Next: Doctrine.
Up: Categorical Definitions
Previous: Entropy.
Any observed ``ordering'' in a collection of particles
can be interpreted as information.
For example, a pair of coins on a table could be examined to determine
whether there are zero, one or two ``tails'' showing.
Typically, much of the ``ordering'' in a collection is ignored
when examining it for its informational content.
In the foregoing case of a pair of coins, for example, we might be
uninterested in their location on the table.
In general, an entropic object contains information only
if we can enumerate various possible states of that object.
We could extract this information if we were allowed to examine
the object sufficiently carefully, to determine which state it is in.
In our pair-of-coins example, there are three states: 0, 1, and 2.
Information is thus entropy with a syntax, that is,
a structural context which allows enumeration.
Clark Thomborson
Fri Oct 3 14:28:46 NZST 1997