next up previous
Next: Value. Up: Measuring Entropic Items: Quantity Previous: Measuring Entropic Items: Quantity

Quantity.

Given an entropic system, we can make precise measurements of the ``quantity'' of its entropic and informational content. In Maxwell's theory, for example, entropy is a well-defined quantitative measure. The entropy of a physical object is proportional to its temperature; the constant of proportionality can, at least in principle, be determined precisely from various physico-chemical properties of that object.

In Shannon's theory of information, we also find a precise measure. Indeed, Shannon's theory is built on Maxwell's theory: in both theories, we must posit a probability distribution before measuring quantity. For example, we may assume that a fair coin has a 50-50 chance of coming up ``heads'' or ``tails'' on each flip, that is, we posit a uniform (50-50) probability distribution over an event space with exactly two outcomes (heads, tails). Using Shannon's analytic technique, we can do a bit of calculation on the probability distribution to discover that each coin-toss yields precisely one bit of information. Equivalently, we assert that we could communicate the result of a coin-toss by sending precisely one bit of information.

It may someday be possible to quantify doctrine, knowledge, and wisdom, although this seems quite unlikely to me. At most I would assert that any reasonable quantity metrics must be monotonic in my hierarchy: any entropic system must have more entropy than information, more information than doctrine, more doctrine than knowledge, and more knowledge than wisdom. The first of these assertions is in Shannon's theory of information; the others may be novel, however the study of entropic quantity does not seem a promising avenue for exploration.

Let us now turn to questions of value and valuation.



Clark Thomborson
Fri Oct 3 14:28:46 NZST 1997