Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
In general, entropy is a property of a probability distribution. Each
(typically discrete) distribution has its own entropy. ...
The entropy of a distribution objectively measures (nonparametrically)
how "random" or "uncertain" the outcomes are (on the average) for samples
that are draw from the distribution. The entropy of a distribution is that
it is the average (minimal) amount of information necessary to exactly
determine which outcome obtains when a sample is drawn from the
distribution given only the information contained in the specification of
the distribution itself.
Let {P_r} represent a probability distribution where P_r means the
probability that the r_th outcome obtains when a sample is drawn. Here
r is a label that runs over each distinct (disjoint) possible outcome
and: SUM_r{P_r} = 1. Here SUM_r{...} means sum the quantity ... over all
values of the parameter r. The entropy of a distribution {P_r} is:
S = SUM_r{P_r * log(1/P_r)}. It is the expectation over the set of
outcomes of the logarithm of the reciprocal of the probability of each
outcome. In the special case that there are N outcomes and each outcome is
equally likely, (i.e. P_r = 1/N, r = 1,2,3 ...,N) then S = log(N). .....
...............................
..................................
On occasion it may be useful to make a distinction between the disorder
seen (at the *same* level of description as is used in some generic
entropy measure on some generic probability distribution) in a system and
the entropy for the distribution of outcomes for that (generic) system.
For such a generic system the entropy is the average (minimal) information
needed to exactly specify which outcome occurs when a sample is drawn from
the distribution; whereas the disorder of a given outcome can be defined as
the minimal information necessary to uniquely characterize or describe that
outcome. With this distinction, the entropy is a property of the
distribution, and the disorder is a property of the individual
realizations (outcomes).
...................................
.............................................