Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
It is easy to confuse the concepts when people get cute and use an
established term [entropy] for what is really a new concept [Shanon's
"entropy" described by David].
In thermodynamics it makes no sense to ask "What is the entropy of the
binary sequence 1010101010?" Only physical systems have entropies, and the
entropy of a physical system is a function of its state. We could consider
a physical system, say ten ferrite cores .... on which the sequence above
has been written. In such a case there is no uncertainty associated with
the sequence *per se*; that's why we use the cores, after all. Any ten
digit sequence has the same property. .... Thus all states with five ones
and five zeros have the same entropy as the system in state 1010101010.
I would not disagree with anyone who says that this system is more orderly
than is, say, 1101010010, and that should adequately demonstrate that
order and entropy are not necessarily correlated.