Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Oredr being born from disorder?



On Mon, 19 May 1997 Leigh Palmer made an intersting contribution to this
thread. I selected some of his sentences and mixed them with the ideas
they generated. I appologize if I am repeating what has been discussed
in November (I had no time to fetch old debate messages from the archive).

It is easy to confuse the concepts when people get cute and use an
established term [entropy] for what is really a new concept [Shanon's
"entropy" described by David].

Consider a "highly organized" static system, two layers of sand (dark
and light) in a bottle. We shake the bottle and order is destroyed.
I often say (teaching conceptual physics to non-science majors) that
no matter how long I shake, the initial state will not be esatblished
unless the number of particles (big pebbles) is very small. The correctness
of this statement is obvious. But then, following authors of many textbooks,
I add something (next paragraph) which is not totally correct.

"Scientists invented a concept, called entropy, which quantifies
orerliness. And they discovered that when transformations take place
the entropy of the universe increases. The entropy of the initial state
is lower than the entropy of the final state (after shaking)."

Notice that I am assigning entropies to quazi-static sand systems. How
does this differ from entropies assigned to different realizations of a
deck of playing cards? Or from entropies assigned to paterns of bits?
Bits, or letters in crypted messages, do not interact with forces and are
not subjected to temperature effects (except in hot debates); they are
static not DYNAMIC.

In thermodynamics it makes no sense to ask "What is the entropy of the
binary sequence 1010101010?" Only physical systems have entropies, and the
entropy of a physical system is a function of its state. We could consider
a physical system, say ten ferrite cores .... on which the sequence above
has been written. In such a case there is no uncertainty associated with
the sequence *per se*; that's why we use the cores, after all. Any ten
digit sequence has the same property. .... Thus all states with five ones
and five zeros have the same entropy as the system in state 1010101010.
I would not disagree with anyone who says that this system is more orderly
than is, say, 1101010010, and that should adequately demonstrate that
order and entropy are not necessarily correlated.

But the "entropy" of information theory (intropy ?) is said to be highly
sucessful in dealing with crypted messages and in linguistics. Is this
true? And what about applying intropy to the loaded die outcomes decribed
previously in this thread? Can we say that Brian demonstrated to us that
higher intropy is not necessarily correlated with higher probability?

Ludwik Kowalski