Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy: sorted=0 unsorted=237



At 03:02 PM 2/11/00 -0800, Andy Piacsek wrote:

But now I wonder: the reference state of a deck of cards seems
arbitrary;

Yes, the configuration I call "standard sorted order" is
arbitrary. Arbitrary but well-known. The crucial point is that it is well
known.

as has been pointed out, the "standard order" is equally
likely to exist as any other order.

That's true. Microcanonically speaking, each microstate is equally likely.

Perhaps it clarifies matters if we refer to the "standard order" of a
deck of cards as a "reference state", in the same way that a temperature
scale needs a reference temperature.

I don't see why having a reference state is related to having a reference
temperature.

As discussed above, there's nothing special about the reference state,
except that it is well known. We could choose some other state as a
reference, no problem.

But the same cannot be said
of an ideal gas: its reference state (s=0) is NOT as likely
as the many other states it might occupy (equilibrium or not).

There is danger of miscommunication here. In quantum mechanics, usually
the word state is defined to mean microstate (as in "the ground state", or
such-and-such "excited state"). In classical thermodynamics, often the
word state means macrostate (as in "the equation of state"). When talking
to a mixed audience about the thermodynamics of systems with identifiable
microstates, the safe course is to avoid the word "state" and use the more
explicit versions.

So let's move beyond terminology to real physics:

1) The entropy of the ground (micro)state is zero.
2) The entropy of any other microstate is zero. If you know what
microstate the system is in, that's all there is to it.
3) The entropy of a macrostate is generally the logarithm of the number of
ways that macrostate could have happened, i.e. the logarithm of the number
of different microstates that are consistent with the given macrostate.

Let's do an example: Suppose we have a box containing two distinguishable
but equally-massive particles. Perhaps they are electrons with opposite
(therefore distinguishable) spin. Suppose I tell you they are in the state
where k1=3 and k2=5 in the appropriate units. Then this is a
fully-specified microstate. It has energy 3^2 + 4^2 = 25 in the
appropriate units. It has zero entropy.

In contrast, now suppose I shuffle the system and tell you that it is in a
microcanonical macrostate with energy E=25 exactly. This comprises four
possible microstates:
k1=0 k2=5;
k1=3 k2=4;
k1=4 k2=3;
k1=5 k2=0.
Therefore the entropy is 2 bits.


Yet perhaps this does not matter; if shuffling a deck of cards
corresponds to doing work on this system,

I wouldn't have said that. Shuffling is a increase in entropy. That can
occur with no change in energy.

I must also confess that I am unfamiliar with what John refers to
as Shannon entropy.

Shannon single-handedly invented the field of information theory. In his
epochal 1948 paper he defined something he called "entropy".

Check out
http://www.math.washington.edu/~hillman/Entropy/infcode.html