Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Order being born from disorder?



When Leigh says that thermodynamic entropy is not related to order/disorder
what he means (I think) is that any increase in the order in a
*macroscopic* pattern seen in a system (such as in the arrangment of
floating whisker fragments in a toilet bowl) is not related to the system's
*thermodynamic* entropy. The thermodynamic entropy measures uncertainty (or
disorder if you will) at the level of the individual (atomic) microstates
for the system, *not* at the level of any possible macroscopic patterns.

David is so helpful I can't bring myself to berate him for interpreting me,
especially since he got it very nearly right. His posting (which I've saved
in its corrected form) may get commented on after I've read it at liesure,
and not from a CRT.

I don't think I said anything about "thermodynamic entropy". When one says
"entropy" among physicists I believe it is implicit that the word is used in
its original sense. The other "entropies" David describes are called that
only because their quantification is formally similar to the entropy in the
statistical thermodynamics of Gibbs and Boltzmann, just as electromagnetic
"waves" are called that even though they are not wet. I believe also that
it is *the entropy* to which Ludwik referred initially, but perhaps I'm
guilty of a false interpretation there. It is easy to confuse the concepts
when people get cute and use an established term for what is really a new
concept. It was once thought that thermodynamics and the properties of
materials set a fundamental limit on the maximum computer speed attainable.
That limit was subsequently shown to have been based on a mass delusion
engendered by just such a confusion. There are even reasonable people now
who worry about what entropy change is associated with the loss of a book
into a black hole if what is written in the book contains "information". It
would appear to me that this is not science. Any conclusion these people
reach is likely not falsifiable even in principle! Entropy is a quantity of
great utility, but it is one which fluctuates not only with time, but also
with its very definition, which can vary in minor ways.

The entropy has been shown to be a measure of uncertainty, indeed, and I'm
pleased to see David using that term in preference to the odious "disorder".
It is, however, a measure of uncertainty at *all* scales of length, not only
the submicroscopic. Uncertainty is also related to timescale, by the way, a
fact both David and I have swept under the rug to this point. The
uncertainty in the order of a pack of cards is zero once it has been
isolated from a source of shuffling; that is why the entropy contribution
due to any particualr order is the same: zero. In the limit of incredibly
long times one could fancifully imagine that cards could change places, say
by quantum mechanical tunneling, and that would increase the "uncertainty".
The order of an initially arranged pack would be as uncertain as that of an
initially shuffled pack, however; the entropy increase due to consideration
on an incredibly long time scale would be the same for both cases (and
negligible compared to other terms, including the amplitude of the expected
entropy fluctuations).

In thermodynamics it makes no sense to ask "What is the entropy of the
binary sequence 1010101010?" Only physical systems have entropies, and the
entropy of a physical system is a function of its state. We could consider
a physical system, say ten ferrite cores (I'm an old coot, remember?), on
which the sequence above has been written. In such a case there is no
uncertainty associated with the sequence *per se*; that's why we use the
cores, after all. Any ten digit sequence has the same property. A
particular sequence, though it is necessary to define the state of the
system, contributes nothing to the entropy of the system. Another way to
look at it is to note that it is possible to transform the digital state of
this system to any other digital state and reverse the transformation with
no increase in the entropy of the universe. I will not go into detail how
this may be accomplished, but I will exemplify one sort of process, one in
which the number of ones and zeros remains unchanged. In any such process
it is possible to physically rearrange the cores simply by doing reversible
work against conservative electromagnetic forces. Thus all states with five
ones and five zeros have the same entropy as the system in state 1010101010.
I would not disagree with anyone who says that this system is more orderly
than is, say, 1101010010, and that should adequately demonstrate that order
and entropy are not necessarily correlated.

On rereading this I'm not sure I've done anyone a service, but I expect
that I'll attract some potshots, and that should focus discussion on
something interesting.

Leigh