Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Order being born from disorder?



Warning! Excessively long post to follow!

Leigh wrote:

David is so helpful I can't bring myself to berate him for interpreting me,
especially since he got it very nearly right. His posting (which I've saved
in its corrected form) may get commented on after I've read it at liesure,
and not from a CRT.

You can berate me if you want to. It seems I ought to speak only for myself
in the future.

I don't think I said anything about "thermodynamic entropy". When one says
"entropy" among physicists I believe it is implicit that the word is used in
its original sense.

This is the usual case although there are exceptions. When information
theoretic topics impinge on physics then sometimes other kinds of
entropies are meant. For instance, in the field of quantum measurement
theory various information theoretic entropies are sometimes introduced to
quantify the information changes that occur when various kinds of
correlated quantum measurements are performed. Such information measures
are not the original sense of the meaning of the word entropy. I did not
mean to imply that Leigh had used the phrase "thermodynamic entropy". The
reason that I attached the modifier "thermodynamic" in front of the word
entropy was precisely to avoid any misinterpretation of what kind of
entropy was meant in the context at hand.

The other "entropies" David describes are called that
only because their quantification is formally similar to the entropy in the
statistical thermodynamics of Gibbs and Boltzmann, just as electromagnetic
"waves" are called that even though they are not wet. I believe also that
it is *the entropy* to which Ludwik referred initially, but perhaps I'm
guilty of a false interpretation there.

It seems the concept of a "wave" may have an analogous similarity to that
of "entropy" in that both words seem to have been used originally to
describe a specific thing (i.e. motion of the ocean surface, and inability
of a thermal energy transfer to perform macroscopic work), but later were
generalized to mean a more abstract and general concept. I doubt that
Clausius would recognize the uses to which the term he coined has been put.
I also doubt that the Old Norse mariners or prehistoric Germanic peoples
or whoever were first to use the root that became the word "wave" thought
of the concept in its modern abstract form as a generic oscillation in both
time and space.

I'm reminded of the story of how Shannon named his information theoretic
function "entropy" in the context of communication theory. Apparently
Shannon originally wanted to call it some name with the word "information"
in it (I think it may have been "information content"). He was reluctant to
name it that for fear of confusion because the term "information" has many
different meanings. He then asked John Von Neumann what he should call his
function. Von Neumann said he should call it "entropy" for two reasons:

"First, the expression is the same as the expression for entropy in
thermodynamics and as such you should not use two different names for
the same mathematical expression, and second, and more importantly,
entropy, in spite of one hundred years of history, is not very well
understood yet and so as such you will win every time you use entropy
in an argument." -- John Von Neumann

So if Leigh doesn't like the use of the term "entropy" as the generic
mathematical measure on a probability distribution that we recognize it as
today, and would rather restrict its usage to its original thermodynamic
meaning, then he can blame Von Neumann for the broadening of the term's
meaning. I might add that I think since Von Neumann made his above comments
there has been some further progress in the understanding of the concept of
entropy and the relationship of its thermodynamic manifestation to its other
more abstract generic usages in Baysian probability theory. Much of this
further work was done by E. T. Jaynes.

It is easy to confuse the concepts
when people get cute and use an established term for what is really a new
concept.

I don't see it so much as being cute nor as using an established term for
a really *new* concept. Rather, I see it as a natural process of
generalization from the specific to the more abstract and metaphorical, and
in the process becoming closer to the underlying Platonic conceptual
essence that the original context illustrates as a special case. To me the
concepts of physics are Platonic things. As we learn more about them and
learn more about the underlying mathematical role played by those concepts,
they get redefined in an ever more abstract setting. This generalization/
abstraction process happens for most of our deepest physical concepts,
(energy, momentum, angular momentum, state, & gravitation to name a few
others). The way I see it the essence of the concept of entropy is the
abstract idea that it is the average minimal information about the outcome
of some probability distribution which is not contained in the definition of
that distribution, and which is necessary to identify the specific outcome
on the average. All the different specific uses of this concept whether or
not they are in a specifically thermodynamic context or not doesn't alter
the fact that it is still the same mathematical concept. If two ostensibly
different phenomena are governed by the same mathematics, then in a very
deep sense they are both just different manifestations of the same
underlying phenomenon. This universality of phenomena governed by the same
underlying mathematics is one of the main sources of beauty in physics. For
another example: the gauge group governing the weak interactions, the 3-d
"surface" of a 4-dimensional sphere, the space of unit quaternions, and the
phase space of a spinor are, in a deep sense, all the same thing, i.e. just
different manifestations of SU(2).

It was once thought that thermodynamics and the properties of
materials set a fundamental limit on the maximum computer speed attainable.
That limit was subsequently shown to have been based on a mass delusion
engendered by just such a confusion.

I was under the impression that the earlier proposed theoretical limits on
computational speed had more to do with inapproriate assumptions about what
was irreducibly necessary for a computation (the need to irreversibly
distinguish a signal above background thermal noise level, the need to
erase partial computations and reset the computer's state, etc.) and was
not due to a confusion between different kinds of entropies.

There are even reasonable people now
who worry about what entropy change is associated with the loss of a book
into a black hole if what is written in the book contains "information". It
would appear to me that this is not science. Any conclusion these people
reach is likely not falsifiable even in principle!

No argument from me here.

Entropy is a quantity of
great utility, but it is one which fluctuates not only with time, but also
with its very definition, which can vary in minor ways.

This is why I thought it was a good idea to attach appropriate modifiers to
the term entropy when discussing it to avoid possible confusion.

The entropy has been shown to be a measure of uncertainty, indeed, and I'm
pleased to see David using that term in preference to the odious "disorder".

I do prefer the term "uncertainty" over "disorder" when discussing entropy.
"Uncertainty" conjures up in the mind the idea that some information about
something is missing, and the essential point of idea of entropy is that it
is a quantitative measure of that missing information. "Disorder", OTOH,
usually seems to suggest (at least to me) a patternless arrangement of the
parts of a whole. A problem with quantifying this notion is that it is
quite subjective and a particular arrangement may seem quite disordered on
one level but actually have a hidden order. If one looked at the
statistical pattern of the bits of a compressed (say PKZIPed) version of an
executable computer program, a PhD thesis, a .gif image file, or a
digitized sound file of the performance of some musical composition, one
might think that the compressed file is a completely meaningless disordered
random strings of bits. If so, one would be wrong. That's why I like the
term "complexity" when discussing a particular arrangement of something.
This notion has been objectively quantified by Chaitin and independently by
Kolmogorov. The Chaitin-Kolmogorov complexity of any thing is the length
(in bits, for instance) of the shortest possible *complete* description of
that thing. Such a maximally-compressed description will tend for
intrinsically or *irreducibly* complex things to have an arrangement of its
bits which is statistically indistingishable from a very long string of
random bits.

It is, however, a measure of uncertainty at *all* scales of length, not only
the submicroscopic.

This is true in a reductionistic sense. All macroscopic differences can be
thought of as due to an accumulation of microscopic differences. A
thermodynamic system has *some* macroscopic definition. If the system is
isolated and in equilibrium then that system's entropy is just the
logarithm of the number of possible *microstates* the system can be in and
still have the *same macroscopic* description. If two different
microstates have the same pre-specified defining macrostate, and they
differ in terms of some *other* long length-scale property which is *not*
used to define the macrostate, then both microstates contribute equally to
the entropy (for the isolated system in equilibrium).

Uncertainty is also related to timescale, by the way, a
fact both David and I have swept under the rug to this point. The
uncertainty in the order of a pack of cards is zero once it has been
isolated from a source of shuffling; that is why the entropy contribution
due to any particualr order is the same: zero. In the limit of incredibly
long times one could fancifully imagine that cards could change places, say
by quantum mechanical tunneling, and that would increase the "uncertainty".

This is an important point. I *did* sweep this consideration under the
rug. The time-scale is important for the notion of entropy because the
underlying dynamics which chooses from the ensemble of microstates for the
sequence of realized microstates can only sample the subset of microstates
which are accessible to the dynamics over the experimentally relevant
time-scale. If a thermodynamic system's dynamics is such that some
accessible region of (microscopic) phase space is disconnected from some
other regions of phase space (or if the connections to them are very tenous
bottlenecks which are nearly impossible to find or traverse, such as
through macroscopic quantum tunneling) then the system point in phase
space cannot get to the other regions and sample those microstates in the
time allowed for any experimental observation. When this happens the
dynamics is effectively non-ergodic and the space of acccessible
microstates is a subset of those which are ostensibly consistent with the
macrostate. In this case (as in the case of glassy materials and other
systems with quenched disorder) the experimentally relevant version of the
thermodynamic entropy is the information needed to determine the
microstate of the system *given* the particular subset of *accessible*
microstates that the system finds itself trapped in. In this case the
system does not equilibrate to "true equilibrium" over the experimentally
feasible time scales and the system becomes "stuck" in a metastable
"pseudo"equilibrium macrostate which (for an isolated system) maximizes
the entropy for the distribution over the restricted phase space. If a
*much* longer time scale is allowed for then the system may relax to
"true" equilibrium by sampling the other nearly-isolated regions of
(microscopic) phase space. Once enough time has passed for all of the
(nearly) separated regions of phase space to become accessible by the
system's dynamics, then the system's entropy grows because of the newly
opened up collections of other previously disallowed microstates become
possibly realized as "true" equilibration takes place. In the case of a
glassy system the extra entropy generated by waiting so long represents
the information necessary to pick from among the various "frozen" glassy
configurations when the T -> 0 limit is taken at a more experimentally
reasonable rate.
(This commentary on glassy materials was inserted to head off a potential
objection by Doug Craigen.)

...
In thermodynamics it makes no sense to ask "What is the entropy of the
binary sequence 1010101010?" Only physical systems have entropies, and the
entropy of a physical system is a function of its state.

To be picky I would say only *macroscopic* physical systems have
*thermodynamic* entropies. And I would mention the thermodynamic entropy
of a given physical system is a function of its *macro*state. I do agree
that a particular sequence does not have either a thermodynamic or a
(Shannon-like) information-theoretic entropy either. Any such entropy is
defined for a probability distribution, and a given sequence by itself does
not define a probability distribution.

We could consider
a physical system, say ten ferrite cores (I'm an old coot, remember?), on
which the sequence above has been written. In such a case there is no
uncertainty associated with the sequence *per se*; that's why we use the
cores, after all. Any ten digit sequence has the same property. A
particular sequence, though it is necessary to define the state of the
system, contributes nothing to the entropy of the system.

This would be the case if the presence or the absence of a net spontaneous
magnetic moment on a given ferrite core does not influence the number of
microstates accessible to the core. I doubt that this is precisely the
case however. Thus there is expected to be some *small* influence on the
system's entropy by at least changes in the total number of "1"s in a
pattern (assuming that the construction of each core is essentially
identical down to the atomic level, otherwise there would be entropy
changes for different patterns with the same number of "1"s by changing
which of the cores have the magnetic moments)

Another way to
look at it is to note that it is possible to transform the digital state of
this system to any other digital state and reverse the transformation with
no increase in the entropy of the universe. I will not go into detail how
this may be accomplished, but I will exemplify one sort of process, one in
which the number of ones and zeros remains unchanged. In any such process
it is possible to physically rearrange the cores simply by doing reversible
work against conservative electromagnetic forces.

I'll take your word on this but it seems that it would be difficult indeed
accomplish this task with out the extraneous creation of at least a few
very low energy photons. The information carried away by them could
certainly mess up the results. Note that for case of Planck blackbody
radiation there are 5.1 bits of entropy per photon present.

Thus all states with five
ones and five zeros have the same entropy as the system in state 1010101010.
I would not disagree with anyone who says that this system is more orderly
than is, say, 1101010010, and that should adequately demonstrate that order
and entropy are not necessarily correlated.

It seems that the 1010101010 pattern has a slightly lower Chaitin-
Kolmogorov complexity than the 1101010010 pattern.

On rereading this I'm not sure I've done anyone a service, but I expect
that I'll attract some potshots, and that should focus discussion on
something interesting.

I hope you (Leigh) don't consider this post as anything more than a few
potshots. Unfortunately, (as is the usual case with my posts) it is too
long and its length may make it seem more like a machine gun burst (which
it is not meant to be).

David Bowman
dbowman@gtc.georgetown.ky.us