Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: entropy



Responding to my last post James M. said:

I don't see how this is substantially different from Doug's definition of
disorder. He says disorder is when there is no "ordering rule" to describe
the configuration, implying that such a rule should be simple; you are
suggesting that disorder is when the configuration requires a lot of
information to define: isn't that the same thing?

I don't want to speak for Doug here, but his web page didn't say that the
ordering rule must be simple to define, although it is possible that he meant
for this to be the case. He seems to have said that disorder is the amount of
deviation a particular configuration has from the one specified by the
ordering rule. So apparently this notion of disorder is some kind of measure
of the difference a given configuration has from the ordered case. He didn't
give details on whether or not a large measure of deviation necessarily
entailed a long (large amount of information) characterizing description or
not. Even if the definition of disorder that I gave is used rather than
Doug's, this still doesn't quite equate entropy with disorder. The entropy of
the situation is the amount of information needed to specify *on the average*
exactly which sample/realization obtains given the probability distribution
for choosing the samples. The def. of disorder I gave above is the amount
of information needed to characterize a given individual sample. Essentially,
using the definitions I gave, the entropy is the average of the disorder for
each sample over the probability distribution for the samples.

A question about the entropy of amorphous stuff at zero Kelvin: The
statistical mechanics definition of entropy usually involves some phrase
like "the number of microstates accessible to the system". While the
amorphous stuff could have frozen into numerous microstates, once it is
frozen there is only one 'accessible' microstate. I guess the buzzword is
the system isn't ergodic.

Yup. You have hit on the difference. If one only includes the microstates
accessible to the system's dynamics for a *given* quenched background
positional configuration one has that the entropy vanishes as T -> 0. In this
case one needs to consider the particular quenched molecular arrangement as
part of the system's macrostate which strongly limits the allowed set of
possible microstates. However, the metastable state of the glassy system is
*not* in true thermodynamic equilibrium and is effectively nonergodic over
experimental time scales due to anomalously slow configurational equilibration
times so that when one takes the true infinite time average over the states we
see that the various "quenched" configurational states are not really frozen,
but are dynamically interconvertible making all of the different "frozen"
configurations accessible over a sufficiently long time (probably *much* too
long for experimental situations) scale. In this case the various
microscopically detailed configurational arrangements become part of the
microscopic dynamics and are not seen as an external macroscopic constraint on
the dynamics. When the full phase space is included the *nonequilbrium*
system is seen to possess a residual configurational entropy as T -> 0.
Effectively, this zero-point configurational entropy represents the
information necessary to find out which one of the frozen glassy states the
nonequilbrium system is in as T -> 0. Presumably, if one took such a frozen
glassy system and waited for a nearly infinite time it would decay (via
tunneling if not via thermal activation) to the crystalline phase which would
liberate the excess frozen potential energy stored in the bonds between the
particles and reheat the system to T > 0. If this newly liberated thermal
energy was removed from the system, the system's entropy would then smoothly
go to zero as T -> 0 again, and this time the limit would agree with the 3rd
law as it would be in a quasistatic sequence of equilibrium (macro)states
during the limiting process.

Can an entropy even be properly defined for such a system?

Yes. But as can be seen from the above discussion, one needs to be very
careful in doing so, since slightly different definitions give significantly
different answers.

Leigh said:
I have a very clear idea of what disorder is. Related to a pack of cards
I can tell you that a new pack is rarely as disordered straight out of
the box as it is after it has been shuffled. I shall demonstrate* that
the entropy of the pack of cards has nothing whatever to do with its
entropy, and that should thorougly discredit the analogy.

This example has (at least) two different entropies present. Certainly, the
thermodynamic entropy has nothing to do with the disorder in the arrangement
of the pack of cards. But this entropy *is* related to the disorder (as I
defined it) in the pack's microscopic degrees of freedom. There is a second
(nonthermodynamic) entropy present in this problem that *is* related to the
disorder in the arrangement of cards in the pack. This entropy is the
average information needed to exactly specify the actual card sequence. If
the deck has been thoroughly shuffled then this entropy is log(52!) = 226 bits
(we are neglecting the jokers here). If the deck is a fresh unopened pack the
entropy is just 3 the bits needed to specify the order of the 4 suits and to
specify whether the cards count up in sequence or count down.

David Bowman
dbowman@gtc.gerogetown.ky.us