Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy



In his important warning about simplistic explanations Leigh said:

.... There are many examples of
such erroneous high school "explanations" that would be better off
left as mysteries*, or perhaps shown to be so when they come up. ....

*Perhaps the worst is the association of entropy with disorder. ....

and Doug C. chimed in with:

This business of equating entropy with disorder is a pet peeve of mine.

It seems to me that whether or not one can associate/equate entropy with
disorder depends on just how the notion of disorder is defined. (There is
not a large amount of wiggle room in changing the standard Boltzman/Gibbs/
Shannon/Jaynes definition of entropy.) If disorder is defined, as Doug does
in his Entropy & Evolution essay, as a particular metric measure on the
(phase) space of states/outcomes/realizations which measures the deviation of
a given state from a fiducial state called an "ordered" state which is itself
defined according to an algorithm called the "ordering rule", then Doug is
certainly correct that Entropy is *not* disorder. For in this case disorder
is a property possessed by individual (micro)states, whereas entropy, being a
(functional) property of the probability measures on that realization space,
is a property possessed by each probability distribution on that space. In
the case of stat mech/thermo this means that each macrostate has an entropy
but not each microstate. In thermo the microstates are not individually
characterized--only the macrostates are characterized. The entropy of a given
macrostate is the (avg. minimal) information necessary to exactly determine
the microstate given only the macrostate (and the constraints on the possible
microstates provided by the macrostate). *

If one defines disorder in another way it may very well be possible to
associate entropy with disorder. For instance, if we say that a disordered
state is one that requires a large amount of information to precisely
characterize and a relatively ordered one is one that can be characterized
with little information, then the association of disorder with entropy becomes
quite tight.

Doug goes on to claim that:

I guess I got into it while doing my thesis research on amorphous silicon.
It occured to me that if one cooled a-Si towards zero K, its entropy would
approach 0. i.e. there is no residual entropy due to the disorder in the
structure.

Why do you make this claim? It seems to me that in the T -> 0 limit a-Si
*does* have residual entropy which is exactly the configurational entropy
associated with the distribution describing the quenched "disorder" of the
particle positions.

David Bowman
dbowman@gtc.georgetown.ky.us

*In the case of microcanonical (system isolated from the rest of the universe)
equilibrium this means that the entropy boils down to essentially the
logarithm of the number of accessible microstates. The reason for this
simplification is that in the case of microcanonical equilbrium, all (up to a
set of essentially measure zero) of the accessible microstates are equally
likely. The reason for this is that if there are W possibilities it takes
log(W) digits of information to specify which one obtains when they are all
equally likely. If the system is not in microcanonical equilibrium then the
probability distribution on the available microstates is not uniform and the
entropy is, accordingly, more complicated.