Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy (was very long)



At 08:23 AM 2/5/00 -0500, David Bowman wrote:

I personally see nothing objectionable with the usage of these terms in
the article.

Right. Neither do I.

I see little conceptual danger in the association as long as the term
'disorder' is used in an imprecise and nontechnical sense,

I see very, very little conceptual danger. If the word is used in an
imprecise context, then obviously it is imprecise. OTOH the concept of
disorder can be made precise, in which case it is precise.

It is possible, I suppose, that an
unsuspecting student might bring to the conceptual table ideas of the
relevant disorder possibly being at the level of a macroscopic
arrangement of macroscopic parts of a thermodynamic system. If so then
the student would be liable to be confused.

I would say the student brings the _correct_ concept. The only possible
area of confusion has to do with magnitudes; see below.

For instance, a thoroughly
shuffled deck of cards has essentially the same thermodynamic entropy
as a well-ordered deck (assuming, of course, both decks have the same
temperature and are subject to the same external macroscopic
environment). OTOH, a cold deck of cards has less thermodynamic entropy
than a warm one regardless of how the decks were shuffled or not.

Shuffling the deck increases its entropy by a few bits. Cooling the deck
(in the usual way) decreases its entropy by something like 10^23 bits.

(I say "in the usual way" because it is possible to heat and cool things
without changing the entropy at all. We all know about reversible
adiabatic compression and expansion of gas, for example.)

The terms, 'information', 'disorder', 'order', 'information',
'complexity', 'uncertainty', (even 'entropy'), etc. can all be used in
both an imprecise and colloquial as well as a precise and technical
manner.

Yes indeed.

But the same goes for such words as 'energy', 'power', 'force',
'work', 'color', 'black', 'white', 'flavor', 'heat', 'flow', etc.

Yes again.

Another problem that sometimes arises in the terminology of physics is
that the same common word may be occasionally adopted for multiple
technical meanings. This, too, can lead to confusion for the student.
For instance the term 'heat' is used quite differently in the next
two sentences. "The latent heat for the phase change for the substance
is 750 KJ/kg." "The Joule paddle-wheel experiment shows that
1 cal. of heat is equivalent to 4.18 J of work."

Let me remind everyone that there are two schools of thought here.
1) One school defines heat = heatflow.
2) One school defines heat = thermal energy.

Folks in school #2 see no inconsistency in the two statements Bob just
cited. I suspect that strict adherents to school #1 would reject _both_
statements as abuse of the terminology.

In the specific cases of the terms 'entropy' and 'disorder', I would
*not* use them as synonyms.

OK. Disorder is a colloquial term. Entropy is the corresponding technical
term.

I would consider the concepts they
signify as related, but not identical, to each other. What relates them
is that they are both measures of (missing) information. What
distinguishs them is that they are *different* such measures.

I don't know any way to quantify disorder except by measuring the entropy.

This definition implies that since entropy is an *average* over a
distribution it is a statistical concept that is a property of
statistical ensemble of possibilities, (i.e. the probability distribution
itself). It is not a property of individual outcomes or states drawn
from the universe of possibilities. The entropy is a functional on the
space of probability distributions. It is not a function of an
individual given realization or sample drawn from such a distribution.

Quite so. To say it in other words: If you know the exact microstate,
there's zero entropy. The same can be said for disorder: If I shuffle the
cards and tell you the resulting order, in some sense they're not
"disordered" -- they are ordered in a very particular way that I just told you.

Since entropy is really dimensionless

Right. It is dimensionless. Sometimes we choose to measure it in bits
(which are dimensionless). As Bob points out elsewhere, this is analogous to
measuring angles in radians (which are dimensionless). We have to specify
the units of measure even for dimensionless quantities because there are
choices (bits/nats; degrees/radians/revolutions, et cetera).

If this is done then a temperature of 1 K really
means the intensive (energy/per entropy units) of 1.38065x10^(-23) J/nat.

Exactly so.

....

The composite object(s) that has(have) this minimal possible
complexity value over the whole ensemble is/are said to be the
most-ordered object(s). The difference between the actual complexity of
each of the ensemble members and this minimal complexity value defines
(by my definition of the term) the 'disorder' of each such composite
object.

That's an interesting definition, but it has bugs that I can't immediately
see how to resolve. Conceptually, the bug has to do with neglecting the
internal disorder of the component subsystems.

As an illustration of this conceptual bug, consider a crystal built by
assembling a vast number of micro-crystals. If I do it right, the entropy
of the whole is much less than the sum of the entropy of the parts (because
knowing the microstate in one part let's you make predictions about the
others). Similar remarks apply to a stack of 10^20 brand-new unshuffled
decks of cards. AFAICT the suggested definition of disorder yields a
strongly negative value in such cases; I consider this a bug.

=====================

I'd like to thank Bob for discussing a difficult subject in clear and
thoughtful terms.