Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-l] Entropy again



On 01/06/2012 06:30 AM, Savinainen Antti wrote:
It seems that there still is a lively discussion - even a debate? -
on how entropy should be interpreted or at least on how it should be
taught.

Yes. It's not really a physics discussion; it is an outright
religious dispute. It has been going on for more than 40 years:

FL Lambert
"Chaos, Entropy, and Original Sin"
Religion in Life (1967)

Frank L. Lambert
"THE ONTOLOGY OF EVIL"
Zygon, The Journal of Religion and Science (1968)


The principal advocates for the "spreading metaphor" don't publish
in the physics journals (on this topic or any other); they publish
more-or-less exclusively in the religious literature, the Journal
of Chemical Education, and a few other narrowly-specialized venues.
Check Google Scholar if you want to see for yourself.

People who publish in the scientific literature use the probabilistic
approach. Examples include Boltzmann, Einstein, Fermi, Shannon,
Feynman, Landauer, Bennett, Zurek, et cetera.

Switching from the "disorder metaphor" to the "spreading metaphor"
is jumping from the frying pan into the fire.

Leff has another paper on entropy which is online for free: [1]
<http://homepages.wmich.edu/~korista/phys3300/entropy_language-interpretation.pdf>.

The journal should have rejected that paper out of hand ... on multiple
grounds, starting with the failure to discuss the available evidence.
There is only one mention of the probabilistic approach to entropy,
which Neff dismisses in a single sentence:

It does not, however, use space, time, and energy in a
qualitatively useful way, and thus cannot replace the spreading
metaphor that is proposed herein as an interpretive tool. [2]

That sentence is just plain false, and even if it were true, it
would not be an adequate /explanation/ or an adequate /discussion/
of the issue. As for the physics: First of all, the definition
and/or explanation of entropy does not need to "use space, time and
energy". Secondly, in all cases where entropy is connected to energy,
temperature, et cetera, the probabilistic approach gets it right.

The canonical example of this is the entropy of the nuclear spins
in a sample of copper. The maximum molar entropy is R ln(4), as you
can calculate from first principles in less time than it takes to
talk about it. The entropy can be calculated using the probabilistic
approach, even in situations where the temperature is unknown or even
undefinable. OTOH in cases where the temperature is well behaved,
the probabilistic entropy agrees with classical expressions involving
entropy, temperature, and energy ... so the claims that the classical
entropy is somehow different from the information-theoretic entropy
are quite untenable.

In zero magnetic field, the energy of this system is zero, so changes
in entropy simply cannot be described in terms of the "spreading" of
energy ... not spreading in space or any other kind of spreading.
This is important, because (a) it completely falsifies the Lambert /
Neff theory of entropy, and (b) it tells you that pedagogical examples
involving tossing coins and shuffling cards (which Lambert calls
"nonsense") are in fact correct --- qualitatively, quantitatively,
and pedagogically correct --- even though they do not involve energy.

Returning to statement [2]: As a point of process, if you want to
have a *scientific* discussion, you need to account for all the
evidence, including evidence for AND AGAINST your pet theory. I
wish these guys would act more like scientists and less like lawyers.

There are so many other wrong statements in reference [1] that it's
not worth counting them, let alone discussing them all. Just for fun,
here's one more:

In real systems, the total energy is never exactly constant and
energy exchanges with the environment, even if weak, cause
continual changes in the system’s microstate.

That is /at best/ irrelevant. By mentioning it, the paper implies that
it is relevant, and the implication is wrong. It goes on to imply that
the stray coupling determines the rate of equilibration, which is also
quite wrong.

I have personally done experiments involving the entropy, energy, and
temperature of a chunk of copper, including the nuclear R ln(4) term.
It works. I have also personally used the probabilistic definition of
entropy to solve real-world problems in pattern recognition, electronic
chip design, communications, experiment-design, cryptography, et cetera.
In one case, the work went from an insight about the measure-theoretic
foundations of probability theory to a product in the streets of Toledo
in 14 months.

If this approach didn't work, people would have noticed by now! I would
have noticed, my customers would have noticed, and a huuuuge number of
other people would have noticed.

The main pedagogical issue that affects the probabilistic approach is
that it requires students to have some clue about probability ... which
most of them don't.

However, this is an example of "praising with faint damns". Students
need to get a clue about probability, for this reason and many others.
So the strategy is simple: Teach them some probability, and then
explain entropy in terms of probability. This is the win/win strategy.

There is one side-issue that you ought to watch out for: Some students
may adhere to a religion that forbids gambling. This includes Islam,
LDS, and some Protestant denominations. (This is why it was so amazing
to see a Mormon offering a bet to a born-again Methodist hard-liner;
the $10,000.00 magnitude of the bet was the least of his problems.)
Anyway ... the reason I mention this is that some classroom examples of
entropy involve a deck of cards, which some students might find offensive.
As I understand it, "most" of these denominations don't consider it
gambling unless there is money at stake, but I'm not an expert and I'd
be willing to bXX [guess] that there are some folks who would be offended
by the cards _per se_. I've never heard of anybody having an actual
problem with this, but it's wise to be sensitive to the possibility. In
particular, when assigning homework, you should not assume that everybody
has a deck of cards at home.