Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy (statistical +- thermal)



At 10:15 PM 2/11/00 -0800, Bernard G. Cleyet wrote:

I find
the statistical nature of entropy very easy to understand
and I was introduced to it in my first year university
physics course. The alternative is the mathematical
concept in thermodynamics, which I learned in my first
year university chemistry course.

That's an interesting way to describe the contrast.

Both have their place.

Yes.

Entropy is, of course, best understood in Boltzmann's famous
equation that is engraved on his gravestone, i.e.
entropy = log(number of states).

That's certainly how I understand it.

To relate
entropy to energy one needs to talk about energy states;
the energy with more states is the most likely, assuming
that states are equally likely.

Evidently this statement uses the terminology "state" = "microstate", not
"macrostate".

These states do not need to be quantum ones.

Indeed. Playing cards will do. Or coins; see below.

Much of Gibb's work was based
on classical considerations, though his work on gases
and indistinguishability is one of the first precursors
of quantum mechanics. I personally believe the statistical
approach to entropy should be taught as early as possible.

Right.

A graphic demonstration is to bring in a box containing 100 coins. It's
even better to have coin-like objects that are colored white on one side
and black on the other. Start them out in the all-black microstate. Then
de-sort them by tapping the bottom of the box.

Observe that disorder increases. Observe that for all practical purposes,
this increase is irreversible. You could tap for a long time before the
all-black microstate randomly reappears.

In this demonstration, temperature is irrelevant; the tapping is
non-thermal, and between taps you've got essentially a zero-temperature
situation since the real-life room temperature is so small compared to the
energy required to flip a coin. Entropy is defined without reference to
temperature.

You can take one of the random microstates and mark it by spraying
quick-dry red paint over the coins. Then tap some more. The all-red
microstate gets lost as surely as the all-heads microstate. The point is
that any particular random microstate is vanishingly unlikely; it is the
random macrostate that is likely.

Hint: It is more practical to keep a set of red/black/white coins in a
separate box. That way you don't need to mess with real paint during
class. And you can use the same coins year after year. Just turn the
red/black/white coins to the all red (micro)state before class, just as you
turn the black/white coins to all black.

The difficulty of students is mostly derived from their poor education
in probability theory.

That's certainly part of it.

A somewhat-related point is that statistics is often the first time
students face combinatorially large numbers like 54 factorial. These are
hard to visualize.

I think probability theory should
be taught very early and more widely since it is the
most applicable branch of mathematics to our daily lives.
Irrational decisions are often made that affect us all
but are based on misunderstandings of probability.
How many people fear to get onto a plane, but are willing
to drive 80 mph in the rain to catch the plane on time?

Amen, brother.