Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy and states of matter (long)



Regarding John D.'s latest response:

I usually agree 100% with what David Bowman writes, but
this time I'd like to push back on a couple of points.

And I usually agree with John 100% except for a couple of points
mentioned below.

...
In the context of my statement:
(Classical thermo
requires us to count degrees of freedom, but doesn't
tell us how to do it.

He replied:
No it doesn't. Classical thermo doesn't even know about any
microscopic degrees of freedom in the first place--let alone count
them. Classical thermo is exclusively concerned with the
thermodyanmic macrostate and the macroscopic quantities describing
it and their inter-relationships.

Call that proposition [B]. I disagree with it. First let
me get some experimental facts on the table, and interpret
them a little; then let's see how all this fits in with
what David is saying.

OK with me.

Let's look at the data for the adiabatic exponent, also
known as the ratio of specific heats, often denoted k or
gamma.

1) Classical thermodynamics makes an iron-clad prediction
that a monatomic gas should have gamma = 4/3.

No it doesn't. Classical thermodynamics is incapable of making any
ab initio predictions about what the specific heat is for any
substance. Classical thermo is completely uninformed about the
microscopics of any thermodynamic situation. What classical thermo
*can* and does do is relate the specific heat function (whatever it
happens to be) in a consistent manner to the other thermodynamics
functions for a given system. All these inter-related functions
describe the macro-level. Thus, classical thermodynamics places
certain mutual consistency constraints on the behavior of the
various thermodynamics functions, but it doesn't predict the actual
values of any of them.

Classical *statistical mechanics* predicts that the specific
heat ratio for a monatomic *ideal gas* is *5/3*. The level of the
model is that of tiny mutually negligibly-interacting point masses
that obey classical mechanics.

The observed
data is pretty close to this value, but there are measurable
deviations. See the tables in:
http://www.physicsofmatter.com/Book/Chapters/Chapter5/5.html

Not all gases are ideal. If the interparticle interactions are
significant enough for a sufficiently dense gas, the potential
energy associated with those interactions will make a
significant contribution to the Hamiltonian, and the statistical
expectation of that contribution will add to the contribution from
the kinetic energy when the specific heats are calculated.

2) Similarly, classical thermo makes an iron-clad prediction
that a diatomic gas should have gamma = 7/5. It's different
because the diatomic gas has rotational "degrees of freedom"
that the monatomic gas lacks. Again, the data isn't too far
off, but it's definitely off, and there is a clear trend
toward lower gamma at higher temperature.

Again, no. Classical thermo makes no such prediction. But classical
stat mech predicts that an *ideal gas* of tiny rigid dumbell rotators
(whose moment of inertia about the symmetry axis of the dumbell is
zero) will have a [gamma] value of 7/5. Often it is convenient to
model a diatomic gas as such a system. If we make the gas rarified
enough we can make their interparticle interactions weak enough to
neglect so we can consider it an ideal gas. If the temperature is
high enough so that h-bar^2/(2*I) << k*T (where I is the molecules'
moment of inertia through the center of mass about an axis that is
perpendicular to the symmetry axis of the molecule) then the
particles will be behaving classically for their rotational degrees
of freedom. Typically by the time we get up to around room
temperature, this tends to be the case (H2 is the worst case in this
regard since it has such a low I value). Since the electrons in a
diatomic molecule have a negligible mass this means that the
molecule for all practical purposes has no moment of inertia about
the axis connecting the molecule's nuclei. If the chemical bond
that holds the molecule together is stiff enough so that at the same
temperature that allows the rotational degrees of freedom to behave
classically is still too low for thermal excitations to excite the
internal vibrational mode (i.e. k*T << h-bar*[omega] where omega is
the frequency of the internal vibrational mode) *then* the conditions
on the model are satisfied, and the classical stat mech prediction
for that model are born out in the experimental data for that system.

But if our gas is not ideal (in that there are non-trival
interactions between the gas particles), or if its molecules have a
significant degree of internal flexibility (so that they are not
really rigid), or if the temperature is so low that the quantization
of the rotational angular momenta of the rotators, or the quantum
statistics of the occupancy of the various single-particle states of
the identical particles becomes a relevant issue, then, again, there
will be discrepancies between the model prediction and the
experimental data.

3) You would think that a triatomic _linear_ molecule
like CO2 would behave just like a diatomic molecule, but
it doesn't. Gamma is markedly lower.

So it seems that a CO2 molecule is sufficiently flexible that under
normal room temperature-type conditions its bending modes are
significantly excited. This is not surprising. We would expect
bond bending modes to be less stiff and to have a lower frequency
than the bond stretching modes.

4) OK, you say, let's let the triatomic molecules have a
lower gamma. You think you've spotted a pattern? Consider
a pentatomic molecule, CH4. That should have a really low
gamma, right? Ooops, its larger than CO2. Closer to CO
than to CO2.

It all depends on the frequencies of the internal modes relative to
the temperature (assuming, of course, we are always keeping our gases
sufficiently rarified so as to be ideal). In the case of CH4 the
hydrogens are so light and the corresponding covalent bonds holding
them to the carbon are plenty stiff enough for them to have a pretty
high oscillation frequency and thus not be very excited at room
temperature.

IMHO there is no way you can reconcile this data with
a classical notion of 1/2 kT per degree of freedom.
Even if you do a shameless smoke-and-mirrors routine
to figure out _a posteriori_ how many "degrees of
freedom" there must have been, you can't really make
it work.

Clearly, if some mode is behaving non-classically, then we ought not
expect it to contribute its full classical allotment to the energy
and the specific heat. OTOH, if a mode *is* behaving classically
because the k*T >> h-bar*[omega] then we do expect that the classical
results will obtain. Of course care still has to be taken when
heating up a mode enough so that it is a classical one. Often when a
vibrational mode is so hot that it is effectively classical the mode
is *so* excited that the restoring force no longer obeys Hooke's law
and the oscillator is *nonlinear* (rather than a SHO) and easily be
so excited that there is a significant probability that the 'spring'
breaks and the system disassociates. Remember the classical
(1/2)*k*T prediction for both the KE and the PE is for a SHO. If
the PE is for a nonlinear force then the classical predicted energy
for it is correspondingly different.

I had this data very much in mind when I wrote:
Classical thermo
requires us to count degrees of freedom, but doesn't
tell us how to do it.

My point is still that classical thermo has nothing to say on the
matter at all because it is not concerned with any of the
microscopics. It is in the domain of stat mech to count the
states--whether they be described classically or according to QM.

...

I don't know enough about liquid water to respond specifically
to the "freezing" example. So instead
let's consider sublimation of ice directly to water
vapor. This is not particularly unusual under terrestrial
conditions, especially in the winter. ISTM the H2O molecules
are free to rotate in the liquid, and not free in the solid,
so this is a better illustration of my notion that we can
have a first-order phase transition where the kinetic energy
per molecule is not constant.

(What does ISTM mean?)

I just remembered that treating water-ice is a weird case. In ice
the molecules cease to exist *as* H2O molecules. The ice crystal
is a regular lattice of oxygen atoms with each of the hydrogens stuck
in one of two inequivalent interstitial sites along the line segment
connecting a pair of nearest neighbor oxygens. Since the oxygen
lattice has twice as many nearest neighbor pairs as it does
lattice sites, there is exactly one hydrogen between each pair of
oxygens. The oxygen atom closest to a given hydrogen may be
modeled as being chemically bound to it, and the farther oxygen
on the other side of the hydrogen may be thought of as being
hydrogen bonded to it.

Since the molecules themselves don't exist in ice it is kind of
unfair to consider their kinetic energy in the solid. In this
weird case of ice, it is better to treat each oxygen and each
hydrogen separately, and to remember that the hydrogens are quite
light and the bond connecting each one to its nearest oxygen is so
strong that the mode is probably *not* classical at the melting
point of ice.

If we considered the vaporization (from the liquid) and condensation
of water rather than its freezing to ice. I believe the molecular
kinetic energy on both sides of the transition is the same when the
transition occurs at a constant temperature.

But suppose we considered a much simpler system like Krypton (that
doesn't have all the complications of water). In this case whether
it freezes, melts, vaporizes, condenses, sublimes, or deposits, its
kinetic energy is constant on both sides of these transitions and the
latent heat of the transitions is only due to changes in the
potential energies of the particle interactions.

And regarding diatomic molecular freezing where the frozen lattice is
still made up of molecules whose atoms are chemically bonded to each
other, and the molecules are held in the lattice by Van der Waals
bonds (say maybe nitrogen), then just because the molecules in the
lattice are not free to rotate unhindered, and instead may act like
a collection of torsion pendula, that does not necessarily mean that
there is a jump in the molecular kinetic energy across the
transition. It is not a question of whether or not the motion
(either vibrational or rotational) is hindered, but whether or not
the motion can still be described as *classical* at the temperature
of interest. There will *not* be such a jump as long as the rocking
modes have a sufficiently low frequency that those modes behave
*classically*. This requires that k*T >> h-bar*[omega].

Now I don't know what the freezing point of Nitrogen is off hand,
but it must be quite low. If it is *so* low that the above
inequality is violated then these van der Waals modes may not be
sufficiently excited to behave classically. In that case I would
agree with you that the molecular kinetic energy would not be
expected to remain constant across the transition.

I guess, in hindsight, I should have prefaced my remarks with the
proviso that they were predicated on the assumption that whenever the
1st order transition occurs at a temperature that is hot enough so
that the molecular motions remain classical, and *also* that the
integrity of the molecules continues to remain in tact across the
transition, *then* for that transition, the kinetic energy of the
molecules is constant, and that all the latent heat is due to changes
in the potential energy of the particles.

...
That's a fine theoretical argument, but I don't buy it.
In particular, consider the !!careful!! wording of proposition
[A]. There is a requirement that the DoF in question be
classical. That's crucial!

Yes it is.

And I strongly suspect that the rotational DoF in the water vapor
is classical,

Undoubedly.

while the
torsion-pendulum DoF in the ice crystal is not.

I think it probably is warm enough to be classical, but the problem
is, now that I think about it some more, the molecules themselves
no longer exist, and there is no nice neat distinction between the
contributions of the whole "molecules" and the contributions of
their supposedly stiffer "internal" motions.

That is,
I suspect that the hydrogen bonding is so strong that the
torsion-pendulum frequency is not small enough compared
to kT/hbar.

Now that I think about it, the hydrogen bonding is so strong so as to
violate the integrity of the molecules themselves.

Even if I'm wrong about this particular example,
I'm sure I can (with a little work) come up with another
example that indubitably illustrates my point.

This is probably the case.

The point remains that classical thermo requires us to
count degrees of freedom, but doesn't tell us how to do
it.

And my point remains that this is *not* the domain of classical
thermo. What you keep calling 'classical thermo' is really classical
stat mech.

In particular, we might agree on a set of possible
modes, but classical arguments won't tell us which of
the possible modes are "classical" and which are not.

Agreed. The criterion for classicality (is that a word?) does
involve h-bar.

...
We agree on the physics. Perhaps we disagree as to how
mundane this freeze-out is. I claim it happens quite commonly.
The gamma of CH4 seems like pretty good support for this
claim.

The quantum-freezing of modes is, in hindsight, maybe more common
than I first thought. I agree the hydrogens in CH4 are probably not
acting classically at room temperature. My point about the
1st order transitions was concerned with the external relative
reconfiguration of the particles w.r.t. each other rather than with
what is happening with the presumably stiffer internal modes. If
those modes happen to be *so* stiff that they can be modeled as
perfectly rigid then they are not excited at all and the degrees
of freedom that describe them can safely be ignored. This is, for
instance why we can ignore the nucleon/quark degrees of freedom in
our examples (and even ignore the electron degrees of freedom, too,
when we don't have to worry about their macroscopic delocalization).

...
Really? I thought that was exactly how one analyzed
the Joule-Thompson expansion.

We can't restrict thermodynamics to cases where everything
is always _exactly_ in equilibrium; otherwise there would
never be any heat flow from anywhere to anywhere. The
usual requirement is that things be close enough to
equilibrium that you can transfer energy without "too
much" entropy production, where "too much" is relative
to some relevant scale of the overall problem.

True. But the definition of temperature and the meaning of
quasi-static are concepts that involve a certain amount of asymptotic
idealization that may be hard to perfectly achieve in practice.
That is not necessarily a problem with such definitions--as long as
we can achieve a decent approximation to the ideal situations when we
need to.

...

"Known" is not the same as "monitored" or "observed". In
particular, consider the spin-echo experiment. If I know
what pulse-sequence to apply, I can turn what otherwise
would have been a high-entropy description into a low-
entropy description.

True.

What we mean by macro-work is a macro-level change in the system's
energy because of a change in one or more macro-level parameters in
the system's Hamiltonian, (e.g. a change in the system's volume,
shape, magnetic moment, etc.). If the system's energy changes
by a noticeably macroscopic amount, and yet all of such energy-
changing macro-parameters are held rigidly constant, then the energy
change is due to heat.

How do you apply that to the spin-echo case?
You can't tell what's thermal and what's not
if I don't tell you what pulse-sequence to try.

True. In the spin-echo situation the particular pulse-sequence is
supposed to be part of the macro-description if you want to be able
to reconstruct the inital macro spin state. If it is not part of that
description things (that with that extra information would not be
thermal), *would* be considered thermal in the absence of that info.

If the system's 'macrostate' is exactly defined in terms of its actual
microstate (as in the Szilard engine) then there is no randomness to
average the generalzed force over when tweaking the system's
macrostate, and the supposed (statistically calculable) macro-work
expression just boils down to actual the ordinary mechanical work.

The Szilard engine is quite a bit more interesting than
that, because it has a heat reservoir whose macrostate
is, by hypothesis, not known. The one-particle "working
fluid" is used to extract energy from the heat reservoir.

Ok. I didn't catch in your bringing up of this example that you
wanted to insist that there be a relevant heat bath. That *does*
change things. I thought you only brought up the example because it
was supposedly 'microscopic', not because it was both that *and*
thermal.

... in this case the state of the engine is not
written in terms of a temperature *either*.

The state of the heat reservoir is most certainly
written in terms of its temperature.

That system is just not a thermodynamic system.

Completely disagree.

What I said was not a thermodynamic system above was the isolated
engine of one molecule whose particle trajectory was known. If you
want the heat bath to influence the behavior of the water molecule,
then I *agree* with you that it *is* a thermal sistuation--albeit one
that is just not in the thermodynamic limit of an quasi-infinite
number of degrees of freedom.

A good fraction of what I know
about entropy I learned by thinking about the thermodynamics
of Szilard engines.

It's been quite a while since I had looked at them. In particular, I
had forgotten that the heat bath is crucial to Szilard's analysis--
and apparently to your reason for bringing up the example.

In particular, there's a paper by
Zurek which lays out the only really-convincing argument
I've ever seen that Shannon entropy really is the same
as Carnot entropy ... and it is based in large part on
an analysis of a Szilard engine.

I've never had any problem being convinced of this. Come to think of
it, it seems to me that the name Shannon entropy may be somewhat of a
misnomer in physics. Since Szilard was the first to relate the
Boltzmann/Gibbs entropy to *information*, we probably ought to be
calling the info-theoretic version of entropy the Szilard entropy.
Shannon's main contribution was to apply this info-theoretic function
to problems in communication theory (but he did derive it
independently).

The main influence on my thinking in this matter has been the work of
E.T. Jaynes.

(Another large part
goes back to the physics-of-computation ideas of
Landauer and Bennett).

This field certainly *does* depend crucially on understanding that
thermodynamic entropy is really info-theoretic at its heart.

...
But my point remains that classical arguments won't tell
you the limits of their own validity.

Of course. Newton's physics doesn't have h-bar in it.

One would think
that the gamma of CH4 is a perfectly well-posed classical
question, but if you try to answer it in classical terms
you've got three strikes against you before you even come
to the plate.

I don't expect *any* model to contain within itself the limits
of its own validity. That goes for whether or not the model
is classical or quantum. Ultimately the arbiter of the
validity of any model of nature is nature itself--not the
model.

To repeat: There's a very great deal of what David is
saying that I agree with, and it seems ungracious to
focus on the places where I have doubts ... but I didn't
want to clutter 700 mailboxes by quoting a long list
of agreed-upon points......

Thank Goodness for that. These posts are already much too long.

I think our disagreements now hinge on at most only two relatively
minor points. First I claim that when John uses the phrase
"classical thermo" he really means "classical stat mech". And
secondly, I claim that just because some mode is not free, such as a
rotor that is prevented by stearic forces to freely rotate and acts
as a torsion pendulum, or a particle in a weakly bound lattice or
liquid is prevented by neighboring Van der Waals forces from freely
wandering across the lattice or liquid, that does not mean that (in
thermal equilibrium) any degrees of freedom are "lost" or that the
kinetic energy of the mode doesn't obey the equipartition theorem
rule of contributing (1/2)*k*T to the energy. The only thing that
would prevent this is if the temperature is so low that the mode's
discrete quantum energy levels are just not sufficiently excited for
the mode to act effectively classically. As long as the mode *is*
classical, then the presence or absence of any potential energy is
irrelevant to the contribution that the kinetic energy itself makes.

David Bowman