Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Max-entropy (cont)



Joel Rauber wrote:
....
My point is that statistical mechanics is intended to provide a basis for
thermodynamics and is statistical in nature!

Agreed. I guess that's why they call it 'statistical' mechanics.

E.g. Consider two finite systems systems A and B in thermal contact. In
principle one can write the entropy S as S(U_A) , where U_A is the internal
energy of A. Statistical mechanics predicts that there are fluctuations in
the energy of system A around the mean <U_A>. These fluctuations are
inherent in the macrostate. The equilibrium state of the system does not
just consist of the system staying at the maximum value of S(U_A); it
includes the fluctuations of the system around the mean energy.

We seem to have a disagreement about just what constitutes the microstate
and the macrostate. The way I see it when the composite system is in
equilibrium the mean value <U_A> is a macroscopic parameter which is part of
the macrostate description of A. The precisely exact instantaneous value of
U_A (or at least the difference between it and <U_A>) is part of a
*microscopic* description. Actually, for a truly quantum system A
interacting with another system B an exact value for U_A at any instant of
time is not precisely determined due to the constraints of the uncertainty
principle (|DU_A|*|Dt| > h_bar/2). Here the microsopic dynamics of the system
A are not determined by just the Hamiltonian of A, H_A alone. The
interaction Hamiltonian involving the terms which couple the microscopic
degrees of freedom of A to those of B induces transitions between eigenstates
of H_A and allows A's energy to fluctuate. The quantum state of A cannot be
a stationary state, and its time dependence keeps it from being characterized
by a unique U_A value.

For this equilibrium case the system A has a probability distribution {p_r}
for its various microstates {r}. The macroscopic <U_A> quantity is
determined from both the {p_r} distribution and the spectrum of microscopic
energy values {E_r} of the Hamiltonian H_A; <U_A> = SUM_r{p_r*E_r). The
precise values for the {p_r} distribution are determined by maximizing the
entropy of the composite A+B system over all possible allowed distributions
{p_A}. In the (normal) case that the interaction terms in the total A+B
Hamiltonian between the A & B degrees of freedom are negligible enough to
consider the thermal contact between A & B as 'weakly interacting', then the
result of this maximization is that:
p_r = exp((S_B(U_tot - E_r) - S_B(U_tot))/k)/Z , where Z is given by:
Z = SUM_r{exp((S_B(U_tot - E_r) - S_B(U_tot))/k)}, where U_tot is the fixed
total energy of the composite A+B system. Here the function S_B(U) is the
microcanonical entropy function of the *isolated* system B when it has energy
U. In the limit that the heat capacity of B becoming infinitely greater than
that of A then the system B boils down to a simple infinite heat reservior
and the above {p_r} distribution boils down to the usual Boltzmann
distribution. Notice that in the limit of an infinite B system (for fixed
microstate energy E_r for system A) the expression:
(S_B(U_tot - E_r) - S_B(U_tot))/k --> - E_r/(k*T) where T is the temperature
of the reservior B. In the finite B case the entropy of A is
S_A = -k*SUM_r{p_r*ln(p_r)} = k*ln(Z) + S_B(U_tot) - <S_B(U_tot - E_r)> where
the expectation in the last term is over the {p_r} distribution.

In any event, the entropy S_A of the system A in thermal contact with the
finite system B (with the A+B composite system being in equilibrium) is
*still* a constant since it is still determined by the constant (in time)
{p_r} distribution.

For finite systems, there are finite probabilities of large fluctuations;
although for "large" systems the probabilities of large fluctuations become
exceedingly small; to the point of being negligible in practice; the reason
I don't worry about all the air molecules in my office congregating over in
the corner. But one can apply statistical mechanics to "small" systems and
"medium" systems.

I think you mean *relatively* large and *relatively small fluctuations here.
The absolute magnitude of the fluctuations of the floating (i.e. not rigidily
fixed by the boundary conditions) extensive macroscopic parameters (e.g. the
U_A parameter above) increases with size proportional to sqrt(N). Since the
mean values of these quantities are proportional to N we see that the
relative fluctuations decrease with size proportional to 1/sqrt(N) (where N
is a measure of the number of particles or of the number of microscopic
degrees of freedom).

One can indeed apply stat mech to "small" and "medium" systems, but in such
a case one has to be extra careful about which quantities are considered
part of the definition of the macro-"like" state. Some of these parameters
are considered as 'rigid' and their values are assumed known and are fixed by
the boundary conditions/constraints of the system, and others of these
macro-like parameters are considered as 'floating' and whose average values
are fixed with the averages being considered as the experimental macro-value
of those parameters. The difference between the average of a floating
macro-like parameter and the exact instantaneous value of that parameter is
considered as a micro-like variable and is *not* part of the system's partial
description that we call the macrostate. All the other detailed parameters
(i.e. degrees of freedom of the system) are also considered as micro-
variables. The whole point of stat mech is to calculate the *averages* of
the floating macro-like parameters over the probability distribution of the
micro-variables. The specification of the collection of all of the
micro-like degrees of freedom determines the microstate of the system. But
knowledge of this state is assumed not accessible from the partial
macro-level description. Hence the need for the statistical averaging. The
entropy of the system is a function of the macro-level state (which
determines the micro-level distribution {p_r} which then determines the
entropy).

For a 'small' system it is easy to confuse variables that play a macro-type
role in the theory with those that play a micro-type role. For any size
system the entropy lives in the macro-level -- however that is defined.
After all, it is a measure of just how partial that macro-level description
actually is, since it is precisely the average amount of (inaccessible)
information needed to determine the exact microstate given just the
macro-level description.

...
Hard sphere ideal gas in a constant volume container which has been isolated
from its environment for a long time. Presumably it is in an equilibrium
macrostate given by the thermodynamic coordinates P,V and T (only two of
which need to be specified).

What is a hard sphere ideal gas? Hard sphere systems are decidely *not*
ideal since they have a positive particle size. Maybe you are using the term
'ideal' to mean something other than for a system obeying the ideal gas eqn.
of state. Maybe it means a hypothetical hard sphere gas model? If so, for a
classical hard sphere gas the p/T ratio depends only on the V/N ratio. Thus
the equation of state for a classical hard sphere gas is characterized by a
single universal function of *one* argument rather than the usual case of a
(species-specific) function of two arguments once the V/N ratio is measured
in units of the cube of the hard sphere radius.

What are the statistical mechanics underpinnings of this state. Take
Pressure P.

presumably the pressure is related to the impulse delivered to the walls of
the container when the spheres collide with the walls. I would maintain
that it is only the average value of this impulse that is constant in time;
and only if averaged over some suitably long period of time. Otherwise it
is a varying quantity when I look carefully at the microscopic details.

The definition of the pressure requires that it *is* an averaged quantity.
The microscopic impulses do not define a time dependent pressure function.
Actually, for a hard sphere system the instantaneous value is not well defined
mathematically since it represents a very discontinuous function whose value
is zero for all times except for a set of times of measure zero, and at those
times the value is infinite.

The pressure of any (fluid) system is the negative of the *statistical
average* of the partial derivative of each microstate's energy E_r with
respect to the system's volume over the microstate distribution , i.e.
p = - <dE_r/dV> = - SUM_r{p_r*dE_r/dV}. Being an average, the pressure does
not depend on time in equilibrium since the {p_r} distribution is time
independent in this case. If your pressure gauge is so sensitive and has such
a fast response time that it is actually registering fluctuations, then your
gauge is not registering the pressure; it is registering part of the
microscopics (which is thus not a purely the macroscopic quantity that the
pressure is supposed to be).

Entropy of mixing: Gas A and Gas B originally separated by a removable
partition. Remove the partition and wait along time T_o.

Continue to observe the system. The 2nd law at the thermodynamic level says
that the gases will never unmix, or even partially unmix.

Not quite. The second law implys that the entropy of the system will rise
as the system re-equilibrates after the partition is removed until a new
equilibrium is established. The entropy of the final equilibrium situation
is constant and is objectively greater than the the initial macrostate which
forced all the particles of each species to occupy only its own side of the
container. Since so many new microstates become available once the partition
is removed the entropy goes up. That doesn't say anything about statistical
fluctuations in the instantaneous microscopic allocation of particles among
the various regions of the container. In calculating macroscopic quantities,
such as the entropy, the macroscopic value is an average over the microstate
distribution {p_r}.

Looked at in terms of the statistical situation, there is a finite
probability that they will unmix.

Agreed, but that doesn't make the entropy drop -- unless you require that as
part of the macrostate definition a continuous monitoring of the locations
(as to which side of the container) for each of the particles in the system.
For such a detailed macrostate specification, the system cannot ever reach
(time-independent) equilibrium since this weird macrostate will continue to
fluctuate indefinitely and never settle down once the partition is removed.
Needless to say, such a detailed intrusive microscopic monitoring of the
system violates the whole spirit of what a macrostate is supposed to be.

Reducto ad absurdum example:
Gas A is two molecules in 1/2 liter, Gas B is two molecules in 1/2 Liter,
apply the procedure above; the probability is rather large that the system
may find itself in the unmixed state after T_o.

This is true. The entropy doesn't fluctuate though -- unless you choose a
macrostate description so restrictive that you actually cheat and gain
knowledge of the microscopics to the extent that you always monitor
which side of the system each molecule is in as part of the macrostate.

For an example designed to upset Leigh, consider the case of a thoroughly
shuffled deck of 52 distinct cards. The entropy of the (sequence of cards in
the) deck is S = log_2(52!) = 225.6 bits if you do nothing to gain any
knowledge of the card sequence. As long as the cards are continually
shuffled without regard to the sequence then the entropy *stays* at 225.6
bits even though occasionally some interesting patterns may crop up in the
sequence for some shuffles. If you take note of a pattern of a particular
shuffle you are cheating and already have gained knowledge of the sequence
(i.e. the microstate). If you know the whole sequence then you do not have
any information about the sequence that is missing, so the entropy in that
case is zero -- since your ignorance about the sequence is zero.

Entropy is both subjective and objective. It is subjective in that it
measures one's own ignorance about the details of a given situation. In the
case of thermodynamics however, the conditions that determine that ignorance
are objectively defined once the macrostate is defined. Thus for a given
macrostate description the thermodynamic entropy is an objective function
of the macrostate.

David Bowman
dbowman@gtc.georgetown.ky.us