Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: errata- Max entropy



While explaining his comments about negative-going entropy fluctuations Joel
Rauber wrote:
I simply mean that one is using the formalism of the canonical ensemble to
analyse some physical system. In that context the physical system could be
viewed as a representative member of the ensemble.

I think you may be mixing things up a little here. The canonical ensemble
describes the statistics of a system in equilibrium with a fixed temperature
surrounding thermal reservoir. The temperature of the reservior, not the
system's internal energy, is a relevant parameter defining the system's
macrostate. This temperature has a fixed non-fluctuating value. The
internal energy, OTOH, is subject to microscopic fluctuations and its
instantaneous value is not subject to experimental monitoring. It is true
that as the system temporarily looses some internal energy to its environment
due to fluctuations, the number of microscopic states accessible to the
system consistent with this reduced energy is temporarily decreased. But,
this is *not* the system's entropy. Since microscopic fluctuations are not
part of the macrostate specification they do not influence the entropy which
is a function of the macrostate (not microstate). For this system, described
by the canonical ensemble, the entropy is *fixed* by the Boltzmann
distribution of microstates which is the relevant microstate distribution for
the system. The entropy is the statistical function(al) of that microstate
distribution given by S = - k*SUM_r(p_r*ln(p_r)) where p_r is the probability
of the r-th microstate. For the Boltzmann distribution we have that
p_r = exp(-E_r/(k*T))/Z where Z = SUM_s(exp(-E_s/(k*T))) is the system's
canonical partition function. In this case the entropy is: S = k*ln(Z) + U/T
where U = SUM_s(p_s*E_s) = - dln(Z)/d[beta] is the system's mean internal
energy. In the above formula [beta] = 1/(k*T) by definition, and the
derivative is a partial derivative taken holding the system rigid (i.e. no
changes in quantities such as the volume which would allow work to be done).
We thus see that in equilibrium the entropy for the canonical ensemble is
determined by the partition function -- which is itself determined by the
environmental temperature and all the macroscopic parameters (e.g. volume)
whose variation would cause work to be done, but which are held rigid by the
boundary conditions. Since the entropy is a function of only fixed
parameters it does not fluctuate, but rather remains fixed (in equilibrium).

The quantity which corresponds to the entropy of an isolated system, i.e.
the logarithm of the number of microscopic states which are accessible to
the system having a fixed internal energy U *does* fluctuate in equilibrium
for a system in thermal contact with a thermostat (thermal reservoir) as the
internal energy itself fluctuates, *but* this quantity no longer represents
the entropy for the system in this canonical case.

Do you prefer to compute an average value of entropy for a system?

Yes (where the average is taken according to the general definition of
entropy).

I have
trouble understanding the 2nd law statement that entropy never decreases for
a process; unless I can view it as a calculatable quantity that evolves in
time.

The entropy is a calcuable quantity, but it does not evolve in time in
equilibrium. By the time equilibrium obtains the entropy has finished
evolving. The second law is for systems that are not yet equilibrated.

For non-equilibrium systems the entropy is *still* an average functional
value over the statistical distribution of microstates. In this case both
the microstate distribution and its corresponding entropy evolves in time in
such a way that the entropy never decreases (if the system is isolated,
otherwise the sum of the entropies of the system and its environment never
decreases).

Which I interpret to mean that I can calculate it, given enough
mathematical prowess, at some instant of time.

This is true (that the entropy is calculable at any time) even though the
entropy involves averaging over the microstate distribution.

Since I wished to view systems in the context of thermodynamic equilibrium,
I needed to view the system in terms of the canonical ensemble.

Why is that necessary? Isolated systems can come to equilibrium too.

...
On the one hand, you have the impressive edifice of Thermodynamics (recall
Einstein's quote that Thermodynamics is probably the one edifice that will
stand the test of time, unchanged). And this edifice has the 2nd law, hard
fast and immutable; engraved on tablets of stone: The entropy of a system in
equilibrium does not decrease.

The entropy of a system in equilibrium remains constant -- as do some other
macroscopic thermodynamic parameters. When a system is in equilibrium its
macrostate is time invariant.

On the other hand, following Boltzmann and Gibbs; we have the statistical
mechanical underpinnings of Thermodynamics and you find the idea of
fluctuations of quantities; implying that the entropy can over some periods
of time decrease.

Fluctuations, yes. Of entropy, no.

This apparent? conundrum?, I find alluring and is a factor in how I came to
be stuck in a career in physics.

It's not a conundrum if one thinks sufficiently carefully about the
situation.

And if its not known to be in some specific eigenstate?

Then the entropy is positive.

David Bowman
dbowman@gtc.georgetown.ky.us