Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Explanation of def. 4 of Q (long)



Considering Carl M.'s comment:

I confess that I can't parse defn 4 ....

and Jim Green's rejoinder:

.... I can't make any sense of Def 4 either.

Let me attempt to explain this definition.

I wrote:

4. I'm for the standard definition of heat given in statistical
mechanics as the integral of the infinitesimal contributions to the
differential change in the macroscopic energy expectation due to a
change in the probability distribution for the system occupying its
various microscopic states. Such a change in the macroscopic energy
expectation will typically be associated with a change in the
system's entropy resulting from a change in the system's
distribution of microstates which are accessible to the system's
microscopic dynamics over the time interval that the changes occur.

First off def. 4. (like def. 1) is a definition of the Q term in the
Q + W partition of the contributions to the change in the system's
internal energy. It is not related to def.s 2 or 3. Def. 4 has the
main advantage over def. 1 being that it is defined for situations
where the concept of temperature is not defined. Also, it doesn't
have ambiguity problems with situations where various dissipative
processes are occurring (either internally or at the system's
boundaries).

This definition assumes that there is a distinction between a
system's macroscopic (macro) state and its microscopic (micro) state.
Only the macrostate is assumed accessible in principle to
experimental determination and to the physicist. OTOH the microstate
is taken to be known only to God. For each macrostate of the system
there is an enormous collection of microstates that are compatible
with that macrostate in the sense that if the system is in any of
those microstates they all have the same macrostate within our
ability to resolve it experimentally at the macroscopic level.

Consider the set of all microstates that are both compatible with a
given macrostate and also are mutually accessible to the system's
microscopic dynamics over the experimentally relevant time scale.
The average minimal amount of further information necessary to
determine with certainty the system's exact microstate from this
set, given only the system's microscopic description, is precisely
the system's entropy. In a sense we can think of the system's
entropy as being the average minimal number of yes-no questions
we would have to have God truthfully answer for us to be able to
identify with certainty which microscopic state the system is
actually in, given that we only are allowed to initially know the
system's macostate. The number of bits of entropy in the system
for every J/K of entropy is 1 J/K = 1.0449388(18) x 10^23 bits.

The probability distribution for which of the possible microstates
is actually the real microstate the system is in, given a particular
macroscopic description (macrostate) for the system, is found using
a Bayesian interpretation of those probabilities. Let r be a label
which labels each possible microstate. Let P_r be the probability
that the r-th microstate is the system's actual microstate. The
particular probability distribution {P_r} for all the possible
microstates is determined once the system's macrostate is
determined. The precise set {P_r} for a given macrostate is
found by maximizing the functional:

SUM{r, P_r*log(1/P_r)}

over all the collection of all possible distributions of microstates
(where all such microstates are both compatible with and mutually
accessible to the system's internal dynamics). The particular
distribution which maximizes this functional (subject to whichever
constraints are necessary so that the macrostate remains fixed at
the actual macrostate) is taken *as* the actual probability
distribution for the system's microstate, and the corresponding
particular extremal value for that functional *is* the system's
actual entropy. If the logarithm is taken to the base 2 the entropy
is in bits, and if it is a natural Naperian logarithm with the result
being multiplied by Boltzmann's constant, then the entropy is in J/K.
Thus the entropy is a function of the system's macrostate (as is the
whole probability distribution as well).

Now consider a macroscopic (i.e. thermodynamic) system which is in
some sort of thermal contact with its surroundings. Because of this
thermal contact many of the microscopic the degrees of freedom of the
system interact weakly with the neighboring microscopic degrees of
freedom of the system's surroundings. It is possible for energy to
be exchanged between the system and its surroundings via these weak
interactions. If the system happens to be in equilibrium the average
of these microscopic energy exchanges is zero (within experimental
resolution) over the experimentally relevant time scale and the
system's macrostate will remain constant (up to the ability of
experimental resolution to determine it). If the system is not in
equilibrium then the system's macrostate will tend to vary with
macroscopic time in some way. Here macroscopic time means the time
scale of the experimental response of any equipment needed to
measure and determine the parameters of the macrostate. The
macrostate will be determined by the average of a set of relevant
macroscopically measurable quantities averaged over macroscopic
time and over the experimental resolution of the measuring equipment.
For instance, if we consider the pressure to be a macroscopically
relevant parameter for defining the macrostate, the system's pressure
will be taken to be the average of the impulse delivered by the sytem
to a fiducial boundary area due to microscopic collisions per unit
time per unit area where the relevant time and area scales are
determined by the apparatus (i.e. pressure gauge).

At any instant of time any temporary microscopic energy imbalance
w.r.t. the system's macroscopic average energy is a thermal
fluctuation of the system's energy. Indeed, any temporary
microscopic deviation in any macroscopically relevant (relevant in
defining the macrostate that is) quantity from it's macroscopic
average value is a thermal fluctuation of that particular quantity
for the system. In general thermal fluctuations are tiny
changes in the system's relevant macroscopic parameters that are
both too small in value and short duration in time scale to show
up at the macroscopically observable level.

Typically (except at a critical point) the magnitude of thermal
fluctuations (for some extensive quantiity) scale proportional to
the sqrt of the number of microscopic degrees of freedom in the
system. This makes the *relative* magnitude of the fluctuations
w.r.t. the average value for such a quantity scale proportional to
1/sqrt of the number of microscopic degrees of freedom in the system.
Thus, typically, if we have a system with about 10^24 degrees of
freedom we expect that the magnitude of the thermal fluctuations in
the macroscopic parameters will show up in about the 12th significant
figure and will not be experimentally measurable. This means we
can effectively theoretically calculate the experimentally
measurable macroscopic quantities relevant in determining the
macrostate by just calculating the average of the quantity of
interest where we do the average over the probability distribution
of microstates we calculated above (by maximizing the entropy
functional).

The internal energy is in general considered a relevant macroscopic
quantity. Because of the arguments of the preceding paragraph.
we can calculate the macroscopic energy of the system by performing
the statistical average:

U = SUM{r, P_r*E_r}

where E_r denotes the precise internal energy of microstate r.

Now let's assume the system undergoes some macroscopic process
that measurably changes the value of U. Perhaps some macroscopically
controllable parameters are varied in some way during the change
which provokes a corresponding change in U. We next imagine the
total finite change delta-U as being the integral of a sequence
of infinitesimal differential changes in U, i.e. dU. For
each of these tiny (yet macroscopic) changes we have (care of the
product rule of differential calculus):

dU = SUM{r, (dP_r)*E_r} + SUM{r, P_r*(dE_r)}

The first term is the contribution to the change in U due to
changes in the {P_r} distribution for a given set of
energy levels (which are determined by the system's current
Hamiltonian using the particular values of the control parameters
that appear in the Hamiltonian at that particular stage of the
process) for the microstates. The second term is the change in U due
to changes in the energies of the indvidual microscopic states for a
the particular microstate distribution in force for that part of the
process. The total finite energy change for the process is the
integral sum of the individual dU's over the sequence of the process
as it unfolds. The integral sum of each of the infinitesimal
first-term contributions *is* the heat Q for the process, and the
integral sum of each of the infinitesimal 2nd-term contributions *is*
the (macro)work W done by the system during the process. Thus,
delta-U = Q + W.

Since the Q is the sum of all the infinitesimal first terms,
and since they are all due to changes in the probability
distribution for the system's microstate (for the current
macrostate at that stage for each infinitesimal contribution), we see
that the heat Q is (according to def. 4.) "the integral of the
infinitesimal contributions to the differential change in the
macroscopic energy expectation due to a change in the probability
distribution for the system occupying its various microscopic
states." Furthermore, since the entropy is also determined by the
{P_r} distribution, any change in that distribution will "typically
be associated with a change in the system's entropy resulting from a
change in the system's distribution of microstates which are
accessible to the system's microscopic dynamics over the time
interval that the changes occur."

I hope this exhaustive explanation helps explain my meaning in my
def. 4.

David Bowman
David_Bowman@georgetowncollege.edu