Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy, Objectivity, and Timescales



This time Leigh says,

that in principle it *could have* reached any of the states which are
considered to be in the set of accessible states. It is not necessary
that the system reach even 10^-10 (or any small fraction) of them.

I'm sorry, I'm afraid I don't understand what is meant by "in principle
it could have...".

That is the criterion for attainment of what is called "thermodynamic
equilibrium". There is an identifiable time associated with any system's
tenure in a single microstate (a single cell in phase space). The time
required to reach the remotest cell in the phase space under given set
of constraints is the equilibration time. Of course the system does not
have to attain that particular microstate; it is enough that it is, in
principle, accessible from the initial state.

I'm very sorry, I still don't understand the phrase "in principle".
Either the system reaches a state or it doesn't. I don't see any middle
ground, unless you want to introduce an observer and talk about what
information is available to the observer. But then entropy becomes
subjective.

You don't know (and you can't know) the exact microstate of any
physical system, and it is fundamentally impossible to predict even
its microstate after the next transition, let alone in the distant
future.

Not true. For a sufficiently small system I can know the precise
microstate, and I can predict its state into the future using the
Schrodinger equation (or Newton's laws for a classical system).
I don't know of any fundamental limitations for larger systems--
only practical limitations. Am I missing something?

Two things. Thermodynamics does not apply to small systems; it was
never meant to. The other is that the evolution of a small system
is not predicted by the quantum mechanics. The billiard game in Mr.
Tompkins is a nice example of that.

1. Then please tell me exactly how large a system has to be before
we may talk about its entropy.

2. Excuse me? The time-dependent Schrodinger equation is fully
deterministic and allows one to predict a future state with precision,
provided that the initial state is also known with precision.
In any case, the issue of quantum indeterminacy surely is not
relevant to understanding entropy. (If you think it is relevant,
then please elaborate.)

Let me approach this in a
different way. I have before me a thermodynamic system. I know enough
about it to be able to compute the entropy of this system. You, on
the other hand, know nothing about the system,and there fore you
cannot calculate its entropy. Do you not believe that the system has
a perfectly definite entropy even though you are ignorant of it?

No. What I call its entropy would depend on exactly how much information
I would have. The more information you give me about its state, the
smaller its entropy becomes. For instance, if you tell me that the
system is a mole of helium at standard temperature and pressure, I can
now calculate an entropy using this information. But now, if you
provide the further information that the helium is separated into
two chambers of equal volume with exactly half the atoms in each chamber,
I can calculate the entropy again and get a number that is very slightly
smaller. In practice this point is inconsequential because the two
entropies differ by only a tiny amount and you'll never give me enough
information about the state of the helium to reduce the entropy
significantly. But fundamentally, the entropy still depends on how
much information I have.

When you understand what is meant by the statement "The entropy of a
thermodynamic system is a function of its state" you will be well on
the road to understanding the concept. As it stands you now do not
believe that statement, and progress is very difficult in consequence.

It's not that I don't believe the statement. It's that I think the
word "state" is ambiguous and subjective. Who's to decide how much
information should go into specifying the "state"?

This morning I took an informal poll of a half dozen of my colleagues
and all but one initially agreed with you, saying entropy is objective.
With further discussion, however, I've already convinced one of them
that it's subjective after all (I'll work on the others later).
Then I e-mailed a higher authority, the professor who taught my
graduate-level statistical mechanics course (and who is generally
a very careful thinker whom I and others highly respect). He agreed
with me. My point is not that we should rely on authority, only
that intelligent people who have thought very carefully about entropy
are *not* in agreement over whether it is objective or subjective.
So let's be a little more polite to each other, ok?

-dan