Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy, Objectivity, and Timescales



Dan quibbles:

OK, I'm going to quibble with this. I guess the question is what
you mean by understand. The way I put it to my students is this:
Clausius defined entropy to be the thing that changes by Q/T when
heat Q enters a system at temperature T. But Clausius never explained
what entropy actually *is*. It was Boltzmann who figured that out,
several years later: Entropy is the logarithm of the number of
accessible states.

You have (in the words of the late Bill Burke) committed the sin of
reification! (Note how well the vocabulary of religion fits this.)
You have treated an idea (a formula) as a thing. Entropy is not a
thing any more than energy is. It is an attribute of a system which
"has to be computed"; it does not have an independent existence. In
the words of Feynman "there are no blocks". If you have not read
Chapter 4 in Volume 1, please read the first two pages. Then pay
your penance by attending the football game of your choice holding
a large lettered sign reading "Feynman I:4-1". That would be an
entirely appropriate act of contrition given the venial nature of
your sin.

If you're willing to settle for the "macroscopic" level of understanding
that Clausius had, then ok. But most people, I think, want to know in
more fundamental terms what entropy is and so are much happier with
Boltzmann's explanation, despite the fact that the word "accessible"
is difficult to define precisely.

Another error, as I explained in my last posting. Boltzmann's entropy
is *exactly* equivalent to Clausius's, where the word *exactly* is to
be interpreted in the physical sense, meaning that there is no
measureable difference between them.

Of course quantum mechanics is still irrelevant to understanding
entropy (aside from the very minor point of determining the arbitrary
additive constant). I'm only quibbling with the word "macroscopic".

Oh, someone also mentioned that I'm writing my own book on thermal
physics. True indeed. But the reason I'm not satisfied with the
existing texts has nothing to do with their treatment (or nontreatment)
of this issue. I do prefer a statistical approach to entropy,
and I feel that the existing books that take this approach just
aren't very carefully written or well organized... Also none of them
really do justice to classical thermodynamics, which is an important
subject.

I confess that I haven't spent very much time with your book yet,
Dan. I do fear that its philosophical orientation will not
appeal to me. Going deeper into quantum mechanics than using the
quantization of phase space (the so-called uncertainty principle)
is probably not justified in such a text because it is probably
not physically justifiable.

I like Reif's book. When I read it I do so hearing his Austrian
accent in my head. It treats both classical and statistical thermo
in a satisfactory way, I think.

Leigh