Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
OK, I'm going to quibble with this. I guess the question is what
you mean by understand. The way I put it to my students is this:
Clausius defined entropy to be the thing that changes by Q/T when
heat Q enters a system at temperature T. But Clausius never explained
what entropy actually *is*. It was Boltzmann who figured that out,
several years later: Entropy is the logarithm of the number of
accessible states.
If you're willing to settle for the "macroscopic" level of understanding
that Clausius had, then ok. But most people, I think, want to know in
more fundamental terms what entropy is and so are much happier with
Boltzmann's explanation, despite the fact that the word "accessible"
is difficult to define precisely.
Of course quantum mechanics is still irrelevant to understanding
entropy (aside from the very minor point of determining the arbitrary
additive constant). I'm only quibbling with the word "macroscopic".
Oh, someone also mentioned that I'm writing my own book on thermal
physics. True indeed. But the reason I'm not satisfied with the
existing texts has nothing to do with their treatment (or nontreatment)
of this issue. I do prefer a statistical approach to entropy,
and I feel that the existing books that take this approach just
aren't very carefully written or well organized... Also none of them
really do justice to classical thermodynamics, which is an important
subject.