Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Entropy: quenched disorder is entropy



However, if we are to adopt Leigh and David's pedagogical advice, then
ENTROPY is a forbidden word until students hit a stat-mech course, which is
often not until graduate school. ;-(

That is certainly not an impression I would like left here. Changes in the
entropy of a system are associated with changes in the state of the system.
These changes in entropy can be calculated in a well defined manner which
is similar to the manner in which one calculates the change in the energy
of a system on which one does work. There are important differences, of
course, none of which I wish to go into here. They are well covered in
elementary physics textbooks at first year university level or, I suppose,
AP Physics level. The calculations to which I refer are, of course,
classical. The idea of disorder being related to entropy is an almost
purely quantum mechanical concept. Introducing this concept as the first
to be associated with entropy is similar to introducing energy by telling
students that the energy of any macroscopic system is a weighted average
over the eigenvalues associated with the solutions to the Schrodinger
equation corresponding to that system's Hamiltonian. Neither description
will lead to a quantitative determination of the quantity in question by
a beginning student.

The association of entropy with disorder in the absence of mathematical
foundation is not scientific. It is not even stamp collecting in Lord
Rutherford's sense of that term. It is, however, a compelling false
concept because the referent, disorder, seems to be so easy to grasp. The
concept, even if made mathematical, would not really help the student to
understand what is meant by the physical concept. The association with
disorder is traditionally made later, as you point out, in statistical
mechanics*.

Explaining to a student how to calculate the entropy change associated
with a change in the state of a system is, in principle, no more difficult
than explaining how to calculate the energy change associated with the
same process. In practice the former *is* more difficult to accomplish
than the latter, but it can be done, and I succeed in getting a significant
fraction of an introductory class to understand it at an appropriate level
of mastery. (Chemists do it differently and also succeed.) My message is
that knowing in advance that "entropy is a measure of the degree of
disorder" does not advance their ease of understanding what entropy is by
the slightest amount, and it is my impression based on many years of
teaching that it is an idea they feel they can't abandon because they think
they understand it. IT IS A BARRIER TO CONCEPTUAL GRASP! (Excuse me, but
that felt good.)

The business with the deck of cards is also a barrier, as John Denker's
dilemma will show. It is clear to me that I have not convinced him. John,
consider the process of rearranging a deck of cards from a state on one
sequence to another sequence. Can this be carried out in such an ideal
manner that no irreversible work is done and no heat flows? (rhetorical
question - the answer is yes) Then the entropy of the deck is unchanged
by the process. Did I specify any particular initial or final state? Then
the entropies of decks in all sequences have the same entropy, all other
things being equal.

Leigh

* In my view there is no justification for putting statistical mechanics
off until graduate school. There exists a brilliant textbook in the
subject by Fred Reif published in 1964 which is accessible to the student
after an introductory physics course. We teach it in third year here at
Simon Fraser University and I believe that is common practice throughout
the English speaking world. (By the way, this book exists in a much less
expensive paperback edition. See http://bookshop.blackwell.co.uk/. Their
price is 27.99 pounds (12.7 kg?)). Statistical concepts, as pioneered by
Ludwig Boltzmann and Josiah Willard Gibbs (the greatest of all American
physicists) revolutionized our worldview at the turn of the last century.
The subject must be introduced at the same time as quantum ideas.

P.S. Daniel Schroeder's comment is entirely appropriate. The entropies
tabulated in chemistry books are numbers. Energy is a number too. Both
quantities are abstract, depend only upon the state of the system in
question, and, above all, are quantitative.