Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: entropy



Once again, we get 'ex-cathedra' pronouncements [mostly from north of the
border ;-) ] that dozen's of texts and years of instruction are wrong--or
damaging--or useless. The problem here, as in a number of other cases, is
that the advice given may well be appropriate for the future physics majors
in our various courses, but not really helpful for the multitudes who
don't go on beyond the introductory courses. Entropy IS a word that shows
up in 'popular' scientific and technical contexts and even these
introductory students need some (maybe vague) understanding of the term.
'A measure of disorder' seems to be a reasonable, low level, admittedly
imprecise, but nonetheless useful definition--_for this clinetele_--a
clientele that won't be taking stat-mech, won't be doing graduate level
thermo, but should not be left without any meaning of the term. My point
of the quote from Kittel was that if a high-level stat-mech book can relate
entropy back to disorder (or randomness), then it doen't seem to me to be
_all that bad_ to do it with introductory students. As I've suggested
before, the physics majors that come out of these intro courses should not
have too much trouble later on with refining these earlier imprecise
definitions and concepts as their knowledge and skills in physics build
with later work. {If they do, then maybe they really shouldn't be physics
majors.}

Rick Tarara
----------
Date: Sat, 23 Nov 1996 21:49:41 -0800
From: palmer@sfu.ca (Leigh Palmer)

Donald's examples of the understanding of order by naive students is
entirely correct. That is exactly why relating entropy to disorder
is useless when it is done to enlighten naive students. The practice
of the overwhelming majority of teachers is to parrot that "meaning"
anyway. It is evident to me that it is not at all useful to define
disorder as being proportional to k log omega when telling naive
students about entropy. That was a frivolous suggestion at best.
Is there anyone out there who has actually approached the task of
introducing entropy in that way?