Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: entropy



Joel Rauber asked:
<snip> ... I'd like to here how some of you
give naive students an idea of what entropy is in a few classes without
using disorder as an analogy. In other words, what do you really do to
approach the subject or just some ideas. ....
I confess to always mentioning the idea of disorder; but I'm not so sure
(even after what's been said) that its an entirely stupid thing to do ....
So lets hear what you guys do!

I'm not too sure you want to hear from me since I also mention disorder when
discussing entropy (for both lower and upper level classes). We finished up
the thermo unit 2 weeks ago in my algebra-based Gen. Physics I class. When
discussing entropy I told them the real definition of entropy and even
illustrated my points with Leigh's card deck example. I don't think the
concept of "amount of information" is all that difficult for lower level
students to grasp. After all most of them are quite familiar with computers
and have heard of bits & bytes. The students have all had logarithms in high
school, and even though they may not be adept at algebraic manipulations
involving them, they mostly know what a logarithm is. The idea that an
N-digit decimal number can uniquely specify 1 of 10^N possibilities is not so
difficult to follow--after all, you just label each possibility with a number
(starting with 0). Similarly, most students can get the idea that the set of
N-digit binary numbers (N-bit strings) can uniquely specify 2^N possibilities.

Although I didn't try to mention it in my class this year, I think (maybe hope
is more honest) most of them could even grasp (with careful explanation) the
concept of Chaitkin's algorithmic complexity notion of the information content
of a state being the length of the shortest possible description of (or set of
instructions needed to reproduce) it. Thus a notion of the disorder in a
situation can be easily be quantified as the minimal information necessary to
characterize that situation. I think this idea should be particularly easy
to explain to any student who has ever used a file compression utility like
PKZIP. I think that as students become more computer literate in their
background it may be easier to discuss at lower levels information-theoretic
topics that previously were reserved for upper (& grad) level courses. I
don't see how what I said above can be much more difficult to follow than our
usual explanations of negative focal lengths and virtual images in geometric
optics, or the difference between the motion of the individual disturbances in
a wave and the overall motion of the wave as a whole. Come to think of it,
since there are no calculus or vector concepts embedded in the idea of
information, I suspect that information is even easier to comprehend than
instantaneous acceleration is (since it is notoriously quite difficult to get
a non-calculus student to understand a quantity that is fundamentally a second
time derivitive of a vector quantity). I suspect if we try, we can get
students to understand entropy better than we can get them to understand
acceleration.

So since Joel asked, I tell my students that the thermodynamic entropy of a
given (macroscopically specified) thermodynamic system is the average minimal
information needed to specify the exact microscopic state that system is in.

David Bowman
dbowman@gtc.georgetown.ky.us