Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: entropy



On Mon, 25 Nov 1996, Leigh Palmer wrote:
That won't wash. The entropy of a system does not depend one tiny bit
on "the INFORMATION we have about" it! If you believe that it does,

I would have expected the writer was using a little verbal shorthand here
for saying that information theory provides the tools for a clear
understanding of the meaning of entropy.

!!Anecdote alert
I once had a student search me out first thing is September with
the following confession: "when you did the section on Information
Theory in SM last year I thought it was the most useless thing I had
ever studied". So much for my presentation skills :-(. He went on to
say that his summer job had consisted of using Information Theory to
advance an image processing technique, and the section on Information
Theory had turned out to be the most useful thing he had ever studied.

you believe in magic. The entropy of a physical system (apart from a
question about its zero point) is a function only of its thermodynamic
coordinates, and those depend not at all on human knowledge. My entire
problem with teaching this particular pernicious factoid is that it
tends to mystify physics. Students hear it and quite justifiably think
that physicists believe in magic.

Much like telling students that the quantum state of a system depends on
our observation of it. ;-)

Presented carefully, I think that Information Theory is a good means of
showing what entropy is. The possible misconception, that "information"
is somebody' knowledge should be easy enough to deal with.

|++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++|
| Doug Craigen |
| |
| If you think Physics is no laughing matter, think again .... |
| http://cyberspc.mb.ca/~dcc/phys/humor.html |
|++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++|