Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] let's define energy



On 09/29/2015 01:54 PM, Diego Saravia wrote:

Is interesting to think entropy as "energy disperssion", and not for
example mass disperssion, or another stuff dispersing.

That's interesting, but not correct. It's a very widespread
misconception, but a misconception none the less.

So Shannon
entropy is not the same as thermo entropy, you must account bits
energy for that.

As far as I can tell, Shannon entropy is /exactly/ the same
as thermodynamic entropy. I've seen lots of cases where it
is, and lots of reasons why it has to be ... and never the
slightest evidence to the contrary.

Example: The observed, macroscopic entropy of the copper
nuclei in a demagnetization refrigerator is R ln(4), exactly
as you would expect from counting states.

Insofar as entropy is related to dispersion or spreading, it
has to do with /probability/ spreading out in /phase space/.
We're talking about the probability of states, which are not
necessarily energy states. If you think of it in terms of
energy spreading in position-space you will get the wrong
answer. If you think of it in terms of mass spreading in
position-space you will get the wrong answer.

I am not really shure if there is a better methapore about this.

The correct approach is to keep track of the states.
Define entropy in terms of the probability(*) of the states.

(*) Technical note: This works in any situation where
the classical probability is well defined. In the very
unlikely event that you are dealing with Schrödinger cat
states, the procedure gets slightly more complicated,
but the idea remains the same.

States are states. They are not energy states. Entropy is
not defined in terms of energy, nor vice versa.