Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-l] entropy +- heat



Chuck Britton remarked parenthetically:

[a] I refuse to relegate 'heat' to the dustbin of
useful Anglo-Saxon four letter words. EVERYBODY uses the word so why
not address some of it's shortcomings and usefulnesses.

[b] I feel the
same about Entropy. It's not just for those who delight in
enumerating states.

I mostly agree with each of those statements *separately*, but the
indicated parallelism between the two statements doesn't sit right
with me.

We can agree that "everybody" uses the word heat. However, because
the word has multiple incompatible meanings, it might or might not
be obvious what "everybody" means by it. At best it is context-
dependent. In particular, there is AFAICT no agreed-upon _core_
meaning to the term, so any attempt to give a precise definition
of the term is guaranteed to be unsatisfactory. Many people have
tried and failed, often without realizing how badly they failed.

We can comparte this to the term "blue", another common four-letter
word. Nobody in his right mind would try to give a quantitative
definition of blue. Instead, if you want to be quantitative, you
don't quantify the blueness, you quantify something else, perhaps
power versus wavelength.

Actually "heat" is far more problematic than "blue", because there's
something even worse than imprecision, namely holy wars between the
big-endians and the little-endians, each of whom think they know
"the one true meaning" of the word.

My point is that unlike heat, entropy does have a satisfactory _core_
meaning. This does not preclude using the word in imprecise, non-
technical, and/or metaphorical senses. I find the entropy situation
remarkably non-problematic, because AFAICT all the common metaphorical
usages are reasonably close to the core technical usage. To repeat:
both of the following are OK with me:
-- It is OK to determine the entropy in terms of p log p. This is the
gold standard. This is the core meaning. This always works. This
requires knowing the state-by-state probabilities.
-- It is OK to use the term entropy in less-precise ways. If you're
not using p log p, it might be a corollary, it might be some kind
of approximation, or it might be merely a metaphor. This doesn't
change the core meaning.

Note: In special cases, p log p reduces to counting states. (Let's
be clear: counting states is *not* the gold standard.)

Also note: Entropy does not solve all the world's problems. Entropy
measures a particular average property of the distribution. It is
easy to find situations where other properties of the distribution
are worth knowing.

The contrast between "heat" and "entropy" is particularly stark for
me, because I always tell people to use entropy to solve the heat
problem. That is, don't quantify heat, for the same reason you don't
quantify blue. Quantify something else instead!

*** Instead of heat, quantify energy and entropy. ***