Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
I personally see nothing objectionable with the usage of these terms in
the article.
I see little conceptual danger in the association as long as the term
'disorder' is used in an imprecise and nontechnical sense,
It is possible, I suppose, that an
unsuspecting student might bring to the conceptual table ideas of the
relevant disorder possibly being at the level of a macroscopic
arrangement of macroscopic parts of a thermodynamic system. If so then
the student would be liable to be confused.
For instance, a thoroughly
shuffled deck of cards has essentially the same thermodynamic entropy
as a well-ordered deck (assuming, of course, both decks have the same
temperature and are subject to the same external macroscopic
environment). OTOH, a cold deck of cards has less thermodynamic entropy
than a warm one regardless of how the decks were shuffled or not.
The terms, 'information', 'disorder', 'order', 'information',
'complexity', 'uncertainty', (even 'entropy'), etc. can all be used in
both an imprecise and colloquial as well as a precise and technical
manner.
But the same goes for such words as 'energy', 'power', 'force',
'work', 'color', 'black', 'white', 'flavor', 'heat', 'flow', etc.
Another problem that sometimes arises in the terminology of physics is
that the same common word may be occasionally adopted for multiple
technical meanings. This, too, can lead to confusion for the student.
For instance the term 'heat' is used quite differently in the next
two sentences. "The latent heat for the phase change for the substance
is 750 KJ/kg." "The Joule paddle-wheel experiment shows that
1 cal. of heat is equivalent to 4.18 J of work."
In the specific cases of the terms 'entropy' and 'disorder', I would
*not* use them as synonyms.
I would consider the concepts they
signify as related, but not identical, to each other. What relates them
is that they are both measures of (missing) information. What
distinguishs them is that they are *different* such measures.
This definition implies that since entropy is an *average* over a
distribution it is a statistical concept that is a property of
statistical ensemble of possibilities, (i.e. the probability distribution
itself). It is not a property of individual outcomes or states drawn
from the universe of possibilities. The entropy is a functional on the
space of probability distributions. It is not a function of an
individual given realization or sample drawn from such a distribution.
Since entropy is really dimensionless
If this is done then a temperature of 1 K really
means the intensive (energy/per entropy units) of 1.38065x10^(-23) J/nat.
The composite object(s) that has(have) this minimal possible
complexity value over the whole ensemble is/are said to be the
most-ordered object(s). The difference between the actual complexity of
each of the ensemble members and this minimal complexity value defines
(by my definition of the term) the 'disorder' of each such composite
object.