Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Entropy, Objectivity, and Timescales



First let me confess that I haven't followed the entire thread on
order and disorder, etc. But the example of shuffling a deck of
cards brings up an issue that I'd like to comment on.

We usually define entropy as the logarithm of the number of "accessible"
microstates. But the word "accessible" cannot possibly be meant to
imply that the system is ever *actually* in more but a tiny fraction
of the microstates. Look at the numbers: e^(10^25) "accessible"
microstates for a mole of helium under standard conditions. Bringing
up the timescale issue (a good issue, generally) doesn't help here;
wait as long as you like, billions and billions of years, this system
will never actually explore the vast majority of the supposedly
"accessible" states.

It seems to me, therefore, that entropy is fundamentally a *subjective*
quantity, a measure of our ignorance. If I knew the precise microstate
of the helium now, then I could predict its microstate at all times
in the future for the next billion years, and even counting every state
that it explores as "accessible" I'd get an entropy much less than the
generally agreed upon value. The agreed upon value, though, includes
an enormous number of other microstates that the helium will never
actually explore (though we don't know this).

Now back to the deck of cards. If you shuffle the deck and I don't
look at the order, why can't I consistently say that its entropy
is log(52!)? From my perspective, your shuffling of the deck in the
past made all states "accessible", in the same sense that all those
microstates were accessible to the helium gas. True, the cards won't
be exploring any more microstates in the future if you're no longer
shuffling them, so there is a sense in which their entropy is zero,
but since entropy is subjective anyway, it doesn't bother me that
entropy might be zero in one sense and nonzero in another. In cases
where it matters (i.e., large systems), the difference between
one person's entropy and another person's entropy will be negligible,
since nobody will have enough information about the system to cut
down the entropy significantly.

Well, that's how it seems to me. But I'm posting this so that those
of you who disagree will (I hope) explain why. Can't wait to hear
from you!

Dan Schroeder
Weber State University
dschroeder@cc.weber.edu