Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] infinite entropy ... for a discrete distribution



Here's an open-ended question that you may find amusing.

Exercise: Come up with a discrete distribution, a distribution
over the integers, such that the probabilities pi are well
behaved:
pi > 0 for all integers i>0 [3]
Σ pi = 1 [4]

and the entropy is infinite:
Σ pi log(1/pi) = ∞ [5]

This is not one of those trivia questions that students can answer
in 45 seconds or less. The first time I thought of the question,
it took me a couple of hours to figure out the answer. It turns
out I could have found /part/ of the answer by googling, but it
was more fun to just figure it out.

Note that if you want to succeed, you cannot assign (let alone
define) the entropy to be equal to log(multiplicity) or anything
like that, because that would assign zero probability to each of
the microstates, in violation of equation [3].

Note that we are threading a rather fine needle, since we want
the series in [4] to converge, but the closely-related series
in [5] to diverge.

For some solutions and additional discussion including a pie
chart, see
http://www.av8n.com/physics/thermo/entropy-more.html#sec-infinite-s