Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-l] Gibbs paradox (redux)



On 06/25/2011 05:23 PM, Carl Mungan wrote:
Suppose we have a glass of milk (or snow). Now insert a partition
dividing that glass into 2 equal portions. Each of the new portions
has entropy:

S_new = 1.5(N/2)k + (N/2)kln(V/2u).

So the change in entropy of all the milk is 2S_new - S = -Nkln2.

By dividing the glass of milk into 2 equal portions, I seem to have
decreased the entropy of the system. This seems like a strange
result: where is corresponding gain of entropy in the surroundings to
ensure the 2nd law is not violated?

Perhaps it would help to start with chocolate milk on one side and
while milk laced with radioactive cesium on the other. Remove the
partition. Observe the mixing. Re-insert the partition. Unmixing
does not occur. If it did, then you would get to ask how the system
entropy decreased and where the missing entropy went to. But since
unmixing does not occur, the question does not actually arise.

If it _appears_ that inserting the partition lowers the entropy, that
appearance is illusory. I suspect the illusion is caused, at least
partially, by assuming that the entropy should be extensive. For this
system, the entropy is not extensive; not even close.

========================

Here's another version of the same thing. This version may allow some
people to more easily follow the quantitative details

a) Take a deck of cards, starting in the canonical factory configuration,
which has no entropy, because we know the microstate exactly.

b) Give Moe the clubs and spades. He shuffles them. His deck now has
88.4 bits of entropy i.e. log2(26 factorial).

Give Joe the hearts and diamonds. He shuffles them. His deck now has
88.4 bits of entropy i.e. log2(26 factorial).

c) Put the two halves together and shuffle. The entropy is now 226 bits
i.e. log2(52 factorial). A goodly amount of "entropy of mixing" has been
created ... almost 48 bits worth.

d) Separate the deck again, giving Moe the top half and Joe the bottom
half. This does not change the entropy. The entropy is still 226 bits.

Moe has 2^88.4 ways of re-arranging his hand, and Joe has 2^88.4 ways
of re-arranging his hand ... but that does not account for all the
entropy. The total entropy is still 226 bits, so in some sense there
is almost 48 bits of entropy associated with not knowing which cards got
transferred to/from Moe from/to Joe during the big mashup in step (c)
above.

Entropy measures how much we don't know about the system ... the system
as a whole.

=====================

Another word about the formalism:

When we have a two-part system, intuition might suggest we can calculate
the entropy of each subsystem separately and then add ... but this
intuition is just wrong. The math and physics say that we need to calculate
the microstates of the *whole* system. Just because two subsystems are
physically separated at the moment does not mean their probabilities are
statistically independent. They might have history.