Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] How to convince a Second Law skeptic



A friend of mine wrote the passage appended below. I do not wish to offend
him. And I most certainly do not wish to tell him he is crazy. I have
tried to explain the Second Law to him and I have borrowed explanations from
others to no avail. Is there anything that can disabuse him of his
misconceptions? Is there a graphic way to help him understand why he should
assume that the Second Law is correct and what can and what cannot be
inferred from the Second Law?

Tom Wayburn, Houston, Texas

Remember, the object is NOT to prevent someone else accepting this. The
object is to inform the author who might as well be a skeptical student
accustomed to thinking for himself. Here is the passage in question:

You may have noticed that in this whole discussion I have never mentioned
the word entropy. It is a word that causes great amounts of confusion.
People link it to order, link it to the cooling of systems. When confusion
abounds, clarify your definitions. Entropy as commonly employed by engineers
is a mathematical concept that really has no visual references. Many people,
including myself, have a great attraction to being able to visualize
something in order to understand it. Since it cannot be visualized, it
causes confusion. I also find contradictory statements about entropy, which
compounds the confusion. Entropy to engineers is defined as the heat
transferred out of a system, divided by the temperature at the time of
transfer. It is a useful number in working with problems of heat transfer.
There are no measurements of order involved with this definition, and it
only gives relative changes in entropy. It is easy to see that these changes
tend to be in the downward direction, since heat transfer tends to slow down
as equilibrium is reached.

There is another definition of entropy, that does use the concept of order.
This equation says that S = k ln W, where S is entropy, k is a constant, the
Boltzmann constant, and W is the measurement of ways of being. This equation
fits rough observation. The ways of being decrease as order increases, which
happens as energy flows out of the system, and the entropy decreases. This
fits, since entropy at absolute zero should be zero, and it should be zero
at equilibrium, as well. Since things are continually going to equilibrium,
the heat transfer is continually decreasing, order is continually
increasing, and entropy decreasing with it. Why then do we constantly hear
the statement that entropy always increases? Where have physicists obtained
the idea that systems tend toward disorder? I have an encyclopedia article
in front of me that says this, along with the equations. My engineering
textbook on thermodynamics, by Bernhardt Skrotzki, points out that entropy
at absolute zero is zero. But the encyclopedia article says that, "when
entropy is at its maximum, the amount of work that can be transferred is
equal to zero." These are contradictory statements. I am very suspicious
that we have a case of the " naked emperor", here, with no one daring to say
that they don’t understand contradictions like this. Biologists have noted
the statement from physics that the tendency for disorder is statistical,
and have seized on this to say that the order presented by life is a thing
of chance, that life is the improbable come true, as it should, given enough
time. Mystics scoff. But if you look objectively and closely, systems do not
tend to disorder, life is indeed based on chance encounters, but the
attractions and rejections of certain particles make life as predictable as
the existence of a whirlpool when you pull the plug in a tub.

It is actually not difficult to see where the idea that entropy always
increases comes from. Some simple mathematics will show entropy always
increasing. The trouble is, that a logical error is being made in those
simple mathematics. Entropy is defined as the heat transfer in or out of a
system, divided by the absolute temperature in the system at which the
transfer takes place. So the logic goes, that heat is transferred out of one
system at a high temperature, into another system at a lower temperature.
The same amount of heat is transferred, but the temperature of the receiving
system is lower. With the same amount of heat being divided by a lower
temperature, the entropy of the receiving system is a bigger number.
Therefore it is said that entropy always increases. But a subtle error has
been done here. You do not find entropy by adding entropy from one system to
another system. To find what has happened to the entropy of the combined
systems, you must find the heat transfer over the boundaries of this
combined system. Heat flow from one part of the system to another is an
internal matter that is not important to the definition of entropy. All
kinds of internal heat transfer can be happening, but entropy is defined as
the heat transfer in or out of the system. You simply cannot add entropies
from two systems and claim that this says something about the combined
system. Once this problem is recognized, contradictions about entropy go
away. But basically, entropy is not really important to the discussion of
order and life. As I wrote before, it is a useful number in dealing with
problems of heat transfer. I am not aware of any practical significance to
the equation for entropy that does involve order. The only thing that I can
see that it has done is confuse things.