Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] Re: entropy: increased by stirring, decreased by observation



On 06/13/05 11:02, John Denker wrote:

"Call it what you like, or ignore it altogether if you don't
find it helpful. It is
a) just an analogy, and
b) not the crux of my argument."

I disagree with this. The analogy was brought up to answer the quest=
ion by Bob Sciamanda, and if the answer was intent to be relevant, th=
e analogy must be essential. In this respect, at least, it was the cr=
ux of the argumrent.

"Well, if that's how you look at it, the analogy is much
better than you think. When we assign probability to
events in nature, all probabilities are conditional
probabilities, conditioned on the state of the observer.
As a corollary, all entropies are conditional entropies
(since entropy is defined in terms of probability)."

The probability in the definition of entropy is not merely the meas=
ure of our knowledge. Even if I know the exact microstate of a system=
, this by itself will not affect the system's entropy.=20


" -- For starters, there is no such thing as dQ. There is no
Q that can be differentiated to yield T dS (except maybe
in trivial cases).
http://www.av8n.com/physics/thermo-laws.htm#sec-non-exact
-- Also, S is well defined even in cases where T is zero,
unknown, or undefineable.
http://www.av8n.com/physics/thermo-laws.htm#sec-entropy";

Just because dQ is not a perfect differential does not mean that the=
re is no such thing as dQ. Once you specify the path in the phase spa=
ce (physically, the way you add heat to a system in a state close to =
equilibrium), dQ is exactly defined.
I find the websites John referred me to very interesting, and some e=
xamples just fascinating. I'll try to read more carefully both of the=
m. However, here, too, I disagree with (or, may be, do not quite unde=
rstand) some of the statements regarding the entropy. Here are some e=
xcerpts:

"(The Laws of Thermodynamics).
The entropy of some things goes to zero as temperature goes to zero.
This is true except when it's not true."

I wish I knew when it is not true.

"(Section 1.1) Equating energy with doable work is inconsistent with =
thermodynamics.=20
* A box containing a hot potato and a cold potato has some energy and=
some ability to do work.
* In contrast, a box containing just two hot potatos has more energy =
but less ability to do work."

This statement is misleading. It is based on the severely restricte=
d conditions for doing work. In conventional definition it is work do=
ne on a system in question. In the first box, you have two systems at=
different temperatures, and work is done by one system on the other.=
In the second box, I cannot get any work from the thermal energy, be=
cause both potatos are at the same temperature, and should be conside=
red as one system close to thermodynamic equilibrium. In this case, t=
he whole box is a system that can do work only on a colder environmen=
t. So if I put both boxes into a big container with T near zero, and =
operate within this container, then I find that the second potato box=
has more ability to do work than the first one by the amount of the =
additional thermal energy of the second potato.

"(Sec. 4.3) When we want to quantify things, it is better to forget a=
bout "heat" and just to quantify E and S, which are unambiguous and u=
nproblematic."

This does not seat together with the statement that S is subjectiv=
e characteristic depending on knowledge of the observer.

"There are cases when E =3D E (V, S)"

On this point, I agree with John. But now consider the implications=
. If E is a uniquely defined function of V, S, and at the same time y=
ou say that S only relfects the state of knowledge of an observer, t=
hen automatically the same becomes true of E. John, sorry, if in a bo=
xing match you knock me down, then even the fact that I know very lit=
tle about entropy won't make your punch more gentle on me.

=20
Moses Fayngold,
NJIT
_______________________________________________
Phys-L mailing list
Phys-L@electron.physics.buffalo.edu
https://www.physics.buffalo.edu/mailman/listinfo/phys-l