Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] heat content



On 02/14/2014 12:53 PM, Jeffrey Schnick wrote:

The language is not precise. The water swirling around in the bowl
helps to highlight that. There is a gray area between when the energy
is kinetic energy of bulk motion and when it is energy of unorganized
molecular interactions and motion.

Some of that will always be a problem, but some of that is
a fixable problem. Think of the flow pattern as a fractal,
and look at it as a function of /length scale/. The whole
bowlful of water has no center-of-mass motion, but if you
divide it into parcels each parcel has some regular center-
of-mass motion as it flows around, and if you look at even
smaller parcels you see some irregular motion. The dividing
line between "thermal" and "non-thermal' is gray and fuzzy
... but that's OK, because it doesn't actually matter.

Consider the famous Brownian motion paper that I cited
yesterday: The idea is to treat the pollen grain as a
macroscopic object and work out the plain old Newtonian
mechanics. Then turn around and apply a statistical,
thermodynamic analysis to the same object. The thermal
analysis and the non-thermal analysis are not mutually
exclusive! Thermodynamics is a tool that you can bring
to bear in addition to -- not instead of -- other methods.

Suppose I have two black boxes that look and feel the same to me. I
push down on the top of each one and it goes down. Let each black
box and its contents be a system, more specifically, a control mass.
In each case, I have increased the energy of the system. By
mechanical means, I have caused energy to flow into each system. I
say I did work on the system. Next I bring a hot object in contact
with each black box. In each case the hot object is at a higher
temperature than the black box. In each case, some energy
spontaneously flows from the hot object into the system. I say I
have caused heat to flow into the system. In either case, the energy
of the system, the black box and its contents increases.

That's all good.

We can quantify the process by writing something like:

dE = T dS

so the change in energy is ∫ dE
and the "heat" is T dS and/or ∫ T dS (your choice).

Hint: In all such situations I recommend keeping track of
both the entropy and the energy ... not just the energy.

To exclude the possibility that someone thinks that some of the
energy of the system might have to do with the translational motion
of the center of mass of the system, I tend to say that the internal
energy of the system increases, rather than just saying that the
energy of th e system increases. I don't think I should have to
include the adjective "internal" but it helps clarify things.

I agree with the sentiment, but in practice one needs to be
somewhat more cagey. Start by considering a pendulum. It
undergoes some center-of-mass motion, which which we can
call a /collective mode/. To a very good approximation the
collective mode is decoupled from the thermal properties of
the materials. The decoupling is not a law of physics, but
we can /arrange/ for it by means of suitable engineering,
e.g. invar rods et cetera. So far so good.

Now consider a tuning fork. I say the fundamental physics
is the same. We can describe it using exactly the same
words if we speak in terms of collective modes.

In contrast, we get tangled up if we focus too much
attention on the "center-of-mass" motion. The collective
mode of the tuning fork does not involve center-of-mass
motion. This makes the tuning fork unlike the pendulum,
but IMHO this is an unimportant distinction in this context.
The pendulum is different as to momentum, but it is the
same as to energy and entropy.

It is easy to come up with additional examples where we have
an important collective mode that is not simply center-of-mass
motion:
-- Spinning flywheel
-- Waves on the ocean
-- Acoustic standing waves in an organ pipe
-- Electrical energy stored in a capacitor

The concept of "collective modes" doesn't solve all the world's
problems, but it solves some of them. One key property is that
a collective mode can have high energy without very much entropy.

I open up the boxes. I find that one of them contains a gas. When
I pushed down on the lid of the black box I compressed the gas. When
I brought the hot object in contact with the black box heat flowed
into the gas. But in the other box I find a bunch of gadgets and a
cold potato. When I pushed down on the lid I pushed a gear rack
down which caused a flywheel to spin. When I brought the hot object
into contact with the black box a heat engine inside the black box
caused the flywheel to spin faster and warmed up the potato.

That's all good. There are of course other possible black boxes.
-- fluid :: dE = - P dV + T dS uncramped
-- flywheel + potato :: dE = - P dV + T dS cramped
-- flywheel only :: dE = - P dV cramped
-- potato only :: dE = T dS cramped

One can also replace the potato with milk. It is sometimes advantageous
to replace the flywheel with an ideal Hookean spring. Et cetera.

Tangential remark: There are yet other boxes. For example, a
box containing two potatoes at different temperatures. The
temperature of the system as a whole is undefined and undefinable
... even though each of the two subsystems has a well-defined
temperature.

Despite the many different ways that heat is defined, I have the
impression that physicists have pretty much reached a consensus on a
formal definition of heat as energy in transit, in particular the
energy that spontaneously flows from an object to a colder object
when the two are brought into thermal contact.

That's the #1 dictionary definition ... but that's not the only
way in which the word is /actually/ used. I call 'em like I
hear 'em. If I say I'm going to heat some food in the microwave,
only the world's biggest pedant would object ... even though it
doesn't conform to the #1 definition. More generally, whenever
we look at dissipative processes, the #1 definition is pretty
much dead on arrival.

Once it gets there it is not heat, nothing can contain heat [1]

I understand the sentiment, but we have to be careful.
a) Technically speaking, assertion [1] is true *except* in
trivial cases.
b) Pedagogically speaking, if you just tell the class that
nothing can contain heat, you've lost everybody, and you'll
have a hard time getting them back. They've had years of
direct personal experience dealing with the "heat content"
of potatoes and baby bottles. On top of that, most of the
grade-school science instruction explicitly endorses and
reinforces this concept. The dictionary definition of "heat"
is fully consistent with "heat content" if we imagine that
caloric flows from the hot object to the cold object.

So here we have the makings of an enormous pedagogical disaster.

In my experience, it doesn't pay to tell students XYZ is not
true when they have years of experience and tons of evidence
that XYZ is true. Insisting that XYZ is not true would just
destroy credibility. Instead I recommend a softer approach.
I build a corral around XYZ. I concede that inside the corral,
XYZ is true and useful. However ... as soon as we step outside
the corral, XYZ is no longer true.

So as I see it, at the conceptual level and at the pedagogical
level, the big agenda item is to appreciate the distinction
between inside the corral and out.

In the present example, we need to distinguish cramped systems
from uncramped systems:
*) For a potato or a baby bottle,
++ you can write dQ = T dS
-- you cannot build a heat engine.
*) For a piston+cylinder full of steam,
-- you cannot write dQ = T dS
++ you can build a heat engine

====================

All this is layered on top of another pedagogical disaster,
namely the idea that every vector field "must" be the gradient
of a potential. This is a disaster in electromagnetism as well
as thermodynamics.

This misconception is widely held, and deeply held. People
are systematically taught that every voltage is a potential
difference. On 4 September 2013‎ one of the contributors to
wikipedia wrote:
«"voltage" is a misconception people use to describe difference»

Again, as I see it, the big agenda item is to appreciate the
distinction between inside the corral and out. If you want
to live inside the corral, i.e. if you /want/ the voltage to
be a potential, you have to make sure there are no time-
dependent magnetic fields, or you have to control the geometry
so that the changing flux lines don't couple to the part of
the circuit you care about. And even then, you have to keep
in mind that there are some things that permanently live outside
the corral, including transformers, radios, ground loops,
betatrons, and lots of other things.

======================

All this is layered on top of a third pedagogical disaster,
namely a weak and not-entirely-correct understanding of
multivariate calculus. Among other things, the usual notation
for partial derivatives is just begging to be misunderstood.

It's not fair that the physics teacher has to teach much of
the math *and* all of the physics, but in most places that's
the way it is. That's the way it's been for a very long time.