Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: tangent off of Order being born from disorder?



These comments by Leigh raise a related issue that I've had
trouble understanding. Hawking, and others, state that the directionality
of time is related to entropy; something I don't quite grasp. When Leigh
says 'long timescales' is this referring to time's directionality? If
the uncertainty was not related to timescale, would there be a
directionality?

When I mentioned timescale I was referring to the concept of thermodynamic
equilibrium. The default option when speaking of thermodynamics is the
thermodynamic behaviour of systems in thermodynamic equilibrium. (Of course
nonequilibrium thermodynamics is also a nonempty field.) The condition of
thermodynamic equilibrium is a matter of timescale. For example, we can use
equilibrium thermodynamics to say useful things about supersaturated
solutions and supercooled pure substances, but these systems are always
subject to capricious irreversible transitions at any instant in time. On
the other hand a system we would conventionally consider to be in stable
thermodynamic equilibrium is always susceptible to an incredibly
improbable catastrophic transition to a state with lower free energy, for
example a black hole, as Freeman Dyson points out. (The transition can be
made by the mechanism of quantum mechanical tunneling.) The difference
between this and a supercooled droplet of water is one of timescale.

Of course the entropy of the universe does increase with time. That
observation is called the second law of thermodynamics, and entropy is
sometimes poetically referred to as "time's arrow". The observation that
the entropy of the universe increases with time is an expression of the
limitation on our ability to predict the future. The more remote the
future, the greater the uncertainty about what it will hold. I like David
Bowman's characterization of entropy in that way. Entropy is a measure of
the magnitude of our uncertainty regarding the state of the system. If a
large number, N, of different microscopic states (quantum states) of a
macroscopic system will all seem equivalent on a macroscopic scale, then
the entropy, S, of the system is given by the relation

S = k log N
e

where k is Boltzmann's constant. Ludwig was so proud of his discovery of
that relation that he had it put on his gravestone.

Leigh