Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
This is admittedly a more minor part of the discussion, but here goes:
The pressure is an average quantity, it is averaged over something.
Speaking from an operational viewpoint; I'd say my pressure guage averages
over time.
And I don't think I'm willing to say the fluctuations my
sensitive guage sees is not registering fluctuations in pressure. I think
that saying it isn't, and is rather measuring something about the
microscopic state is dangerously close to simply defining "the problem" (my
viewpoint) away.
How much of what my guage measures is the macroscopic state and how much is
the microscopic state?? The dividing line seems rather arbitrary, unless
you simply answer the question by saying, "that which is different from the
average value, isn't measuring pressure (but rather microstate info); and
that which is the same as the average, is measuring pressure and macrostate
info".
The problem I have with that is the following: what if I don't know
if the system is in equilibrium and my pressure guage sees fluctuations, are
the fluctuations fluctuations in pressure (because its not in equilibrium);
or are they measuring micro info (because my system is in equilibrium and
then the fluctuations aren't pressure??)???
This may of course be the difference implicit in taking the information
theory approach versus the other (I'm not sure).
Joel