Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
Ok, here's my next reply to Leigh...
Dan says (and I omit much I do not choose to argue further)...:
In other words, you're refusing to answer some of my honest questions.
I still want to know precisely how large a system has to be before
we're allowed to talk about its entropy. I've now asked you three
times.
Look, I don't want to argue quantum mechanics here. But I've never
before heard anyone claim that you need to understand quantum
indeterminacy to understand entropy.
I'm honestly not sure what "the authoritative view of entropy" is,
nor do I believe that the question should be decided by appeal
to authority. Please read what I said. I'm making no claims
about your intelligence. I'm simply countering your implication
that all the authorities are on your side.
For the record, I am neither a creationist nor a postmodernist.
As for the methodology of my poll, I simply stuck my head in each
office and, without any context, announced that I was taking a
poll: "Is entropy objective or subjective?". All but one answered
"objective". Then I e-mailed my former instructor, again with
no context, and he answered "subjective" (with a nice brief
explanation along the lines of what I've been arguing). I only
"badgered" one person, the one who changed his answer from objective
to subjective (though with some reservations that he's still
thinking about).
Will someone else please chime in here? A voice of sanity?
I also would like to hear from some others. It's becoming clear
that further exchanges between Leigh and myself are not likely to
get us much further.
Would you insert comments about the
subjective nature of entropy into such a course? I'll wager that
the texts cited here would not support your view.
In my course I raised the issue very briefly, several weeks ago,
and said it's controversial. Then this morning, as students were
coming in before class, I briefly mentioned this debate to them
and asked if they had an opinion. (Most of them did not.) But I
didn't spend any more class time on it.
I've never seen a text that even raises this question, much less tries
to answer it. If you know of one that does, please tell me. Landau
and Lifshitz *almost* touch on the question, especially in their discussion
of timescales on page 27 (statistical physics, part 1, 3rd edition).
They say that there's no such thing as an "instantaneous" entropy,
but rather that entropy is always understood to be relative to some
relaxation timescale. But I'm making a stronger claim, that even
over very long timescales, most "accessible" states will never
be realized and so the conventional entropy also factors in our
large degree of ignorance (which could be somewhat less, yielding
a smaller value of the entropy).
Some other books define "accessible" in terms of "constraints". But
they're usually nebulous about what, exactly, qualifies as a constraint.