Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] Quantum measurement problem



On 6/30/22 12:07 AM, Antti Savinainen via Phys-l wrote:

The recent issue of Physics Today has an interesting article on the quantum
measurement problem. N. David Mermin recounts three typical responses that
physicists may have (the wording is mine):

1) There is such a problem.
2) There is no problem whatsoever.
3) The problem is not important.

Mermin supports the second option and provides interesting reasons for his
stance. Half of the piece seems to be available for free:
https://physicstoday.scitation.org/doi/10.1063/PT.3.5027

Do you agree with Mermin?
In the Physics Today article, Mermin makes a few major points and a
great many minor points.

I disagree with several of the minor points, and a couple of the
major points are not well explained.

We agree on the #1 most-central major point. It is a mistake
to separate QM into "plain ordinary" QM plus a special
"measurement process" that plays by different rules. This
mistake is very widespread, but that doesn't make it any
less of a mistake.

The right thing to do is to treat the "system" plus the
"measurement apparatus" as one big apparatus containing a
measurand subsystem plus a measuring subsystem. The whole
apparatus plays by the "plain ordinary" QM rules. There is
only one universe, and it plays by the rules of QM. The
so-called classical world exists only as an approximation,
as explained by QM in certain limits. QM can explain
classical behavior but not vice versa.

The approach in the previous paragraph is more easily said
than done. It has actually been done in a few cases, but
more commonly the measuring subsystem is so gnarly that
people give up on analyzing it from QM first principles.

This problem is not unique to the measurement process.
Even plain old atomic physics, for atoms more complicated
than hydrogen, it is very messy to analyze them using QM
first principles. So people introduce approximations,
with all they risks and conceptual burdens that entails.

Mermin says
Nobody has ever worried about classical measurement problem.

I disagree. There is a classical problem that is directly
related (perhaps identical) to one of the alleged QM
problems. Classical probabilities are subjective. You
can calculate the probability of (say) each possible
poker hand, but your calculations will be no good if
I have stacked the deck. I know it's stacked but you
don't. You think the deck has 226 bits of randomness
but I think it has zero.

Similarly, if I hand out a bunch of flash drives, each
containing a 1 megabyte encrypted file, if you don't
know the encryption key it looks completely random.
You can't tell the difference between encrypted
valuable data and encrypted random noise. You see
it as 8 megabits of randomness. OTOH someone who
knows the key sees it as zero randomness.

The same story applies to macroscopic physical systems.
Using pulsed NMR techniques I can create a spin system
that appears to have 10²³ bits of randomness, unless
you know how it was created, in which case you can use
another pulse sequence to return it (nearly) to a zero
entropy state.

A "good" measurement subsystem "should" produce an
unambiguous indelible record, perhaps ink on paper.
You can then reason about that record using classical
methods.

This is only one issue among many that come up when
dealing with quantum measurement.

Mermin says:
Collapse is generally abrupt, discontinuous, and stochastic.

That's not clearly explained. Arguably it's "generally"
true in the sense of more often than not, but I would
not agree that it is "generally" true in the sense of
"true in all generality".

That's because you can have a *weak* measurement. You
can couple very loosely to the measurand. That is,
you can siphon off a tiny fraction of the signal and
measure that, leaving the signal mostly undisturbed.
For example, for a signal carried on a coax cable or
optical cable, you can use a 20 db directional coupler.
By making repeated weak measurements, you can very
gradually "collapse" the wavefunction, insofar as
that phrase means anything at all.

I know Mermin knows about this, because I've discussed
it with him.

Rather than talking about "collapse" it is better
to talk about amplifying the signal to the point
where it becomes unambiguously classical. This is
conceptually easy to understand right up to the
point where you re-measure the signal and start
worrying about EPR correlations.

===================

As for "interpretations" of QM, I belong to the school
that says he who interprets least interprets best.
That is to say, the equations are right, and if you
trust the equations you will get the right answer.
There is no need to "interpret" anything or to
introduce any "measurement postulates".

By way of analogy, consider Newtonian gravitation.
Newton was asked to explain *how* the gravitational
force traveled from the sun to Jupiter. Having gone
to school on Galileo, Newton famously replied
Hypotheses non fingo.

For the next 250 years, for all practical purposes
and for many other purposes, we did not need to
know *how* it happens. As Galileo emphasized, it
suffices to know *what* happens.

Eventually some guy came along and figured out how
it happens, i.e. a mechanistic explanation for how
the gravitational field is transmitted, and this
was a tremendous advance in our understanding. OTOH
GR still doesn't have many practical applications.

QM is actually better than Newtonian gravity. If
we didn't have GR, there would be some observable
anomalies, such as the precession of Mercury and
a handful of other things. In contrast, the things
we don't understand about QM have no observable
consequences AFAIK.

As for Mermin's three-way classification scheme: If we
put the three categories at the corners of a triangle,
I sit in the middle of the triangle. There are things
we don't understand, but they are things we don't
need to understand. So it's sort of a problem and
sort of not.

In particular, consider EPR "spooky correlations at
a distance". The equations tell us what must happen,
but we don't have a mechanistic explanation for how
it happens. I am not suggesting that everybody drop
everything and worry about this. But IMHO we should
recognize it as an annoyance, in the same category
as Newtonian gravity. Maybe 250 years hence somebody
will figure it out.

To repeat: There are things we don't understand, but
AFAICT they are things we don't need to understand.

=============

Mermin says:
Quantum mechanics describes a physical system entirely
in terms of states.

I disagree with that, too ... although it has no bearing
on any of the major points he makes.

QM did not emerge from classical mechanics (Hamiltonians,
Lagrangians, et cetera). Actually at the beginning it
evolved from thermodynamics. Planck was one of the few
guys on earth smart enough to read and understand
Boltzmann's book.

I would argue that you can't really understand QM in
the absence of temperature.

Classically if you consider the position of N particles,
you have N equations of 3 variables apiece. QM says we
should instead treat that as one function of 3N variables.

All electrons are identical, so if you follow the
advice to just trust the equations, you need to
include every electron in the universe in your one
function of 3N variables, where N is impossibly large.
If and only if you have a notion of temperature, you
can decide that some electrons are so far away that
they cannot possibly matter, resulting in separation
of variables, resulting in a manageable number of
variables that you need to worry about.

So, rather than fixating on quantum "states" as
the term is normally defined, I suggest a serious
discussion of fundamentals should deal with density
matrices. In particular I have a collection of trick
questions to which smart people give the wrong answer
more-or-less every time, because they think in terms
of plain old states when they shouldn't.