OK, this is going to be a *very* long post. About something I don’t pretend to be expert in. But it *is* science, at least.

A couple of weeks ago, Radio 4’s highbrow “In Our Time” tackled the so-called “Measurement Problem”. That is: quantum mechanics predicts probabilities, not definite outcomes. And yet we see a definite world. Whenever we look, a particle is in a particular place. A cat is either alive or dead, in Schrodinger’s infamous example. So, lots to explain in just setting up the problem, and even more in the various attempts so far to solve it (none quite satisfactory). This is especially difficult because the measurement problem is, I think, unique in physics: quantum mechanics appears to be completely true and experimentally verified, without contradiction so far. And yet it seems incomplete: the “problem” arises because the equations of quantum mechanics only provide a recipe for the calculations of probabilities, but doesn’t seem to explain what’s going on underneath. For that, we need to add a layer of interpretation on top. Melvyn Bragg had three physicists down to the BBC studios, each with his own idea of what that layer might look like.

Unfortunately, the broadcast seemed to me a bit of a shambles: the first long explanation by Basil Hiley of Birkbeck of quantum mechanics used the terms “wavefunction” and “linear superposition” without even an attempt at a definition. Things got a bit better as Bragg tried to tease things out, but I can’t imagine the non-physicists that were left listening got much out of it. Hiley himself worked with David Bohm on one possible solution to the measurement problem, the so-called “Pilot Wave Theory” (another term which was used a few times without definition) in which quantum mechanics is actually a deterministic theory — the probabilities come about because there is information to which we do not — and in principle cannot — have access to about the locations and trajectories of particles.

Roger Penrose proved to be remarkably positivist in his outlook: he didn’t like the other interpretations on offer simply because they make no predictions beyond standard quantum mechanics and are therefore untestable. (Others see this as a selling point for these interpretations, however — there is no contradiction with experiment!) To the extent I understand his position, Penrose himself prefers the idea that quantum mechanics is actually incomplete, and that when it is finally reconciled with General Relativity (in a Theory of Everything or otherwise), we will find that it actually does make specific, testable predictions.

There was a long discussion by Simon Saunders of that sexiest of interpretations of quantum mechanics, the Many Worlds Interpretation. The latest incarnation of Many-Worlds theory is centered around workers in or near Oxford: Saunders himself, David Wallace and most famously David Deutsch. The Many-Worlds interpretation (also known as the Everett Interpretation after its initial proponent) attempts to solve the problem by saying that there is nothing special about measurement at all — the simple equations of quantum mechanics always obtain. In order for this to occur, then *all possible outcomes* of any experiment must be actualized: that is, their must be a world for each outcome. But we’re not just talking about outcomes of *science experiments* here, but rather any time that quantum mechanics could have predicted something other than what (seemingly) actually happened. Which is all the time, to all of the particles in the Universe, everywhere. This is, to say the least, “ontologically extravagant”. Moreover, it has always been plagued by at least one fundamental problem: what, exactly, is the status of probability in the many-worlds view? When more than one quantum-mechanical possibility presents itself, each splits into its own world, with a probability related to the aforementioned wavefunction. But what beyond this does it mean for one branch to have a higher probability? The Oxonian many-worlders have tried to use decision theory to reconcile this with the prescriptions of quantum mechanics: from very minimal requirements of rationality alone, can we derive the probability rule? They claim to have done so, and they further claim that their proof only makes sense in the Many-Worlds picture. This is, roughly, because only in the Everett picture is their no “fact of the matter” at all about what actually happens in a quantum outcome — in all other interpretations the very existence of a single actual outcome is enough to scupper the proof. (I’m not so sure I buy this — surely we are allowed to base rational decisions on only the information at hand, as opposed to all of the information potentially available?)

At bottom, these interpretations of quantum mechanics (aka solutions to the measurement problem) are trying to come to grips with the fact that quantum mechanics seems to be fundamentally about probability, rather than the way things actually are. And, as I’ve discussed elsewhere, time and time again, probability is about our states of knowledge, not the world. But we are justly uncomfortable with 70s-style “Tao-of-Physics” ideas that make silly links between consciousness and the world at large.

But there is an interpretation that takes subjective probability seriously without resorting to the extravagance of many (very, very many) worlds. Chris Fuchs, along with his collaborators Carlton Caves and Ruediger Schack have pursued this idea with some success. Whereas the many-worlds interpretation requires a universe that seems far too full for me, the Bayesian interpretation is somewhat underdetermined: there is a level of being that is, literally unspeakable: there is no information to be had about the quantum realm beyond our experimental results. This is, as Fuchs points out, a very strong restriction on how we can assign probabilities to events in the world. But I admit some dissatisfaction at the explanatory power of the underlying physics at this point (discussed in some technical detail in a review by yet another Oxford philosopher of science, Christopher Timpson).

In both the Bayesian and Many Worlds interpretations (at least in the modern versions of the latter), probability is supposed to be completely subjective, as it should be. But something still seems to be missing: probability assignments are, in fact, testable, using techniques such as Bayesian model selection. What does it mean, in the purely subjective interpretation, to be correct, or at least more correct? Sometimes, this is couched as David Lewis‘ “principal principle” (it’s very hard to find a good distillation of this on the web, but here’s a try): there is something out there called “objective chance” and our subjective probabilities are meant to track it (I am not sure this is coherent, and even Lewis himself usually gave the example of a coin toss, in which there is nothing objective at all about the chance involved: if you know the initial conditions of the coin and the way it is flipped and caught, you can predict the outcome with certainty.) But something at least vaguely objective seems to be going on in quantum mechanics: more probable outcomes happen more often, at least for the probability assignments that physicists make given what we know about our experiments. This isn’t quite “objective chance” perhaps, but it’s not clear that there isn’t another layer of physics still to be understood.

## 7 responses to “The Measurement Problem”

We are a group that is challenging the current paradigm in physics which is Quantum Mechanics and String Theory. There is a new Theory of Everything Breakthrough. It exposes the flaws in both Quantum Theory and String Theory. Please Help us set the physics community back on the right course and prove that Einstein was right! Visit our site The Theory of Super Relativity: http://www.superrelativity.org

Rationally – who knows. Emotionally – I kinda hope Penrose is right. I once saw him give a wonderful public talk called “Faith, Fashion, and Fantasy in Modern Physics”. That was Quantum Mechanics, Inflation, and String Theory. David Gross was on the same bill, and was visibly cross. Lovely.

For some time now, I’ve been pretty firmly convinced that the many worlds interpretation is the most likely. I know that back as an undergraduate, I vehemently objected to this interpretation on grounds that it seemed like it violated all sorts of things about how we understand the world to behave, most especially every conservation law in existence.

Since then, I’ve felt rather disappointed that we never covered decoherence theory (at least not that I remember) in quantum mechanics class. And I’ve come to realize that once you throw decoherence into the picture, the many worlds interpretation becomes so blatantly apparent that it must, in a sense, be true.

That is to say, if you just take a simple system, such as the two-slit experiment, and consider the mixed state interacting with a more complex system, the destruction of the interference pattern necessarily occurs. What this means is that the wave function is now composed of two components that, at least on short time scales, cannot interact with one another. The interaction causes an occurrence which mimics wavefunction collapse!

After recognizing this, I have always thought it patently absurd to add anything at all to the wave function dynamics of quantum mechanics. As those wave function dynamics alone explain the appearance of collapse, why should we think that there is any other physical process at work?

Sure, taking this view of quantum mechanics leads to some rather unsettling consequences, and it’s not an easy task to determine all of the predicted outcomes that observers who are described by quantum mechanics will observe. But just this schematic example alone was enough to convince me that this seems, by far, the most likely interpretation of quantum mechanics.

Completely agree about the program, I was quite disappointed. I didn’t feel the problem was explained well

Interesting stuff. Not sure how accessible you wanted this post to be. It rather illustrates how hard it is for anyone, including “In Our Time”, to cover the topic without much compromise but with it still widely accessible.

QM really challenges the mind. The core issue for me is: QM gives us equations to make predictions to an unprecedented degree. BUT we’ve been pandered to in our expectation that such theories should allow us to comprehend how the world works above and beyond being able to predict it, and alas QM doesn’t play easily on that score.

Either we’ve reached a level where our intuition needs tuning so break the need to go beyond predictive power (try How The Laws of Physics Lie if you want to see what it’s like to wear that hat), or the “meaning” of the theory beyond prediction power is not yet widely understood, or the theory is broken (as all have been in the past) and moreover it’s replacement will not be so broken.

I made an observation that those who understand and comprehend this and other physics problems always have trouble articulating it to those who would be interested in such subjects but cannot grasp the jargon or have enough time to read through the relevant material. If physics had some good PR people might be able to express/communicate these ideas in a more productive way. Thus informing a wider audience which would produce more interest in these ideas and problems. Resulting in a more introspective pondering and pursuing of the answers on a much wider scale. More brains on the problem would statistically suggest a answer or perhaps more problems would be discovered faster.

“.. quantum mechanics appears to be completely true and experimentally verified, without contradiction so far. And yet it seems incomplete …” One objection to quantum theory is that probabilistic density functions should have some deterministic origin — however, Bell’s theorem suggests that if we want to restore determinism we need to look below the Planck scale. Unfortunately, nature does not seem to allow us to look below the Planck scale. Thus the choice seem to be between a semi-mystical quantum agnosticism and a semi-mystical belief in hidden determinism with alternate universes. I have predicted that the Rañada-Milgrom effect shall revolutionize cosmology within about one year — perhaps I am totally wrong in this prediction. In any case, I recently noticed a numerical coincidence: (proton mass)**2/(electron mass)**2 – 137.035999 * (( -.042294 + 4 pi)**4) = .3425 … is there some physical reason why (4 pi) to the 4th power should show up here?