Read Farewell to Reality Online

Authors: Jim Baggott

Farewell to Reality (34 page)

BOOK: Farewell to Reality
11.92Mb size Format: txt, pdf, ePub
ads

But when we intervene to make a measurement, we are obliged to abandon these equations. The measurement itself is like a quantum jump. It is ‘discontinuous'. The properties of the measurement outcomes are not closely, smoothly and continuously connected to the initial wavefunction. The only connection between the initial wavefunction and the measurement outcomes is that the modulus squares of the amplitudes of the various components of the former can be used to determine the probabilities for the latter. The measurement is completely
indeterministic.
This is the ‘collapse of the ‘wavefunction'.

Von Neumann was also very clear that this collapse or process of ‘projecting' the wavefunction into its final measurement state is not inherent in the equations of quantum theory. It has to be
postulated,
which is a fancy way of saying that it has to be assumed. Oh, and by the way, there is no experimental or observational evidence for this collapse per se. We just know that we start with a wavefunction which can be expressed as a superposition of the different possible outcomes and we end up with one — and only one — outcome.

There are, in general, three ways in which we can attempt to get around this assumption. We can try to eliminate it altogether by supplementing quantum theory with an elaborate scheme based on hidden variables. As the experiments described in Chapter 2 amply demonstrate, this scheme has to be very elaborate indeed. We know that hidden variables which reintroduce local reality — variables which establish localized properties and behaviours in an entangled quantum system, for example — are pretty convincingly ruled out by experiments that test Bell's inequality. We also know that the experiments designed to test Leggett's inequality tell us that ‘crypto' non-local hidden variable theories in which we abandon the set-up assumption won't work either.

This leaves us with no choice but to embrace a full-blown non-local hidden variables theory.

Pilot waves

Such theories do exist, the best known being de Broglie—Bohm pilot wave theory, named for French theorist Louis de Broglie and American David Bohm. At great risk of oversimplifying, the de Broglie—Bohm theory assumes that the behaviour of completely localized (and therefore locally real) quantum particles is governed by a non-local field — the ‘pilot wave' — which guides the particles along a path from their initial to their final states.

The particles follow entirely predetermined paths, but the pilot wave is sensitive to the measurement apparatus and its environment. Change the nature of the measurement by changing the orientation of a polarizing filter or opening a second slit, and the pilot wave field changes instantaneously in response. The particles then follow paths dictated by the new pilot wave field.

The de Broglie—Bohm theory has attracted a small but dedicated group of advocates, but it is not regarded as mainstream physics. To all intents and purposes, we have simply traded the collapse assumption for a bunch of further assumptions. Yes, we have avoided the collapse assumption and regained determinism — the fates of quantum particles are determined entirely by the operation of classical cause-and-effect principles. But we have also gained a pilot wave field which remains responsible for all the ‘spooky' action-at-a-distance. And the end result is a theory that, by definition, predicts precisely the same results as quantum theory itself.

Einstein tended to dismiss this approach as ‘too cheap'.
2

Decoherence and the irreversible act of amplification

The second approach is to find ways to supplement quantum theory with a mechanism that makes the collapse rather more explicit. In this approach we recognize a basic, unassailable fact about nature — the quantum world of atomic and subatomic particles and the classical world of experience are fundamentally different. At some point we must cross a threshold; we must cross the point at which all the quantum weirdness — the superpositions, the phantom-like ‘here'
and
‘there' behaviour — disappears. In the process of being amplified to scales that we can directly perceive, the superpositions are eliminated and the phantoms banished, and we finish up with completely separate and non-interacting states of ‘here'
or
‘there'.

Is it therefore possible to arrange it so that Schrödinger's cat is never both alive and dead? Can we fix it so that the quantum superposition is collapsed and separated into non-interacting measurement outcomes long before it can be scaled up to cat-sized dimensions?

The simple truth is that we gain information about the microscopic quantum world only when we can amplify elementary quantum events and turn them into perceptible macroscopic signals, such as the deflection of a pointer against a scale. We never (but never) see superpositions of pointers (or cats). It stands to reason that the process of amplification must kill off this kind of behaviour before it gets to perceptible levels.

The physicist Dieter Zeh was one of the first to note that the interaction of a quantum wavefunction with a classical measuring
apparatus and its environment will lead to rapid, irreversible decoupling or ‘dephasing' of the components in a superposition, such that any interference terms are destroyed.

Each state will now produce macroscopically correlated states: different images on the retina, different events in the brain, and different reactions of the observer. The different components represent two completely decoupled worlds. This decoupling describes exactly the [‘collapse of the wavefunction']. As the ‘other' component cannot be observed any more, it serves only to save the consistency of quantum theory.
3

But why would this happen? In the process of amplification, the various components of the wavefunction become strongly coupled to the innumerable quantum states of the measuring apparatus and its environment. This coupling selects components that we will eventually recognize as measurement outcomes, and suppresses the interference. The process is referred to as
decoherence.

We can think of decoherence as acting like a kind of quantum ‘friction', but on a much faster timescale than classical friction. It recognizes that a wavefunction consisting of a superposition of different components is an extremely fragile thing.
*
Interactions with just a few photons or atoms can quickly result in a loss of phase coherence that we identify as a ‘collapse'. This is fast but it is not instantaneous.

For example, it has been estimated that a large molecule with a radius of about a millionth (10
-6
) of a centimetre moving through the air has a ‘decoherence time' of the order of 10
-30
seconds, meaning that the molecule is localized within an unimaginably short time and behaves to all intents and purposes as a classical object.
4
If we remove the air and observe the molecule in a vacuum, the estimated decoherence time increases to one hundredth of a femtosecond (10
-17
seconds), which is getting large enough to be at least imaginable. Placing the
molecule in intergalactic space, where it is exposed only to interactions with the cosmic microwave background radiation, increases the estimated decoherence time to 10
12
seconds, meaning that a molecule formed in a quantum superposition state would remain in this state for a little under 32,000 years.

In contrast, a dust particle with a radius of a thousandth of a centimetre — a thousand times larger than the molecule — has a decoherence time in intergalactic space of about a microsecond (10
-6
seconds). So, even where the possibility of interactions with the environment is reduced to its lowest, the dust particle will behave largely as a classical object.

The kinds of timescales over which decoherence is expected to occur for any meaningful example of a quantum system interacting with a classical measuring device suggest that it will be impossible to catch the system in the act of losing coherence. This all seems very reasonable, but we should remember that decoherence is an assumption: we have no direct observational evidence that it happens.

But does this really solve the measurement problem?

Decoherence eliminates the potentially embarrassing interference terms in a superposition. We are left with separate, non-interacting states that are statistical mixtures — different proportions of states that are ‘up' or ‘down', ‘here' or ‘there', ‘alive' or ‘dead'. We lose all the curious juxtapositions of the different possible outcomes (blends of ‘up' and ‘down', etc.). But decoherence provides no explanation for why
this
specific measurement should give
that
specific outcome. As John Bell has argued:

The idea that elimination of coherence, in one way or another, implies the replacement of ‘and' by ‘or', is a very common one among solvers of the ‘measurement problem'. It has always puzzled me.
5

This is sometimes referred to as the ‘problem of objedification'. Decoherence theory can eliminate all the superpositions and the interference, but we are still left to deal with quantum probability. We have no mechanism for determining which of the various outcomes — ‘up'/'down', ‘here'/'there', ‘alive'/'dead' — we are actually going to get in the next measurement.

There are other theories that seek to make the collapse of the wavefunction explicit. But, of course, they all typically involve the replacement of the collapse assumption with a bunch of other assumptions which similarly have no basis in observation or experiment.

This leaves us with one last resort.

Everett's ‘relative state' formulation of quantum theory

The third approach is to turn the quantum measurement problem completely on its head. If there is nothing in the structure of quantum theory to suggest that the collapse of the wavefunction actually happens, then why not simply leave it out? Ah, I hear you cry. We did that already and it led us to non-local hidden variables.

But there is another way of doing this that is astonishing in its simplicity and audacity. Let's do away with the collapse assumption and put our trust purely in quantum theory's deterministic equations. Let's not add
anything.

Okay, I sense your confusion. If we don't add anything, then how can we possibly get from a smoothly and continuously evolving superposition of measurement possibilities to just one — and only one — measurement outcome? Easy. We note that as observers in this universe we detect just one — and only one — outcome. We assume that at the moment of measurement the universe splits into two separate, non-interacting ‘branches'. In this branch of the universe you observe the result ‘up', and you write this down in your laboratory notebook. But in another branch of the universe another you observes the result ‘down'.

In one branch we run to fetch a bowl of milk for Schrödinger's very much alive and kicking cat. In another branch we ponder what to do with this very dead cat we found in a box. All the different measurement possibilities inherent in the wavefunction are actually realized.
But they're realized in different branches of the universe.

As Swedish-American theorist Max Tegmark explained in the BBC
Horizon
programme mentioned in the Preface:

I'm here right now but there are many, many different Maxes in parallel universes doing completely different things. Some branched off from this universe very recently and might look exactly the
same except they put on a different shirt. Other Maxes may have never moved to the US in the first place or never been born.
6

I have always found it really rather incredible that the sheer stubbornness of the measurement problem could lead us here, to Hugh Everett Ill's ‘relative state' formulation of quantum theory.

Everett was one of John Wheeler's graduate students at Princeton University. He began working on what was to become his ‘relative state' theory in 1954, though it was to have a rather tortured birth. The theory was born, ‘after a slosh or two of sherry',
7
out of a complete rejection of the Copenhagen interpretation and its insistence on a boundary between the microscopic quantum world and the classical macroscopic world of measurement (the world of pointers and cats).

The problem was that Wheeler revered Niels Bohr and regarded him as his mentor (as did many younger physicists who had passed through Bohr's Institute for Theoretical Physics in Copenhagen in their formative years). Wheeler was excited by Everett's work and encouraged him to submit it as a PhD thesis. But he insisted that Everett tone down his language, eliminating his anti-Copenhagen rhetoric and all talk of a ‘splitting' or ‘branching' universe.

Everett was reluctant, but did as he was told. He was awarded his doctorate in 1957 and summarized his ideas in an article heavily influenced by Wheeler which was published later that year. Wheeler published a companion article in the same journal extolling the virtues of Everett's approach.

It made little difference. Bohr and his colleagues in Copenhagen accused Everett of using inappropriate language. Everett visited Bohr in 1959 in an attempt to move the debate forward, but neither man was prepared to change his position. By this time the disillusioned Everett had in any case left academia to join the Pentagon's Weapons System Evaluation Group. He went on to become a multimillionaire.

Despite Wheeler's attempts to massage the Everett theory into some form of acceptability, there was no escaping the theory's implications, nor its foundation in metaphysics. It seemed like the work of a crackpot. Wheeler ultimately came to reject it, declaring: ‘… its infinitely many unobservable worlds make a heavy load of metaphysical baggage'.
8

Many worlds

And so Everett's interpretation of quantum theory languished for a decade. But in the early 1970s it would be resurrected and given a whole new lease of life.

BOOK: Farewell to Reality
11.92Mb size Format: txt, pdf, ePub
ads

Other books

Vampire Mystery by Gertrude Chandler Warner
Hot Countries by Alec Waugh
Young Forever by Lola Pridemore
Plague Child by Peter Ransley
The Master Magician by Charlie N. Holmberg
Cocksure by Mordecai Richler
Friends by Stephen Dixon