Hiding in the Mirror (16 page)

Read Hiding in the Mirror Online

Authors: Lawrence M. Krauss

BOOK: Hiding in the Mirror
9.72Mb size Format: txt, pdf, ePub

The first such experiment, performed by the
eminent physicist Madam Chien-Shiung Wu along with collaborator
Ernest Ambler at the National Bureau of Standards and his
colleagues, involved nothing other than a careful observation of
the decay of neutrons in the radioactive nucleus cobalt 60.
Neutrons behave as if they are spinning, and if one cools down
neutrons in nuclei to a very low temperature and puts them in a
magnetic field, one can arrange to have most of their spin axes
pointing in the same direction. When this was done for neutrons in
cobalt 60, Wu and collaborators observed an angular asymmetry in
the distribution of the electrons that were emitted in the decay of
the neutron: More electrons were produced heading in one direction
than another. With respect to the neutron spin axis, nature favored
left over right. Within weeks, Leon Lederman and colleagues, also
at Columbia, observed the weak decays of the recently discovered
pions and muons and obtained a similar result. Both experiments
reported that the left–right asymmetry associated with weak decays
was not small. Not only did nature, through weak interactions,
provide a way to distinguish right from left, but it produced the
maximal possible distinction. No longer could knowledgeable
scientists look into the mirror and wistfully imagine a world
behind the looking glass identical to their own. Like Alice, they
would find that the rules in this new world were in fact different
than the rules in their own world.

The astounding significance of this totally
unexpected prediction of Lee and Yang’s is perhaps best reflected in
the fact that they were awarded the Nobel Prize in 1957, only a
year, almost to the day, from the date their paper first appeared in
print. Indeed, the surprise was so great that it was realized after
the fact that the violation of parity had, in fact, been
experimentally observed as early as 1928, even before the discovery
of the neutron, in the experiments of R. T. Cox in England, who
measured the scattering of electrons from the decay of radium and
who detected a different scattering rate in one direction than
another. His contemporaries, however, discounted his results.
Sometimes, alas, it doesn’t pay to be too far ahead of one’s
time.

The newfound complexity of the elementary
particle world was both a mystery and a challenge. It also
completely changed the framework for thinking about unification of
forces in nature, especially along the lines of Kaluza and Klein’s
extra-dimensional arguments. If electromagnetism and gravity were
not the only forces in nature, and if a host of new objects and
strange new forces played a fundamental role, then treating
electromagnetism as a residue of a purely gravitational, and thus
geometric, phenomenon in higher unobserved dimensions would no
longer suffice. What is surprising is that the attempt to address
the mysteries brought on by these new complexities provided a
completely independent impetus to consider extra dimensions.

C H A P T E R 1 1
OUT OF CHAOS . . .

The day will perhaps come
when physicists will no longer concern
themselves with questions which are inaccessible to
positive methods,
and will leave them to
the metaphysicians. That day has not yet come;
man does not so easily resign himself to remaining
forever ignorant of
the causes of
things.

—Henri Poincaré,
Science
and Hypothesis

T
he startling
revelations about nature discovered through cosmic rays stepped up
in pace once accelerators came online, as the number and complexity
of the particles produced by colliding high-energy beams on targets
continued to multiply. Physics had proceeded up to that point with
the presumption, generally supported by experiment, that as one
probed to smaller and smaller scales the apparent complexity of the
world was reduced, with increasing simplicity and economy of ideas
prevailing. But this new data suggested precisely the opposite. The
subatomic world appeared to be proliferating endlessly.

Two questions then naturally arose in the
particle physics community: (1) Was there anything fundamental at
all about any, if not all, of these particles? and (2) Would they
continue to proliferate indefinitely?

By the early 1960s these concerns had given
rise to several drastic proposals. One that became particularly
fashionable had a certain Zen quality about it, and was for a while
the dominant fad in particle theory. It was originated by physicist
Geoffrey Chew at Berkeley, then the center for much of elementary
particle research.

The central idea of his “bootstrap” model was
that perhaps all elementary particles, and at the same time none of
them, are fundamental. Put another way, perhaps all elementary
particles could be viewed as being made up of appropriate
combinations of other particle states. It is like imagining, for
example, that combinations of the three colors red, blue, and green
could make up all other colors, including themselves . . . so that
red combined with blue might make green, and green combined with
blue might make red. In such a case (unlike in the real world),
where these three colors can be considered fundamental, the choice
of which colors one considered fundamental, and which ones are
composite, would clearly be arbitrary. If you’re bothered by this
kind of circular thinking—oddly reminiscent of the famous
“Oroboros,” the snake from Indian philosophy whose head devours its
own tail, ultimately disappearing completely—don’t be too dismayed.
Remember that the quantum mechanical world is full of apparent
classical paradoxes, most of which reflect the fact that our
classical notions fail to capture what are truly the essential
concepts. Ultimately what the bootstrap model suggested was that
perhaps particles themselves, which seem so fundamental to us, are
really not the important objects to focus on, but instead are
merely different reflections of some other, more basic
quantities.

Perhaps instead, it was suggested the
quantities that one should concentrate on were simply the
mathematical relations between the different configurations that
could be obtained by scattering particles off one another. The laws
of quantum mechanics and relativity provide many elegant
constraints on these mathematical relations, independent of the
specific particles involved. Since what one actually measures in a
laboratory are the processes of interactions and scattering, maybe
everything that could be experimentally measured could be derived
from the mathematical relationships that described the scattering
of particles, and not from the classification of the properties of
the particles themselves. I am probably not doing justice to the
bootstrap model, as it has since been confined to the dustbin of
history. It is thus tempting to dismiss all of the work done during
this period as merely a diversion, but that would not be fair.
Concentration on the mathematical properties of so-called
scattering amplitudes did reveal many illuminating and unexpected
relations between states in the theory and the mechanisms for
transformations between them. One of the realizations that arose
out of this kind of analysis was a particularly disturbing one. As
more and more strongly interacting particle states were discovered,
an interesting relation was discovered between the masses of
particles and their “spin.” Recall that many elementary particles
behave as if they are spinning, and thus have an “angular momentum”
similar to that of a gyroscope, which remains aligned in a certain
direction and can precess about that direction and so on. The
faster a top spins, the more energy it possesses, and the larger
its angular momentum. Thus, it was perhaps not too surprising to
find that strongly interacting elementary particles with higher-spin
angular momentum tended to be heavier than their lower-spin
counterparts. What was notable, however, was the roughly linear
relation between the square of particle masses and their spins that
was discovered.

In particular, it was tempting to predict that
more and more new heavy states would be discovered as one attempted
to produce states of higher and higher spin. Indeed, this
prediction was verified as far out as it could be tested, so there
was no reason to believe it would not carry on indefinitely. There
is a problem with this suggestion, however. If one applies the
rules of quantum mechanics and relativity to calculate the
scattering rates when one causes fundamental particles of higher
and higher spin to collide, these rates become very large as the
energy of scattering increases—much larger, in fact, than the
behavior observed in actual particlescattering experiments.
Considerations of the mathematical relations associated with
scattering rates, however, offered a possible way out of this
dilemma. It turned out that while the calculated rates for
individual scattering processes involving the exchange of a
specified number of intermediate particles of a fixed spin might grow
large, it was just possible that if there were instead an infinite
number of possible intermediate states and if the total scattering
rate was determined by summing up over this infinite number of
possibilities, then it might just be that the infinite sum could be
better behaved than any of the individual terms.

I know this must sound weird in the extreme.
First, how could an infinite number of particles be involved in some
specific scattering process?

This is made possible, however by the
uncertainty principle. Remember that quantum mechanics allows for
the existence of virtual particles that can spontaneously appear
and disappear over short time intervals. If the interaction time is
short enough, it turns out that an arbitrarily large number of
virtual particles can be exchanged between the external particles
undergoing a collision, with the heavier particles existing for
progressively shorter times.

The second weirdness is worse, however. How
could an infinite sum of terms end up being smaller than the
magnitude of the individual terms in the sum? Let’s warm up with a
simple example. Imagine the individual terms in a sum alternate in
sign—something like 1 − 1⁄2 + 1⁄3 − 1⁄4 and so on. In this case the
sum of this series seems to be clearly less than 1. Namely, the sum
of the first two terms is 1⁄2, the sum of the first three is 5⁄6, the
sum of the first four is 14⁄24, and so on. (Try adding more and more
terms.) But it turns out that infinite sums behave even more
strangely. Indeed, the mathematics of infinite sums is quite
fascinating and unintuitive, based as it is on the properties of
infinity itself.

To get an idea about how the normal rules of
addition and subtraction can become meaningless when one is
considering infinite quantities, my favorite tool involves something
called Hilbert’s hotel, named after the famous mathematician David
Hilbert, who was one of the pioneers in studying the properties of
numbers, and whom I referred to earlier in the context of the
development of general relativity.

Hilbert’s hotel is not like a normal hotel,
because it has an infinite number of rooms. Other than being rather
large, you might think it would not be qualitatively different from
normal hotels, but you would be wrong. For example, say that one
evening Hilbert’s hotel has every room occupied. In a normal hotel
the manager would put up a NO VACANCY sign, but not so in this
case. Say a weary traveler comes in with his family and asks for a
room. The owner would happily reply that every room was now
occupied, but if the traveler just waited a bit, a room would be
available shortly. How would this be possible? Simple. Just take
the family from room 1 and put them in room 2, the family from room
2 and put them in room 3, and so on. Since there are an infinite
number of rooms, everyone gets accommodated, but now room 1 is
vacant, and free for the new traveler.

Say that the new traveler arrives not merely
with his family, but with his friends, as well. Because he is a
very popular fellow, he brings an infinite number of friends along,
each of whom wants his or her own room. No problem. The manager
takes the family from room 1 and puts them in room 2, the family
from room 2 and puts them in room 4, the family from room 3 and
puts them in room 6, and so on. Now only the evennumbered rooms are
occupied, and there are an infinite number of oddnumbered rooms
vacant to accommodate the new travelers. As these examples
demonstrate, adding up infinite numbers of things is a confusing
process, but mathematicians have developed rules that allow one to
do so consistently. In performing such operations, however, one can
find not only that the sum of an infinite series may be smaller than
some of the individual terms, the sum of an infinite series can be
smaller than
every
single term.
Moreover, this can be the case not only
for series with alternating sign terms, but for series in which
every term is positive. Perhaps the most important example of this,
and one of great relevance for much of the physics that follows, is
the following: When considered using appropriate mathematical tools
developed to handle infinite series, the sum of the series 1 + 2 + 3
+ 4 + 5 + . . . can be shown to not equal infinity, but rather
−1⁄12!

Now, in a similar vein, using similar
mathematical tools, it was recognized by those who studied the
mathematical relations associated with the scattering of strongly
interacting particles that, if a very specific relation called
“duality” (which I shall describe in more detail shortly) exists
between all of the particles in the theory, then it is possible to
write the total scattering rate as an infinite sum of individual
contributions, each of which might blow up as the energy of the
scattering particles increased, but the sum of which would instead
add up to a finite number.

In 1968 the physicist Gabriele Veneziano
postulated a precise formula for the scattering of strongly
interacting particles that had exactly the required duality
properties. It was, one should emphasize, a purely mathematical
postulate, without more than at most marginal physical or
experimental support. Nevertheless, the fact that it appeared to
possibly resolve a conundrum that had been plaguing particle
physics meant that many physicists started following up on
Veneziano’s ideas. It was soon discovered that Veneziano’s purely
mathematical “dual model” actually did have a physical framework
through a theory not of point particles, but of “relativistic
strings” (i.e., extended one-dimensional objects moving at near
light-speed). Specifically, if the fundamental objects that
interacted and scattered were not zero-dimensional pointlike
objects, but rather one-dimensional stringlike objects, then one
could show that the particular mathematical miracles associated
with duality could naturally and automatically result.

Faced with the prospect of an embarrassing
plethora of new particle states and also of what appeared to be an
otherwise mathematically untenable theory based on that
old-fashioned idea that the fundamental quantum mechanical
excitations in nature are manifested as elementary particles, many
physicists felt that the strong interaction had to be, at its
foundation, a theory of strings.

This may all sound a bit too fantastic to be
true, and those of you who are old enough to have followed popular
science ideas in the 1960s and ’70s may wonder why you never heard
tell of strings. The answer is simple: It
was
too fantastic to be true. Almost as soon as dual
string models were developed, a number of even more embarrassing
problems arose, both theoretical and experimental. The theoretical
problem was, as we physicists like to say, “highly nontrivial”: It
turns out that when one examines the specific mathematical miracle
associated with the infinite sums that duality is supposed to
provide, there is a slight hitch. The sums are supposed to produce
formulae for describing the scattering of objects one measures in
the laboratory. Now there is one simple rule that governs a
sensible universe: If one considers all of the possible outcomes of
an experiment and then conducts the experiment, one is guaranteed
that one of the outcomes will actually happen. This property, which
we call unitarity, really arises from the laws of probability:
namely, that the sum of the probabilities of all possible outcomes
of any experiment is precisely unity. With dual string models,
however, it turned out that the infinite sums in question do not, in
general, respect unitarity. Put another way, they predict that
sometimes when you perform an experiment, none of the allowed
outcomes of the experiment will actually occur.

Thankfully, however, there turned out to be an
explicit mathematical solution to this mathematical dilemma, which
will be far from obvious upon first reading it, but here goes: If
the fundamental objects in the theory, relativistic strings, lived
not in a four-dimensional world, but a twentysix–dimensional world,
then unitarity (i.e., sensible probabilities) could be
preserved.

Other books

Bare-Naked Lola (A Lola Cruz Mystery) by Bourbon Ramirez, Melissa
The Simple Truth by David Baldacci
After the Storm by Jane Lythell
Zafiro by Kerstin Gier
Fate (Wilton's Gold #3) by Craig W. Turner
The Eggnog Chronicles by Carly Alexander
The Midshipman Prince by Tom Grundner