Authors: James A. Connor
At the bottom of quantum physics is the discovery that light has the qualities both of a particle and of a wave, a wave/particle duality. In the blackbody experiments of Max Planck, in the photoelectric-effect studies that got Einstein his Nobel Prize, and in the scattering of photons that Compton noticed, light seemed to act like a particle. But then, anyone who has had high school physics knows that light is subject to refraction and will produce interference patterns like waves on a lake, and in that it acts like a wave. Therefore, we cannot explain light without both pictures. Once we got rid of Christian Huygens’s “lumeniferous aether,” the undetectable medium that light is supposed to travel through as it moves through a vacuum, the vacuum championed by Pascal, the nature of light took on two faces.
This is where Werner Heisenberg comes into the picture. During one of the conferences at the Copenhagen Institute, he asked what would happen if we wanted to measure both the position and the momentum of a specific particle. In order for us to do this, we must be able to “see” the particle, so that we can measure it. So, to see it, we have to shine a light on it. The light has a wavelength λ. But in order to actually see clearly, the smaller the object to be observed, the shorter the wavelength λ will have to be, until at the level of electrons, we will have to use such powerful gamma rays to see it that we will change it by our seeing. In this way, Heisenberg realized (actually following the philosophy of Immanuel Kant), we will never see the electron as it truly is, but only as it is once we’ve changed it by our seeing. This gives an
uncertainty
to the particle’s position (Δx ≈ λ).
This means that the change in
x
caused by our seeing is approximately, but uncertainly, equal to the wavelength of the light that we are using to see it with. Now, this applies if we see light as a wave. If we see it as a particle, then we can say that the light we use to see it with gives up some of its momentum to the particles when it illuminates it. How much it gives up is unknown, and unknowable, because it is immeasurable. So, in this case, the change in momentum (
h
) is described by the following formula:
Δx Δp ≈ h
We must note here that this uncertainty is not something that comes from the inadequacy of our technology or of our methods, but is so in principle. You cannot precisely measure both the position and the momentum of a subatomic particle. If you try to get a more accurate fix on one, the other will go askew, and contrariwise.
Now things really get strange. The Schrödinger equation, which spins off of Heisenberg, demonstrates how the basic nature of things is probabilistic and not classically real. The thought experiment that he came up with took Heisenberg’s uncertainty principle and generalized it by describing certain wave functions about whose outcomes, given a large
number of outcomes, the Schrödinger equation will be able to make predictions. Most people have heard about his famous thought experiment about the cat placed in a box with a vial of cyanide gas that would break or not break depending upon whether a specific sensor is hit by a subatomic particle. The chances of its being hit are fifty-fifty. For Schrödinger, as long as the box remains closed and no one looks inside, the wave function of the particle and the state of the sensor remain undecided, and the cat is both alive and dead. It is only when someone opens the box that one wave function collapses and the cat turns up one way or the other. We all cheer for the cat to make it.
Note the term “large number of outcomes” in the last paragraph. This is the part that has a direct bearing on the letters of Pascal and Fermat. In his letter on the problem of expectations, Pascal was aware that as the number of throws of the dice increased, an effect was produced on the expectations that the players could legitimately hold in relation to the game. Later practitioners of the arts of probability have studied this effect and have used it to send boats deep into the continent of this new mathematical world, into the land of big numbers.
One of the things they discovered is the so-called law of averages, or Bernoulli’s law. Many gamblers mistakenly think that this law means that everything will average out, that if over ten tosses of a coin you have six heads and four tails, eventually—over, say, a hundred or a thousand throws—there will be a corrective and the player will begin to get more tails to make up the difference, so that eventually the number of heads and tails will be the same. In other words, in the long run, the chances will even out. But this isn’t how it really works, though it’s a nice enough idea to have its own name, the “gambler’s fallacy,” or the “maturity of chances.” It has lost a lot of people a lot of money.
To understand the way the “law of big numbers” works, you have to understand two things: First, every time you toss a coin or throw a die, the probability of a single outcome is always the same. The probability of getting a heads for each toss is fifty-fifty, while the probability of getting a six on the toss of a single die is one in six, while the probability of getting double sixes with two dice is one in thirty-six. This means that every toss
is a fresh start, and the uncertainty of the outcome is just as great as in the first toss. This takes care of the gambler’s fallacy, because there is no guiding hand making sure that the tosses even out, and at every moment while the coin is spinning, even the powerful mathematics of probability can’t make a prediction of what will happen next.
Let us take the example of a coin toss. The “law of averages,” or the “law of big numbers,” says that as the number of times we toss the coin grows, the number of heads we throw will grow closer, in proportion, to half the number of total throws. Therefore, as the number of tosses grows, the probability grows that the percentage of heads (or tails) will get closer to fifty. But the actual number of heads thrown will get larger, too, just as will the percentage. So, we flip a coin a hundred times and get sixty heads and forty tails, which is 60 percent to 40 percent. The gambler’s fallacy would argue that somewhere in there, ten more tails will show up to make up the difference. But you keep flipping and writing down the result. You get to a thousand flips, and the difference is now 55 percent to 45 percent, but the difference in the number of throws is no longer a mere twenty throws; it is now a hundred throws. This is the Law of Big Numbers, and odd things not only happen but become commonplace as the numbers get larger.
Two people meet each other on the streets of New York just after 9/11, and after further discussion find that their grandparents once had a similar encounter on the streets of London after the Blitz. Then they find out that their grandparents’ grandparents had a similar encounter on the streets of Paris after the First World War. What are the possibilities of that? With a big enough human population, it would be a dawdle. And so, in a big enough universe, anything can happen. The only problem for really improbable things is whether the universe is big enough. And with a big enough universe, say some, you don’t need anything else.
At this point in Western intellectual history, the old division between materialism and religion seems unbridgeable. Religious people believe ever more deeply in the mystery that the Greeks first experienced in the glades and forests, the running streams with intimations of divinity haunting
the field. As such, they are more directly tied to the long drive upward of the human race, and find ever more creative ways to express the new cosmology in theological terms. The old ways are never forgotten; they just find new ways to get into the papers. The materialists, however, are still holding to the notion that we would be better off without all that supernatural folderol, and so they spin ever more delightful theories to make that happen.
The cultural landscape still seems to be divided into two strategic types—the climbers and the sprawlers. The climbers live vertically, and find ever more wondrous and exotic mysteries in the everyday world, mysteries that somehow intimate God, just as the forests and glades once did. The sprawlers live horizontally. They find satisfaction in the idea of many chances and of big numbers. Anything can happen in the universe if the numbers are big enough. The vastly improbable becomes probable; the uncanny becomes ordinary. Eventually, with big enough numbers, those ten thousand monkeys could type out not only Shakespeare, but Milton, too, and half the Bible. Big numbers explain the coming of life and the evolution of intelligence. Of course, the old rhetorical project lurks behind the big numbers. Why have a God when you can have googolplexes? With big enough numbers, anything can happen.
The ultimate sprawler’s theory is the multiple universes of John Archibald Wheeler. We can avoid the whole question of Providence, even in light of the big bang, by inventing a zillion zillion universes, separate from ours, where every conceivable quantum state can have its day. In this way, my dog who was hit by the car is alive in some other universe. The fact that such universes have no more empirical evidence for their existence than does God, and are no more falsifiable than is God’s existence, doesn’t seem to matter. In fact, the whole thing might balance on the edge of Occam’s razor. But which is the simpler explanation? If you have big enough numbers, you don’t need God, and that is the heart of it. But are multiple universes any simpler an explanation than God? It seems finally to come down to choice, perhaps even to the Two Standards: people who believe in God do so because they want to; people who don’t believe don’t because they want to. Almost makes one think of efficacious grace.