God and the Folly of Faith: The Incompatibility of Science and Religion (20 page)

BOOK: God and the Folly of Faith: The Incompatibility of Science and Religion
8.2Mb size Format: txt, pdf, ePub
ads

There is another profound implication of relativity that wreaks havoc with the traditional religious doctrine of creation. An absolute past and future that is the same for all observers cannot be defined. When two events are local, observers in every reference frame will agree on which occurred first. This will be true even though observers moving relative to one another will not agree on the spatial and temporal intervals. They will still agree on what is past and what is future.

However, different observers will not all agree on the time sequence of nonlocal events. Observer 1 might see event A before event B, while observer 2 might see event B before event A. Think of the implication. The two observers differ on which event is in the “past” and which is in the “future.” We already saw in the previous section that no fundamental direction of time can be found in classical or quantum physics. Now we see that there is no universal past and future.

Of course, we and our fellow humans agree on a past and future in our everyday lives. But that is because our relative speeds are low compared to
c
and so, for the purposes of relativity, we are all in the same reference frame. However, consider an event occurring here on Earth and another occurring in the Andromeda galaxy two million light years away. There is a four-million-year interval—two million years before our present and two million years after—for which we cannot specify past and future for events in Andromeda. Similarly, if we go back 13.7 billion years in our past to the beginning of the universe, and another 13.7 billion years into our future, we cannot distinguish past from future for any event farther than 13.7 billion light years from us.

Now, in principle we can see out to 44 billion light years away, which is called our “light horizon.” The universe is 13.7 billion years old, which means that the farthest object we can see, again in principle, was 13.7 billion light
years away when the light left it. However, in that time the object has moved 44 billion light years away with the expansion of the universe. The upshot is that there are many more events in the universe with which we have no causal contact than events with which we do. And we cannot specify whether any of those events occurred in the absolute past or future since another observer might disagree.

GENERAL RELATIVITY

 

When two reference frames are accelerating with respect to each other we must use
general relativity
, published by Einstein in 1915. If you watch a clock go by that is accelerating with respect to you, it will appear to you to run more slowly.

In general relativity, Einstein also assumed the
principle of equivalence
, which says that if you are sitting on a tiny particle you can't tell whether it is accelerating or being acted on by a gravitational force. Thus general relativity became a
theory of gravity
. The equations of general relativity derived by Einstein made several predictions of phenomena, such as the bending of light by the sun and the precession of the perihelion of Mercury that did not follow from an application of Newtonian gravity. When these were shortly verified,
11
Einstein became the famous public figure we all know, who in 1999 was named
Time
magazine's “Person of the Century.”

The general theory of relativity has replaced Newton's theory of gravity as our current working model of gravity. Newton's theory still applies for most practical applications; however, the global positioning system (GPS) in your car corrects for gravitational time dilation using the equations of general relativity. It would not get you where you want to go if it did not.

 

The reductionist attitude provides a useful filter that saves scientists in all fields from wasting their time on ideas not worth pursuing. In this sense, we are all reductionists now.

—Steven Weinberg
1

 

Matter is an illusion. Only consciousness is real.

—Deepak Chopra
2

 

THE DEMATERIALIZATION OF MATTER

 

 

R
eligious apologists and quantum spiritualists have grossly distorted the developments of twentieth-century physics, relativity, quantum mechanics, and relativistic quantum field theory. They want us to believe that these great scientific achievements have demolished the reductionist, materialistic views of the past when, in fact, they have done quite the opposite. Reductionism and materialism are stronger now than they ever were.

William Grassie tells us, “The concept of materialism deconstructed itself with the advent of quantum mechanics.”
3
According to University of Notre Dame philosopher Ernan McMullin, twentieth-century physics resulted in the “dematerialization” of matter.
4
Theologian Philip Clayton concurs: “Physics in the twentieth century has produced weighty reasons to think that some of the tenets of materialism were mistaken.”
5
He adds:

Somewhere near the beginning of the last century, the project of material reduction began to run into increasing difficulties. Special and general relativity,
and especially the development of quantum mechanics, represented a series of setbacks to the dreams of reductionist materialism, and perhaps a permanent end to the materialist project in anything like its classical form.
6

 

Before we get into the details of these philosophical claims, let us take a look at the actual science.
7

THE HISTORY OF QUANTUM MECHANICS

 

We begin with a brief review of the early history of quantum mechanics that should be sufficient to enable us to see how spiritual implications are being wrongly inferred.

The quantum narrative begins in 1900 with Max Planck and his explanation of black-body radiation. A
black body
is an object that does not reflect light but is a perfect absorber and radiator of light.
8
What is observed for a black body at room temperature is a spectrum that peaks at infrared frequencies. The peak moves to the visible region as the object is heated and becomes red-hot. Past the peak, at higher frequencies, the intensity of the light gradually drops to zero. This fall-off is not accounted for in the wave theory of light.

The nineteenth-century wave theory of light had predicted that the electromagnetic radiation from black bodies should go to infinity at high frequencies. This is because, as frequency increases, wavelength decreases so more and more vibration modes can fit within the body. This “ultraviolet catastrophe” is not observed.

Planck solved the problem by proposing that light comes in discrete bundles called
quanta
, in which the energy of each bundle is proportional to the frequency of the light. This explained the shift of the spectrum to higher frequencies. A body can contain only so much energy in the motions of its constituent particles. These particles increase in kinetic energy as the temperature of the body increases. The jiggling of charged electrons in the body results in electromagnetic radiation. According to Planck, higher energy means higher frequency radiation. The dropoff at higher frequency occurs because conservation of energy limits how high the body's energy can go.

In 1905, the same year he published the special theory of relativity, Einstein proposed that Planck's quanta were composed of individual particles that were later dubbed
photons.
The energy of each photon is equal to the frequency of the light times Planck's constant,
h
. Planck had determined
h
from fitting observed black body spectra to his mathematical model. Einstein's proposal, using Planck's value of
h
, accounted quantitatively for the
photoelectric effect
, in which the electric current produced when light hits metal depends on the frequency (energy) of the light and not, as expected from wave theory, on the intensity of the light.

In the wave theory, light is electromagnetic radiation in the visible and nearby spectral bands, infrared and ultraviolet. The particle nature of light was further verified in 1923 when Arthur Compton showed that X-rays, electromagnetic radiation in the spectral band just above ultraviolet light, decreases in frequency when scattered from electrons. Wave theory predicts that X-rays should not change in frequency, but just reduce in intensity as they lose energy to the electron. In the photon theory, the frequency of the scattered X-rays is lower than the incident rays since that frequency is proportional to the energy of the corresponding photons. In all these examples, the photon theory agreed quantitatively with measurements, with the same value of Planck's constant fitting the data in each case.

The identification of light as being composed of localized particles was a big step in making physics even
more
reductionist, contrary to the claim of theists and quantum spiritualists that quantum theory has weakened reductionism. The photon theory (along with special relativity) eliminated the holistic ether.

In 1911, Ernest Rutherford proposed a model of the chemical atom patterned after the solar system in which electrons orbit around a nucleus much smaller than the orbits themselves. This model was able to explain why alpha rays from radioactive nuclei scattered at abnormally large angles from gold foil. Rutherford inferred that the alpha rays were bouncing off highly localized, heavy chunks of matter at the center of the gold atoms. The electrons in the atoms were insufficiently massive to produce the observed effect. Rutherford concluded that the atom was mostly empty space, with almost all of its mass concentrated in a tiny nucleus.

In 1913, Niels Bohr applied Rutherford's model to the hydrogen atom, where a single electron orbits a proton. He postulated that the orbital angular momentum of the electron must be a multiple of Planck's constant
h
divided by 2π, a quantity physicists label as ħ and call the
quantum of action.
This enabled Bohr to calculate the frequencies of the very sharp
spectral lines
that were observed when an electric spark was sent through hydrogen gas and heated the gas to high temperature. Again, the wave theory provided no explanation. Interestingly, while Bohr obtained the correct spectrum of hydrogen (to first approximation), his hypothesis was actually wrong and would be improved upon twenty years later by Heisenberg and Schrödinger (see below). This is not the only example in the history of science where theories that fit the data well turned out to be wrong.
9
We have no way of knowing whether our physics models provide us with a reliable picture of ultimate reality. They just fit the data.

While Einstein, following Planck, had shown that light is composed of particles traveling at the speed
c
, the wavelike behavior of light did not go away. In 1924, Louis-Victor-Pierre-Raymond, 7th duc de Broglie, proposed in his doctoral thesis that objects such as electrons that we are accustomed to calling particles also have wave properties. He noted that the wavelength of a photon—that is, that of the associated electromagnetic wave—equals Planck's constant
h
divided by the photon's momentum. De Broglie assumed that all particles have the same relationship between their momentum and the wavelength of their associated wave. Clinton Davisson and Lester Germer confirmed de Broglie's hypothesis in 1927 when they observed the wavelike diffraction of electrons in a crystal.

These developments led to what is called the
wave-particle duality
, in which particles such as electrons and neutrons also behave like waves while waves such as light also behave like particles. The wave-particle duality is behind most of the spiritual claims that are made for quantum mechanics.

Werner Heisenberg, with the help of Max Born and Pascual Jordan, put quantum mechanics on a mathematical basis in 1925 in a formulation using matrix algebra. However, most people who know something about quantum theory are more familiar with the alternative formulation using less advanced mathematics published by Erwin Schrödinger the following year. Schrödinger
utilized the calculus of partial differential equations that is covered in the sophomore or junior years by physics, math, and chemistry majors. Thus students in these disciplines are able to apply Schrödinger's theory to the hydrogen atom and derive its energy levels, which remarkably turn out to be the same, to a first approximation, as those derived by Bohr with his far cruder and, as I pointed out, ultimately incompatible model.

Heisenberg and Schrödinger's theories were shown to be equivalent by Paul Dirac, who developed his own more elegant formulation of quantum mechanics employing linear vector algebra.
10
Most physicists today use Dirac's method. Once you can do the math (about junior level), it is amazing how simple quantum mechanics is and how few assumptions are needed to derive its full structure.
11
The so-called mysteries of quantum mechanics are in its philosophical interpretation, not in its mathematics.

Heisenberg is most famous for his
uncertainty principle
, published in 1927, which can be derived from standard quantum mechanics. Here is its simplest form: one cannot simultaneously measure both the momentum and position of a particle with unlimited precision. The product of the uncertainty (standard error) in momentum and the uncertainty in position must be greater than Planck's constant divided by 4π.

This result had the profound effect of demolishing the Newtonian world machine. Recall that Newton's laws imply that everything that happens is determined by prior events. In order to predict the motion of a particle you need to know its initial momentum and position and the forces acting on it. Ordinarily, on the macroscopic scale, this is no problem and is limited only by measuring precision. But on the submicroscopic scale it becomes serious.
12
For example, suppose you start out with an electron that has no forces on it but is confined to a volume of empty space equal to the volume of a hydrogen atom. It is a free electron, not bound in an atom. You cannot predict the position of that electron six seconds later within a volume equal to that of Earth.

To the delight of Christian theologians, this eliminates the Enlightenment deist god as a candidate for the creator. The deist god creates the universe and its laws but relies on determinism to make things come out the way he wants without intervening further. Some theologians think that quantum mechanics and the uncertainty principle open up a way for the Abrahamic God to act on
the universe without performing miracles. But they have celebrated too soon. The uncertainty principle opens up the possibility of a different kind of deist god, one who “plays dice,” using chance to let the universe develop its own way. More about this later.

Continuing with our historical review, in 1928 Dirac developed what we now call the Dirac equation, which describes the electron fully relativistically, that is, at all speeds up to near the speed of light. There were two remarkable outcomes of this work. First, Dirac proved that the electron has intrinsic angular momentum, or
spin
, equal to ½, a fact that had been previously inferred from experiment but did not follow from the nonrelativistic Schrödinger scheme. In quantum mechanics, a particle is either a
fermion
with half integer spin or a
boson
with integer or zero spin. The electron is a fermion. The photon is a boson with spin 1.
13
Today Dirac's equation is used to describe all spin ½ particles, such as muons, tauons, neutrinos, and quarks.

Second, Dirac showed that, in the scheme of things, the electron should be accompanied by an antielectron, a particle of the same mass and spin but with opposite charge. In 1932, Carl D. Anderson detected the antielectron in cosmic rays. He dubbed it the
positron.

In 1948, Richard Feynman produced yet another formulation of quantum mechanics called
path integrals
that paved the way for the advances in fundamental physics that were to follow over the next thirty years. Feynman also showed that quantum physics was time-reversible, like classical physics. In Feynman's picture, the positron is indistinguishable from an electron going backward in time. This can be applied more generally for any particle and its corresponding antiparticle.

QUANTUM FIELDS

 

When in the early twentieth century light was recognized as being composed of photons, a theory of photons was needed that was consistent with Maxwell's equations of electromagnetism in the classical limit. Basically the electromagnetic field had to be “quantized.” A quantization procedure had already been developed for going from classical equations to quantum mechanics. Essentially,
the same equations hold, with observables like momentum and energy represented by mathematical “operators” rather than by simple numbers.
14

BOOK: God and the Folly of Faith: The Incompatibility of Science and Religion
8.2Mb size Format: txt, pdf, ePub
ads

Other books

The Burning Horizon by Erin Hunter
Put a Ring On It by Beth Kendrick
Flying by Carrie Jones
The Night Parade by Kathryn Tanquary
Silver Dew by Suzi Davis
Alice I Have Been: A Novel by Melanie Benjamin
Trouble Vision by Allison Kingsley
The Tides of Kregen by Alan Burt Akers