The Big Questions: Physics (10 page)

Read The Big Questions: Physics Online

Authors: Michael Brooks

BOOK: The Big Questions: Physics
3.4Mb size Format: txt, pdf, ePub
 

Carnot’s work led directly to the formulation of the second law of thermodynamics. As phrased by the English physicist Lord Kelvin and the German physicist Max Planck, it states that an engine operating in a cycle cannot transform heat into work without some other effect on its environment. Thanks to the second law, not only can you not get a free lunch, you can’t even
keep your lunch cool in the refrigerator for free. Refrigeration, it turns out, is nothing more complicated than the Carnot engine working in reverse.

 

‘An engine operating in a cycle cannot transfer heat from a cold reservoir to a hot reservoir without some other effect on its environment.’

 

RUDOLPH CLAUSIUS

 

In 1850, the German physicist Rudolph Clausius rephrased the second law to read, ‘An engine operating in a cycle cannot transfer heat from a cold reservoir to a hot reservoir without some other effect on its environment.’ A refrigerator, in other words, needs energy put into it. This arises from the natural tendency of energy to flow ‘downhill’: from hot to cold. Keeping the inside of your refrigerator below the temperature of your kitchen involves the same process of expanding and contracting, heating and cooling gases as running your car engine, and it all takes energy. This time, though, you need a compressor rather than an expander for the gas.

 
The march of entropy
 

As mentioned, Carnot’s work involved consideration of the pressure, temperature and volume of the gas. The process that Carnot uncovered led to another revelation for physicists: the notion of entropy. The whole universe, it turns out, is spiralling into ever-more disorder. It was Clausius who classified this disorder as ‘entropy’, a word derived from the Greek for ‘transformation’. In 1865, he wrote a mathematical treatise on the work that the atoms do on one another in a gas. The result, Clausius showed, is that the second law can be expressed in a new way: the entropy, or disorder, of a closed system either stays the same or increases – it never decreases.

 

That doesn’t mean you’ll never see entropy increase on a small scale. Your lunch inside the refrigerator will get cold, for instance, decreasing the disorder in its constituent molecules. But don’t be fooled that this breaks the second law of thermodynamics. The inside of your fridge is not a closed system – the molecules of refrigerant gas take the heat away, and their
disorder increases as they do. As the heat is transferred to the air in your kitchen, the disorder in your house increases too.

 

This kind of thing is happening throughout the universe as the processes of nature unfold. It creates, in physicists’ view, the irreversibility of natural processes: the arrow of time is just another way of expressing the second law of thermodynamics. The wasted energy of Carnot’s engine cycle is the slow unravelling of the universe in microcosm.

 

Together, the first and second laws of thermodynamics put up a brick wall to any claims for the generation of a free lunch. So well proven are they, in fact, that the US Patent Office warns anyone submitting a patent for a perpetual motion machine that they should think carefully; they will most likely lose their money. ‘The views of the Patent Office are in accord with those scientists who have investigated the subject and are to the effect that such devices are physical impossibilities,’ the office’s official statement says. ‘The position of the Office can only be rebutted by a working model…. The Office hesitates to accept fees from applicants who believe they have discovered Perpetual Motion, and deems it only fair to give such applicants a word of warning that fees cannot be recovered after the case has been considered by the Examiner.’ So not only is there no such thing as a free lunch; even looking for one could end up costing you money.

 
IS EVERYTHING ULTIMATELY RANDOM?
 

Uncertainty, quantum reality and the probable role of statistics

 

We could start by turning the question on its head. Is everything predictable? Can we work out the rules that determine how the processes of the universe occur? That would give us extraordinary power over nature, the kind of power humanity has always dreamed of.

 

In many ways, the whole of human existence is wrapped up in this quest. We look at the world around us, and attempt to find regularities and correlations that enable us to reduce what we see to a set of rules or generalities. This enables us to make predictions about the things we might or might not encounter in future, and to adjust our expectations and our movements accordingly. We are, at heart, pattern-seekers.

A facility for pattern-spotting has served us well as a species. It is undoubtedly what enabled us to survive in the savannah. A predator might be camouflaged when still, but as soon as the beast moved, we would spot a change in the patterns in our surroundings, and take evasive action. Roots and berries grow in predictable geographical and temporal patterns (the seasons), enabling us to find and feast on them.

 

The evidence suggests that, because our lives depended on pattern recognition, the evolution of our brains took the process to extremes, forcing us to see patterns even when they are not
there. For example, we over-interpreted the rustles of leaves and bushes as evidence for a world of invisible spirits. Modern research suggests this kind of oversensitivity to patterns in our environment has predisposed us to religious conviction; a tendency towards irrational thinking – the consideration of things we can’t touch, see or account for – is the price the human species has paid for its survival.

 

Ironically, though, scientists have only been able to draw conclusions about where irrational thinking comes from because of the mote in their own eye. Scientists are now painfully aware of their tendency to see patterns where there are none, and randomness where there is order. In order to combat this, and to determine whether there is any order, purpose or structure in the world around us, we needed the invention that exposes both how brilliant, and how foolish, the human mind can be. You might know it better as statistics.

 
The die is cast
 

Unlike many of the developments of modern science, statistics had nothing to do with the Greeks. That is remarkable when you consider how much they enjoyed gambling. The Greeks and Romans spent many hours throwing the ancient world’s dice. These were made from
astralagi
, the small six-sided bones found in the heels of sheep and deer. Four of the sides were flat, and these were assigned the numbers. Craftsmen carved the numbers one and six into two opposing faces, and three and four into the other two flat faces. The way the numbers were situated made one and six around four times less likely to be thrown than three or four.

 

An enterprising Greek mathematician, you might think, could have made a fortune in dice games involving
astralagi
. However, there are reasons why no one did. Firstly the Greeks saw nothing as random chance: everything was in the hidden plans of the gods. Also the Greeks were actually rather clumsy with numbers. Greek mathematics was all to do with shape: they excelled at geometry. Dealing with randomness, however, involves arithmetic and algebra, and here the Greeks had limited abilities.

 

The invention of algebra was not the only breakthrough required for getting a grip on randomness. Apparently, it also needed the production of ‘fair’ dice that had an equal probability of landing on any of their six faces; the first probability theorems, which appeared in the 17th century alongside Newton’s celestial mechanics, were almost exclusively concerned with what happens when you roll dice.

 

‘Chance, that mysterious, much abused word, should be considered only a veil for our ignorance. It is a phantom which exercises the most absolute empire over the common mind’

 

ADOLPHE QUETELET

 

These theorems were predicated on the idea that the dice are fair and, though rather primitive, they laid the foundations for the first attempt to get a handle on whether processes in the natural world could be random. From dice, through coin tosses and card shufflings, we finally got to statistics, probability and the notion of randomness.

 
Probable cause
 

The Belgian astronomer and mathematician Adolphe Quetelet first began to apply probability to human affairs in the 1830s, developing statistical notions of probability distributions of the physical and moral characteristics of human populations. It was Quetelet who invented the concept of the ‘average man’.

 

 

When he turned his attention to the notion of randomness in natural events, Quetelet was determined to take no prisoners. ‘Chance, that mysterious, much abused word, should be considered only a veil for our ignorance,’ he said. ‘It is a phantom which exercises the most absolute empire over the common mind, accustomed to consider events only as isolated, but which is reduced to naught before the philosopher, whose eye embraces a long series of events.’

 

MORE THAN NOISE

 

If the world is ultimately random, at least we can put that randomness to use. For starters, there’s the notion of dither. During the Second World War, aircrews noticed that their instruments worked better when the aircraft was in flight. The vibrations of the plane were moving the needles back and forth in tiny, random motions that overcame friction within the instruments’ mechanisms. Random noise seems to help creatures in the natural world too. Crayfish are more likely to be eaten by their predators when in still water. The addition of a little turbulent noise seems to give the crayfish an edge in detecting the disturbances associated with an approaching fish.

 

On the other side of the food equation, randomness is put to use when the paddlefish hunts out plankton. Plankton emit a weak electrical signal, and the paddlefish’s long snout is equipped with electric field sensors that can pick up this field. Most of the time, though, the plankton’s signal is too small for the paddlefish to detect. Evolution has equipped the paddlefish with neural cells that add some noise into the signal, however. The result is a phenomenon called ‘stochastic resonance’, which seems to amplify the weak signal to the point where the paddlefish is able to sense the plankton.

 

A similar trick may be at work in our own brains. Fruit flies, which share their brain architecture with all vertebrates, have been shown to use stochastic resonance to improve their sense of smell. Studies show that eyesight, hearing and an elderly person’s senses of touch and balance can be improved by adding some random noise into the signals that the brain receives from the ear, the eye or the skin. Cochlear implants, for example, help deaf people hear better when they have some low-level random noise built-in.

 
 

Though ancient civilizations might have been able to predict the motions of the planets, until Quetelet no one thought that there could be any pattern to the way rain falls on a windowpane or the occurrence of meteor showers. Quetelet changed all that, revealing statistical patterns in things long thought to be random.

 

Not that the notion of randomness was killed with Quetelet. His work showed that the ‘long series of events’ followed a statistical pattern more often than not. But that left open the idea that a single event could not be predicted. While a series of coin tosses will give a predictable distribution of heads and tails, the outcome of a single coin toss remains unpredictable in Quetelet’s science.

 
Lifting the veil of ignorance
 

Even here, though, science has now shown perception of randomness to be a result of ignorance. Tossing a coin involves a complicated mix of factors. There is the initial position of the coin, the angular and linear momentum the toss imparts, the distance the coin is allowed to fall, and the air resistance during its flight. If you know all these to a reasonable accuracy, you can predict exactly how the coin will land.

 

A coin toss is therefore not random at all. More random – but still not truly random – is the throw of a die. Here the same rules apply: in principle, if you know all of the initial conditions and the precise dynamics of the throw, you can calculate which face will end facing upwards. The problem here is the role of the die’s sharp corners. When a die’s corner hits the table, the result is chaotic (see
Does Chaos Theory Spell Disaster?
): the ensuing motion depends sensitively on the exact angle and velocity at which it hits. The result of any subsequent fall on a corner will ultimately depend even more sensitively on those initial conditions. So, while we could reasonably expect to compute the outcome of a coin toss from the pertinent information, our predictions of a die throw will be far less accurate. If the throw involves two or three chaotic collisions with the table, our predictions may turn out to be little better than random.

Other books

Mayday Over Wichita by D. W. Carter
darknadir by Lisanne Norman
Eggsecutive Orders by Julie Hyzy
Drive-By by Lynne Ewing
The Last Of The Rings by Celeste Walker
The Romanovs: The Final Chapter by Massie, Robert K.
In the Black by Sheryl Nantus