Against the Gods: The Remarkable Story of Risk (23 page)

BOOK: Against the Gods: The Remarkable Story of Risk
11.47Mb size Format: txt, pdf, ePub

Imagine that you were given a choice between a gift of $25 for certain or an opportunity to play a game in which you stood a 50% chance of winning $50 and a 50% chance of winning nothing. The gamble has a mathematical expectation of $25-the same amount as the gift-but that expectation is uncertain. Risk-averse people would choose the gift over the gamble. Different people, however, are risk-averse in different degrees.

You can test your own degree of risk aversion by determining your "certainty equivalent." How high would the mathematical expectation of the game have to go before you would prefer the gamble to the gift? Thirty dollars from a 50% chance of winning $60 and a 50% chance of winning nothing? Then the $30 expectation from the gamble would be the equivalent of the $25 for certain. But perhaps you would take the gamble for an expectation of only $26. You might even discover that at heart you are a risk-seeker, willing to play the game even when the mathematical expectation of the payoff is less than the certain return of $25. That would be the case, for example, in a game where the payoff differs from 50-50 so that you would win $40 if you toss tails and zero if you toss heads, for an expected value of only $20. But most of us would prefer a game in which the expected value is something in excess
of the $50 in the example. The popularity of lottery games provides an
interesting exception to this statement, because the state's skim off the
top is so large that most lotteries are egregiously unfair to the players.

A significant principle is at work here. Suppose your stockbroker
recommends a mutual fund that invests in a cross section of the smallest stocks listed on the market. Over the past 69 years, the smallest 20%
of the stock market has provided an income of capital appreciation plus
dividend that has averaged 18% a year. That is a generous rate of return.
But volatility in this sector has also been high: two-thirds of the returns
have fallen between -23% and +59%; negative returns over twelvemonth periods have occurred in almost one out of every three years
and have averaged 20%. Thus, the outlook for any given year has been
extremely uncertain, regardless of the high average rewards generated
by these stocks over the long run.

As an alternative, suppose a different broker recommends a fund
that buys and holds the 500 stocks that comprise the Standard & Poor's
Composite Index. The average annual return on these stocks over the
past 69 years has been about 13%, but two-thirds of the annual returns
have fallen within the narrower range of -11% and +36%; negative
returns have averaged 13%. Assuming the future will look approximately like the past, but also assuming that you do not have 70 years to
find out how well you did, is the higher average expected return on
the small-stock fund sufficient to justify its much greater volatility of
returns? Which mutual fund would you buy?

Daniel Bernoulli transformed the stage on which the risk-taking
drama is played out. His description of how human beings employ
both measurement and gut in making decisions when outcomes are
uncertain was an impressive achievement. As he himself boasts in his
paper, "Since all our propositions harmonize perfectly with experience,
it would be wrong to neglect them as abstractions resting upon precarious hypotheses."

A powerful attack some two hundred years later ultimately revealed
that Bernoulli's propositions fell short of harmonizing perfectly with
experience, in large part because his hypotheses about human rational ity were more precarious than he as a man of the Enlightenment might
want to believe. Until that attack was launched, however, the concept
of utility flourished in the philosophical debate over rationality that prevailed for nearly two hundred years after Bernoulli's paper was published. Bernoulli could hardly have imagined how long his concept of
utility would survive-thanks largely to later writers who came upon it
on their own, unaware of his pioneering work.

 

ne winter night during one of the many German air raids on
Moscow in World War II, a distinguished Soviet professor of
statistics showed up in his local air-raid shelter. He had never
appeared there before. "There are seven million people in Moscow," he
used to say. "Why should I expect them to hit me?" His friends were
astonished to see him and asked what had happened to change his mind.
"Look," he explained, "there are seven million people in Moscow and
one elephant. Last night they got the elephant."

This story is a modern version of the thunderstorm phobias analyzed
in the Port-Royal Logic, but it differs at a critical point from the moral of
the example cited there. In this case, the individual involved was keenly
aware of the mathematical probability of being hit by a bomb. What the
professor's experience really illuminates, therefore, is the dual character
that runs throughout everything to do with probability: past frequencies
can collide with degrees of belief when risky choices must be made.

The story has more to it than that. It echoes the concerns of
Graunt, Petty, and Halley, When complete knowledge of the futureor even of the past-is an impossibility, how representative is the information we have in hand? Which counts for more, the seven million
humans or the elephant? How should we evaluate new information and
incorporate it into degrees of belief developed from prior information? Is the theory of probability a mathematical toy or a serious instrument
for forecasting?

Probability theory is a serious instrument for forecasting, but the devil, as they say, is in the details-in the quality of information that forms the basis of probability estimates. This chapter describes a sequence of giant steps over the course of the eighteenth century that revolutionized the uses of information and the manner in which probability theory can be applied to decisions and choices in the modern world.

The first person to consider the linkages between probability and the quality of information was another and older Bernoulli, Daniel's uncle Jacob, who lived from 1654 to 1705.1 Jacob was a child when Pascal and Fermat performed their mathematical feats, and he died when his nephew Daniel was only five years old. Talented like all the Bernoullis, he was a contemporary of Isaac Newton and had sufficient Bernoullian ill temper and hubris to consider himself a rival of that great English scientist.

Merely raising the questions that Jacob raised was an intellectual feat in itself, quite apart from offering answers as well. Jacob undertook this task, he tells us, after having meditated on it for twenty years; he completed his work only when he was approaching the age of 80, shortly before he died in 1705.

Jacob was an exceptionally dour Bernoulli, especially toward the end of his life, though he lived during the bawdy and jolly age that followed the restoration of Charles II in 1660.*
One of Jacob's more distinguished contemporaries, for example, was John Arbuthnot, Queen Anne's doctor, a Fellow of the Royal Society, and an amateur mathematician with an interest in probability that he pepped up with a generous supply of off-color examples to illustrate his points. In one of Arbuthnot's papers, he considered the odds on whether "a woman of twenty has her maidenhead" or whether "a town-spark of that age `has
not been clap'd."'2

Jacob Bernoulli had first put the question of how to develop probabilities from sample data in 1703. In a letter to his friend Leibniz, he commented that he found it strange that we know the odds of throwing a seven instead of an eight with a pair of dice, but we do not know the probability that a man of twenty will outlive a man of sixty. Might we not, he asks, find the answer to this question by examining a large number of pairs of men of each age?

In responding to Bernoulli, Leibniz took a dim view of this approach. "[N]ature has established patterns originating in the return of events," he wrote, "but only for the most part. New illnesses flood the human race, so that no matter how many experiments you have done on corpses, you have not thereby imposed a limit on the nature of events so that in the future they could not vary."3 Although Leibniz wrote this letter in Latin, he put the expression, "but only for the most part" into Greek: CO; E7tt 'CO nob j. Perhaps this was to emphasize his point that a finite number of experiments such as Jacob suggested would inevitably be too small a sample for an exact calculation of nature's intentions.*

Other books

Hawaiian Heartbreak by Cole, Libby
Tres hombres en una barca by Jerome K. Jerome
Anything but Ordinary by Nicola Rhodes
Now and Yesterday by Stephen Greco
Bloodguilty by K.M. Penemue
Island of Thieves by Josh Lacey
The Sunlit Night by Rebecca Dinerstein
The Huntsmen by Honor James