Read Against the Gods: The Remarkable Story of Risk Online
Authors: Peter L. Bernstein
Their solution to Paccioli's puzzle meant that people could for the
first time make decisions and forecast the future with the help of numbers. In the medieval and ancient worlds, even in preliterate and peasant
societies, people managed to make decisions, advance their interests, and carry on trade, but with no real understanding of risk or the nature
of decision-making. Today, we rely less on superstition and tradition than
people did in the past, not because we are more rational, but because
our understanding of risk enables us to make decisions in a rational
mode.
At the time Pascal and Fermat made their breakthrough into the
fascinating world of probability, society was experiencing an extraordinary wave of innovation and exploration. By 1654, the roundness of
the earth was an established fact, vast new lands had been discovered,
gunpowder was reducing medieval castles to dust, printing with movable type had ceased to be a novelty, artists were skilled in the use of
perspective, wealth was pouring into Europe, and the Amsterdam stock
exchange was flourishing. Some years earlier, in the 1630s, the famed
Dutch tulip bubble had burst as a result of the issuing of options whose
essential features were identical to the sophisticated financial instruments in use today.
These developments had profound consequences that put mysticism
on the run. By this time Martin Luther had had his say and halos had disappeared from most paintings of the Holy Trinity and the saints. William
Harvey had overthrown the medical teachings of the ancients with his
discovery of the circulation of blood-and Rembrandt had painted "The
Anatomy Lesson," with its cold, white, naked human body. In such an
environment, someone would soon have worked out the theory of probability, even if the Chevalier de Mere had never confronted Pascal with
his brainteaser.
As the years passed, mathematicians transformed probability theory
from a gamblers' toy into a powerful instrument for organizing, interpreting, and applying information. As one ingenious idea was piled on
top of another, quantitative techniques of risk management emerged
that have helped trigger the tempo of modern times.
By 1725, mathematicians were competing with one another in
devising tables of life expectancies, and the English government was
financing itself through the sale of life annuities. By the middle of the
century, marine insurance had emerged as a flourishing, sophisticated
business in London.
In 1703, Gottfried von Leibniz commented to the Swiss scientist
and mathematician Jacob Bernoulli that "[N]ature has established patterns originating in the return of events, but only for the most part,"'
thereby prompting Bernoulli to invent the Law of Large Numbers and methods of statistical sampling that drive modern activities as varied as opinion polling, wine tasting, stock picking, and the testing of new drugs.*
Leibniz's admonition-"but only for the most part"-was more profound than he may have realized, for he provided the key to why there is such a thing as risk in the first place: without that qualification, everything would be predictable, and in a world where every event is identical to a previous event no change would ever occur.
In 1730, Abraham de Moivre suggested the structure of the normal distribution-also known as the bell curve-and discovered the concept of standard deviation. Together, these two concepts make up what is popularly known as the Law of Averages and are essential ingredients of modern techniques for quantifying risk. Eight years later, Daniel Bernoulli, Jacob's nephew and an equally distinguished mathematician and scientist, first defined the systematic process by which most people make choices and reach decisions. Even more important, he propounded the idea that the satisfaction resulting from any small increase in wealth "will be inversely proportionate to the quantity of goods previously possessed." With that innocent-sounding assertion, Bernoulli explained why King Midas was an unhappy man, why people tend to be risk-averse, and why prices must fall if customers are to be persuaded to buy more. Bernoulli's statement stood as the dominant paradigm of rational behavior for the next 250 years and laid the groundwork for modern principles of investment management.
Almost exactly one hundred years after the collaboration between Pascal and Fermat, a dissident English minister named Thomas Bayes made a striking advance in statistics by demonstrating how to make better-informed decisions by mathematically blending new information into old information. Bayes's theorem focuses on the frequent occasions when we have sound intuitive judgments about the probability of some event and want to understand how to alter those judgments as actual events unfold.
All the tools we use today in risk management and in the analysis of decisions and choice, from the strict rationality of game theory to the challenges of chaos theory, stem from the developments that took place
between 1654 and 1760, with only two exceptions:
In 1875, Francis Galton, an amateur mathematician who was Charles
Darwin's first cousin, discovered regression to the mean, which explains
why pride goeth before a fall and why clouds tend to have silver linings.
Whenever we make any decision based on the expectation that matters
will return to "normal," we are employing the notion of regression to the
mean.
In 1952, Nobel Laureate Harry Markowitz, then a young graduate
student studying operations research at the University of Chicago,
demonstrated mathematically why putting all your eggs in one basket is
an unacceptably risky strategy and why diversification is the nearest an
investor or business manager can ever come to a free lunch. That revelation touched off the intellectual movement that revolutionized Wall
Street, corporate finance, and business decisions around the world; its
effects are still being felt today.
The story that I have to tell is marked all the way through by a persistent tension between those who assert that the best decisions are
based on quantification and numbers, determined by the patterns of the
past, and those who base their decisions on more subjective degrees of
belief about the uncertain future. This is a controversy that has never
been resolved.
The issue boils down to one's view about the extent to which the
past determines the future. We cannot quantify the future, because it is
an unknown, but we have learned how to use numbers to scrutinize
what happened in the past. But to what degree should we rely on the
patterns of the past to tell us what the future will be like? Which matters more when facing a risk, the facts as we see them or our subjective
belief in what lies hidden in the void of time? Is risk management a science or an art? Can we even tell for certain precisely where the dividing line between the two approaches lies?
It is one thing to set up a mathematical model that appears to explain
everything. But when we face the struggle of daily life, of constant trial
and error, the ambiguity of the facts as well as the power of the human
heartbeat can obliterate the model in short order. The late Fischer Black, a pioneering theoretician of modern finance who moved from M.I.T. to
Wall Street, said, "Markets look a lot less efficient from the banks of the
Hudson than from the banks of the Charles."2
Over time, the controversy between quantification based on observations of the past and subjective degrees of belief has taken on a deeper
significance. The mathematically driven apparatus of modern risk management contains the seeds of a dehumanizing and self-destructive technology. Nobel laureate Kenneth Arrow has warned, "[O]ur knowledge
of the way things work, in society or in nature, comes trailing clouds of
vagueness. Vast ills have followed a belief in certainty."3 In the process
of breaking free from the past we may have become slaves of a new religion, a creed that is just as implacable, confining, and arbitrary as the old.
Our lives teem with numbers, but we sometimes forget that numbers
are only tools. They have no soul; they may indeed become fetishes.
Many of our most critical decisions are made by computers, contraptions
that devour numbers like voracious monsters and insist on being nourished with ever-greater quantities of digits to crunch, digest, and spew
back.
To judge the extent to which today's methods of dealing with risk
are either a benefit or a threat, we must know the whole story, from its
very beginnings. We must know why people of past times did-or did
not-try to tame risk, how they approached the task, what modes of
thinking and language emerged from their experience, and how their
activities interacted with other events, large and small, to change the
course of culture. Such a perspective will bring us to a deeper understanding of where we stand, and where we may be heading.
Along the way, we shall refer often to games of chance, which have
applications that extend far beyond the spin of the roulette wheel.
Many of the most sophisticated ideas about managing risk and making
decisions have developed from the analysis of the most childish of
games. One does not have to be a gambler or even an investor to recognize what gambling and investing reveal about risk.
The dice and the roulette wheel, along with the stock market and
the bond market, are natural laboratories for the study of risk because
they lend themselves so readily to quantification; their language is the language of numbers. They also reveal a great deal about ourselves.
When we hold our breath watching the little white ball bounce about
on the spinning roulette wheel, and when we call our broker to buy or
sell some shares of stock, our heart is beating along with the numbers.
So, too, with all important outcomes that depend on chance.
The word "risk" derives from the early Italian risicare, which means
"to dare." In this sense, risk is a choice rather than a fate. The actions we
dare to take, which depend on how free we are to make choices, are
what the story of risk is all about. And that story helps define what it
means to be a human being.