Against the Gods: The Remarkable Story of Risk (64 page)

BOOK: Against the Gods: The Remarkable Story of Risk
4.27Mb size Format: txt, pdf, ePub

For Leibniz, the difficulty in generalizing from samples of information arises from nature's complexity, not from its waywardness. He
believed that there is too much going on for us to figure it all out by
studying a set of finite experiments, but, like most of his contemporaries, he was convinced that there was an underlying order to the
whole process, ordained by the Almighty. The missing part to which he
alluded with "only for the most part" was not random but an invisible
element of the whole structure.

Three hundred years later, Albert Einstein struck the same note. In
a famous comment that appeared in a letter to his fellow-physicist Max
Born, Einstein declared, "You believe in a God who plays with dice,
and I in complete law and order in a world which objectively exists."2

Bernoulli and Einstein may be correct that God does not play with
dice, but, for better or for worse and in spite of all our efforts, human
beings do not enjoy complete knowledge of the laws that define the
order of the objectively existing world.

Bernoulli and Einstein were scientists concerned with the behavior
of the natural world, but human beings must contend with the behavior of something beyond the patterns of nature: themselves. Indeed, as
civilization has pushed forward, nature's vagaries have mattered less
and the decisions of people have mattered more.

Yet the growing interdependence of humanity was not a concern
to any of the innovators in this story until we come to Knight and
Keynes in the twentieth century. Most of these men lived in the late
Renaissance, the Enlightenment, or the Victorian age, and so they
thought about probability in terms of nature and visualized human
beings as acting with the same degree of regularity and predictability as
they found in nature.

Behavior was simply not part of their deliberations. Their emphasis
was on games of chance, disease, and life expectancies, whose outcomes are ordained by nature, not by human decisions. Human beings were always assumed to be rational (Daniel Bernoulli describes rationality as "the nature of man"), which simplifies matters because it
makes human behavior as predictable as nature's-perhaps more so.
This view led to the introduction of terminology from the natural sciences to explain both economic and social phenomena. The process of
quantifying subjective matters like preferences and risk aversion was
taken for granted and above dispute. In all their examples, no decision
by any single individual had any influence on the welfare of any other
individual.

The break comes with Knight and Keynes, both writing in the aftermath of the First World War. Their "radically distinct notion" of uncertainty had nothing whatsoever to do with nature or with the debate
between Einstein and Born. Uncertainty is a consequence of the irrationalities that Knight and Keynes perceived in human nature, which
means that the analysis of decision and choice would no longer be limited to human beings in isolated environments like Robinson Crusoe's.
Even von Neumann, with his passionate belief in rationality, analyzes
risky decisions in a world where the decisions of each individual have an
impact on others, and where each individual must consider the probable responses of others to his or her own decisions. From there, it is only
a short distance to Kahneman and Tversky's inquiries into the failure of
invariance and the behavioral investigations of the Theory Police.

Although the solutions to much of the mystery that Leibniz perceived in nature were well in hand by the twentieth century, we are
still trying to understand the even more tantalizing mystery of how
human beings make choices and respond to risk. Echoing Leibniz, G.K.
Chesterton, a novelist and essayist rather than a scientist, has described
the modern view this way:

The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind
of trouble is that it is nearly reasonable, but not quite. Life is not an
illogicality; yet it is a trap for logicians. It looks just a little more
mathematical and regular than it is; its exactitude is obvious, but its
inexactitude is hidden; its wildness lies in wait.'

In such a world, are probability, regression to the mean, and diversification useless? Is it even possible to adapt the powerful tools that interpret the variations of nature to the search for the roots of inexactitude? Will wildness always lie in wait?

Proponents of chaos theory, a relatively new alternative to the ideas
of Pascal and others, claim to have revealed the hidden source of inexactitude. According to chaos theorists, it springs from a phenomenon
called "nonlinearity." Nonlinearity means that results are not proportionate to the cause. But chaos theory also joins with Laplace, Poincare,
and Einstein in insisting that all results have a cause-like the balanced
cone that topples over in response to "a very slight tremor."

Students of chaos theory reject the symmetry of the bell curve as a
description of reality. They hold in contempt linear statistical systems in
which, for example, the magnitude of an expected reward is assumed to
be consistent with the magnitude of the risks taken to achieve it, or, in
general, where results achieved bear a systematic relationship to efforts
expended. Consequently, they reject conventional theories of probability, finance, and economics. To them, Pascal's Arithmetic Triangle is
a toy for children, Francis Galton was a fool, and Quetelet's beloved
bell curve is a caricature of reality.

Dimitris Chorafas, an articulate commentator on chaos theory,
describes chaos as "... a time evolution with sensitive dependence on
initial conditions."4 The most popular example of this concept is the
flutter of a butterfly's wings in Hawaii that is the ultimate cause of a
hurricane in the Caribbean. According to Chorafas, chaos theorists see
the world "in a state of vitality... characterized by turbulence and
volatility."5 This is a world in which deviations from the norm do not
cluster symmetrically on either side of the average, as Gauss's normal
distribution predicts; it is a craggy world in which Galton's regression
to the mean makes no sense, because the mean is always in a state of
flux. The idea of a norm does not exist in chaos theory.

Chaos theory carries Poincare's notion of the ubiquitous nature of
cause and effect to its logical extreme by rejecting the concept of discontinuity. What appears to be discontinuity is not an abrupt break
with the past but the logical consequence of preceding events. In a
world of chaos, wildness is always waiting to show itself.

Making chaos theory operational is something else again. According to Chorafas, "The signature of a chaotic time series ... is that prediction accuracy falls off with the increasing passage of time." This view leaves the practitioners of chaos theory caught up in a world of minutiae, in which all the signals are tiny and everything else is mere noise.

As forecasters in financial markets who focus on volatility, practitioners of chaos theory have accumulated immense quantities of transactions data that have enabled them, with some success, to predict changes in security prices and exchange rates, as well as variations in risk, within the near future.6 They have even discovered that roulette wheels do not produce completely random results, though the advantage bestowed by that discovery is too small to make any gambler rich.

So far, the accomplishments of the theory appear modest compared to its claims. Its practitioners have managed to cup the butterfly in their hands, but they have not yet traced all the airflows impelled by the flutterings of its wings. But they are trying.

In recent years, other sophisticated innovations to foretell the future have surfaced, with strange names like genetic algorithms and neural networks.7 These methods focus largely on the nature of volatility; their implementation stretches the capability of the most high-powered computers.

The objective of genetic algorithms is to replicate the manner in which genes are passed from one generation to the next. The genes that survive create the models that form the most durable and effective offspring.*
Neural networks are designed to simulate the behavior of the human brain by sifting out from the experiences programed into them those inferences that will be most useful in dealing with the next experience. Practitioners of this procedure have uncovered behavior patterns in one system that they can use to predict outcomes in entirely different systems, the theory being that all complex systems like democracy, the path of technological development, and the stock market share common patterns and responses."

These models provide important insights into the complexity of reality, but there is no proof of cause and effect in the recognition of patterns that precede the arrival of other patterns in financial markets or in the spin of a roulette wheel. Socrates and Aristotle would be as skeptical about chaos theory and neural networks as the theorists of those approaches are about conventional approaches.

Likeness to truth is not the same as truth. Without any theoretical structure to explain why patterns seem to repeat themselves across time or across systems, these innovations provide little assurance that today's signals will trigger tomorrow's events. We are left with only the subtle sequences of data that the enormous power of the computer can reveal. Thus, forecasting tools based on nonlinear models or on computer gymnastics are subject to many of the same hurdles that stand in the way of conventional probability theory: the raw material of the model is the data of the past.

The past seldom obliges by revealing to us when wildness will break out in the future. Wars, depressions, stock-market booms and crashes, and ethnic massacres come and go, but they always seem to arrive as surprises. After the fact, however, when we study the history of what happened, the source of the wildness appears to be so obvious to us that we have a hard time understanding how people on the scene were oblivious to what lay in wait for them.

Surprise is endemic above all in the world of finance. In the late 1950s, for example, a relationship sanctified by over eighty years of experience suddenly came apart when investors discovered that a thousand dollars invested in low-risk, high-grade bonds would, for the first time in history, produce more income than a thousand dollars invested in risky common stocks.*
In the early 1970s, long-term interest rates rose above 5% for the first time since the Civil War and have dared to
remain above 5% ever since.

Given the remarkable stability of the key relationships between bond yields and stocks yields, and the trendless history of long-term interest rates over so many years, no one ever dreamed of anything different. Nor did people have any reason for doing so before the development of contracyclical monetary and fiscal policy and before they had experienced a price level that only went up instead of rising on some occasions and falling on others. In other words, these paradigm shifts may not have been unpredictable, but they were unthinkable.

If these events were unpredictable, how can we expect the elaborate quantitative devices of risk management to predict them? How can we program into the computer concepts that we cannot program into ourselves, that are even beyond our imagination?

We cannot enter data about the future into the computer because such data are inaccessible to us. So we pour in data from the past to fuel the decision-making mechanisms created by our models, be they linear or nonlinear. But therein lies the logician's trap: past data from real life constitute a sequence of events rather than a set of independent observations, which is what the laws of probability demand. History provides us with only one sample of the economy and the capital markets, not with thousands of separate and randomly distributed numbers. Even though many economic and financial variables fall into distributions that approximate a bell curve, the picture is never perfect. Once again, resemblance to truth is not the same as truth. It is in those outliers and imperfections that the wildness lurks.

Other books

Tied Together by Z. B. Heller
The True Love Wedding Dress by Catherine Anderson
Driven to the Limit by Alice Gaines
A Formal Affair by Veronica Chambers
Escape by Robert K. Tanenbaum