Read Mathematics and the Real World Online
Authors: Zvi Artstein
The errors deriving from the difference between logical mathematical analysis and the intuitive way in which the brain handles such problems of probability can sometimes carry a heavy penalty. In many articles and in a book in the bibliography at the end of this book, the German psychologist Gerd Gigerenzer cites shocking examples of failed medical procedures and patients’ traumatized reactions resulting from incorrect information regarding the outcome of laboratory tests. The message of Gigerenzer and others is that decision makers, including physicians, economists, politicians, and the like, must be taught how to act in situations of uncertainty, in other words, to absorb and implement Bayes's thought process. Is that possible? Gigerenzer's opinion is that Bayesian logic can be absorbed if, instead of analyzing problems using concepts of probability events, we train ourselves to think in the framework of repeats of a situation. In other
words, we should exchange events as in the Kolmogorov model for considerations of relative frequency. Thus, according to Gigerenzer, if in the example above of the blood donor and his blood tests, instead of considering the one individual, we check a series consisting of many subjects, we would note that many of them, more than a half, are healthy subjects whom the tests incorrectly showed as being carriers. Gigerenzer actually presented figures showing a great improvement in groups of physicians who had learned to analyze in this way situations of randomness.
I find it difficult to accept Gigerenzer's conclusions. I think that the errors are basic and stem from intuitive thinking. People erring in a certain situation are less likely to make the same mistake if they encounter exactly the same situation again. Their decisions will not improve, however, if the uncertainty appears in a slightly different guise. Someone noting that most of the subjects recorded by the tests as being carriers are actually healthy could to the same extent implement Bayes's original formula. The only solution that I can think of to the problem of errors is that in those cases in which the error can result in much damage—in medicine, economics, assessments of intelligence data, and so on—one must strictly analyze the situation using mathematical tools explicitly and avoid intuitive thinking. If it is not particularly important that the right answer is reached, that is, if the potential error is bearable, it may be that the way evolution teaches us to react is acceptable and even preferable and more efficient. That may lead to errors in a number of instances, but it may solve other situations correctly and may save time and effort.
43. INTUITION VERSUS THE STATISTICS OF RANDOMNESS
Although evolution did not prepare us to analyze intuitively with logical elements situations of uncertainty, we could assume that we would react correctly to statistical situations. Throughout the whole of the evolutionary process, humans have been exposed to random occurrences. Nevertheless, even in these cases errors related to statistical randomness are repeated
again and again; we mentioned some of them in section 39 on the mathematics of predictions and errors. Some of the errors can be explained by evolution itself. We will give a few examples.
The Ayalon Highway in Israel that traverses Tel Aviv and its suburbs is intended to enable vehicles to cross the city quickly. Shortly after a central section of the highway was opened with due pomp and ceremony, the Ayalon River overflowed due to very heavy rain, and the highway was flooded. This led to severe traffic jams, and the CEO of the Ayalon Highway Company was invited to appear on television to explain the reasons for the flooding. His explanation was convincing: To build a highway that would be immune to any possible flooding would be prohibitively expensive. The engineers therefore took a calculated risk and constructed a road with a wide margin of error, so that flooding was expected only once in twenty-five years. It was bad luck, he explained, that this flooding occurred only a short time after the highway was inaugurated, but that was the nature of randomness. He went on to calm the viewers that now they could look forward to a long period of flood-free driving on the highway. Exactly three weeks passed, and the highway was flooded again. The CEO was again invited to appear on television and with a crestfallen face mumbled something about independent and dependent events, without managing to persuade the interviewer that the engineers’ calculations were accurate under the circumstances. The reason for the mistake is clear: in his first broadcast the CEO gave insufficient weight to the most important piece of information, that the highway had just been flooded. If a flood occurs because of extremely heavy downpours, the ground is saturated with water and even light rain may then cause a flood, in other words, the next flooding is not an event independent of the first.
The attitude to the significance of numerical values that the law of probability attributes to various events is not uniform or consistent. Some years ago there was a danger that the Sea of Galilee would flood its shores. The executive responsible for Israel's water sector explained on television that the chance of such flooding was 60 percent and went on to say that a miracle was needed to avoid it. Is an occurrence that has a 40 percent chance of taking place considered a miracle? I doubt it. And indeed, that
year a “miracle” occurred, and the Sea of Galilee did not flood. To many doctors, an 80 percent chance of a patient's recovery and a 97 percent chance of recovery may seem similar, but to the patient who understands the law of probability, the difference is huge. A 97 percent chance of recovery means that the treatment is successful in all but a few cases. A 20 percent chance of failure indicates that failure is a systemic possibility.
The attitude to events with very low probability is also inconsistent. On the one hand, people buy lottery tickets although the trouble it takes outweighs the probable winnings. The reason is apparently the positive personal feeling of looking forward to a possible win, even though they know it is unlikely to be realized. On the other hand, the intuitive tendency is to ignore events that have very little chance of being realized. Sometimes this tendency is crucial, particularly in financial, economic, political, and similar matters. This tendency to ignore unlikely events may also be traced to evolutionary sources. In the broad framework of the struggle for survival, the means devoted to facing up to occurrences with a small chance of happening are made at the expense of the major efforts needed in the struggle for survival. For example, if dinosaurs would have developed gills that enabled them to breathe dusty air, they would have survived the meteoric dust that according to the generally accepted explanation engulfed the Earth and resulted in their extinction. On the other hand, a species of dinosaur that would have devoted efforts to developing such gills at the expense of the struggle for day-to-day survival may not have survived and may have become extinct before the meteor collided with Earth. The evolutionary struggle is one of here and now, that is, it takes into account only current conditions and ignores possible future events or events with a low probability of occurring. This fact has filtered down into the way we react to dangers that have a low probability of being realized.
Another error that may be called a mental illusion is related to the interpretation of statistical data, and it too may be traced back to evolutionary origins. As we explained in section 4, identifying patterns is an innate ability. Moreover, it is preferable to err on the side of overidentification. Failure to identify an existing pattern may bear a heavy price, compared
with the damage that may be suffered through identifying a nonexistent pattern. The psychologist and expert on decision making Amos Tversky (1937–1996), together with his colleagues Thomas Gilovitch and Robert Vallone, decided to examine the “hot hand” belief in basketball. Every basketball fan knows of this phenomenon. When a player scores a number of baskets in successive throws, he, his coach, the opposing team, the spectators, all feel that he has a “hot hand” and that it is reasonable that he should also try to score in the future. In terms of the law of probability, the hot hand rule says that a number of successful shots at the basket increases the chances that the next throw will also be successful, in contrast with the case in which the same player under the same conditions did not score in his previous attempts at the basket. The situation can be explained, and the usually accepted explanation is the combination of achieving self-confidence with the psychological effects following a run of successes.
Tversky and his colleagues decided to examine the hot hand concept, and over a whole NBA basketball season in the United States they watched one of the most successful teams at that time, the Philadelphia 76ers, and recorded each shot at the basket and monitored the runs of successful shots. They discovered, to the surprise of many, that the hot hand was an illusion, a fallacy. In a random series, a run of successes can also occur without the chances of success in the next attempt increasing. The runs (or “streaks”) of successful shots in the 76ers’ games did not differ from those of a random series. The parameters of the series, that is the percent of successful shots, are likely to change from player to player and from one game to another, but under the same conditions the chances of a successful shot at the basket does not increase after a run of successful shots.
This finding should have immediate implications because a player's hot hand, if it does not exist, has a direct effect on how the coach manages the team during the game. Tversky and his colleagues’ findings met with a mixed reception. They had no effect on the spectators, the players, or the coaches, who continued to believe in the hot hand, and who continue to act accordingly. Opinions in the scientific community are divided. Some accept the findings as they are, and others think that the hot hand phenomenon does exist but is expressed differently. I do not know whether it exists
or not, but there is a simple explanation for the illusion: the need to look for and find patterns is deeply embodied in our genes, so much so that events such as a run of successes, or successful shots at the basket, or a few consecutive years of hotter than usual weather, or successive stock-exchange profits, a run that is consistent with the statistics of random events we interpret as valid nonrandom occurrences.
Does the Consumer Price Index cause sunspots? • Are there optimal marriages? • Game theory or conflict theory? • How much would you pay for a lottery ticket that is expected win a million dollars? • Is it irrational to throw money into the trash bin? • Is someone who believes everything “simple”? • Can one arrive at a decision without preconceptions? • What is evolutionary rationality?
44. MACRO-CONSIDERATIONS
Since the dawn of history man's behavior has been the subject of analysis and debate in various spheres: literature, art, law, and political and philosophical studies. Yet, the use of a mathematical approach to describe and analyze people's conduct and decisions began only toward the end of the eighteenth century. In this chapter we describe some of these developments.
Human conduct, particularly in economic matters, can be divided into individual behavior and group behavior. Clearly the two are connected, as individual behavior determines group behavior. Yet, in economic issues it is still difficult to find a mathematical model that can provide a quantitative prediction of how global economic parameters follow from the decisions of individuals. It was the Scottish philosopher and economist Adam Smith (1723–1790) who coined the phrase
the invisible hand
. He presented the concept in his book
An Inquiry into the Nature and Causes of the Wealth of Nations
, published in 1776. In this book Smith laid the foundations of the theory of capitalism: every individual tries to maximize his own welfare, without regard to the needs of the public, and an invisible hand translates those individual actions such that they improve the situation of the society. The nature of the invisible hand remained unexplained. The first explanations did not appear until the 1950s, when economists began systematically to base the theory of capitalism on defined fundamentals; however, this approach met with only very limited success.
On the face of it, behavior in which every individual is concerned only with himself is highly consistent with Darwin's concepts of evolution, as the evolutionary struggle leads to competitive conduct. A closer look, however, reveals that competition in nature is not between individuals but between species. The victorious species are those that survive for generations, and they are not necessarily the species in which every individual fends for itself. A species may be able to survive because its members are prepared to sacrifice themselves for the common good. Such an evolutionary analysis showing the link between individual behavior and the success of the group does not yet exist with regard to economic conduct of communities. Furthermore, the performance of a large economy is largely the result of the decisions of many individual decision makers, each one of whom has little or negligible effect. In this sense there is a similarity with the mathematical description of nature. No quantitative mechanism for the invisible hand has been discovered that combines the elementary particles that have wave characteristics into an element that fulfills Newton's laws.