A Sense of the Enemy: The High Stakes History of Reading Your Rival's Mind (33 page)

Read A Sense of the Enemy: The High Stakes History of Reading Your Rival's Mind Online

Authors: Zachary Shore

Tags: #History, #Modern, #General

BOOK: A Sense of the Enemy: The High Stakes History of Reading Your Rival's Mind
13.65Mb size Format: txt, pdf, ePub
Kurzweil asserts that the neocortex—the large frontal region of the brain where most such calculations are conducted—is composed of layers upon layers of hierarchical pattern recognizers. These pattern recognizers, he maintains, are constantly at work making and adjusting predictions. He offers the simple sentence:
Consider that we see what we expect to—
Most people will automatically complete that sentence based on their brain’s recognition of a familiar pattern of words. Yet the pattern recognizers extend far deeper than that. To recognize the word “apple,” for example, Kurzweil notes that our brains not only anticipate the letter “e” after having read a-p-p-l, the brain must also recognize the letter “a” by identifying familiar curves and line strokes. Even when an image of an object is smudged or partially obscured, our brains are often able to complete the pattern and recognize the letter, or word, or familiar face. Kurzweil believes that the brain’s most basic and indeed vital function is pattern recognition.
This ability is exceptionally advanced in mammals and especially in humans. It is an area where, for the moment, we still have a limited advantage over computers, though the technology for pattern recognition is rapidly improving, as evidenced by the Apple iPhone’s use of Siri speech recognition software. For a quick example of your own brain’s gifts in this arena, try to place an ad on the website Craigslist. At the time of this writing, in order to prove that you are a human and not a
nefarious robot, Craigslist requires users to input a random string of letters or numbers presented on the screen. The image, however, is intentionally blurred. Most likely, you will have no difficulty identifying the symbols correctly. Robots, in contrast, will be baffled, unable to make sense of these distorted shapes. For fun, try the option for an audio clue. Instead of typing the image, listen to the spoken representation of those letters and numbers. You will hear them spoken in a highly distorted manner amidst background noise. The word “three,” for example, might be elongated, stressed, or intoned in a very odd way. “Thaaaaaaaa-reeeeeeee.” It sounds like the speaker is either drunk, on drugs, or just being silly. The point is that a computer program attempting to access the site could not recognize the numbers and letters when they do not appear in their usual patterns. Our brains possess an amazing ability to detect patterns even under extremely confusing conditions. But before you start feeling too smug, Kurzweil predicts that we have until the year 2029, when computers will rival humans in this and other regards. So enjoy it while it lasts.
Let me sum up this section: Kurzweil’s theory suggests that pattern recognition is the brain’s most crucial function, and our sophisticated development of this ability is what gives human beings the edge over other animals and, for now, over computers as well. I suggest that the best strategic empaths are those who focus not only on enemy patterns but also on meaningful pattern breaks and correctly interpret what they mean. Next, Claude Shannon’s information theory shows that it is the new and surprising information that is more valuable than other data. I observe that pattern breaks are, in fact, markers of new and surprising information, possessed of greater value to leaders than the enemy’s routine actions. Finally, the theory of mind scholarship provides ways of thinking about how we mentalize, or enter another’s mind, which I employed throughout this book, but especially when scrutinizing how Stalin tried to think like Hitler.
Before we grow too enamored of all these theories, we should remember that theories are not always right. Often their proponents, in their well-intentioned enthusiasm, exaggerate the scope and significance of their discoveries. This is particularly true of some recent works in social science—studies that bear directly on the nature of prediction.

How Silly Are We?

One of the striking features infusing much of the recent social science scholarship on prediction is its tendency to expose alleged human silliness. Across fields as diverse as behavioral economics, cognitive psychology, and even the science of happiness or intuition, studies consistently show how poor we are at rational decision-making, particularly when those choices involve our expectations of the future. Yet too often these studies draw sweeping conclusions about human nature from exceedingly limited data. In the process, they typically imply that their subjects in the lab will respond the same way in real life. Before we can apply the lessons of cognitive science to history, we must first be clear on the limits of those exciting new fields. We should temper our enthusiasm and must not be seduced by science.
Consider one daring experiment by the behavioral economist Dan Ariely. Ariely recruited male students at the University of California Berkeley to answer intimate questions about what they thought they might do under unusual sexual settings. After the subjects had completed the questionnaires, he then asked them to watch sexually arousing videos while masturbating—in the privacy of their dorm rooms, of course. The young men were then asked these intimate questions again, only this time their answers on average were strikingly different. Things that the subjects had previously thought they would not find appealing, such as having sex with a very fat person or slipping a date a drug to increase the chance of having sex with her, now seemed much more plausible in their excited state. Ariely concluded from these results that teenagers are not themselves when their emotions take control. “In every case, the participants in our experiment got it wrong,” Ariely explains. “Even the most brilliant and rational person, in the heat of passion seems to be absolutely and completely divorced from the person he thought he was.”
12
Ariely is one of America’s most intriguing and innovative investigators of behavioral psychology. His research has advanced our understanding of how poorly we all know ourselves. And yet there is a vast difference between what we imagine we would do in a situation as compared to what we would actually do if we found ourselves in that situation. In other words, just because a young man in an aroused state says that he would drug his date does not guarantee that he truly would do it. He
might feel very differently if the context changed from masturbating alone in his dorm room to being present with a woman on the real date. Can we be so certain that he really would slip the drug from his pocket into her drink? Or would he truly have sex with a very overweight person if she were there before him? Would he have sex with a sixty-year-old woman or a twelve-year-old girl, or any of Ariely’s other scenarios, if he were presented with the opportunity in real life? Life is not only different from the lab; real life has a funny way of being rather different from the fantasy.
A great many recent studies suffer from a similar shortcoming. They suggest profound real-world implications from remarkably limited laboratory findings. In his wide-ranging book on cognitive psychology, Nobel Laureate Daniel Kahneman describes the priming experiments conducted by Kathleen Vohs in which subjects were shown stacks of Monopoly money on a desk or computers with screen savers displaying dollar bills floating in water. With these symbols priming their subconscious minds, the subjects were given difficult tests. The true test, however, came when one of the experimenters “accidentally” dropped a bunch of pencils on the floor. Apparently, those who were primed to think about money helped the experimenter pick up fewer pencils than those who were not primed. Kahneman asserts that the implications of this and many similar studies are profound. They suggest that “. . . living in a culture that surrounds us with reminders of money may shape our behavior and our attitudes in ways that we do not know about and of which we may not be proud.”
13
If the implications of such studies mean that American society is more selfish than other societies, then we would have to explain why Americans typically donate more of their time and more of their income to charities than do those of nearly any other nation.
14
We would also need to explain why some of the wealthiest Americans, such as Bill Gates, Warren Buffett, Mark Zuckerberg, and a host of billionaires, have pledged to donate half of their wealth within their lifetimes.
15
Surely these people were thinking hard about their money before they chose to give it away. We simply cannot draw sweeping conclusions from snapshots of data.
I want to mention one other curious study from psychology. Its underlying assumption has much to do with how we behave during
pattern breaks. Gerd Gigerenzer is the highly sensible Director of the Max Planck Institute for Human Development and an expert on both risk and intuition. Some of his work, which he related in a book titled
Gut Feelings
, was popularized in Malcolm Gladwell’s
Blink
. Gigerenzer has never been shy to point out perceived weaknesses and shallow logic in his own field. He has written cogently on the flaws embedded in Daniel Kahnemann’s and Amos Tversky’s heuristics and biases project.
16
Yet even Gigerenzer has occasionally fallen into the “how silly are we?” camp, though the following topic he certainly did not take lightly. Unfortunately, this particular study suggests that Americans behaved irrationally after 9/11, though their reactions may have been perfectly sound.
Gigerenzer found that American fatalities from road accidents increased after 9/11.
17
Because many Americans were afraid to fly in the year following the attacks, they drove instead. Presumably, the increased number of drivers increased the number of collisions, leading to roughly 1,500 more deaths than usual. Gigerenzer’s main aim is prudent and wise. Governments should anticipate likely shifts in behavior following terrorist attacks and should take steps to reduce indirect damage such as greater accidents from changed behavior. But the underlying assumption is that many Americans cannot think rationally about probability. Gigerenzer implies that the decision not to fly after 9/11 was based on irrational fears. Had they continued to fly instead of drive, fewer Americans would have died.
The problem with such reasoning, as you’ve likely already guessed, is that it ignores the pattern-break problem. A statistician might argue that, despite the 9/11 hijackings, the odds of dying in a plane crash were still extremely low. But those odds are based on a prior pattern—prior to a meaningful and dramatic pattern break. After 9/11, Americans had to wonder whether other terrorist plots using airplanes were still to come. If the terrorists could defeat our security checks once, could they do it again? Given that these were the acts of an organization and not of a single, crazed individual, and given that the leader of that organization vowed to strike America again, it was wise to adopt a wait-and-see approach. The past odds of flying safely no longer mattered in light of a potentially ongoing threat. Without any means of determining how great that threat would be, driving was a perfectly rational alternative,
even knowing that one’s odds of dying in a car crash might rise. Until a new pattern is established (or a prior one returns), the odds of dying in a hijacked plane might be even higher.
In his article, Gigerenzer did observe that following the Madrid train bombings in 2004, Spaniards reduced their ridership on trains, but those rates returned to normal within a few months. Gigerenzer speculates that one reason might have been that the Spanish are more accustomed than Americans to dealing with terrorist attacks. In other words, the Madrid train bombings represented less of a pattern break than did 9/11.
Another way of thinking about this problem is to compare it with the horrific movie theater shootings in Aurora, Colorado, on July 20, 2012, in which a lone gunman shot twelve people to death and wounded fifty-eight others. As frightening as this incident was, it would not have made sense for Americans across the country, or even in Aurora, to have stopped attending films in theaters. The incident marked no new breach in security and no innovation in killing techniques. The same risk has long been present. The assailant operated alone, not as part of an international terrorist network. While there is always the chance of copy-cat attacks, it remained valid to consider the odds of being murdered in a movie theater based on the pattern of past killings in theaters or in public spaces in general. The Aurora attacks did not represent a meaningful break in the pattern of American gun violence. Like Spaniards and the Madrid train bombings, Americans have sadly become accustomed to episodes like these.
Judging probability is an excellent way of assessing risk only when we focus on the right data and recognize when the old odds no longer matter. The famed English economist John Maynard Keynes is often quoted for his snappy remark, “When the facts change, I change my mind. And what do you do, sir?” I would offer a variation of Keynes’s quip.
When the pattern breaks, I change my behavior.
How about you?
My goal in this discussion is not to disparage the work of behavioral scientists. On the contrary, their work can help us challenge the assumptions we have too long taken for granted. My aim instead is to caution us against carrying the implications of such studies too far. The experiments of behavioral scientists can help guide our thinking about how we
think, as long as we remain cognizant of the gulf between labs and real life.
18
And here is where I believe historians can add true value.

The Standard Works

Other books

The Ugly Duckling by Hans Christian Andersen
Dead End by Mariah Stewart
The Bear's Tears by Craig Thomas
To the Indies by Forester, C. S.
Texas Funeral by Batcher, Jack
Win or Lose by Alex Morgan
Taken Identity by Raven McAllan