The Folly of Fools (29 page)

Read The Folly of Fools Online

Authors: Robert Trivers

BOOK: The Folly of Fools
8Mb size Format: txt, pdf, ePub

 

D
isasters are always studied in retrospect. We will not have an experimental science of the subject anytime soon. Disasters range from the personal—your wife tells you she is leaving you for the mailman—to the global—your country invades the wrong nation, with catastrophic effects all around. Disasters, of course, are expected to be closely linked to self-deception. There is nothing like being unconscious of reality to make it intrude upon you in unexpected and painful ways. In this chapter we will concentrate on one kind of disaster—airplane and space crashes—because they typically are subject to intensive investigation immediately afterward to figure out the causes and avoid repetition. For our purposes, these accidents help us study the cost of self-deception in-depth under highly controlled circumstances. The disasters produce a very detailed and well-analyzed body of information on their causes, and they form a well-defined category. As we shall see, there are repeated ties to self-deception at various levels: the individual, pairs of individuals (pilot and copilot), institutions (NASA), and even countries (Egypt).

But there is one striking difference between space and aviation disasters. In the United States, aviation disasters are immediately and intensively studied by teams of experts on twenty-four-hour notice in an institution designed to be insulated from outside interference, the National Transportation Safety Board. The NTSB generally does a superb job and publicizes its findings quickly. It almost always discerns key causes and then makes appropriate recommendations, which appear to have helped reduce the accident rate steadily for some thirty years, so that flying is, by far, the safest form of travel. I know of only one case of a delayed report (about three years) and this was because of interference on the international level, when Egypt fought the truth to the bitter end.

By contrast, NASA’s accidents are investigated by a committee appointed to study only a specific disaster, with no particular expertise, and sometimes with a preordained and expressed goal to exonerate NASA. Study of one disaster does not prevent another, even when it has many of the same causes identified in the first case. Of course, safety corners can more easily be cut when only the lives of a few astronauts are at stake, instead of the great flying public, including airline personnel.

Aviation disasters usually result from multiple causes, one of which may be self-deception on the part of one or more key actors. When the actors number more than one, we can also study processes of group self-deception. A relatively simple example of this is the crash of Air Florida Flight 90 in 1982, in which both pilot and copilot appear to have unconsciously “conspired” to produce the disaster.

AIR FLORIDA FLIGHT 90—DOOMED BY SELF-DECEPTION?

 

On the afternoon of January 13, 1982, Air Florida Flight 90 took off from Washington, D.C.’s National Airport in a blinding snowstorm on its way to Tampa, Florida. It never made it out of D.C., instead slamming into a bridge and landing in the Potomac River—seventy-four people died, and five survivors were fished out of the back of the plane. Perhaps because one of those who died was an old friend of mine from Harvard (Robert Silberglied), I was listening with unusual interest when soon thereafter the evening news played the audiotape of the cockpit conversation during takeoff. The copilot was flying the plane, and you could hear the fear in his voice as he also performed the role the pilot should have been playing, namely reading the instrument panel. Here is how it went:

Ten seconds after starting down the runway, the copilot responds to instrument readings that suggest the plane is traveling faster than it really is: “God, look at that thing!” Four seconds later: “That doesn’t seem right, does it?” Three seconds later: “Ah, that’s not right.” Two seconds later: “Well . . .”

Then the pilot, in a confident voice, offers a rationalization for the false reading: “Yes, it is, there’s 80,” apparently referring to an airspeed of 80 knots. This fails to satisfy the copilot, who says, “Naw, I don’t think that’s right.” Nine seconds later, he wavers: “Ah, maybe it is.” That is the last we hear from the copilot until a second before the crash when he says, “Larry, we’re going down, Larry,” and Larry says, “I know.”

And what was Larry doing all this time? Except for the rationalization mentioned above, he only started talking once the mistake had been made and the plane was past the point of no return—indeed when the device warning of a stall started to sound. He then appeared to be talking to the plane (“Forward, forward.” Three seconds later: “We only want five hundred.” Two seconds later: “Come on, forward.” Three seconds: “Forward.” Two seconds: “Just barely climb.”). Within three more seconds, they were both dead.

What is striking here is that moments before we have a human disaster that will claim seventy-four human lives, including both primary actors, we have an apparent pattern of reality evasion on the part of one key actor (the pilot) and insufficient resistance on the part of the other. On top of this, typical roles were reversed, each playing the other’s: pilot (ostensibly) as copilot and vice versa. Why was the copilot reading the contradictory panel readings while the pilot was only offering a rationalization? Why did the copilot speak while it mattered, but the pilot started talking only when it was too late?

The first thing to find out is whether these differences are specific to the final moments or we can find evidence of similar behavior in the past. The answer is clear. In the final forty-five minutes of discussion between the two prior to takeoff, a clear dichotomy emerges. The copilot is reality-oriented; the pilot is not. Consider their discussion of snow on the wings, a critical variable. Pilot: “I got a little on mine.” Copilot: “This one’s got about a quarter to half inch on it all the way.” There were equal amounts of snow on both wings but the pilot gave an imprecise and diminutive estimate, while the copilot gave an exact description.

And here is perhaps the most important exchange of all, one that occurred seven minutes before takeoff. Copilot: “Boy, this is a losing battle here on trying to de-ice those things. It gives you a false sense of security is all that it does” (!!). Pilot: “This, ah, satisfies the Feds.” Copilot: “Yeah—as good and crisp as the air is and no heavier than we are, I’d . . . ” Here is the critical moment in which the copilot timidly advanced his takeoff strategy, which presumably was to floor it—exactly the right strategy—but the pilot cut him off midsentence and said, “Right there is where the icing truck, they oughta have two of them, pull right.” The pilot and copilot then explored a fantasy together on how the plane should be deiced just before takeoff.

Note that the copilot began with a true statement—they had a false sense of security based on a de-icing that did not work. The pilot noted that this satisfies the higher-ups but then switched the discussion to the way the system
should
work. Though not without its long-term value, this rather distracts from the problem at hand—and at exactly the moment when the copilot suggests his countermove. But he tried again. Copilot: “Slushy runway, do you want me to do anything special for this or just go for it?” Pilot: “Unless you got something special you would like to do.” No help at all.

The transcript suggests how easily the disaster could have been averted. Imagine the earlier conversation about snow on the wings and slushy conditions underfoot had induced a spirit of caution in both parties. How easy it would have been for the pilot to say that they should go all-out but be prepared to abort if they felt their speed was insufficient.

A famous geologist once surveyed this story and commented: “You correctly blame the pilot for the crash, but maybe you do not bring out clearly enough that it was the complete insensitivity to the copilot’s doubts, and to his veiled and timid pleas for help, that was the root of all this trouble. The pilot, with much more experience, just sat there completely unaware and without any realization that the copilot was desperately asking for friendly advice and professional help. Even if he (the pilot) had gruffly grunted, ‘If you can’t handle it, turn it over to me,’ such a response would have probably shot enough adrenaline into the copilot so that he either would have flown the mission successfully or aborted it without incident.” It is this dreadful, veiled indecision that seems to seal the disaster: the copilot tentative, uncertain, questioning, as indeed he should be, yet trying to hide it, and ending up dead in the Potomac.

The geologist went on to say that in his limited experience in mountain rescue work and in abandoned mines, the people who lead others into trouble are the hale and hearty, insensitive jocks trying to show off. “They cannot perceive that a companion is so terrified he is about to ‘freeze’ to the side of the cliff—and for very good reasons!” They in turn freeze and are often the most difficult to rescue. In the case of Flight 90, it was not just the wings that froze, but the copilot as well, and then so did the pilot, who ended up talking to the airplane.

Earlier decisions infused with similar effects contributed to the disaster. The pilot authorized “reverse thrust” to power the airplane out of its departure place. It was ineffective in this role but apparently pushed the ice and snow to the forward edge of the wing, where they would do the most damage, and at the same time blocked a key filter that would now register a higher ground speed than was in fact obtained. The pilot has been separately described as overconfident and inattentive to safety details. The presumed benefit in daily life of his style is the appearance of greater self-confidence and the success that this sometimes brings, especially in interactions with others.

It is interesting that the pilot/copilot configuration in Flight 90 (copilot at the helm) is actually the safer of the two. Even though on average the pilot is flying about half the time, more than 80 percent of all accidents occur when he is doing so (in the United States, 1978–1990). Likewise, many more accidents occur when the pilot and copilot are flying for the first time together (45 percent of all accidents, while safe flights have this degree of unfamiliarity only 5 percent of the time). The notion is that the copilot is even less likely to challenge mistakes of the pilot than vice versa, and especially if the two are unfamiliar with each other. In our case, the pilot is completely unconscious, so he is not challenging anyone. The copilot is actually challenging himself but, getting no encouragement from the pilot, he lapses back into ineptitude.

Consider now an interesting case from a different culture. Fatal accident rates for Korea Airlines between 1988 and 1998 were about seventeen times higher than for a typical US carrier, so high that Delta and Air France suspended their flying partnership with Korea Air, the US Army forbade its troops from flying with the airline, and Canada considered denying it landing rights. An outside group of consultants was brought in to evaluate the problem and concluded, among other factors, that Korea, a society relatively high in hierarchy and power dominance, was not preparing its copilots to act assertively enough. Several accidents could have been averted if the relatively conscious copilot had felt able to communicate effectively with the pilot to correct his errors. The culture in the cockpit was perhaps symbolized when a pilot backhanded a copilot across the face for a minor error, a climate that does not readily invite copilots to take strong stands against pilot mistakes. The consultants argued for emphasizing copilot independence and assertion. Even the insistence on better mastery of English—itself critical to communicating with ground control—improved equality in the cockpit since English lacked in-built hierarchical biases to which Koreans responded readily when speaking Korean. In any case, since intervention, Korea Air has had a spotless safety record. The key point is that hierarchy may impede information flow—two are in the cockpit, but with sufficient dominance, it is actually only one.

A similar problem was uncovered in hospitals where patients contract new infections during surgery, many of which turn out to be fatal and could be prevented by simply insisting that the surgeon wash his (or occasionally, her) hands. A steep hierarchy—with the surgeon unchallenged at the top and the nurses carrying out orders at the bottom—was found to be the key factor. The surgeon practiced self-deception, denied the danger of not washing his hands, and used his seniority to silence any voices raised in protest. The solution was very simple. Empower nurses to halt an operation if the surgeon had not washed his hands properly (until then, 65 percent had failed to do so). Rates of death from newly contracted infections have plummeted wherever this has been introduced.

DISASTER 37,000 FEET ABOVE THE AMAZON

 

Another striking case of pilot error occurred high above the Amazon in Brazil at 5:01 p.m. on September 26, 2006. A small private jet flying at the wrong altitude clipped a Boeing 737 (Gol Flight 1907) from underneath, sending it into a horrifying forty-two-second nosedive to the jungle below, killing all 154 people aboard. The small American executive jet, though damaged, landed safely at a nearby airport with its nine people alive. Again, the pilot of the small jet seemed less conscious than his copilot when the disaster was upon them, but neither was paying attention when the fatal error was made, nor for a long time afterward.

Other books

The Cone Gatherers by Robin Jenkins
The Slickers by L. Ron Hubbard
El rey del invierno by Bernard Cornwell
The Murder Room by James, P. D.
Terminal Rage by Khalifa, A.M.
Fiercombe Manor by Kate Riordan
Like My Ex by Peters, Norah C.
Finding Jennifer Jones by Anne Cassidy