The Folly of Fools (30 page)

Read The Folly of Fools Online

Authors: Robert Trivers

BOOK: The Folly of Fools
8.96Mb size Format: txt, pdf, ePub

The key facts are not in doubt. The large commercial jet was doing everything it was supposed to do. It was flying at the correct altitude and orientation (on autopilot); its Brazilian pilots were awake, alert, and in regular contact with their flight controllers. In addition, they were fully familiar with the plane they were flying and spoke the local language. The only mistake these pilots made was getting out of bed that morning. By contrast, the American crew was flying a plane of this kind for the first time. They were using the flight itself to master flying the craft by trial and error as they went along. Although they had had limited simulation training on this kind of airplane, they did not know how to read the instrument panel and, as they put it while in flight, were “still working out the kinks” on handling the flight management system. When attempting to do so, they could not compute time until arrival or weather ahead, much less notice whether their transponder was turned off, as soon enough it was. They tried to master the airplane display systems, toyed with a new digital camera, and planned the next day’s flight departure. They chatted with passengers wandering in and out of their cockpit. They did everything but pay attention to the task at hand—flying safely through airspace occupied by other airplanes.

They were, in fact, flying at the wrong altitude, contradicting both normal convention (even numbers in their direction) and the flight plan they had submitted (36,000 feet for the Brasilia–Manaus leg of their trip). But their own error was compounded by that of the Brasilia controller who okayed their incorrect orientation. They had managed to turn off their transponder (or it had done so on its own), so they were flying invisible to other planes and were blind themselves—a transponder warns both oncoming craft of your presence and you of theirs—yet they were completely unaware of this. They were barely in contact with the flight controllers, and when they were, the pilots showed little evidence of language comprehension or of interest in verifying what they thought the controllers were saying (“I have no idea what the hell he said”). They had spoken disparagingly of Brazilians and of the tasks asked of them, such as landing at Manaus.

Their flight plan was simplicity itself. They were to take off from near Sao Paolo on a direct leg to Brasilia at 37,000 feet; then they were to turn northwest toward Manaus at 36,000, since planes flying in the opposite direction would be coming at 37,000 feet. They then were to land at Manaus. Automatic pilots would attend to everything, and there was only one key step in the whole procedure: go down 1,000 feet when they made their turn high over Brasilia. This is precisely what the flight plan they submitted said they would do, it was the universal rule for flights in that direction, and it was assumed to be true by the flight bearing down on them from Manaus.

It was not, however, what they did. Instead, as they made their turn, they were at that moment busying themselves with more distant matters—trying to calculate the landing distance at Manaus and their takeoff duties the next day. This was part of their larger absorption in trying to master a new plane and its technology. For the next twenty minutes, the mistake was not noticed by either the pilots or the Brazilian air controller who had okayed it, but by then the plane’s transponder was turned off and there was no longer clear evidence to ground control of who and where they were. There is no evidence of deception, only of joking around as if jockeying for status while being completely oblivious to the real problem at hand. This is a recurring theme in self-deception and human disasters: overconfidence and its companion, unconsciousness. Incidentally, it was the copilot who seems first to have realized what may have happened, and he took over flight of the plane, later apologizing repeatedly to the pilot for this act of self-assertion. He was also the first to deny the cause of the accident on arrival and provide a cover-up.

In the example of Air Florida Flight 90, the pilot’s self-deception—and copilot’s insufficient strength in the face of it—cost them their lives. In the case of Gol Flight 1907, both pilots who caused the tragedy survived their gross carelessness while 154 innocents perished. This is a distressing feature of self-deception and large-scale disasters more generally: the perpetrators may not experience strong, nor indeed any, adverse selection. As we shall see, it was not mistakes by astronauts or their own self-deception that caused the
Challenger
and
Columbia
disasters but rather self-deception and mistakes by men and women with no direct survival consequences from their decisions. The same can be said for wars launched by those who will suffer no ill effects on their own immediate inclusive fitness (long-term may be another matter), whatever the outcome, even though their actions may unleash mortality a thousand times more intense in various unpredictable directions.

ELDAR TAKES COMMAND—AEROFLOT FLIGHT 593

 

It is hard to know how to classify the 1994 crash of Aeroflot Flight 593 from Moscow to Seoul, Korea, so absurd that its truth was covered up in Russia for months. The pilot was showing his children the cockpit and, against regulations, allowed each to sit in a seat and pretend to control the plane, which was actually on autopilot. His eleven-year-old daughter enjoyed the fantasy, but when his sixteen-year-old son, Eldar, took the controls, the teen promptly applied enough force to the steering wheel to deactivate most of the autopilot, allowing the plane to swerve at his whim.

Deactivation of the autopilot turned on a cockpit light (which was missed by the pilots), but more important, the pilot was trapped in a fantasy world in which he encouraged his children to turn the wheel this way and that and then to believe that this had an effect, while in fact the plane was (supposed to be) on autopilot. When his son actually controlled movements, the pilot was slow to realize this was no fantasy; indeed, his son was the first to point out that the plane was actually turning on its own (due to forces unleashed by Eldar’s turning motions), but the plane then quickly banked at such an angle as to force everyone against their seats and the wall so that the pilot could not wrest control of the plane from his son. After a harrowing vertical ascent, the copilot and Eldar managed to get the plane in a nosedive, which permitted control to be reestablished, but alas it was too late. The plane hurtled to the ground, losing all seventy-five aboard. Besides disobeying all standard rules for cockpit behavior, the pilot appeared blissfully unaware that he was doing this high in the air and was becoming trapped in the very fantasy he had created for his children. Of course, it is easy for adults to underestimate the special ability of children to seize control of electromechanical devices.

SIMPLE PILOT ERROR—OR PILOT FATIGUE?

 

We now turn to self-deception at higher levels of organization—within corporations or society at large—that impede airline safety. That is, pilot error is compounded by higher-level error. For example, the major cause of fatal airline crashes is said to be pilot error—about 80 percent of all accidents in both 2004 and 2005. This is surely an overestimate, as airlines benefit from high ones. Still, evidence of pilot error is hardly lacking and is usually one of several factors in crashes. We do not know how much of this error is entrained by self-deception, but a common factor in pilot error is one we have already identified: overconfidence combined with unconsciousness of the danger at hand. Certainly this combination appears to have doomed John F. Kennedy Jr. (and his two companions) when he set out on a flight his experienced copilot was unwilling to take—into the gray, dangerous northeastern fog in which a pilot can easily become disoriented, mistake up for down, lose control, and enter a death spiral.

Consider a commercial example, documented by the flight recorder. On a cloudy day in October 2004 at 7:37 in the evening, a twin-engine turboprop approaching the airport at Kirksville, Missouri, was descending too low, too fast, though the pilots could not see the runway lights until they were below three hundred feet and soon were on top of trees. Both pilots and eleven of the thirteen passengers died in the crash. Below ten thousand feet, FAA rules require a so-called sterile cockpit, in which only pertinent communication is permitted, yet both pilots were sharing jokes and cursing frequently below this altitude. They discussed coworkers they did not like and how nice it would be to eat a Philly cheesesteak, but they did not attend to the usual rules regarding rate and timing of descent or to the plane’s warning system alerting them to the rapidly approaching ground below.

Of course, the usual human bias toward self-enhancement makes this negligence more likely: “rules that apply to the average pilot do not apply to better ones, such as me.” The pilot, whose job in this situation was to watch the instruments, said it was all right to descend because he could see the ground. The copilot—whose job was to look for the runway—said he could not see a thing, but he did not challenge the pilot, as rules required him to do. The pilot kept descending as if he could see the runway when he probably saw nothing at all until finally he spotted the landing lights and then immediately the tops of trees. Here we see familiar themes from the crash of Air Florida Flight 90: the irrelevant and distracting talk during takeoff in the first case and landing in this one, pilot overconfidence prevailing over the more reality-oriented but deferential copilot, and the pilot’s failure to read instruments, as was his duty.

It should be mentioned that the pilots could be heard yawning during their descent, and they had spent fourteen hours on the job, after modest sleep. This was their sixth landing that day. Had they followed proper procedure, they still should have been able to land safely, but surely fatigue contributed to their failure to follow procedure, as well as to their degree of unconscious neglect of the risks they were taking.

Now here comes the intervention of self-deception at the next level. In response to this crash, the NTSB recommended that the FAA tighten its work rules for pilots by requiring more rest time, the second time it had done so in twelve years, because the FAA did not act on the first recommendation. In response to this crash, the airline industry, represented by its lobbying organization, the Air Transport Association, argued that this was an isolated incident that did not require change in FAA rules. (If accidents were not isolated incidents, we would not get on airplanes.) “The current FAA rules . . . ensure a safe environment for our crews and the flying public.” Of course, they do no such thing: they save the airlines money by requiring fewer flight crews. And note the cute form of the wording “our crews” comes first—we would hardly subject our own people to something dangerous—followed by reducing everyone else to “the flying public.” But neither management nor lobbyists are part of the flight crew, and predictably, the Airline Pilots Association backed the rule change. True to form, in March 2009, seven airlines sued in federal court to overturn a recent FAA rule that imposed forty-eight-hour rest periods between twenty-hour flights (e.g., Newark to Hong Kong), a decision that followed earlier pioneering work by Delta Airlines to institute the rule and to provide proper sleeping quarters for the pilots during their nearly daylong flight. The fiction is that the FAA represents the so-called flying public; the truth is that it represents the financial interests of the airlines and represents the general public only reluctantly and in response to repeated failures.

ICE OVERPOWERS THE PILOTS; AIRLINES OVERPOWER THE FAA

 

Ice poses a special problem for airplanes. Ice buildup on the wings increases the plane’s weight while changing the pattern of airflow over both the main wings and the small rear control wings. This reduces lift and in some cases results in rapid loss of control, signaled by a sudden pitch and a sharp roll to one side. The controls move on their own, sometimes overpowering counterefforts by the pilots. Commuter planes are especially vulnerable because they commonly fly at lower altitudes, such as ten thousand feet, at which drizzling ice is more common. When icing results in loss of control, the plane turns over and heads straight to the ground.

To take an example, on October 31, 1994, American Eagle Flight 4184 from Indianapolis had been holding at ten thousand feet in a cold drizzle for thirty-two minutes with its de-icing boot raised (to break some of the ice above it), when it was cleared by Chicago air traffic controllers to descend to eight thousand feet in preparation for landing. Unknown to the pilots, a dangerous ridge of ice had built up on the wings, probably just behind the de-icing boot, so that as the pilots dipped down, they almost immediately lost control. The plane’s controls moved on their own but on the right wing only, immediately tilting the plane almost perpendicular to the ground. The pilots managed to partly reverse the roll before the (top-heavy) plane flipped upside down and hit the ground at a 45-degree angle in a violent impact that left few recognizable pieces, including any of the sixty-eight people aboard.

This was an accident that did not need to happen. This kind of airplane (ATR 42 or 72 turboprops) had a long history of alarming behavior under icing conditions, including twenty near-fatal losses of control under icing conditions and one crash in the Alps in 1987 that killed thirty-seven people. Yet the problem kept recurring because safety recommendations were met by strong resistance from the airlines—which would have to pay for the necessary design changes—and the FAA ended up acting like a biased referee, approving relatively inexpensive patches that probably reduced (at least slightly) the chance of another crash but did not deal with the problem directly. As one expert put it, “Until the blood gets deep enough, there is a tendency to ignore a problem or live with it.” To wait until after a crash to institute even modest safety improvements is known as tombstone technology. The regulators and airline executives are, in effect, conscious of the personal cost—immediate cost to the airlines in mandated repairs and bureaucratic cost to any regulator seen as unfriendly to the airlines—while being unconscious of the cost to passengers.

Other books

Little Boy by Anthony Prato
Tristano Dies by Antonio Tabucchi
2 Knot What It Seams by Elizabeth Craig
Ascension by Kelley Armstrong
The Faberge Egg by Robert Upton