Read Voodoo Histories: The Role of the Conspiracy Theory in Shaping Modern History Online
Authors: David Aaronovitch
Tags: #Historiography, #Conspiracies - History, #Social Science, #Popular Culture, #Conspiracy Theories, #General, #Civilization, #World, #Conspiracies, #.verified, #History
The idea that supposedly rational arguments about how big bad things happen originate not in the real world but in our internal selves—that they are the outer expression of inner problems—was explored by Elaine Showalter in her 1997 book
Hystories: Hysterical Epidemics and Modern Culture
. Showalter looked at a series of moral and health panics that had erupted at various times in Britain and America, including ritual child abuse, Gulf War syndrome, and CFS (chronic fatigue syndrome). In them, she discerned common patterns.
Skeptical about all these phenomena, Showalter argues that what we are in fact looking at is hysteria, the external manifestation of repressed feelings. “Hysteria has not died,” she writes. “It has simply been relabeled for a new era . . . Contemporary hysterical patients blame external sources—a virus, sexual molestation, chemical warfare, satanic conspiracy, alien infiltration—for psychic problems.”
34
In the era of mass media and instant communication, these hysterias find traction with other people looking for explanations for their feelings and symptoms and “multiply rapidly and uncontrollably.”
35
One of Showalter’s pieces of evidence may cast some light on an under-remarked aspect of conspiracism—its gender. She points out that a Harvard Medical School study discovered that 80 percent of chronic fatigue syndrome sufferers were women, as were 90 percent of those who, usually under hypnosis, supposedly recovered hidden memories of sexual abuse, and two-thirds of those reporting alien abduction.
36
This, together with the number of times I have been told “My husband/boyfriend will be very interested in your book” prompts the thought that conspiracy theories may be hysterias for men.
Unsurprisingly, Showalter’s analysis was unpopular. In nonclinical usage, “hysteria” connotes a somewhat ridiculous lack of control rather than a genuine psychological condition. Furthermore, to the person who believes that her son’s all-too-real autism must be explicable, and that the explanation must be external interference in the shape of a state-sponsored measles jab, Showalter will seem insulting. Such psychologizing didn’t find favor among sociologists either. To Peter Knight, Showalter’s thesis was something of a conspiracy theory itself. “The figuration of the spread of paranoid thinking as an ‘epidemic’ or a ‘plague,’ ” he charged, “likewise renders it an inscrutable and virtually unstoppable force that infiltrates innocent minds.”
37
This, it seems to me, is a misreading of Showalter’s attempt to explain how we create or borrow stories for ourselves. In fact, Showalter was concerned to argue that this impulse to grasp half-baked and damaging but attractive notions of why the world is as it is could be replaced through emotional literacy. “Men and women, therapists and patients,” she concluded, “will need courage to face the hidden fantasies, myths, and anxieties that make up the current hysterical crucible; we must look into our own psyches rather than to invisible enemies, devils, and alien invaders for the answers.”
38
About Fashion
In her work on hysteria, Showalter argues that “like all narratives,” her mass hysterias have “their own conventions, stereotypes, and structures.”
39
The more the specific thesis is talked about, the more people feel that they have had the same experience or are suffering the same symptoms, the more voluminous the literature becomes, the more aggressive in defense of their illness the victims become, and the more the idea is accepted into the mainstream. Eventually, however, the panic dies down, to be replaced fairly soon by another, similar outbreak, though with a completely different focus. The same is true of conspiracy theories. The set of charges and allegations surrounding the 1984 Hilda Murrell death were, as we’ve seen, highly specific in type to the period between 1980 and 1987. During that time, a large number of theories, or beliefs, or dramas, focused precisely on a supposed matrix composed of the nuclear industry, American and British intelligence, and semi-corrupt politicians. After Mikhail Gorbachev’s summit with Ronald Reagan in Iceland in late 1986, the nuclear-matrix conspiracy all but ended.
Based, as they supposedly are, on the uncovering of hidden truths, conspiracy theories should not be subject to fashion, and yet they clearly are. As a result, one suspects that conspiracy theories also have a social function and that they could be classic examples of what the biologist Richard Dawkins has called “memes”: ideas that replicate themselves because of the utility of sharing notions but that are genuinely felt. In the immediate aftermath of 9/11, it was extremely rare to find someone outside the Arab press arguing that there had been a cover-up. That had changed by 2006, and what had altered was not, I would argue, the presentation of any new facts but the widespread social acceptability of blaming the U.S. administration.
The Triumph of Narrative
While I was writing this book, I went on a visit to the University of Winchester to give a talk about conspiracism and
The Da Vinci Code
. At dinner, I found myself sitting with a senior member of the drama department. I told him that one of the things I found interesting about conspiracy theories was the need for a narrative that they suggested. “Ah, yes,” he said. “You should read Mamet.”
This was excellent advice. The American playwright and screenwriter’s fourth collection of essays almost starts with the words, “It is in our nature to dramatize.” By this, Mamet doesn’t mean that we are all a bit histrionic sometimes, but rather that we need to construct, or have constructed, dramas and stories for ourselves. Therapists and psychoanalysts know the truth of this. Their patients, like the rest of us, invariably have a story about inexplicable or mundane aspects of their lives. Our illnesses are due to stress or genetics or that day we went out for a walk and it was cold. Adopted children very often create a backstory of their real parents, and unadopted children have fantasies of their “real” mothers and fathers. As Mamet points out, we will have a story even if it means giving characteristics to the elemental. So “the weather is impersonal, and we both understand it and exploit it as dramatic, i.e., having a plot, in order to understand its meaning for the hero, which is to say, for ourselves.”
40
This is not some kind of occasional preference, done merely to keep ourselves entertained. Mamet observes that just as children use up the last of the day’s energy by jumping around, “the adult equivalent, when the sun goes down, is to create or witness drama—which is to say to order the universe into a comprehensible form. Our sundown play/film/gossip is the day’s last exercise of that survival mechanism . . . We will have drama in that spot, and if it’s not forthcoming we will cobble it together out of nothing.”
41
At first, hearing this may sound like a clever artist’s generalization—an observation and nothing more. But in 2006, the British human biologist Lewis Wolpert theorized that the compulsion to create a story, “to have drama in that spot,” might actually be biological—that it represented a “cognitive imperative,” an innate need to have the world organized cognitively. Wolpert speculated that the requirement to establish causality was a necessity for an animal that made tools in order to survive, and had thus become instinctive. “Once there were causal beliefs for tool use,” he argued, “then our ancestors developed causal beliefs about all key events.”
42
If we are impelled, therefore, to find causes, it follows that failure to do so created discomfort or anxiety. Consequently, human beings evolved with “a strong tendency to make up a causal story to provide an explanation . . . ignorance about important causes is intolerable.”
43
Wolpert’s focus was on the universality of religious beliefs, a universality that prevailed even though the beliefs themselves were mutually incompatible. But his idea works rather well with conspiracism: “We construct apparently coherent stories about what happened . . . but where consistency and internal satisfaction have to compete with testing against the real world, we choose consistency.”
44
If Wolpert is right, then a religious conspiracy theorist like David Ray Griffin represents the ultimate in the triumph of narrative. Meanwhile, all of us who argue for a living, including this author, might do well to consider Wolpert’s observation of the tendency always to look for confirmation of preexisting stories rather than their falsification.
The Catastrophe of Indifference
So, we need story and may even be programmed to create it. But why are certain types and structures of story more successful, more satisfying than others? One possible answer is that a successful story either represents the way we think things should happen, or is the best explanation we can get of why they didn’t. A New York fire chief asked to account for the various theories surrounding the collapse of buildings at the World Trade Center attributed them to the disappointment of people’s belief in the omnipotence of the emergency services. “In the movies,” he said, “it’s always wrapped up in the end.” Or, as Norman Cohn puts it when discussing paranoid thought in his history of apocalyptic movements, people cannot accept “the ineluctable limitations and imperfections of human existence, such as transience, dissention, conflict, fallibility whether intellectual or moral.”
45
The paradox is that, seen this way, conspiracy theories are actually reassuring. They suggest that there is an explanation, that human agencies are powerful, and that there is order rather than chaos. This makes redemption possible. “After all,” argues Dr. Jeffrey M. Bale, an American academic specializing in the ideology of terrorism, “if evil conspirators are consciously causing undesirable changes, the implication is that others, perhaps through the adoption of similar techniques, may also consciously intervene to protect a threatened way of life or otherwise alter the historical process.”
46
There is, however, another possible form of reassurance, of an altogether more personal kind. The classic view of paranoia, the unwarranted belief that one is being persecuted, is that it is a wholly negative state. But what if paranoia is actually the sticking plaster that we fix to a very different kind of wound? That of feeling ourselves to be of no importance whatsoever, and our lives (and especially our deaths) of little real significance except to ourselves.
The London-based American psychoanalyst Stephen Grosz believes this may be the case. He argues, after twenty-five years of practice, that paranoia may often be a defense against indifference, against the far more terrible thought that no one cares about you. The elderly, at a time of their lives when no one very much wonders what they think, often become classically paranoid, believing that someone wishes to rob or hurt them. The lonely person fears that there is a burglar or a murderer in the empty house waiting for them. Indeed, they may often perceive the real symptoms of such threats—the noises, the shadows, the displaced objects. These fears disguise the truly obliterating disaster, the often well-founded fear that no one is thinking about them at all, what Grosz calls “the catastrophe of indifference.”
47
Everyone knows Oscar Wilde’s famous dictum “There is only one thing in the world worse than being talked about, and that is not being talked about.” Fewer will have heard Susan Sontag’s clever development of it: “I envy paranoids. They actually feel people are paying attention to them.” If conspiracism is a projection of paranoia, it may exist in order to reassure us that we are not the totally unconsidered objects of a blind process. If Marilyn was murdered, then she did not die, as we most fear and as we most often observe, alone and ingloriously. A catastrophe occurred, but not the greater catastrophe that awaits all of us.
But if conspiracy theories are paradoxically comforting, it doesn’t mean that they are not harmful. It is worth quoting at length here the judgment of the historian Stephen E. Ambrose:
We should care because conspiracy theories about past events usually carry with them a political agenda for today. Erroneous or downright mythical views of the past can have important, even crucial, influence on the present. The coming to power of the Nazis, German rearmament, ultimately World War II might not have happened without widespread German belief in the stab-in-the-back conspiracy. Widespread acceptance by the American people of the “merchants of death” conspiracy thesis about our entry into World War I was a prelude to the ill-fated, nearly disastrous neutrality legislation of the 1930s. The unhappy consequences of McCarthyism would not have come about had the American people rejected his conspiracy thesis about the triumph of Communism in China.
48
Ambrose could have added many more examples and, facing some dangerous challenges in the early twenty-first century, so could we. I am with John Maynard Keynes, whose view was that “the power of vested interests is vastly exaggerated compared to the gradual encroachment of ideas.” I have written this book because I believe that conspiracies aren’t powerful. It is instead the idea of conspiracies that has power.
ACKNOWLEDGMENTS