We always want to believe that history happened only to “them,” “in the past,” and that somehow we are outside history, rather than enmeshed within it. Many aspects of history are unanticipated and unforeseen, predictable only in retrospect: the fall of the Berlin Wall is a single recent example. Yet in one vital area, the emergence and spread of new infectious diseases, we can already predict the futureâand it is threatening and dangerous to us all.
The history of our time will be marked by recurrent eruptions of newly discovered diseases (most recently, hantavirus in the American West); epidemics of diseases migrating to new areas (for example, cholera in Latin America); diseases which become important through human technologies (as certain menstrual tampons favored toxic shock syndrome and water cooling towers provided an opportunity for Legionnaires' Disease); and diseases which spring from insects and animals to humans, through manmade disruptions in local habitats.
To some extent, each of these processes has been occurring throughout history. What is new, however, is the increased potential that at least some of these diseases will generate large-scale, even worldwide epidemics. The global epidemic of human immunodeficiency virus is the most powerful and recent example. Yet AIDS does not stand alone; it may well be just the first of the modern, large-scale epidemics of infectious disease.
The world has rapidly become much more vulnerable to the eruption and, most critically, to the widespread and even global spread of both new and old infectious diseases. This new and heightened vulnerability is not mysterious. The dramatic increases in worldwide movement of people, goods, and ideas is the driving force behind the globalization of disease. For not only do people travel increasingly, but they travel much more rapidly, and go to many more places than ever before. A person harboring a life-threatening microbe can easily board a jet plane and be on another continent when the symptoms of illness strike. The jet plane itself, and its cargo, can carry insects bringing infectious agents into new ecologic settings. Few habitats on the globe remain truly isolated or untouched, as tourists and other travelers penetrate into the most remote and previously
inaccessible areas in their search for new vistas, business, or recreation.
This new global vulnerability is dramatically illustrated by the history of HIV/AIDS. While its geographical origins remain uncertain, it is clear that the global spread of HIV was underway by the mid-1970s. By 1980, about 100,000 people worldwide were infected with HIV. Yet the discovery of AIDS, in California in 1981, and the subsequent identification of the causative virus, HIV, in 1983, resulted from a series of very fortunate circumstances. Put another way, AIDS could have easily remained unrecognized for at least another five to ten years, with devastating global health consequences. Delay in discovering AIDS could have resulted from any or all of the following:
⢠if HIV took longer to cause detectable, clinical illness (AIDS);
⢠if the immunodeficiency of AIDS resulted in an increase of more typical infections rather than the easily recognized, unusual opportunistic infections (
Pneumocystis carinii
pneumonia) or cancers (Kaposi's sarcoma);
⢠if AIDS had not clustered among active, self-identified gay men, but rather had been spread more broadly within society;
⢠if AIDS had not occurred in a country (U.S.A.) with a highly developed disease surveillance system, capable of linking reports of cases from many different geographical areas;
⢠and if the science of human retrovirology had not been recently developed, including techniques for detection.
With AIDS, a combination of chance and circumstance relatively quickly led scientists to consider that a new health threat had arisen.
AIDS is trying to teach us a lesson. The lesson is that a health problem in any part of the world can rapidly become a health threat to many or all. A worldwide “early-warning system” is needed to detect quickly the eruption of new diseases or the unusual spread of old diseases. Without such a system, operating at a truly global level, we are essentially defenseless, relying on good luck to protect us.
Laurie Garrett has written a pioneering book. She provides us with a history, full of real people, sweat and grit, of the discoveries which have led us to realize that infectious diseases have not been vanquishedâquite the contrary. It was in these places, in Bolivia, Sudan, Sierra Leone, and Zaire, that a group of highly trained, dedicated, and courageous people met the enemy on its own ground. Facing the unknown, at the frontiers of science, they struggled and wrested from nature an insight which Laurie Garrett shares with usâthat diseases will remain a threat, that disease and human activity are inextricable, and that nature has many hidden places and surprises still in store.
The voyage that Ms. Garrett proposes is full of heart. I have been privileged to know many of the people in this book. They are heroes of a special kind: bonding science, curiosity, and humanitarian concern, combined with a very practical, “let's get it done” attitude. Not everyone could
go, as Joe McCormick has done, into the field armed only with his will, intelligence, and confidence that a way forward would be found.
They have pioneered on our behalf. We owe them our thanks. Laurie Garrett has done us the great service of introducing them and their work to a large audience. And to those who sleep peacefully, unaware of the emerging global threat of infectious disease, and to those who through this book will be introduced to the new global realities, it is important to meet these men and women who confront disease along its frontier with society.
This book sounds an alarm. The world needsânowâa global early-warning system capable of detecting and responding to new emerging infectious disease threats to health. There is no clearer warning than AIDS. Laurie Garrett has spelled it out clearly for us. Now we ignore it at our peril.
JONATHAN M. MANN, M.D., M.P.H.
François-Xavier Bagnoud Professor of Health and Human Rights Professor of Epidemiology and International Health
Harvard School of Public Health
Director, International AIDS Center
Harvard AIDS Institute
Cambridge, Massachusetts
By the time my Uncle Bernard started his medical studies at the University of Chicago in 1932 he had already witnessed the great influenza pandemic of 1918â19. He was seven years old when he counted the funeral hearses that made their way down the streets of Baltimore. Three years earlier Bernard's father had nearly died of typhoid fever, acquired in downtown Baltimore. And shortly after, his grandfather died of tuberculosis.
In his twelfth year Bernard got what was called “summer sickness,” spending the long, hot Maryland days lying about the house, “acting lazy,” as his mother put it. It wasn't until 1938, when he volunteered as an X-ray guinea pig during his internship at the University of California's medical school in San Francisco, that Uncle Bernard discovered that the “summer sickness” was actually tuberculosis. He had no doubt acquired consumption from his grandfather, survived the disease, but for the rest of his life had telltale scars in his lungs that were revealed by chest X rays.
It seemed that everybody had TB in those days. When young Bernard Silber was struggling his way through medical studies in Chicago, incoming nursing students were routinely tested for antibodies against TB. The women who came from rural areas always tested negative for TB when they started their studies. With equal certainty, they all tested TB-positive after a year on the urban hospital wards. Any ailment in those days could light up a latent TB infection, and tuberculosis sanitariums were overflowing. Treatment was pretty much limited to bed rest and a variety of hotly debated diets, exercise regimens, fresh air, and extraordinary pneumothorax surgical procedures.
In 1939 Uncle Bernard started a two-year residency in medicine at Los Angeles County Hospital, where he met my Aunt Bernice, a medical social worker. Bernice limped and was deaf in one ear, the results of a childhood bacterial infection. When she was nine, the bacteria grew in her ear, eventually infecting the mastoid bone. A complication of that was osteomyelitis, which left her right leg about an inch shorter than her left, forcing Bernice to walk knock-kneed to keep her balance. Shortly after they met, Bernard got a nasty pneumococcal infection and, because he was a physician, received state-of-the-art treatment: tender loving care and oxygen.
For a month he languished as a patient in Los Angeles County Hospital hoping he would be among the 60 percent of Americans who, in the days before antibiotics, survived bacterial pneumonia.
Bacterial infections were both common and very serious before 1944, when the first antibiotic drugs became available. My Uncle Bernard could diagnose scarlet fever, pneumococcal pneumonia, rheumatic fever, whooping cough, diphtheria, or tuberculosis in a matter of minutes with little or no laboratory support. Doctors had to know how to work quickly because these infections could escalate rapidly. Besides, there wasn't much the lab could tell a physician in 1940 that a well-trained, observant doctor couldn't determine independently.
Viruses were a huge black box in those days, and though Bernard had no trouble differentiating between German measles, influenza, St. Louis encephalitis, and other viral diseases, he had neither treatments nor much of an understanding of what these tiniest of microbes did to the human body.
Uncle Bernard was introduced to tropical medicine during World War II, when he served in the Army Medical Corps at Guadalcanal and other battlefields of the Pacific. That's when he learned firsthand about diseases of which he'd heard very little in medical school: malaria, dengue (break-bone fever), and a variety of parasitic diseases. Quinine did a good job of curing malaria, but there was little he could do for GIs afflicted with the other tropical organisms that were rife in the Pacific theater.
Two years into the war the Army issued its first meager supplies of penicillin, instructing physicians to use the precious drug sparingly, in doses of about 5,000 units (less than a third of what would be considered a minimal penicillin dose for minor infections in 1993). In those early days before bacteria became resistant to antibiotics, such doses were capable of performing miracles, and the Army doctors were so impressed with the powers of penicillin that they collected the urine of patients who were on the drug and crystallized excreted penicillin for reuse on other GIs.
Years later, when I was studying immunology in graduate school at UC Berkeley, Uncle Bernard would regale me with tales of what sounded like medicine in the Dark Ages. I was preoccupied with such things as fluorescence-activated laser cell sorters that could separate different types of living cells of the immune system, the new technology of genetic engineering, monoclonal antibodies, and deciphering the human genetic code.
“I always liken the production of antibiotics to the Internal Revenue Service,” Uncle Bernard would say when I seemed less than interested in the pre-antibiotic plights of American physicians. “People are always looking for loopholes, but as soon as they find them, the IRS plugs them up. It's the same way with antibioticsâno sooner have you got one than the bacteria have become resistant.”
During the summer of 1976 I had reason to reconsider much of my Uncle Bernard's wisdom. As I tried to make sense of my graduate research project
at Stanford University Medical Center, the news seemed overfull of infectious disease stories. The U.S. government was predicting a massive influenza epidemic that some said would surpass that of 1918âa global horror that claimed over 20 million lives. An American Legion group met in a hotel in Philadelphia on the Fourth of July, and something made 182 of them very sick, killing 29. Something else especially strange was going on in Africa, where, according to garbled press accounts of the day, people were dying from a terrifying new virus: in Zaire and the Sudan, something called Green Monkey Virus, or Marburg, or Ebola, or a mix of all three monikers was occupying the urgent attention of disease experts from all over the world.
In 1981 Dr. Richard Krause of the U.S. National Institutes of Health published a provocative book entitled
The Restless Tide: The Persistent Challenge of the Microbial World,
1
which argued that diseases long thought to have been defeated could return to endanger the American people. In hearings a year later before the U.S. Congress, Krause was asked, “Why do we have so many new infectious diseases?”
“Nothing new has happened,” Krause replied. “Plagues are as certain as death and taxes.”
2
But the shock of the AIDS epidemic prompted many more virus experts in the 1980s to ponder the possibility that something new was, indeed, happening. As the epidemic spread from one part of the world to another, scientists asked, “Where did this come from? Are there other agents out there? Will something worse emergeâsomething that can be spread from person to person in the air?”
The questioning grew louder as the 1980s dragged on. At a Rockefeller University cocktail party, a young virologist named Stephen Morse approached the institution's famed president, Nobel laureate Joshua Lederberg, and asked him what he thought of the mounting concern about emerging microbes. Lederberg characteristically responded in absolute terms: “The problem is serious, and it's getting worse.” With a sense of shared mission, Morse and Lederberg set out to poll their colleagues on the matter, gather evidence, and build a case.
By 1988 an impressive group of American scientists, primarily virologists and tropical medicine specialists, had reached the conclusion that it was time to sound an alarm. Led by Morse and Lederberg of Rockefeller University, Tom Monath of the U.S. Army's Medical Research Institute of Infectious Diseases, and Robert Shope of the Yale University Arbovirus Research Unit, the scientists searched for a way to make tangible their shared concern. Their greatest worry was that they would be perceived as crybabies, merely out to protest shrinking research dollars. Or that they would be accused of crying wolf.
On May 1, 1989, the scientists gathered in the Hotel Washington, located across the street from the White House, and began three days of discussions aimed at providing evidence that the disease-causing microbes of the planet,
far from having been defeated, were posing ever-greater threats to humanity. Their gathering was co-sponsored by the National Institutes of Allergy and Infectious Diseases, the Fogarty International Center, and Rockefeller University.
“Nature isn't benign,” Lederberg said at the meeting's opening. “The bottom lines: the units of natural selectionâDNA, sometimes RNA elementsâare by no means neatly packaged in discrete organisms. They all share the entire biosphere. The survival of the human species is
not
a preordained evolutionary program. Abundant sources of genetic variation exist for viruses to learn new tricks, not necessarily confined to what happens routinely, or even frequently.”
University of Chicago historian William McNeill outlined the reasons
Homo sapiens
had been vulnerable to microbial assaults over the millennia. He saw each catastrophic epidemic event in human history as the ironic result of humanity's steps forward. As humans improve their lots, McNeill warned, they actually
increase
their vulnerability to disease.
“It is, I think, worthwhile being conscious of the limits upon our powers,” McNeill said. “It is worth keeping in mind that the more we win, the more we drive infections to the margins of human experience, the more we clear a path for possible catastrophic infection. We'll never escape the limits of the ecosystem. We are caught in the food chain, whether we like it or not, eating and being eaten.”
For three days scientists presented evidence that validated McNeill's words of foreboding: viruses were mutating at rapid rates; seals were dying in great plagues as the researchers convened; more than 90 percent of the rabbits of Australia died in a single year following the introduction of a new virus to the land; great influenza pandemics were sweeping through the animal world; the Andromeda strain nearly surfaced in Africa in the form of Ebola virus; megacities were arising in the developing world, creating niches from which “virtually anything might arise”; rain forests were being destroyed, forcing disease-carrying animals and insects into areas of human habitation and raising the very real possibility that lethal, mysterious microbes would, for the first time, infect humanity on a large scale and imperil the survival of the human race.
As a member of a younger generation trained in an era of confident, curative medicine and minimal concern for infectious diseases, I experienced such discussion as the stuff of Michael Crichton novels rather than empiric scientific discourse. Yet I and thousands of young scientists also reared in the post-antibiotic, genetic engineering era had to concede that there was an impressive list of recently emergent viruses: the human immunodeficiency virus that caused AIDS, HTLV Types I and II which were linked to blood cancers, several types of recently discovered hepatitis-causing viruses, numerous hemorrhage-producing viruses discovered in Africa and Asia.
In February 1991 the Institute of Medicine (IOM), which is part of the U.S. National Academy of Sciences, convened a special panel with the task of exploring further the questions raised by the 1989 scientific gathering and advising the federal government on two points: the severity of the microbial threat to U.S. citizens and steps that could be taken to improve American disease surveillance and monitoring capabilities. In the fall of 1992 the IOM panel released its report,
Emerging Infections: Microbial Threats to Health in the United States
,
3
which concluded that the danger of the emergence of infectious diseases in the United States was genuine, and authorities were ill equipped to anticipate or manage new epidemics.
“Our message is that the problem is serious, it's getting worse, and we need to increase our efforts to overcome it,” Lederberg said on the day of the report's release.
After the release of the report, the U.S. Centers for Disease Control and Prevention in Atlanta began a soul-searching process that would, by the spring of 1994, result in a plan for heightened vigilance and rapid response to disease outbreaks. The slow response to the emergence of HIV in 1981 had allowed the epidemic to expand by 1993 to embrace 1.5 million Americans and cost the federal government more than $12 billion annually in research, drug development, education, and treatment efforts.
The CDC was determined that such a mistake would not be repeated.