Whole (20 page)

Read Whole Online

Authors: T. Colin Campbell

BOOK: Whole
4.24Mb size Format: txt, pdf, ePub

Every point in this process may be influenced by body biochemistry, diet, physical activity, medication, mood, and just about every other variable you can think of. Not only that: the so-called stages of genetic expression influence one another, too, feeding information backward and forward in an endlessly complex series of loops. These streams of events communicate with one another in many different ways, at every enormously complex stage of the process, as we saw with the series of enzymes (which are themselves one type of protein) in
chapter seven
. In addition, each change in activity rate can have more than one cause.
The amounts of protein synthesized from DNA, for example, fluctuate according to how much is needed at any moment in time. When there is enough of one protein, its formation is slowed. But slowing the rate of protein synthesis can be controlled in multiple ways. The rate of DNA-to-RNA transcription, and/or the rate of protein synthesis from RNA itself, both can be altered.

This is the system that we are now tampering with, as if it were a human-made machine. Sure, we’ve mapped the human genome.
5
But that mapping is only the first step. We can label genes with cryptic names all we want; that doesn’t mean we’ll magically know what those labels mean or how emergent structures like personality, preferences, predispositions—or disease—arise from them... assuming it’s even possible to do so.

THE GENETICIST’S DREAM

Despite the unimaginable complexity of genetics, geneticists stubbornly persist in advocating and pursuing a genetic research agenda as the future of health care. To reductionists, complexity is simply an invitation to throw more time and money at the problem. All we need is faster processing, or smarter programming, or more research....

Geneticists are sure that we’ll crack the genetic basis of disease in a decade or two—if not sooner. And once we do, it will lead to a revolution in health care. Knowing the identity and function of genes involved in disease formation and treatment will let us refine drug development
6
and economize clinical testing of the newly developed products. Drugs will be developed that are targeted either for specific disease-related events or, as recently announced, for individuals whose genes define their likely drug responsiveness. In doing so, drug side effects would be minimized and costs of clinical trials would be lessened. In fact, the Human Genome Program—the ambitious government-led research project that mapped all 20,000 to 25,000 human genes from 1990 to 2003—claims a more streamlined drug development process would have “the potential to dramatically reduce the estimated 100,000 deaths and 2 million hospitalizations that occur each year in the United States as the result of adverse drug response.”
7

But that’s only the start of the benefits. Here are a few other verbatim quotations from their website that reflect the U.S. government’s “official” enthusiasm:

  • “[A]dvance knowledge of a particular disease susceptibility will allow careful monitoring, and treatments can be introduced at the most appropriate stage to maximize their therapy.”
    8
  • “Vaccines made of genetic material [...] promise all the benefits of existing vaccines without all the risks.”
    9
  • “The cost and risk of clinical trials will be reduced by targeting only those persons capable of responding to a drug.”
    10
  • All of these benefits and more “will promote a net decrease in the cost of health care.”
    11

NIH Director Dr. Francis Collins, who, with Dr. J. Craig Venter, led the remarkable sequencing of the human genome, and who used to direct NIH’s National Human Genome Research Institute, also talks frequently and with extraordinary enthusiasm about the promise of genetics research. He visualizes a time when the identities of individuals’ unique DNA profiles will not only establish disease risks but also permit customized programs of prevention and treatment of illness. Because people are unique, he envisions customized prevention and treatment strategies for each individual. One size will not fit all, according to Collins and his colleagues.

These promises all sound inspiring and are said to be ushering in a whole new medical practice paradigm: genetics as the centerpiece of medicine’s future! And in fact, many of the promised outcomes of genetics no doubt will be very good. I’m not saying that genetic research is a complete waste of time. I actually find the Human Genome Project to be endlessly fascinating science. There’s no way a curious species like ourselves could have left that stone of indeterminate complexity unturned, given sufficient technology. And there’s no doubt that genetic interventions will help the 0.01 percent of the population who suffer from rare conditions brought about by faulty genes.

What they won’t do, however, is solve the basic problem: our society’s failing health. What I object to is our focus on genetics to the near exclusion of everything else. Currently, hundreds of billions of dollars are being spent on genetic testing and sequencing every year in the United States, without getting us any closer to solving our health-care crisis. Our
society’s multibillion-dollar investment in genetics will help only a very small portion of the population, and even then only at enormous expense.

Once we’ve eliminated 90 percent of human diseases via nutrition and ended the financial drain of reductionist health care on our economy, then we can avail ourselves of the luxury of genetic testing and sequencing. Right now we have much more urgent things we can do that would benefit a much larger percentage of the population. We’re facing a perfect-storm health-care crisis right now. When the hurricane is blowing, you don’t redecorate the foyer; you nail plywood over the windows.

Or maybe I’m just jealous. I’ll leave that for you to decide. After all, while this new Age of Genetics was rising over the horizon, an Age of Nutrition was sinking below it.

THE DECLINE OF THE AGE OF NUTRITION

In 1955, I was in my first year of veterinary school at the University of Georgia, where my biochemistry professor was enthralled by the recent discovery of the DNA double helix and what it might mean for the future. I, too, was enthralled with this marvelous bit of biochemical and medical research—exactly what I’d envisioned as my cup of tea. When Cornell professor Clive McCay surprised me with an unsolicited offer by telegram for me to drop veterinary medicine and instead come to Cornell and study this new field of “biochemistry” (of which the emerging discipline of genetics was then a part), I jumped at the opportunity. In my graduate research program at Cornell, I formally combined nutrition as a major field of study with biochemistry as a minor. In retrospect, I realize that I was witnessing not only the emergence of a new field, but a tectonic shift in the way science viewed human health.

From the early 1900s to the early 1950s, nutrition researchers were at the forefront of the struggle to improve human health. In the early twentieth century, scientists and medical professionals had begun investigating the causes of such diseases as beriberi, scurvy, pellagra, rickets, and other maladies. These diseases appeared to be linked in some way to food, but the exact mechanism was unclear. Eventually, researchers identified specific nutrients and posed the possibility that inadequate intake of these
nutrients might be what leads to these diseases. Around 1912, the word
vitamin
was coined to refer to a substance in food, present in very small quantities, that was thought to be vital for sustaining life.

During the 1920s and 1930s, nutrition researchers identified a number of specific vitamins and other nutrients, including the “letter vitamins,” A through K. Amino acids, the building blocks of protein that are assembled from the DNA template, also were being studied to determine how their sequence and arrangement within polypeptide chains affected protein’s important, life-giving properties. In 1948, scientists stated with confidence that they had discovered the last vitamin, B
12
, based on the observation that it was possible to grow laboratory rats on diets composed only of chemically synthesized versions of these newly discovered food nutrients. Now that the elementary particles of nutrition had been found and catalogued, nutrition scientists believed, whole foods need not be eaten. Human beings could get everything they needed from pills, and hunger and malnutrition would be banished to the distant past.

The findings from this impressive period of basic nutrition research filled our lectures as I started my research program at Cornell University in 1956, of course. But news of these exciting nutrient discoveries had filtered down to the popular imagination years earlier. I remember, when I was a child, my mother gave my siblings and me spoonfuls of oil prepared from codfish liver daily because it contained the life-giving nutrient vitamin A (I can still taste that oil—ugh!). I also remember at about that same time my aunt telling my mother with considerable enthusiasm that someday we would not have to eat food because its main ingredients would be in the form of a few pills! Forget about the vegetables grown in my mom’s garden. (I remember my mother not taking kindly to that comment.) Protein was another nutrient independently gaining a reputation of epic proportions. On our dairy farm, we were certain that our milk was especially good for mankind (womankind had not yet been invented) because it was a source of high-quality protein that could make muscle and grow strong bones and teeth. Nutrition as a scientific discipline was riding high, although even then it was mostly focused on the discoveries and activities of individual nutrients.

Ironically, it was the reductionist nature of nutrition that provided the opening for the much more reductionist discipline of genetics to replace it as the best answer to the question of Why We Get Sick. All those fortified
breakfast cereals and multivitamin pills weren’t turning us into a nation of decathletes and vigorous octogenarians. Nutrition as a reductionist science had hit a dead end. And genetics obligingly stepped up to replace it.

THE NATURE-NURTURE DEBATE

The power struggle between nutrition and genetics closely mimics that age-old debate concerning nature versus nurture. Does our initial “nature” at birth—our genes—predetermine which diseases we get later in life? Or are health and disease events a product of our environment, like the food we eat or toxins we’re exposed to—our “nurture”? Forms of the nature-nurture debate (or mindless shouting match) have been raging for millennia, at least since Aristotle characterized the human mind as a tabula rasa, or a blank slate to be filled by guidance and experience, in opposition to the prevailing view that humans were born with fixed “essential natures.”

Most health researchers agree that neither nature nor nurture acts alone in determining which diseases we get, if any. Both contribute. The debate centers around
how much
each contributes. But the truth is, it’s almost impossible to assign meaningful numbers to the relative contributions of genes and lifestyle, let alone the specific contribution of nutrition.

This uncertainty became clear to me many years ago when, from 1980 to 1982, I was on a thirteen-member expert committee of the National Academy of Sciences preparing a special report on diet, nutrition, and cancer,
12
the first reasonably official report of its kind. Among other objectives, we were asked to estimate the proportion of cancers caused by diet versus those caused by everything else, including genetics, environmental toxins, and lifestyle, and through that, suggest how much cancer could be prevented by the food we eat.

Estimating the proportion of cancer prevented by diet was of considerable interest to those of us working on the project because, as had been noted in the media a year or so before, a report
13
developed for the now-abolished Office of Technology Assessment of the U.S. Congress by two very distinguished scientists from the University of Oxford, Sir Richard Doll and Sir Richard Peto, had suggested that 35 percent of all cancers were preventable by diet. This surprisingly high estimate quickly became a politically charged issue, especially as this estimate was even
higher than the 30 percent of cancers estimated to be preventable by not smoking. Most people had no idea that diet might be this important.

Our committee’s task of creating our own specific estimate of diet-preventable cancers proved to be impossible. I was assigned the task of writing a first draft of this risk assessment, and I quickly saw that this exercise made little or no sense. Any estimate of how much cancer could be prevented by diet that was based on a single number was likely to convey more certainty than it deserved. We also faced the dilemma of how to summarize the combined effects of the various factors that affect cancer risk. What were we to do, for example, if not smoking could prevent 90 percent of lung cancer (our current best guess), a proper diet could prevent 30 percent (there is such evidence), and avoiding air pollution could prevent 15 percent? Did we add these numbers together and conclude that 135 percent of lung cancer could be prevented?

Becoming aware of both of these somewhat contrasting difficulties (i.e., over-precision and inappropriate summation of risk), our committee therefore declined to include a chapter that gave precise estimates of the reduced risk of cancer due to a healthy diet. We also knew that the previous report prepared for the Office of Technology Assessment
14
did not fixate on a precise number for diet-preventable cancers; the 35 percent cited by the media was a result of sloppy reporting. In fact, the report’s authors had surveyed relevant professional diet and health communities and found that the estimates ranged broadly, from 10 percent to 70 percent. The seemingly finite figure of 35 percent was anything but conclusive. It was mostly suggested as a reasonable midpoint within this range, because a range of 10 percent to 70 percent would only confuse the public and discourage taking seriously diet’s effect on cancer development. It is a generous range within which personal biases can play.

Other books

Where It Hurts by Reed Farrel Coleman
How Happy to Be by Katrina Onstad
The Law Under the Swastika by Michael Stolleis
The Gate House by Nelson DeMille
King of the Perverts by Steve Lowe
The Golden Bough by James George Frazer
The Angel of History by Rabih Alameddine
For the King’s Favor by Elizabeth Chadwick