Authors: T. Colin Campbell
So even if we accept that the MDR is an accurate representation of what we need to achieve total health (a very risky assumption on its own), when we consume the RDI amount for a nutrient, nearly 98 percent of us are theoretically exceeding our minimum nutrient requirements. In addition, most people, including most health professionals, incorrectly assume that these
recommended
allowances are
minimum
requirements. This assumption encourages us to consume more of these nutrients than we need, which benefits companies who sell nutrient-based products such as supplements, fortified foods, and nutraceuticals.
There’s more. These RDIs—as they are popularly interpreted—have in my experience long been biased on the high side for some nutrients to the point where they encourage the consumption of animal-based foods. Have you heard the myth that we need to consume lots of calcium to have strong bones and prevent osteoporosis? The calcium recommendation in the United States (1,200-1,300 mg/day) considerably exceeds the intake in countries that consume no dairy and less calcium (400-600 mg/day) but experience much lower rates of osteoporosis.
10
Convincing evidence favors a recommendation for lower calcium intake, but, suffice it to say, the dairy industry has long had a strangling influence on the committee making these recommendations, urging these “unbiased experts” (their words) to accept a high-calcium RDI.
11
The riboflavin (vitamin B
2
) recommendation has long been set high as well, with the additional but false understanding that dairy is a rich source of this vitamin—a myth that started in the 1950s.
12
(In reality, dairy is not a rich source of riboflavin, at least as compared to certain plants.) In addition, the “daily value” for cholesterol is set at 300 mg/day. Cholesterol’s inclusion in this list implies that it is needed as a nutrient. It is not! Our bodies, on their own, produce all the cholesterol we need. Dietary cholesterol comes only from animal-based foods, and a far healthier recommendation would be zero!
Then there is the epic story of protein, a nutrient that has long been the government’s darling. The RDI for protein has for decades been 10-11 percent of calories, which is already more than enough (and not coincidentally, the average amount of protein consumed in a WFPB diet). Many people believe that a dietary average of 17-18 percent of calories from protein, also the current average level of protein consumption among Americans, is a good health practice. In 2002, the Food and Nutrition Board of the National Academy of Sciences (FNB) concluded, based on no credible evidence, that we can consume protein up to an astounding 35 percent of calories without health risk
13
—a number three times the longstanding RDI! At the time of the report, the director of the FNB was a major dairy industry consultant, and the majority (six out of eleven) of the members of a companion policy committee (the USDA “Food Pyramid” Committee) also had well-hidden dairy industry ties. Dairy groups even helped to fund the report itself. At this rate, before long, the government may start recommending a milk faucet in your kitchen next to the one for water.
The current system of developing and interpreting RDIs and guidelines according to industry interests is nothing less than shameful, not least because these industry-favoring standards and their supporting documents form the basis of so many government programs. These supposedly official items provide the scientific and political rationales for the way the national school lunch program, hospital meals, and Women, Infants, and Children programs are run.
14
As a member of the expert panel that wrote the 1982 report on diet, nutrition, and cancer for the NAS, I recall that one of our central debates focused on what we should suggest as the appropriate goal for dietary fat to reduce cancer risk, based on existing evidence. Should we suggest reducing it to 30 percent of total calories (from the then 35-37 percent average), when the evidence clearly pointed to a much lower number? The debate was not about the evidence. Instead, we were worried about the political palatability of an honest dietary fat recommendation as low as 20 percent (still twice the level suggested by a WFPB diet). It was a statement that, thirty years ago, likely would have doomed our report to oblivion just on its own. Ultimately, we chose not to go lower than 30 percent, in deference to a prominent member of our panel from the USDA, who convinced us that doing so might result in a decrease in the
consumption of protein and animal-based foods. That number, 30 percent, set the definition for a low-fat diet that remained part of the public narrative for many years thereafter. It gave the Atkins enthusiasts, among others, a false benchmark to use as a straw man in their argument that so-called low-fat diets don’t work. Our committee’s shading of the evidence in the policy statement in effect protected the animal foods industry and did nothing to promote human health.
While real nutrition is marginalized as a potential source of health, the federal government ignores and even covers up the truth about the deadly effects of the American medical system. As we saw in
chapter one
, the public CDC website conveniently omits the misfortunes of the medical system from the list of leading causes of death in the United States, despite the fact that “physician error, medication error and adverse events from drugs and surgery”
15
is the third leading cause of death, trailing just heart disease and cancer. These are deaths caused by the medical system, almost half of which result from the adverse effects of prescription drugs.
You might argue that the reason drug- and surgery-related deaths aren’t included in the CDC list is because government has judged those death-by-health-care numbers to be incorrect; perhaps the researchers got it wrong. But this stark reality was summarized and reported in the prestigious
Journal of the American Medical Association.
16
A federal entity, the Agency for Healthcare Research and Quality of the U.S. Department of Health and Human Services, was given responsibility in 1999 of monitoring medical errors nationwide in most U.S. hospitals. They have been diligent in getting all U.S. hospitals to systematically monitor such information, and have accumulated data for about five years as of this writing. The trend so far suggests not only that these statistics are correct, but also that the number of “medical errors” is increasing. Further, this may only be “the tip of the iceberg” with respect to the total number of avoidable deaths. An analysis of a subset of all hospitalized Medicare patients, for example, concluded that from 2000 to 2002, “over 575,000 preventable deaths occurred” nationwide.
17
This more recent report confirms that these errors remain a “leading” cause of death; in fact, the report’s authors agree that this number of deaths is so high that it should be considered an “epidemic.” How is it possible that this cause of death might be an epidemic in one government report and not even be listed on a separate government website as
a leading cause of death? Of course, such publicity would be bad for the disease business—and if the U.S. government cares about one thing here, it’s the economic interests of the medical establishment, one of the leading donors to political candidates, parties, and political action committees.
As we’ve discussed, the NIH devotes a microscopic amount of money to nutrition research, and most of that money supports reductionist studies on the effects of individual supplements, not whole foods. The NIH doesn’t get a lot of public press, but its influence on the direction of medical research is huge. Its $28 billion annual budget funds somewhere between 68 and 82 percent of all biomedical funding in the United States, and a considerable amount around the world. Its two biggest institutes, based on funding, are the NCI and the National Heart, Lung, and Blood Institute, corresponding to the two leading causes of death. Of course, there’s no Institute of Medical Error and Adverse Drug Effect Prevention, corresponding to the third leading cause! And, as I’ve mentioned, there’s no Institute of Nutrition.
The NIH is thought to be an objective research organization, but of course there’s no such thing as objectivity where funding priorities are concerned. Let’s take a moment and look, in brief, at the way taxpayer money is allocated by the U.S. Congress. After receiving testimony and a proposed budget from NIH officials, Congress provides money to NIH in its general budget. NIH then apportions the budget among the directors of its institutes, each of whom divides the money into different program areas. Since institutes at various levels in the appropriation process essentially compete against one another for funding, they tend to be highly sensitive to the interests of powerful members of Congress. Regardless of how enlightened any individual institute director might be, she or he still must devote the lion’s share of the money received to reductionist, profit-focused research, or else risk censure by Congressional representatives feeling their own financial pressure from industry lobbyists. There’s not much money available for the type of systems analysis that could help us reprioritize our health spending in more efficient and compassionate
ways. And almost nothing remains for studies of the social impact of health policies—trivial stuff, such as how real people’s health is affected by RDIs and school lunch programs.
The NIH gives out money in the form of grants. The way they do this is by inviting qualified people to sit on grant application review panels and pass judgment on the many submitted proposals that are competing for the money. By “qualified,” the NIH means something more specific and pernicious than “professionally qualified to evaluate study design and research potential.” The people deemed qualified to pass judgment on research grant priorities are those who have been successful in getting NIH grant money in the past, a cycle that helps keep innovative wholistic research off the menu.
I have served on grant review panels both within NIH and nongovernmental cancer-research funding agencies. Several years ago, I was invited by two successive NCI directors to present my views on the link between cancer and nutrition in a Director’s Seminar that included the director and about fifteen members of his staff. My second presentation followed my then-recent proposal for a new research-grant review panel called “Nutrition and Cancer” in hopes of giving some emphasis to this important topic. Although this new panel had been created, its name was changed to “Metabolic Pathology,” thus negating its purpose. In my presentation, I expressed concern that this new name would obscure the goal of studying nutrition and its ability to prevent and reverse cancer—a phenomenon that I was demonstrating in my lab at that point, and that had been corroborated in humans in the China Study. I asked then-director Sam Broder why the word
nutrition
could not be in the title. After some heated discussion, he snapped, “If you keep talking this way, you can just go back to Cornell where you came from.” Broder insisted that they were already funding nutrition research, but clearly our definitions of “nutrition research” were different. The NIH’s nutrition research at that point comprised, as it does now, only about 2 to 3 percent of the total NCI budget, most of which was devoted to clinical trials of supplements. Two hours of discussion (all right, argument) got me nowhere.
18
You can see the NIH’s reductionist agenda clearly in what is and isn’t included in its public pronouncements about the causes and future treatment options for currently “incurable” diseases. To cite an especially
pertinent example of an NIH-funded project laden with reductionist philosophy, I turn again to the supposed link between AF and liver cancer. The NIH website includes a page on this relationship, which I accessed in March 2012, almost four decades after Len Stoloff (then chief of the FDA branch studying mycotoxin) and I first published our doubts about AF being a human carcinogen. This NIH page begins:
For almost four decades, [National Institute of Environmental Health Sciences]-funded scientists have conducted research on the role in promoting liver cancer of aflatoxin, a naturally occurring toxin produced by mold. Their discovery of the genetic changes that result from aflatoxin exposure have led to a better understanding of the link between aflatoxin and cancer risk in humans. These discoveries are also being used in developing cancer prevention strategies....
NIEHS-funded scientists at the Massachusetts Institute of Technology were among the first to show that exposure to aflatoxin can lead to liver cancer. Their research also demonstrated that aflatoxin’s cancer-causing potential is due to its ability to produce altered forms of DNA called adducts.
19
See the reductionist assumption: AF causes cancer by altering DNA— as if the process were that linear and uncomplicated and unmediated by thousands of other reactions and interactions! But let’s allow the NIH to continue (while continuing to ignore the dominating nutritional effect on the course of this disease):
The Johns Hopkins University researchers are [...] the first to test the effectiveness of chlorophyllin, a derivative of chlorophyll that is used as an over-the-counter dietary supplement and food colorant, in reducing the risk of liver cancer in aflatoxin-exposed individuals. Studies conducted in Qidong, People’s Republic of China, showed that consumption of chlorophyllin at each meal resulted in a 55% reduction in the urinary levels of aflatoxin-related DNA adducts. The researchers believe that chlorophyllin reduces aflatoxin levels by blocking the absorption of the compound into the gastrointestinal tract. The results suggest that taking chlorophyllin, or eating green vegetables that are rich in chlorophyllin, may be a practical and cost-effective way of reducing liver cancers in areas where aflatoxin exposures are high.
20
Researchers have identified a biomarker—something they can measure that supposedly relates to cancer development. In this case, the biomarker is the level of AF-related DNA adducts in the urine. And they’ve identified a single nutrient—chlorophyllin—that can, in a straightforwardly reductionist fashion, block absorption of these compounds in the gastrointestinal tract.