Read Deadly Harvest: The Intimate Relationship Between Our Heath and Our Food Online
Authors: Geoff Bond
Interestingly, when in recent years Western dairymen entered these untapped markets, they hit upon an unexpected difficulty. The new, potential consumers thought that dairy consumption was a strange practice and found that it often disagreed with them. We now understand that dairy products can be a problem. For example, the San are uniformly intolerant of the lactose in milk and this applies in some degree to everyone on the planet. Lactose intolerance gives rise to allergies, headaches, bloating, colon diseases, and many other disorders.
The unhealthy properties of milk fat are now mostly accepted. We are told that fat-free milk is good for us and it is even better to stay away from cream, butter, and ice cream. For many years now, the connection between these foods and high cholesterol, heart disease, strokes, and hardening of the arteries has been well known. Scientific findings show that dairy consumption from any source (cow, goat, sheep) and in any form (including skimmed milk, cheese, and yogurt) is associated with a number of serious, slow-acting diseases, including osteoporosis, high cholesterol, cancers, allergies, heart disease, and obesity. The notion that dairy products
cause
osteoporosis is so contrary to conventional nutritional dogma that it needs solid justification. In chapter 4, we will look at the scientific background to these assertions.
It has been noted that the Germanic peoples, the ones who adopted dairy farming early, seem to tolerate milk quite well in their early years. We find, however, that childhood tolerance to milk wears off. Germanic senior citizens are just as vulnerable to milk intolerance as everybody else. This is one of the few instances that we know of where a human tribe has evolved an adaptation to a new food. We now suspect that early dairy herders must have suffered a very high percentage of weanlings dying from a bad reaction to milk. The ones that survived had a genetic makeup that allowed them to live through the experience and pass their genes on to their descendants. Even so, such people still suffer, like the rest of the population, from the slower-acting diseases caused by dairy foods.
FATS AND OILS
The term
fat
and the term
oil
mean essentially the same thing. A fat is simply an oil that is solid at room temperature. Fats (oils) fall into three classes: saturated, polyunsaturated, and monounsaturated. In nature, any particular fat (oil) is a cocktail of all three classes. As a rule of thumb, if it is solid (fat) at room temperature, then the chief component is saturated fat.
We have seen that the food supply of the African savanna was very low in fat. It was never available on its own and the foods themselves did not contain much. The San really loved to eat the warthog, which had a relatively high fat content of around 10% (but still a lot lower than red meat’s 25%). The other major source of fat was the mongongo nut. The situation remained much the same throughout history until well after the farming revolution. It was not until a few thousand years ago that domesticated animals, notably the pig, were bred porky enough to yield a fat that could be separated out. This kind of fat is lard, whereas fat from cows and sheep is known as tallow. Even so, it was only in certain places and certain levels of prosperity that farming peoples had the luxury of free animal fat in cooking. Traditionally, Chinese, Indian, and Japanese cooking is done with water, not fat.
Butter is also an animal fat, so the first dairy farmers were among the first to have fat as a separate entity. Several thousand years later, it was the same people (mostly northern Europeans) who, in the Middle Ages, discovered more efficient ways to raise livestock. This was the first time that a large group of humans had an abundance of meat and fat throughout the year. Fatty cuisine, utilizing cream, lard, and butter became the norm in Germany, Central Europe, and England. These same peoples then brought the animal fat habit to North America, Australia, and New Zealand. Animal fat consumption in U.S. was already strong in 1909 at 34 pounds per person per year; by 2000, consumption had accelerated to 42 pounds annually.
Meanwhile, in the southern parts of Europe and in the Near East, early farmers had domesticated the olive. The earliest recorded occurrence is from the Greek island of Crete around 3500 b.c.
71
Its cultivation was important to the ancient Greeks and Romans and they spread it to all the countries bordering the Mediterranean. Fresh olives are extremely bitter and must be treated with lye (a strong alkali leached from wood ash) before they can be eaten. Today, olives are grown primarily for olive oil. The Greeks first extracted the oil simply by heaping the olives on the ground in the sunshine and collecting the oil as it dribbled out of the ripe fruit. Now it is pressed out, but in the first pressing not a lot of pressure is used so that the bitterness stays behind; this is known as “extra virgin oil.” Greece remains the biggest consumer at about 42 pounds per person per year, while the tiny consumption in the U.S. has risen from 10 ounces to 1.5 pounds per person annually. Similar figures are seen in England, France, and Germany.
It is difficult to imagine, but just 100 years ago corn oil, peanut oil, sunflower oil, rapeseed oil (Canola oil), safflower oil, cottonseed oil, and other “vegetable” oils were virtually unknown to the ordinary consumer. They existed, of course, but only as an unwanted by-product of agricultural processes. The U.S. cooked with solid animal fats as did northern Europe, including Britain and Germany. Then, in 1910, the first process was developed by the food giant Procter and Gamble, in Cincinnati, Ohio, for turning these waste vegetable oils into something useful—cooking fat. The process was “hydrogenation.” Thus, Crisco
®
vegetable shortening was born and swiftly commercialized as a replacement for lard. It was cheaper, more convenient, and the quality more predictable than the animal fat alternatives.
Gradually, vegetable fat became popular until, by World War II, farmers grew plants specifically to supply oil to the new vegetable fat industry. Beginning in the 1950s, the budding fast food industry discovered and liked these fats: they had a long shelf life and could be reheated and reused repeatedly without producing “off” flavors. Similar qualities endeared vegetable fats to the rapidly expanding snack food industry. It is remarkable to think that fast foods and snack foods have only been commonplace since the mid 1960s.
However, in the 1970s researchers made the connection between saturated fat and heart disease and the spotlight was put on the practice of hydrogenation—yes, it was turning a relatively harmless plant oil into a health-threatening saturated fat. The solution was straightforward: just use the oil in its original, unhydrogenated state. Supermarket shelves filled with a wide range of vegetable cooking oils. By this time, the extraction technology had become more sophisticated. Today, high temperatures and pressures double the yield and petroleum solvents, such as hexane, extract the last drop out of the crushed oil seed. The raw oil is then bleached, deodorized, de-gummed, de-waxed, and refined with caustic soda. This produces vegetable oils that are clear, heat stable, bland, and odorless (some varieties can be used as engine oil).
Meanwhile, the fast food industry, expanding rapidly, continued using solid hydrogenated vegetable fat (commonly known as “shortening”) for its French fries until the 1990s. Recently, the concerns about hydrogenation encouraged them to convert to the original, liquid, unhydrogenated vegetable oil. This is a step in the right direction, but not the whole story, as we shall see.
The net result of the enthusiastic adoption of vegetable oils is a dramatic, 24-fold increase in U.S. consumption, from 1.5 pounds per person per year in 1909 to 36 pounds per person annually in 2000. Overall consumption of all fats and oils combined has more than doubled from 35 pounds per person per year in 1909 to 77 pounds annually in 2000.
We saw in Chapter 1 that humans are not designed to consume much fat and oil, and what little they do consume has to be of a certain kind. Today, we are consuming very high quantities of oils and fats—40% of calories for the average American—and these fats and oils are different from those found in our ancestral homeland. We can trace a range of diseases to this departure from the Savanna Model: artery plaque, thrombosis, osteoporosis, high blood pressure, arthritis, allergies, cancers, obesity, diabetes, asthma, menstrual cramps, and many more. What is going on? We’ve all heard the slogan “fat makes you fat”, but how can fat (oil) possibly be responsible for such a wide range of other illnesses? The answer lies in our hormones: many fats manipulate our hormones, others do nothing, and yet others block hormones altogether. In other words, like bulls in a china shop, we are blundering about, knocking over our hormones, blissfully unaware of how the fats and oils we eat are disrupting the fine balance of our bodies’ workings. This is a crucial, but neglected aspect of what we eat: it can affect our body in subtle, unseen, yet harmful ways.
SUGAR GROUP
In Chapter 2, we split the USDA’s “sweets” section from the Fats, Oils, and Sweets group and renamed it the “Sugar Group.” What the USDA means by “sweets” is sugar and foods with a high sugar content, such as candies, soft drinks, and some desserts. They are mainly thinking of the familiar sugar that we know as “table sugar,” although they also mention other sources of sugar, including honey, maple syrup, and corn syrup.
There are, in fact, several types of sugar. Fructose is a sugar that is commonly found concentrated in many fruits (from which it gets its name); another common sugar is glucose. Frequently, the two combine equally to form a new type of sugar called sucrose. Table sugar is 99% sucrose and comes either from sugar cane or sugar beets. As we have seen, sweet foods were a rare commodity in the ancestral diet. The main source was honey, which is composed of several different sugars, with glucose and fructose as the major components.
Honey
Even though most people today do not eat much honey, it has become a byword for innate goodness, sweetness, and even love. Winnie the Pooh said that “eating honey” was his favorite pastime. Shakespeare mentions honey 47 times: as endearments (“honey-love”), as flattery (“honeyed words”), as a sugar-coating for something unpleasant, as a delicacy, as something healing, and, by its association with bees, with industry and chasteness.
Our Pleistocene ancestors gave priority to finding honey, but they would not have found much. Australian anthropologist Betty Meehan lived for a year with the native Anbarra aboriginals of Northern Australia and she recorded an average honey consumption of around 4 pounds per person per year.
72
That contrasts with the current average consumption of sugar in the U.S. of about 160 pounds per person per year—40 times as much.
The situation would have remained much the same up until the first farmers learned how to “farm” bees. The first recorded instance of beekeeping is in Ancient Egypt around 2400 b.c. From that time on, it is clear that, for the ancient Egyptians at least, honey became more available. Even so, it is certain that honey consumption was limited to the affluent classes: in 2100 b.c., the 1,000 manual workers building a monument ate “bread, vegetables, and meat”, whereas the king’s messenger received in addition “oil, fat, wine, figs, and honey.”
73
A marriage contract of around 1200 b.c. provides the bride with “12 jars of honey per year” (around 20 pounds), so honey is still precious and rare enough to form part of a marriage bargain. The boy-Pharaoh, Tutankhamen, had jars of honey buried with him. On the other hand, it seems that the ordinary populace had to make do with other sources of sweetness, which archaeologists have identified as syrups made from the juices of figs, dates, and grapes.
74
The practice of beekeeping spread to ancient Greece and Rome, while the ancient Chinese imported honey from the Mediterranean area. In a.d. 500, one retired Peking bureaucrat was paid a quart of honey per month as pension. In late Bronze Age Britain (around 1000 b.c.), the production of beeswax was vital for the casting of bronze objects. We can suppose that the Ancient Britons enjoyed eating the honey that came with the wax.
In Europe’s Middle Ages, there are many records of honey production. In England, Dame Alice de Bryene recorded in her household accounts for the year 1412 to 1413 a consumption of 6-1/2 quarts of honey. In her 40-strong household, this works out at less than half a pound per person per year. By Shakespeare’s time, at the turn of the 1600s, just about every smallholder and cottager would have had a hive or two. Honey was commonplace but not available in large quantities, perhaps not even the 4 pounds per person annually that the Australian aboriginal was able to find by foraging. Even today, honey consumption in the U.S. languishes at around 1 pound per person per year, but that is because of the arrival of a powerful competitor—sugar.
Table Sugar
Common sugar (or table sugar) comes chiefly from either sugar cane or sugar beets. Sugar cane is native to New Guinea in Southeast Asia and several thousand years ago, sugar cane cultivation spread throughout tropical Asia, notably to India. Alexander the Great, in his conquest of the Ganges area of India during the 3rd century b.c., was one of the first Europeans to come into contact with sugar cane. He reported the existence of a “stiff grass yielding a kind of honey.” Mostly Indians just chewed the cane, but around this time, in 400 b.c., they were trying to develop ways to extract the juice. The methods were rudimentary, but they were the first examples of sugar presses or “mills.”