Read Wheat Belly: Lose the Wheat, Lose the Weight and Find Your Path Back to Health Online
Authors: William Davis
Gluten encephalopathy shows itself as migraine headaches and stroke-like symptoms, such as loss of control over one arm or leg, difficulty speaking, or visual difficulties.
13,
14
On MRI of the brain,
there is characteristic evidence of damage surrounding blood vessels in cerebral tissue. Gluten encephalopathy will also show many of the same balance and coordination symptoms as those that occur with cerebellar ataxia.
In one particularly disturbing Mayo Clinic study of thirteen patients with the recent diagnosis of celiac disease, dementia was also diagnosed. Of those thirteen, frontal lobe biopsy (yes, brain biopsy) or postmortem examination of the brain failed to identify any other pathology beyond that associated with wheat gluten exposure.
15
Prior to death or biopsy, the most common symptoms were memory loss, the inability to perform simple arithmetic, confusion, and change in personality. Of the thirteen, nine died due to progressive impairment of brain function. Yes: fatal dementia from wheat.
In what percentage of dementia sufferers can their deteriorating mind and memory be blamed on wheat? This question has not yet been satisfactorily answered. However, one British research group that has actively investigated this question has, to date, diagnosed sixty-one cases of encephalopathy, including dementia, due to wheat gluten.
16
Wheat is therefore associated with dementia and brain dysfunction, triggering an immune response that infiltrates memory and mind. The research into the relationship of wheat, gluten, and brain damage is still preliminary, with many unanswered questions remaining, but what we do know is deeply troubling. I shudder to think what we might find next.
Gluten sensitivity can also show itself as seizures. The seizures that arise in response to wheat tend to occur in young people, often teenagers. The seizures are typically of the temporal lobe variety—i.e., originating in the temporal lobe of the brain, just beneath the ears. People with temporal lobe seizures experience hallucinations of smell and taste, odd and inappropriate emotional feelings such as overwhelming fear for no cause, and repetitive behaviors such as smacking the lips or hand movements. A peculiar
syndrome of temporal lobe seizures unresponsive to seizure medications and triggered by calcium deposition in a part of the temporal lobe called the hippocampus (responsible for forming new memories) has been associated with both celiac disease and gluten sensitivity (positive antigliadin antibodies and HLA markers without intestinal disease).
17
Of celiac sufferers, from 1 to 5.5 percent can be expected to be diagnosed with seizures.
18,
19
Temporal lobe seizures triggered by wheat gluten are improved after gluten elimination.
20,
21
One study demonstrated that epileptics who experience the much more serious generalized (grand mal) seizures were twice as likely (19.6 percent compared to 10.6 percent) to have gluten sensitivity in the form of increased levels of antigliadin antibodies without celiac disease.
22
It’s a sobering thought that wheat has the capacity to reach into the human brain and cause changes in thought, behavior, and structure, occasionally to the point of provoking seizures.
Gluten is the component of wheat confidently linked with triggering destructive immune phenomena, whether expressed as celiac disease, cerebellar ataxia, or dementia. However, many health effects of wheat, including those on the brain and nervous system, have
nothing
to do with immune phenomena triggered by gluten. The addictive properties of wheat, for instance, expressed as overwhelming temptation and obsession, obstructed by opiate-blocking drugs, are not directly due to gluten, but to exorphins, the breakdown product of gluten. While the component of wheat responsible for behavioral distortions in people with schizophrenia and children with autism and ADHD has not been identified, it is likely that these phenomena are also due to wheat exorphins and not a gluten-triggered immune response. Unlike gluten sensitivity, which can
usually be diagnosed with the antibody tests, there is at present no marker that can be measured to assess exorphin effects.
Nongluten effects can
add
to gluten effects. The psychological influence of wheat exorphins on appetite and impulse, or the glucose-insulin effects, and perhaps other effects of wheat that have yet to be described, can occur independently or in combination with immune effects. Someone suffering with undiagnosed intestinal celiac disease can have odd cravings for the food that damages their small intestine, but also show diabetic blood sugars with wheat consumption, along with wide mood swings. Someone else without celiac disease can accumulate visceral fat and show neurological impairment from wheat. Others may become helplessly tired, overweight, and diabetic, yet suffer neither intestinal nor nervous system immune effects of wheat gluten. The tangle of health consequences of wheat consumption is truly impressive.
The tremendously varying way the neurological effects of wheat can be experienced complicates making the “diagnosis.” Potential immune effects can be gauged with antibody blood tests. But nonimmune effects are not revealed by any blood test and are therefore more difficult to identify and quantify.
The world of the “wheat brain” has just started giving way to the light of day. The brighter the light shines, the uglier the situation gets.
IF WHEAT AND
its effects can grasp hold of organs such as the brain, intestines, arteries, and bones, can it also affect the largest organ of the body, the skin?
Indeed it can. And it can display its peculiar effects in more ways than Krispy Kreme has donuts.
Despite its outwardly quiet facade, skin is an active organ, a hotbed of physiologic activity, a waterproof barrier fending off the attacks of billions of foreign organisms, regulating body temperature through sweat, enduring bumps and scrapes every day, regenerating itself to repel the constant barrage. Skin is the physical barrier separating you from the rest of the world. Each person’s skin provides a home to ten trillion bacteria, most of which assume residence in quiet symbiosis with their mammalian host.
Any dermatologist can tell you that skin is the outward reflection of internal body processes. A simple blush demonstrates this
fact: the acute and intense facial vasodilatation (capillary dilation) that results when you realize the guy you flipped off in traffic was your boss. But the skin reflects more than our emotional states. It can also display evidence of internal physical processes.
Wheat can exert age-advancing skin effects, such as wrinkles and lost elasticity, through the formation of advanced glycation end products. But wheat has plenty more to say about your skin’s health than just making you age faster.
Wheat expresses itself—actually, the body’s
reaction
to wheat expresses itself—through the skin. Just as digestive by-products of wheat lead to joint inflammation, increased blood sugar, and brain effects, so too can they result in reactions in the skin, effects that range from petty annoyances to life-threatening ulcers and gangrene.
Skin changes do not generally occur in isolation: If an abnormality due to wheat is expressed on the skin surface, then it usually means that the skin is not the only organ experiencing an unwanted response. Other organs may be involved, from intestines to brain—though you may not be aware of it.
Acne: the common affliction of adolescents and young adults, responsible for more distress than prom night.
Nineteenth-century doctors called it “stone-pock,” while ancient physicians often made issue of the rash-like appearance minus the itching. The condition has been attributed to everything from emotional struggles, especially those involving shame or guilt, to deviant sexual behavior. Treatments were often dreadful, including powerful laxatives and enemas, foul-smelling sulfur baths, and prolonged exposure to X-ray.
Aren’t the teenage years already tough enough?
As if teenagers need any more reason to feel awkward, acne visits the twelve- to eighteen-year-old set with uncommon frequency.
It is, along with the onslaught of bewildering hormonal effects, a nearly universal phenomenon in Western cultures, affecting more than 80 percent of teenagers, up to 95 percent of sixteen- to eighteen-year-olds, sometimes to disfiguring degrees. Adults are not spared, with 50 percent of those over age twenty-five having intermittent bouts.
1
While acne may be nearly universal in American teenagers, it is not a universal phenomenon in all cultures. Some cultures display no acne whatsoever. Cultures as wide ranging as the Kitavan Islanders of Papua New Guinea, the Aché hunter-gatherers of Paraguay, natives of the Purus Valley in Brazil, African Bantus and Zulus, Japan’s Okinawans, and Canadian Inuit are curiously spared the nuisance and embarrassment of acne.
Are these cultures spared the heartbreak of acne because of unique genetic immunity?
Evidence suggests that it is not a genetic issue, but one of diet. Cultures that rely only on foods provided by their unique location and climate allow us to observe the effects of foods added or subtracted to the diet. Acne-free populations such as the Kitavans of New Guinea exist on a hunter-gatherer diet of vegetables, fruits, tubers, coconuts, and fish. The Paraguayan Aché hunter-gatherers follow a similar diet, along with adding land animals and cultivated manioc, peanuts, rice, and maize, and are also spared completely from acne.
2
Japanese Okinawans, probably the most long-lived group on planet earth, until the 1980s consumed a diet rich in an incredible array of vegetables, sweet potatoes, soy, pork, and fish; acne was virtually unknown among them.
3
The traditional Inuit diet, consisting of seal, fish, caribou, and whatever seaweed, berries, and roots that are found, likewise leaves Inuits acne-free. The diets of African Bantus and Zulus differ according to season and terrain, but are rich in indigenous wild plants such as guava, mangoes, and tomatoes, in addition to the fish and wild game they catch; once again, no acne.
4
In other words, cultures without acne consume little to no wheat, sugar, or dairy products. As Western influence introduced
processed starches such as wheat and sugars into groups such as the Okinawans, Inuits, and Zulus, acne promptly followed.
5-
7
In other words, acne-free cultures had no special genetic protection from acne, but simply followed a diet that lacked the foods that provoke the condition. Introduce wheat, sugar, and dairy products, and Clearasil sales skyrocket.
Ironically, it was “common knowledge” in the early twentieth century that acne was caused or worsened by eating starchy foods such as pancakes and biscuits. This notion fell out of favor in the eighties after a single wrongheaded study that compared the effects of a chocolate bar versus a “placebo” candy bar. The study concluded that there was no difference in acne observed among the sixty-five participants regardless of which bar they consumed—except that the placebo bar was virtually the same as the chocolate bar in calories, sugar, and fat content, just minus the cocoa.
8
(Cocoa lovers have cause to rejoice: Cocoa does
not
cause acne. Enjoy your 85 percent cocoa dark chocolate.) This didn’t stop the dermatologic community, however, from pooh-poohing the relationship of acne and diet for many years, largely based on this single study that was cited repeatedly.
In fact, modern dermatology largely claims ignorance on just why so many modern teenagers and adults experience this chronic, sometimes disfiguring, condition. Though discussions center around infection with
Propionibacterium acnes,
inflammation, and excessive sebum production, treatments are aimed at suppressing acne eruption, not in identifying causes. So dermatologists are quick to prescribe topical antibacterial creams and ointments, oral antibiotics, and anti-inflammatory drugs.
More recently, studies have once again pointed at carbohydrates as the trigger of acne formation, working their acne-promoting effects via increased levels of insulin.
The means by which insulin triggers acne formation is beginning to yield to the light of day. Insulin stimulates the release of a hormone called insulin-like growth factor-I, or IGF-I, within the skin. IGF-1, in turn, stimulates tissue growth in hair follicles
and in the dermis, the layer of skin just beneath the surface.
9
Insulin and IGF-1 also stimulate the production of sebum, the oily protective film produced by the sebaceous glands.
10
Overproduction of sebum, along with skin tissue growth, leads to the characteristic upward-growing reddened pimple.
Indirect evidence for insulin’s role in causing acne also comes from other experiences. Women with polycystic ovarian syndrome (PCOS), who demonstrate exaggerated insulin responses and higher blood sugars, are strikingly prone to acne.
11
Medications that reduce insulin and glucose in women with PCOS, such as the drug metformin, reduce acne.
12
While oral diabetes medications are usually not administered to children, it has been observed that young people who take oral diabetes medications that reduce blood sugar and insulin do experience less acne.
13