Sure enough, there is a link between blood glucose and hunger. As we know from people with diabetes, our bodies react severely when blood sugar is too low: shakiness, nausea, and sweating often result. But this does not mean that blood glucose can be used as a simple measure of hunger. It transpires that absolute glucose levels in the blood are actually a very weak indication of hunger and fullness. Glucose injections do not tend to reduce appetite, at least not much. High glucose in the blood, strangely enough, does not translate to high glucose in the small
intestine. What does induce hunger is when blood sugar rapidly drops. In a lab situation, a decline in blood glucose over a short time frame—a few minutes—will fairly reliably prompt human subjects to ask for a meal. Yet this is still not the whole story. In one study, fifteen overweight men were asked to stay in an isolated room free of clocks or any other indication of when mealtimes might be; meanwhile, their blood sugar was continuously monitored. They were free to ask for a meal whenever they liked. Among them, they asked for forty meals when they were in a “postabsorptive state,” meaning the point at which all the food they had eaten at their last meal had been absorbed from the digestive tract. Yet, in these men, the hungry moments never coincided with a drop in blood sugar.
Another “biomarker” for hunger is a hormone in the gut called CCK (for
cholecystokinin
), which is released into the blood when the body detects fat or protein. At least sixteen separate scholarly studies have shown that the presence of CCK suppresses hunger, but since it works primarily when the stomach is full, it hardly represents much progress as a measure for hunger. After twelve healthy men were given slow intravenous doses of CCK, they spontaneously reduced the amount of food they ate (including strawberry jelly on Ritz crackers) by an average of 122 grams. But that average conceals great variation. Three of the twelve men actually ate
more
food when they were dosed with CCK. A 2003 study found that the appetite-suppressing effects of CCK are greatly increased when the stomach is distended. But we already knew that people with distended stomachs tend to feel full, CCK or not.
There has been much excitement in scientific circles over two other hormones,
leptin
and
ghrelin
, that appear to work in tandem to influence our hunger. The theory is that leptin reduces hunger and ghrelin increases it. Leptin is what tells the brain how much body fat is being stored and is available for use. Certainly, a high level of it causes laboratory animals to eat less. With humans, though, it’s complicated. There have been rare cases of hugely obese children whose bodies lacked leptin and who were crazed with hunger, scavenging rotting food from rubbish bins or uncooked fish fingers from the freezer. Leptin injections got them back to normal levels of both hunger and weight. Among those of average
weight, however, leptin concentrations do not change markedly after a meal; it is only after starving for longer periods, twenty-four hours or so, that leptin concentrations in the body fall significantly.
Given that leptin sends information about how much fat is available to the brain, its usefulness as a measure of hunger depends on what else is happening in the body. Binge eaters often have relatively high levels of leptin circulating in their systems, as do obese people, but this does not mean that they never experience hunger. It has been suggested that obesity may trigger leptin resistance in the body, such that it no longer works as a signal for reducing appetite.
Ghrelin—a hunger stimulant—may be a more promising biomarker for hunger. People who suffer from a rare condition called Prader-Willi syndrome, characterized by extreme and unassuageable hunger, have four and a half times more ghrelin in their systems than others who do not have the condition. But the absence or presence of ghrelin is not enough to cause hunger. In people with regular mealtimes, the feeling of hunger appears to arrive
before
the amounts of ghrelin in your system increase.
The most common way scientists measure whether someone is hungry or not remains simply asking them. Your answers are then mapped on a scale. For example, you might be asked “How hungry are you?” and prompted to mark your answer on a line somewhere between “Not at all” and “As hungry as I have ever felt.” Our bodies give us some pretty clear signs that they want food, including tightness in the stomach, loud rumbling of the digestive tract, a feeling of hollowness, lightheadedness, dryness in the mouth or throat, and sometimes a weird excitability. In pigeons that have had the cerebrums removed from their brains, hunger leads to an intense, restless running about, which instantly stops with just a few grains of wheat. Humans are not pigeons. The problem with subjective reports of hunger is that different people experience lack of food in such vastly different ways.
My mother was a wartime baby—born in 1941—and many times told me and my sister of the hunger of the late 1940s, after the war, when rationing was still in force in England. She attributed her lifelong fear of small portions to this austerity. Rationing had taught her that no portion
could ever be too big. Years later, on long family car journeys, she could not bring herself to stop at the motorway restaurants called Little Chef, not because she thought the food was bad—which it was—but because the adjective “little” convinced her there wouldn’t be enough of it. A story she told repeatedly to illustrate the hunger of rationing was of the day as a child when she felt so ravenous that she burst into the larder and ate a whole block of margarine. Such hunger! But one day, her older brother, my uncle, was having lunch at our house when she told the margarine story again. Did he remember feeling terribly hungry, too, during those wartime years? she asked. Not particularly, no, he said, and changed the subject.
The most famous attempt to measure what hunger actually does to the human body was the Minnesota Starvation Experiment, conducted in 1944–1945 at the University of Minnesota. For twenty-four weeks, thirty-six strapping and healthy young men were put on a reduced diet of 1,560 calories, a similar amount to what many weight-loss diets now recommend. As well as losing, on average, a quarter of their bodyweight, they suffered intense psychological and physical distress. Some became obsessed with reading cookbooks. Many found that their sex drive declined, and they withdrew socially. It was common to feel dizzy, moody, and nervous. They bit their nails, chewed gum, and drank coffee in profusion. A few became paranoid that they were being given less food than others, or engaged in bizarre behavior, such as dousing their meals in water and spices. But while they all ate the same amount of food, they varied in the degree and quality of the hunger pains they felt. Two-thirds reported feeling hungry all the time, but another third did not. For some, hunger felt like a mild discomfort in the abdomen; for others it was a sharp and intolerable shooting sensation. In the months after the experiment ended, their ability to gauge hunger went haywire. They might eat as many as 6,000 calories, gorging until they were uncomfortably full and gassy, yet still feel unsatisfied.
Hunger—this mechanism that we suppose to be so basic—turns out to be one of the more intricate bodily impulses. Feeding it is not like putting petrol in a car. There is nothing straightforward about gauging its extent, either from the inside or from the outside. Nor is it an easy thing to ter
minate. The mere fact that we speak of hunger as simple may be a sign of how little we have understood what it would mean to master it.
“You’d eat it if you were really hungry,” I hear myself saying
to the child who is pushing the remains of the mashed potatoes around on his plate, angling to be offered something better. “If you were really hungry, you’d be happy with a slice of bread,” I snap at the ravenous teenager who has already eaten a full supper, plus pudding, plus a supplementary bowl of yogurt, fruit, and honey, plus a toasted cheese sandwich, and who now says he can’t sleep until he has one last snack. I hate the naggy way my voice sounds when I say this “if you were really hungry” line—which I only come out with at the end of my tether. It’s as if I am blaming my children for not being more like those other, more deserving children in the world, the “really” hungry ones. The implication is that “if you were really hungry,” you’d eat anything. Which isn’t actually true.
What is true is that hunger teaches people to accept a wider range of foods. In one experiment from 2009, people were either deprived of food for fifteen hours or given plenty to eat. Afterward, when they were shown pictures of unappetizing food—stews reminiscent of vomit, spinach pulp—the hungry people showed less activity in the
levator
muscle of the mouth that signals disgust. Conversely, when shown pictures of appealing food—pasta, pizza—the hungry subjects showed more activity in the
zygomaticus
muscle, which makes us smile. In short, hunger made the nasty food seem nicer and the nice food seem nicer still.
But there are limits. Even in a state of abject food deprivation, there are certain taboo things that people tend not to eat—witness the fact that cannibalism is so rare. Hungry or not, few in the developed world would resort to a dinner of insects, or eyeballs, or dog. In most situations involving unusual foods, disgust easily trumps hunger. It is false to think that there is a state of absolute hunger in which children would eat anything. Among the hungriest children of the world, hunger is still concrete and not abstract. It cannot be satisfied by just any old thing.
Over the past ten years, the treatment of acute child malnutrition—the sort of hunger that carries an imminent danger of death—has been
revolutionized by the invention and distribution of a peanut-based paste called Plumpy’Nut. This is an energy-dense mixture: a kind of supercharged peanut butter. Delivered in little foil-wrapped packages that children can squirt straight into their own mouths, it was the brainchild of André Briend, a French pediatric nutritionist. He came up with the idea after testing numerous less successful malnutrition foods that already existed, including doughnuts and pancakes. Briend supposedly had his “aha moment” about developing a nourishing paste by looking at a jar of Nutella chocolate spread. On a trip to Malawi, where both peanuts and hungry children are plentiful, he borrowed an electric blender from a local restaurant and whipped up a nutty cocktail of peanuts, milk powder, vitamins, minerals, sugar, and oil.
Before Plumpy’Nut, when children under the age of five arrived at feeding centers with suspected acute malnutrition, the safest option was to admit them to the hospital for tube-feeding. A high percentage—as many as 75 percent in some centers—died anyway. Because it was so hard for mothers to be separated from their children, they often delayed bringing them in for help until it was almost too late. Another option was to give families a dried fortified milk mixture (called F100) that could be administered at home, but that had to be diluted with water, a dangerous proposition in most of the developing world due to the lack of reliably safe drinking water. Also, the dosage and dilution of the milk powder were left up to the families, and many overdiluted it, to make it go further, so that it could be shared among all the children in the family, rather than just those at high risk of death.
One of the great advantages of Plumpy’Nut is that as a paste it does not have to be diluted, so it can safely be given at home, avoiding the need for a hospital stay. It is what is known as an RUTF: Ready to Use Therapeutic Food. The first trials with Plumpy’Nut yielded results that were miraculous. Mark Manary, an American pediatrician working in Malawi, field-tested it in 2001. In defiance of medical orthodoxy, he sent all the children on his ward home with a six-week course of peanut paste. Ninety-five percent of them made a full recovery, as against an average of just 25 percent of those being treated for malnutrition in a hospital. Six months on, the Plumpy’Nut children were still healthy. Peanut RUTFs
are now the main way that acute child malnutrition is treated around the world. In African countries, these sweet pastes are popular with both children and mothers. Children like the sticky nutty taste. Mothers like the convenience. And doctors and aid workers like the fact that rates of recovery are so high.
And yet Plumpy’Nut has not been so welcome everywhere. While it has been an unequivocal hit in Africa, in India and Bangladesh the mothers and children do not respond to it so well. It’s not that they are not hungry enough to appreciate it. There are around 8 million children in India at risk of death from severe acute malnutrition. Bangladesh has one of the worst levels of childhood hunger in the world, with 46 percent of under-fives “stunted,” according to a UNICEF report, and 15 percent “wasted.” “Stunting,” according to the definitions used by aid organizations, means not growing or developing properly owing to years of poor nutrition. “Wasting” is a more acute kind of hunger that can kill in a matter of weeks; it can be brought on by sudden food shortages or disease, and it’s exactly the kind of critical hunger that Plumpy’Nut is designed to combat. But in Bangladesh, a peanut-based paste does not fit with local ideas of what “food” is, and specifically, which foods will appease a child’s hunger.
Jose Luis
Álvarez
Morán works for the charity Action Against Hunger, which fights child malnutrition in more than forty countries. He has witnessed firsthand how successful Plumpy’Nut can be in the treatment of acute malnutrition. But India and Bangladesh are different: no matter how extreme the hunger, our cultural ideas about food do not go away. Indian mothers, on the whole, would rather feed a hungry child something made from lentils or rice than peanuts, which are not part of the everyday diet. “And in Bangladesh, they just don’t like it,” says
Álvarez
Morán. “They want only locally produced food.”
When researchers went into an urban Bangladeshi slum in Dhaka in 2011, they found a very low level of acceptance of Plumpy’Nut among parents and children. If it were true that a “really hungry” child would eat just any food, then the slum-dwellers of Bangladesh should be only too happy to receive free sachets of calorie-dense Plumpy’Nut. But this was not the case. Out of 149 Bangladeshi caregivers of malnourished children—mostly mothers—6 out of 10 said that Plumpy’Nut was not
acceptable as a food. Many hated the peanutty smell; others reviled the sweet taste and the sticky, thick texture. The dark brown color looked like excrement to three of the parents. Twenty parents said their children needed encouragement to eat it, and fifty had to be forced. It was as if they refused to accept that this strange brown paste—so unlike food as they knew it—could satisfy a child’s hunger. Thirty-seven percent said it made their children vomit, and 13 percent said it gave them diarrhea, even though 112 of the parents also admitted that their children were gaining weight while consuming it.