Soup leaves us full because we believe it will. One of the most interesting experiments on soup and fullness was done by scientists at Purdue University in 2004. They found that when apple juice was heated up and presented as “apple soup” in a bowl with a spoon, it left subjects much fuller than when they drank it cold, in a glass as juice. The calories and the volume were identical, yet the apple soup left people much fuller than the juice, both fifteen minutes later and an hour later. The researchers noted that soup may satisfy us more than juice because we perceive it as filling. They concluded that the main reason soup had such a positive effect on satiety was “cognitive.” We think of it as a food that will kill hunger, and so it does. The apple-soup experiment suggests that to search for “fullness” in this or that food is to look in the wrong place.
The search for special fullness-inducing foods assumes that
when our bodies register that we are full, we will stop eating. This is a logical enough assumption, though, as we’ve seen, very little about the way we eat is, in fact, logical. Professor Barry Popkin, a leading obesity researcher based at the Carolina Population Center, argues that outside of food labs, hunger and satiety are not actually the main forces driving our eating anymore. Very few of us allow ourselves—or our children—to experience the sensation of hunger at all these days. We are semi-sated much of the time, preloaded with nibbles here and there that hardly seem to register as eating.
Popkin used survey data to track how meal patterns had changed in the United States from 1977 to 2006. He found that for both children and adults, there was a 23 percent decrease in the amount of time between “eating opportunities” (aka meals and snacks). In 1977, the average time between meals was 4.1 hours for children and 4.4 hours for adults. (For point of comparison, in eighteenth-century Europe, a six- or seven-hour gap between meals was standard.) By 2006, the time had gone down to 3.1 hours for children and 3.5 hours for adults. In other words, the time between “eating opportunities” has shortened by a full hour. This shrinkage is reflected in the rising number of calories eaten: in 1977, it averaged out at 2,090 a day, as against 2,533 in 2003–2006 (across all American age groups over the age of two). Interestingly, the amount of calories consumed at meals has actually declined slightly for children since 1977—by 62 calories a day. For both children and adults, the main increase in calories was from snacks. If Popkin’s data accurately reflect eating patterns (and he suspects they are on the conservative side), the average adult now consumes an extra 180 calories a day from food snacks compared to thirty years ago, not to mention the extra calories from beverages (which increased from 290 calories to 422 calories for adults). “Eating between meals” used to be frowned upon, but now it’s actively encouraged by some diet gurus, who claim that our blood sugar will stay more regular if we snack every three or four hours. Which may make sense if the snack in question is a handful of pecans or a pear—the argument starts to look a
little shakier when you look at the snack foods that people most often buy: potato chips, sugary muffins, and confectionery.
Sometimes, the hunger won’t go away no matter how much you eat. It’s easy to confuse hunger with other emotional states. When it’s your birthday, you can’t not be hungry for cake. “Emotional eating” usually refers to sad emotions, but happy moods can also make us eat more. We are conditioned to use excess calories to celebrate. Researchers have found that it is possible to trigger celebratory binge-eating behavior by putting subjects in a good mood, simply from watching a 2½-minute heartwarming film about a baby panda sneezing. In one test, the group who watched this film consumed a hundred calories more of snack foods (M&Ms, peanuts, and wine gums) than a control group who were shown a dull film about birds in the desert.
Mostly, we eat so often and so much because we have more or less lost touch with the signals our body is sending us about hunger. We take our cues to eat from many places, very few of which have to do with the biomarkers of fullness in our brains and guts. Knowing
which foods
to eat, as we have seen, is a skill that develops with age and experience. But knowing how much to eat is something that infants are better at than older children or adults. Up until the age of three, children have a remarkable ability to stop eating when they are full. It doesn’t matter if you serve them a big portion or a small portion; they will eat until they are not hungry and then stop (assuming they are not force-fed). After that age, this ability to self-regulate hunger is partially lost, and sometimes never regained.
Young children’s accuracy in recognizing when they are full has been confirmed by numerous studies. In one, from 2000, thirty-two preschool children from Pennsylvania were served macaroni cheese. There were two different age groups: three-year-olds and five-year-olds. The three-year-olds ate roughly the same amount of the macaroni whether they were served a small, medium, or large portion. They were not paying attention to the size of the food they were being offered but to what their own bodies were telling them. The five-year-olds, on the other hand, ate significantly more when the portion of macaroni was large. It was as if the sight of so much food in the bowl was telling them to ignore their
own fullness and keep munching. The loss of hunger regulation after the age of four is a phenomenon that transcends cultures and continents. In
2013, the Pennsylvania findings were replicated in Kunming in China with variable portions of rice, vegetables, and protein. This time, the two groups of children were aged four and six. The four-year-olds from Kunming actually ate slightly
less
when offered a large portion, as if they felt overwhelmed by all that food. But the six-year-olds ate substantially more. The scientists behind this Chinese experiment suggested that “in hunger and satiety, there is some point in the development process when children begin to respond to contextual cues such as portion size.”
Most of us continue to respond to these contextual cues, rather than actual hunger, for the rest of our lives. In one famous study—the “bottomless soup” experiment—adult diners were served tomato soup from bowls that were continuously refilled from secret tubes as they ate. Others at the table ate from normal soup bowls. After the meal, researchers asked them how full they felt and how much soup they believed they had consumed. Those eating from the self-refilling bowls estimated they had consumed just a fraction more than from a normal bowl: an additional 4.8 calories. In fact, they had consumed 76 percent more soup than the subjects eating from normal bowls, nearly a whole extra portion. Yet they did not rate themselves as any fuller than those eating from the regular bowls.
From childhood onward, our idea of fullness is heavily influenced by how much food we are offered. Large packages make it seem normal to eat large quantities. We are disposed to think that we will be full when we have eaten “one” of something: one sandwich, one apple, one cookie. Then, if we are extra-hungry, we might have a second helping. This was all well and good in the days when cookies tended to be the diameter of a coffee cup, rather than a side plate. But the rise of vast portions—particularly in fast-food restaurants—means that if we eat only the calories we need, we should often stop at half of something, or even a quarter. And no one—child or adult—seems to like the feeling of the glass—or the plate—being half empty.
My youngest son often demands not one but two cookies, one for each hand. This is fine at home where I bake almond-butter crescents that
are not much bigger than a coin. But when we’re out in a café, where the baked goods are giant, I’ve been known to break one in half and say,
“Look, now you have two.” Which doesn’t fool him. Besides which, two halves of a giant cookie is still too much.
Professor Marion Nestle of New York University has spent decades decrying the “Law of Portion Size: the more food in front of you, the more you will eat.” One of her colleagues came to the office one day with the largest slice of pizza either of them had ever seen, measuring fourteen inches long and weighing a full pound. It was equivalent to 2,000 calories: the full recommended daily intake for a moderately active woman. The customers buying this pizza might tell themselves, “It’s only one slice,” and feel it must be okay to finish it: it’s not as if you are eating a whole pizza. We need new eating methods to take account of the new ways we are being supplied with food.
The work of Brian Wansink, an expert in both marketing and nutrition, has shown that both children and adults can be disturbingly impressionable when it comes to deciding how much to eat. We may believe we only eat until we are full, but there are countless triggers messing with the off-button for eating. Wansink has done a series of studies that involve manipulating the size of utensils to demonstrate what he calls the size-contrast illusion. A large bowl makes you eat too much ice cream; an oversize plate makes you serve too many potatoes; and a short, squat glass makes you pour too much juice. When judging the quantity of liquids, almost everyone focuses on the height and forgets about the width. This mistake is even made by experienced bartenders, who consistently overpour shots when using short tumblers as opposed to tall highball glasses.
It’s a chancy business, taking our cues about how much to consume from our surroundings. Wansink has found that sometimes just the sight of food is a powerful enough trigger to override sensations of fullness. When you have lost touch with the hunger signals of your own body, the prompts to eat are almost inescapable. We are like Alice in Wonderland, controlled by cakes that say “Eat Me” and bottles that say “Drink Me.” When dieters were asked why they stopped or started eating, some of them simply said, “I saw the food.”
Countless studies have shown—duh!—that we eat more when distracted by a screen, whether TV, tablet, or computer. A study of nine- to fourteen-year-old boys showed that not only did they eat more while
watching TV, but the larger quantities of food did not make them feel any fuller. What was happening on the screen was way more interesting to these boys than what was happening in real life.
Wansink has laid out some simple ways that we can reengineer our food environment to eat less. Avoid “distracting” meals such as TV dinners and computer lunches. Replace the cookie jar with a fruit bowl. Repackage food into smaller containers. Order half-size portions in restaurants. “Replace short wide glasses with tall narrow ones.” And get smaller plates. This last one has certainly worked for me. Sometimes I know I’m not hungry at the end of a meal, but yearn for something very sweet to punctuate my eating. I get my tiniest plate—the blue-and-white china kind you buy from Chinese supermarkets for dipping sauce—and fill it with whatever I crave: dark, dense chocolate cake, vanilla ice cream with caramelized almonds, sticky gingerbread. It doesn’t matter how full I make the plate, because it will still be a tiny portion, so I can eat it without guilt or remorse. The first time I did this, I was skeptical: Could I really be so childish that my brain would be fooled by the smaller plate? Yes. I could.
Another cue that keeps us eating more than we should is variety. When asked why they stopped eating, participants in food studies are just as likely to cite boredom as fullness. The waiter’s insistence that we have an extra stomach for sweet things turns out to be true. Sort of. Professor Barbara Rolls, working at Johns Hopkins University, coined the term “sensory-specific satiety” (SSS), meaning that as we eat a certain food, our hunger for that particular food declines; but our hunger for other, new foods remains fresh. This is why buffets are so dangerous. Just as your hunger for one food wanes, there is always something else there to tempt you to eat more. Rolls has argued that the original evolutionary purpose of SSS among our hunter-gatherer ancestors was to promote a good, varied diet. But it does not work so well in our modern food system, where variety might mean different colors of candy, or different flavors of popcorn.
If mindless eating is what makes us blind to our own fullness, the solution might be found in mindfulness, that current new-age buzzword. Training in mindful eating teaches you to pay greater attention both to the food and to the sensations in your own body. Before you sit down to
eat, you ask whether you are really hungry. You set the table nicely, with candles and napkins. You switch off distracting electronic devices. You take the time to savor the aroma and the flavor of the food, putting down your fork between mouthfuls. You notice whether you are enjoying the food or not, and if you are not, you stop eating. Obviously, all of this is a little tricky to achieve if you are a child sitting at a table with a parent bellowing in your ear to finish your breakfast or else you’ll be late for school.
The work of pediatrician Susan L. Johnson, however, has shown that it is possible to teach children to become better at responding to their own internal fullness. Many parents, especially those who themselves struggle with their weight, believe that a child is incapable of self-control when it comes to eating. They themselves may be so out of touch with their own hunger cues that they do not credit their children with the capacity for learning to eat only when they are hungry. Yet twin studies suggest that the ability to stop eating when you are full has minimal genetic cause: it is fundamentally a response to environment. It can therefore be learned.
Johnson demonstrated that over six weeks of intensive intervention, it was possible to train children to improve their ability to self-regulate the amount of food they ate. The children were in a preschool, and most of them were four or five: just the stage when our natural ability to self-regulate portions begins to desert us. When Johnson first assessed the children, they varied wildly in their hunger regulation. Some overate, some underate, and some “regulated accurately.” The children whose mothers dieted and had difficulty managing their own food intake were the ones who were least skilled at regulating what they ate in response to hunger.