In her memoir of living with her then fourteen-year-old anorexic daughter Kitty,
Brave Girl Eating
, Harriet Brown, an advocate for FBT, describes how a refeeding meal might go. Brown lays out Kitty’s breakfast of a bowl of cereal with milk and strawberries. Kitty says she wanted cottage cheese instead. Brown says there is none. Kitty complains that the cereal is soggy. Brown makes a fresh bowl but then insists as calmly as she can that Kitty “sit down and start eating.” This whole rigmarole goes on many times a day, with Kitty often sobbing that the food will make her
fat, and Brown protesting that food is her “medicine” and she must eat. Brown or her husband take it in turns to stay with Kitty for an hour after each meal to prevent her from running to the bathroom to purge. There are many such meals each day—Kitty needs to eat a snack every couple of hours. Four years on, Kitty has recovered to the point where her parents feel it is safe for her to go off to college by herself and take responsibility for her own meals. There are still relapses when the “demon” returns to the dinner table and Kitty’s weight drops, but at least they all feel they have done everything they can to normalize her relationship with food. More importantly, Kitty herself now has an approach to food that she can follow to turn her situation back around when she starts to slip into losing weight. Food is medicine.
One of the many hard aspects of refeeding is that it isn’t enough for the patient to eat an amount of food that would be healthy for a person of normal weight. Someone with anorexia needs vastly more calories than before to regain the weight needed for the body and brain to recover. Anorexics would never “choose” of their own accord to drink a 1,000-calorie milkshake, but after they recover, they often say that it was strangely liberating to have their parents telling them that they had no option but to eat, because it reduced the shame. Families need to become experts in which foods offer the most calories without filling their children up to the point where their stomachs hurt. It is the opposite of the way most of us try to eat, seeking out the maximum food for the minimum calories.
This refeeding process may be even harder for those anorexics
who develop the disease later in life and are without parents at hand to help them. Some years ago, I wrote an article on women who were battling anorexia in their thirties, forties, and fifties. Among those I met was Jane, a reticent fifty-three-year-old teaching assistant who described the humiliation of being a middle-aged anorexic. For her, the misery of anorexia was compounded by a sense of shame that, at her age, she should have “known better,” as she put it. At her lowest point, Jane lost five stones (about seventy pounds) from an already slim body. Once, she felt so despairing that she took a hammer and smashed her own hand. She
was put in a therapy group with six “trendy” teenage girls and expected to open up about how she felt. How Jane felt was: “Why should I share my innermost feelings with a group of strangers?” Another obstacle to Jane’s recovery—in common with the other older anorexics I interviewed—was that she was the one in the family who provided for everyone else. Jane was very good at feeding others, but feeding herself was another matter. She prepared lavish, ambitious meals for her husband and two sons, while she nibbled on an apple or a yogurt. On the rare occasions that she ate out with her husband, she could be reduced to tears by the arrival of a bowl of soup. When I met her, Jane was slowly teaching herself to eat again. She had managed to edge up to 1,000 calories a day: not enough—she was still painfully thin—but just sufficient to keep her out of the hospital.
For some adult anorexics, the best course of treatment may be a residential program where the patients—of whatever age—can enact the role of children in the protective setting of a family meal once again. I visited Newmarket House in Norwich in England, a specialist treatment center for anorexia that felt more like a spacious home than a clinic, with colorful sofas and appetizing cooking smells in the air. I met Beth, who was in her thirties, a mother of four. Like Jane, Beth was a confident cook, and took great pride in the birthday cakes she baked for her children, but she struggled to allow herself to eat anything but lettuce and tomatoes. She wished she could disappear, she said, and was still far from fully recovered. But at least the structured meals of Newmarket House—at which the nurses feel more like family members than therapists—gave Beth an environment where others took care of her eating for a change.
With some eating disorders, being older and more independent seems to be beneficial for recovery. Bulimia tends to strike at a later age than anorexia (in a review of 5,653 cases of bulimia, the average age of onset was seventeen, but often it starts in the twenties). A study of forty women who had fully recovered from bulimia found that they tended to be self-motivated about getting better and did not like the view that “one is powerless over one’s problems.” Eighty percent of these recovered bulimics had ultimately been motivated to change by their own desire for a better life and weariness at the symptoms (when they were ill, they vomited, on average, twenty-two times a week). Although most of them
benefited from professional help, nearly half of them backed this up by reading self-help books. Another study found that among a group of bulimics in Austria, more of the patients became symptom-free by using guided self-help, working through a manual by themselves, than by being given a course of cognitive behavioral therapy.
Learning a new, more balanced way of eating after bulimia or binge eating is very different from recovery through refeeding as an anorexic. Instead of boosting calories, a bulimic needs to find a reliable way to limit each day’s food, avoiding anything that might trigger an episode of bingeing. Unlike selective eaters, bulimics need to teach themselves to become
less
omnivorous. One forty-five-year-old recovered bulimic described the strict regime she had created for herself as a way of keeping herself from backsliding into the bingeing and purging behavior. As a result of the methods she used, she had recently celebrated eighteen months without any symptoms. She shopped in very small quantities, to make bingeing impossible, and ate five small meals a day of fish, meat, fruit, and vegetables. She breakfasted on tinned tuna or cold chicken, because bread would remind her too much of bingeing and create a temptation to purge. Wheat and dairy were now eliminated from her diet. For an anorexic, such rigid food rules might be a dangerous path to take, but for a bulimic, there can be liberation in limits.
There is at least one respect, however, in which the situation of anorexics and bulimics is very similar. Before addressing what to eat, the most urgent matter is how to eat. Phase one of recovery from bulimia is the reintroduction of regular mealtimes: no bingeing, no starving. Slowly, the days regain a sense of rhythm. As anyone who has ever suffered jetlag knows, few things are more disorienting than a warped sense of time. Some of what makes bulimia nightmarish—in common with other eating disorders—is that it disrupts the daily tempo of meals. One recovered binge eater spoke of how she used to live in a “food-fuelled haze,” but had now found that by allowing herself regular, clearly defined meals, she had regained a feeling of certainty. Lunch becomes a meaningless concept when you have already eaten—and possibly purged—a whole box of
cereal by mid-morning. When you are eating all the time, food curiously loses much of its joy, along with its sense of ceremony and sociability.
Once again, the experiences of eating disorder sufferers are on a continuum with the rest of us. It is hard for anyone to live well when meals are not given the attention they are due. As
New Yorker
writer Adam Gopnik has asserted, “the table comes first,” meaning that before we can resolve our endless quandaries about food—such as “where the zucchini came from and how far it had to travel”—we should first establish the basic paradigm that at certain times, every day, we stop, we sit, and we eat.
In many ways, the needs of a bulimic or anorexic are not so
different from those of an adult picky eater like Diane, or just an averagely screwed-up human being with a desire to lose weight. An individual with anorexia “is disconnected from her internal experiences,” and cannot read her internal hunger signals effectively, notes one scholarly article on anorexia. But most of the population, as we have seen, is similarly disconnected from internal signals about when, what, and how much to eat. The difficulty is that those whose disordered eating is less extreme are likely to have less help: when you sit down to eat, you are both the parent and the child, the doctor and the patient. Like the anorexic patient faced with the 1,000-calorie milkshake, many of us would never “choose” to eat a plate of healthy food over a fast-food meal, but if we can give our bodies the food it needs often enough, in a kind and persistent enough way, we may eventually start to recover. Eating
is
about the food. What all of us need is to find a way to eat regular meals, to take pleasure in a variety of foods, and to be able to eat them without being consumed by negative emotions.
It is startling to hear the message, from eating disorder therapists, that nourishing, health-giving family meals, eaten in loving company, are so important for a child’s well-being that everything else in life must be made secondary to them. Most families—most people—do not live like this. Gone—thankfully—are the days of a patriarch ruling the family from the head of the table. The breakdown of strict table manners—chil
dren should be seen and not heard!—has been emancipating in some ways. But as a society, we haven’t quite figured out what a new structure for meals would look like that isn’t just a hasty sandwich in the car on
the way to something more important. The experience of eating each day around a table is given second billing to other activities: homework, after-school activities, Instagram, and email. In a busy life, the organization required for regular, shared dinners can seem unattainable; even if they can manage the shopping and the cooking, parents often hesitate to assume the authority to gather everyone together to eat, never mind to insist that everyone eat the same food. But the experience of eating disorders shows that this is partly a question of priorities. When eating becomes a matter of life or death, and each new bite is a celebration, you may discover that none of the other stuff was quite as important as sitting and breaking bread together.
I knew a family
whose children collected—and then ate—potato chips from different countries. Whenever friends went abroad, they asked them to bring a package or two back. They had eaten curry chips from Belgium and shrimp crisps from Thailand; crinkle-cut from Australia and paprika chips from Germany shaped like kangaroos. Their whole idea of global food was its variations on fried potatoes.
People with selective eating disorders often find that potato chips (usually plain salted ones) are one of the “safe” foods they can best tolerate. “Bob K,” one of the founders of a support group for adult picky eaters, says that potato chips satisfy his two main requirements from food: they are plain-tasting, but they are also crisp and crunchy in texture. They are reassuringly beige in appearance, too. In 2012, a fifty-four-year-old woman dubbed by the newspapers “the world’s fussiest eater” told reporters she ate only three foods: milk, white bread, and fried potatoes, whether in the form of fries or chips. Of these, the chips were her favorite because they were “so salty and fresh and potato-y.”
Selective eaters are not alone in this love of chips. It’s tempting to feel that most of the planet is on the feeding disorder spectrum, judging from our chip habit. Some stack them up like Jenga blocks, to cram in as many as the mouth will hold; others nibble them one by one, licking off the salt before biting. Fried slivers of potato were once an aristocratic treat, used as a garnish to roasted game birds and eaten in small quantities. Those days are gone. In 1964, the British ate an average of 250 grams (about 9 ounces, a little more than half a pound) of chips per person per year; by
1984, this had jumped to 1.33 kilograms (about 47 ounces, or 2.9 pounds). It is now more than 3 kilograms (about 106 ounces, or 6.6 pounds), not counting all the other chip-like salty snacks and crackers that Brits devour.
How have we learned to eat quite so many chips? John S. Allen, author of
The Omnivorous Mind
, notes that crispiness is a nearly universally loved texture across different cultures. Part of the appeal is that chewing crispy foods activates our sense of hearing as well as smell and taste. Making that loud crunching sound is part of the pleasure: it staves off boredom and makes you eat more. Allen suggests that our penchant for crispy food may go back to our primate ancestors, for whom crunchy insects were a valuable source of protein.
But like so much about the way we eat, our instinct for crispiness has outlived its usefulness. Almost all of the commercially produced crispy foods—from chips to fried chicken to breadcrumbed nuggets—are ones we’d do well to eat less of. I can’t deny that salty fried food can be delicious. The way forward could be to get your hit of crispiness in vegetable form. It is possible to make vegetable fritters so inviting—cauliflower pakoras, eggplant tempura, sweetcorn pancakes—that chips seem dull by comparison. But it is still hard to replicate the crunch.