How We Learn (30 page)

Read How We Learn Online

Authors: Benedict Carey

BOOK: How We Learn
7.68Mb size Format: txt, pdf, ePub

It’s no stretch to say, however, that learning—in school, at work, at practice—is equally crucial to the survival game. Mastering a subject or skill may not be as urgent as avoiding some saber-toothed cat, but over a lifetime our knowledge and skills become increasingly valuable—and need to be continually updated. Learning is how we figure out what we want to do, what we’re good at, how we might make a living when the time comes. That’s survival, too. Yet, especially when we’re young, we have a terrible time trying to sort out what’s important from what’s not. Life is confusing, it moves fast, we’re fielding all sorts of often conflicting messages and demands from parents, teachers, friends, and rivals. There aren’t enough hours in the day to think through what it all means.

That’s reason enough to suspect that what the brain does at night is about more than safety. The sleep-wake cycle may have evolved primarily to help us eat and not be eaten but if that downtime can be
put to good use, then evolutionary theory tells us it will. What better way to sift the day’s perceptions and flag those that seem most important? A tracking skill. A pattern of movement in the bushes. An odd glance from a neighbor. A formula for calculating the volume of a cone. A new batting stance. A confounding plot in a Kafka novel. To sort all that variety, sleep might absolutely evolve distinct stages to handle different categories of learning, whether retention or comprehension, thermodynamics or Thucydides. I am not arguing that each state of sleep is specialized, that
only
REM can handle math and
only
deep sleep can help store Farsi verbs. Anyone who’s pulled an all-nighter or two knows that we don’t need any sleep at all to learn a pile of new material, at least temporarily. I am saying the research thus far suggests that each of sleep’s five stages helps us consolidate learning in a different way.

Siegel’s theory tells us that exhaustion descends when the costs of staying up outweigh its benefits. The Night Shift Theory gives us the reason why: because sleep has benefits, too—precisely for sorting through and consolidating what we’ve just been studying or practicing. Seen in this way, it’s yin and yang. Learning crests during waking hours, giving way to sleep at the moment of diminishing returns, when prolonged wakefulness is a waste of time. Sleep, then, finishes the job.

I’ve always loved my sleep, but in the context of learning I assumed it was getting in the way. Not so. The latest research says exactly the opposite: that unconscious downtime clarifies memory and sharpens skills—that it’s a necessary step to lock in both. In a fundamental sense, that is, sleep
is
learning.

No one is sure how the brain manages the sensory assault that is a day’s input, biologically. The science of sleep is still in its infancy. Yet one of its leading theorists, Giulio Tononi of the University of Wisconsin, has found evidence that sleep brings about a large-scale weakening of the neural connections made during the previous day. Remember all those linked neural networks forming every moment
we’re awake? Tononi argues that the primary function of sleep is to shake off the trivial connections made during the day and “help consolidate the
valuable inferences that were made.” The brain is separating the signal from the noise, by letting the noise die down, biologically speaking. Active consolidation is likely going on as well. Studies in animals have found direct evidence of “crosstalk” between distinct memory-related organs (the hippocampus and the neocortex, described in
chapter 1
) during sleep, as if the brain is reviewing, and storing, details of the most important events of the day—and
integrating the new material with the old.

I sure don’t know the whole story. No one does, and maybe no one ever will. The properties of sleep that make it such an unreliable companion—often shallow, elusive when most needed, or arriving when least wanted—also make it difficult to study in a controlled way over time. It’s likely that the sleep stages, arbitrarily defined by brain wave changes, may be replaced by more precise measures, like the chemical cocktails circulating during sleep states, or different types of “crosstalk.” My bet, though, is that the vast promise of tweaking sleep as a means to deepen learning will tempt someone into longer-term experiments, comparing the effects of different schedules on specific topics. Those effects will likely be highly individual, like so many others described in this book. Some night owls may find early morning study sessions torturously unproductive, and some early birds get their chakras bent out of joint after 10
P.M.
At least with the Night Shift Theory, we have some basis on which to experiment on our own, to adjust our sleep to our advantage where possible.

Put it this way: I no longer think of naps or knocking off early as evidence of laziness, or a waste of time, or, worst of all, a failure of will. I think of sleep as learning with my eyes closed.

Conclusion

The Foraging Brain

I began this book with the allegation that most of our instincts about learning are misplaced, incomplete, or flat wrong. That we invent learning theories out of whole cloth, that our thinking is rooted more in superstition than in science, and that we misidentify the sources of our frustration: that we get in our own way, unnecessarily, all the time. In the chapters that followed, I demonstrated as much, describing landmark experiments and some of the latest thinking about how remembering, forgetting, and learning are all closely related in ways that are neither obvious nor intuitive. I also showed how those unexpected relationships can be exploited by using specific learning techniques.

What I have
not
done is try to explain why we don’t know all this already.

If learning is so critical to survival, why do we remain so ignorant about when, where, and how it happens? We do it naturally, after all. We think about how best to practice, try new approaches, ask others we think are smarter for advice. The drive to improve never really
ends, either. By all rights, we should have developed pretty keen instincts about how best to approach learning. But we haven’t, and the reasons why aren’t at all apparent. No one that I know of has come forward with a convincing explanation, and the truth is, there may not be one.

I do have one of my own, however, and it’s this: School was born yesterday. English class, Intro to Trig, study hall, soccer practice, piano lessons, social studies, art history, the Russian novel, organic chemistry, Zeno’s paradoxes, jazz trumpet, Sophocles and sophomore year, Josephus and gym class, Modern Poetry and Ancient Civilizations: All of it, every last component of what we call education, is a recent invention in the larger scheme of things. Those “ancient” civilizations we studied in middle school? They’re not so ancient, after all. They date from a few thousand years ago, no more. Humans have been around for at least a million, and for the vast majority of that time we’ve been preoccupied with food, shelter, and safety. We’ve been avoiding predators, ducking heavy weather, surviving by our wits, foraging. And life for foragers, as the Harvard psychologist Steven Pinker so succinctly puts it, “is a
camping trip that never ends.”

Our foraging past had some not so obvious consequences for learning. Think for a moment about what it meant, that lifelong camping trip. Hunting and tracking
were
your reading and writing. Mapping the local environment—its every gully, clearing, and secret garden
—was
your geometry. The science curriculum included botany, knowing which plant had edible berries and which medicinal properties; and animal behavior, knowing the hunting routines of predators, the feeding habits of prey.

Over the years you’d get an education, all right. Some of it would come from elders and peers, but most of it would be accumulated through experience. Listening. Watching. Exploring the world in ever-widening circles.
That
is how the brain grew up learning, piecemeal and on the fly, at all hours of the day, in every kind of weather.
As we foraged for food, the brain adapted to absorb—at maximum efficiency—the most valuable cues and survival lessons along the way.

It became a forager, too—for information, for strategies, for clever ways to foil other species’ defenses and live off the land. That’s the academy where our brains learned to learn, and it defines who we are and how we came to be human.

Humans fill what the anthropologists John Tooby and Irven DeVore called the “cognitive niche” in
evolutionary history. Species thrive at the expense of others, each developing defenses and weapons to try to dominate the niche it’s in. The woodpecker evolved an extraordinary bone structure to pound holes in tough bark and feed on the insects hidden in trees. The brown bat evolved an internal sonar, called echolocation, allowing it to hunt insects at dusk. We evolved to outwit our competitors, by observing, by testing our intuitions, by devising tools, traps, fishhooks, theories, and more.

The modern institution of education, which grew out of those vestigial ways of learning, has produced generations of people with dazzling skills, skills that would look nothing less than magical to our foraging ancestors. Yet its language, customs, and schedules—dividing the day into chunks (classes, practices) and off-hours into “study time” (homework)—has come to define how we think the brain works, or should work. That definition is so well known that it’s taken for granted, never questioned. We all “know” we need to be organized, to develop good, consistent study routines, to find a quiet place and avoid distractions, to focus on one skill at a time, and above all, to
concentrate
on our work. What’s to question about that?

A lot, it turns out. Take “concentration,” for example, that most basic educational necessity, that mental flow we’re told is so precious to learning. What is concentration, exactly? We all have an idea of what it means. We know it when we see it, and we’d like more of it. Yet it’s an ideal, a mirage, a word that blurs the reality of what the brain actually does while learning.

I remember bringing my younger daughter to my newspaper office one weekend a few years ago when she was twelve. I was consumed with a story I had to finish, so I parked her at an empty desk near mine and logged her into the computer. And then I strapped in at my desk and focused on finishing—focused hard. Occasionally, I looked up and was relieved to see that she was typing and seemed engrossed, too. After a couple hours of intense work, I finished the story and sent it off to my editor. At which point, I asked my daughter what she’d been up to. She showed me. She’d been keeping a moment-to-moment log of my behavior as I worked. She’d been taking field notes, like Jane Goodall observing one of her chimpanzees:

10:46—types
10:46—scratches head
10:47—gets papers from printer
10:47—turns chair around
10:48—turns chair back around
10:49—sighs
10:49—sips tea
10:50—stares at computer
10:51—puts on headset
10:51—calls person, first word is “dude”
10:52—hangs up
10:52—puts finger to face, midway between mouth and chin, thinking pose?
10:53—friend comes to desk, he laughs
10:53—scratches ear while talking

And so on, for three pages. I objected. She was razzing me, naturally, but the phone call wasn’t true, was it? Did I make a call? Hadn’t I been focused the whole time, locked in, hardly looking away from my screen? Hadn’t I come in and cranked out my story without coming up for air? Apparently not, not even close. The truth was, she could
never have invented all those entries, all that detail. I did the work, all right, and I’d had to focus on it. Except that, to an outside observer, I looked fidgety, distracted—
un
focused.

The point is not that concentration doesn’t exist, or isn’t important. It’s that it doesn’t necessarily look or feel like we’ve been told it does. Concentration may, in fact, include any number of breaks, diversions, and random thoughts. That’s why many of the techniques described in this book might seem unusual at first, or out of step with what we’re told to expect. We’re still in foraging mode to a larger extent than we know. The brain has not yet adapted to “fit” the vocabulary of modern education, and the assumptions built into that vocabulary mask its true nature as a learning organ.

The fact that we can and do master modern inventions like Euclidean proofs, the intricacies of bond derivatives, and the fret board hardly means those ancient instincts are irrelevant or outmoded. On the contrary, many scientists suspect that the same neural networks that helped us find our way back to the campsite have been “repurposed” to help us find our way through the catacombs of
academic and motor domains. Once central to tracking our location in physical space, those networks adjusted to the demands of education and training. We don’t need them to get home anymore. We know our address. The brain’s internal GPS—it long ago evolved internal communities of so-called grid cells and place cells, to spare us the death sentence of getting lost—has retuned itself. It has adapted, if not yet perfectly.

Other books

Man in the Shadows by Peter Corris
5 Bad Moon by Anthony Bruno
The Seary Line by Nicole Lundrigan
Rosemary and Crime by Oust, Gail
Chaingang by Rex Miller
Hypocrisy by Daniel Annechino
The Orchid Shroud by Michelle Wan
The Book of Hours by Davis Bunn
On the Fringe by Walker, Courtney King