Read The Social Animal Online

Authors: David Brooks

Tags: #Non-Fiction, #Self Help, #Politics, #Philosophy, #Science

The Social Animal (43 page)

BOOK: The Social Animal
7.94Mb size Format: txt, pdf, ePub
ads

She almost felt as if she were two different people: one of whom had seen the seduction in a mildly titillating way, and the other who had seen it as a disgrace. It was as it says in Genesis, after Adam and Eve were expelled from the Garden of Eden. Their eyes were opened up, and they saw that they were naked. Later, she looked at herself and was unable to explain her own actions: “What in God’s name was I thinking?”

Furthermore, the mistake with Mr. Make-Believe had left some sort of psychic scar. When similar circumstances arose in the years that were to follow, she didn’t even have to think about her response. There was no temptation to resist because the mere thought of committing adultery again produced an instant feeling of pain and aversion—the way a cat avoids a stove on which she has been burned. Erica didn’t feel more virtuous because of what she had learned about herself, but she reacted differently to that specific sort of situation.

Erica’s experience illustrates several of the problems with the rationalist folk theory of morality. In the first place, most of our moral judgments, like Erica thrashing about that night in agony, are not cool, reasoned judgments, they are deep and often hot responses. We go through our days making instant moral evaluations about behavior, without really having to think about why. We see injustice and we’re furious. We see charity and we are warmed.

 

Jonathan Haidt of the University of Virginia provides example after example of this sort of instant moral intuition in action. Imagine a man who buys a chicken from the grocery store, manages to bring himself to orgasm by penetrating it, then cooks and eats the chicken. Imagine eating your dead pet dog. Imagine cleaning your toilet with your nation’s flag. Imagine a brother and sister who are on a trip. One night they decide to have protected sex with each other. They enjoy it but decide never to do it again.

 

As Haidt has shown in a string of research, most people have strong intuitive (and negative) reactions to these scenarios, even though nobody is harmed in any of them. Usually, Haidt’s research subjects cannot say why they found these things so repulsive or disturbing. They just do. The unconscious has made the call.

 

Furthermore, if the rationalist folk theory, with its emphasis on Level 2 moral reasoning, were correct, then you would expect people who do moral reasoning all day to be, in fact, more moral. Researchers have studied this, too. They’ve found there’s relatively little relationship between moral theorizing and noble behavior. As Michael Gazzaniga wrote in his book
Human
, “It has been hard to find any correlation between moral reasoning and proactive moral behavior, such as helping people. In fact, in most studies, none has been found.”

 

If moral reasoning led to more moral behavior, you would expect people who are less emotional to also be more moral. Yet at the extreme end, this is the opposite of the truth. As Jonah Lehrer has pointed out, when most people witness someone else suffering, or read about a murder or a rape, they experience a visceral emotional reaction. Their palms sweat and their blood pressure surges. But some people show no emotional reaction. These people are not hyper-rational moralists; they are psychopaths. Psychopaths do not seem to be able to process emotion about others’ pain. You can show them horrific scenes of death and suffering and they are unmoved. They can cause the most horrific suffering in an attempt to get something they want, and they will feel no emotional pain or discomfort. Research on wife batterers finds that as these men become more aggressive their blood pressure and pulse actually drop.

Finally, if reasoning led to moral behavior, then those who could reach moral conclusions would be able to apply their knowledge across a range of circumstances, based on these universal moral laws. But in reality, it has been hard to find this sort of consistency.

 

A century’s worth of experiments suggests that people’s actual behavior is not driven by permanent character traits that apply from one context to another. Back in the 1920s, Yale psychologists Hugh Hartshorne and Mark May gave ten thousand schoolchildren opportunities to lie, cheat, and steal in a variety of situations. Most students cheated in some situations and not in others. Their rate of cheating did not correlate with any measurable personality traits or assessments of moral reasoning. More recent research has found the same general pattern. Students who are routinely dishonest at home are not routinely dishonest at school. People who are courageous at work can be cowardly at church. People who behave kindly on a sunny day may behave callously the next day, when it is cloudy and they are feeling glum. Behavior does not exhibit what the researchers call “cross-situational stability.” Rather, it seems to be powerfully influenced by context.

The Intuitionist View

The rationalist assumptions about our moral architecture are now being challenged by a more intuitionist view. This intuitionist account puts emotion and unconscious intuition at the center of moral life, not reason; it stresses moral reflexes, alongside individual choice; it emphasizes the role perception plays in moral decision making, before logical deduction. In the intuitionist view, the primary struggle is not between reason and the passions. Instead, the crucial contest is within Level 1, the unconscious-mind sphere itself.

This view starts with the observation that we all are born with deep selfish drives—a drive to take what we can, to magnify our status, to appear superior to others, to exercise power over others, to satisfy lusts. These drives warp perception. It wasn’t as if Mr. Make-Believe consciously set out to use Erica, or attack her marriage. He merely saw her as an object to be used in his life quest. Similarly, murderers don’t kill people they regard as fully human like themselves. The unconscious has to first dehumanize the victim and change the way he is seen.

 

The French journalist Jean Hatzfeld interviewed participants in the Rwandan genocide for his book
Machete Season
. The participants were caught up in a tribal frenzy. They began to perceive their neighbors in radically perverse ways. One man Hatzfeld spoke with murdered a Tutsi who lived nearby: “I finished him off in a rush, not thinking anything of it, even though he was a neighbor, quite close on my hill. In truth, it came to me only afterward: I had taken the life of a neighbor. I mean, at the fatal instant I did not see in him what he had been before; I struck someone who was no longer either close or strange to me, who wasn’t exactly ordinary anymore, I’m saying like the people you meet every day. His features were indeed similar to those of the person I knew, but nothing firmly reminded me that I had lived beside him for a long time.”

These deep impulses treat conscious cognition as a plaything. They not only warp perception during sin; they invent justifications after it. We tell ourselves that the victim of our cruelty or our inaction had it coming; that the circumstances compelled us to act as we did; that someone else is to blame. The desire pre-consciously molds the shape of our thought.

But not all the deep drives are selfish ones, the intuitionists stress. We are all descended from successful cooperators. Our ancestors survived in families and groups.

 

Other animals and insects share this social tendency, and when we study them, we observe that nature has given them faculties that help them with bonding and commitment. In one study in the 1950s, rats were trained to press a lever for food. Then the experimenter adjusted the machine so that the lever sometimes provided food but sometimes delivered an electric shock to another rat in the next chamber. When the eating rats noticed the pain they were causing their neighbors, they adjusted their eating habits. They would not starve themselves. But they chose to eat less, to avoid causing undue pain to the other rats. Frans de Waal has spent his career describing the sophisticated empathy displays evident in primate behavior. Chimps console each other, nurse the injured, and seem to enjoy sharing. These are not signs that animals have morality, but they have the psychological building blocks for it.

 

Humans also possess a suite of emotions to help with bonding and commitment. We blush and feel embarrassed when we violate social norms. We feel instantaneous outrage when our dignity has been slighted. People yawn when they see others yawning, and those who are quicker to sympathetically yawn also rate higher on more complicated forms of sympathy.

 

Our natural empathy toward others is nicely captured by Adam Smith in
The Theory of Moral Sentiments
, in a passage that anticipates the theory of mirror neurons: “When we see a stroke aimed and just ready to fall upon the leg or arm of another person, we naturally shrink back our leg, our own arm; and when it does fall, we feel it in some measure and are hurt by it as well as the sufferer.” We also feel a desire, Smith added, to be esteemed by our fellows. “Nature, when she formed man for society, endowed him with an original desire to please, and an original aversion to offend his brethren. She taught him to feel pleasure in their favorable, and pain in their unfavorable regard.”

 

In humans, these social emotions have a moral component, even at a very early age. Yale professor Paul Bloom and others conducted an experiment in which they showed babies a scene featuring one figure struggling to climb a hill, another figure trying to help it, and a third trying to hinder it. At as early as six months, the babies showed a preference for the helper over the hinderer. In some plays, there was a second act. The hindering figure was either punished or rewarded. In this case, the eight-month-olds preferred a character who was punishing the hinderer over ones being nice to it. This reaction illustrates, Bloom says, that people have a rudimentary sense of justice from a very early age.

Nobody has to teach a child to demand fair treatment; children protest unfairness vigorously and as soon as they can communicate. Nobody has to teach us to admire a person who sacrifices for a group; the admiration for duty is universal. Nobody has to teach us to disdain someone who betrays a friend or is disloyal to a family or tribe. Nobody has to teach a child the difference between rules that are moral—”Don’t hit”—and rules that are not—”Don’t chew gum in school.” These preferences also emerge from somewhere deep inside us. Just as we have a natural suite of emotions to help us love and be loved, so, too, we have a natural suite of moral emotions to make us disapprove of people who violate social commitments, and approve of people who reinforce them. There is no society on earth where people are praised for running away in battle.

 

It’s true that parents and schools reinforce these moral understandings, but as James Q. Wilson argued in his book
The Moral Sense
, these teachings fall on prepared ground. Just as children come equipped to learn language, equipped to attach to Mom and Dad, so, too, they come equipped with a specific set of moral prejudices, which can be improved, shaped, developed, but never quite supplanted.

These sorts of moral judgments—admiration for someone who is loyal to a cause, contempt for someone who betrays a spouse—are instant and emotional. They contain subtle evaluations. If we see someone overcome by grief at the loss of a child, we register compassion and pity. If we see someone overcome by grief at the loss of a Maserati, we register disdain. Instant sympathy and complex judgment are all intertwined.

As we’ve seen so often in this story, the act of perception is a thick process. It is not just taking in a scene but, almost simultaneously, weighing its meaning, evaluating it, and generating an emotion about it. In fact, many scientists now believe that moral perceptions are akin to aesthetic or sensual perceptions, emanating from many of the same regions of the brain.

 

Think of what happens when you put a new food into your mouth. You don’t have to decide if it’s disgusting. You just know. Or when you observe a mountain scene. You don’t have to decide if a landscape is beautiful. You just know. Moral judgments are in some ways like that. They are rapid intuitive evaluations. Researchers at the Max Planck Institute for Psycholinguistics in the Netherlands have found that evaluative feelings, even on complicated issues like euthanasia, can be detected within 200 to 250 milliseconds after a statement is read. You don’t have to think about disgust, or shame, or embarrassment, or whether you should blush or not. It just happens.

In fact, if we had to rely on deliberative moral reasoning for our most elemental decisions, human societies would be pretty horrible places, since the carrying capacity of that reason is so low. Thomas Jefferson anticipated this point centuries ago:

 

He who made us would have been a pitiful bungler, if He had made the rules of our moral conduct a matter of science. For one man of science, there are thousands who are not. What would have become of them? Man was destined for society. His morality, therefore, was to be formed to this object. He was endowed with a sense of right and wrong merely relative to this. This sense is as much a part of nature, as the sense of hearing, seeing, feeling; it is the true foundation of morality.”

Thus, it is not merely reason that separates us from the other animals, but the advanced nature of our emotions, especially our social and moral emotions.

Moral Concerns

Some researchers believe we have a generalized empathetic sense, which in some flexible way inclines us to cooperate with others. But there is a great deal of evidence to suggest that people are actually born with more structured moral foundations, a collection of moral senses that are activated by different situations.

 

Jonathan Haidt, Jesse Graham, and Craig Joseph have compared these foundations to the taste buds. Just as the human tongue has different sorts of receptors to perceive sweetness, saltiness, and so on, the moral modules have distinct receptors to perceive certain classic situations. Just as different cultures have created different cuisines based on a few shared flavor senses, so, too, have different cultures created diverse understandings of virtue and vice, based on a few shared concerns.

BOOK: The Social Animal
7.94Mb size Format: txt, pdf, ePub
ads

Other books

Falling For Her Boss by Smith, Karen Rose
The Bet by J.D. Hawkins
Freedom's Child by Jax Miller
Red Hot Party by Dragon, Cheryl
The Orphan by Peter Lerangis
Following My Toes by Osterkamp, Laurel
Someone to Love by Addison Moore