In 2001, Henry L. Roediger III, Michelle L. Meade, and Erik T. Bergman at Washington University had students list ten items they would expect to see in a typical kitchen, toolbox, bathroom, and other common areas in most homes. Think about it yourself. What ten items would you expect to find in a modern kitchen? This idea, this imaginary place, is a schema. You have schemas for just about everything—pirates, football, microscopes—images and related ideas that orbit the archetypes for objects, scenarios, rooms, and so on. Those archetypes form over time as you see examples in life or in stories from other people. You also have schemas for places you’ve never been, like the bottom of the ocean or ancient Rome.
For instance, when you imagine the ancient Romans, do you see chariots and marble statues with bone-white columns stretching overhead? You probably do, because this is how ancient Rome is always depicted in movies and television. Would it surprise you to know those columns and sculptures were painted with a rainbow of colors that would be gaudy by today’s aesthetic standards? They were. Your schema is fast, but inaccurate. Schemas function as heuristics; the less you have to think about these concepts the faster you can process thoughts that involve them. When a schema leads to a stereotype, a prejudice, or a cognitive bias, you trade an acceptable level of inaccuracy for more speed.
Back to the experiment. After the psychologists had the students list items they’d expect to find in various household locations, they brought in actors posing as a new batch of students and paired them up with the students who’d just made their lists. Together, the subjects and the confederates looked at slides depicting the familiar locations and were asked to pay close attention to what they saw so they could remember it later on. To clear their mental palates, the subjects did some math problems before moving on to the last part of the experiment. The students then returned with their partners and together recalled out loud what they remembered in the scenes, but the confederates included items that weren’t in the pictures. The kitchen scene, for example, didn’t feature a toaster or oven mitts, but both were falsely recalled by the actors. After the ruse, the subjects were handed a sheet of paper and asked to list all the things they could remember.
As you’ve deduced by now, the subjects were easily implanted with false memories for items they expected to be in the scenes. They listed items that were never shown but had been suggested by their partners. Their schemas for kitchens already included toasters and oven mitts, so when the actors said they saw those things, it was no problem for their minds to go ahead and add them to the memory. If their partners had instead said they remembered seeing a toilet bowl in the kitchen, it would have been harder to accept.
In 1932, psychologist Charles Bartlett presented a folktale from American Indian culture to subjects and then asked them to retell the story back to him every few months for a year. Over time, the story become less like the original and more like a story that sounded as though it came from the culture of the person recalling it.
In the original story, two men from Egulac are hunting seals along a river when they hear what they believe are war cries. They hide until a canoe with five men approaches. The men ask them to join them in a battle. One man agrees; the other goes home. After this, the story gets confusing because in the battle someone hears someone else say the men are ghosts. The man who traveled with the warriors is hit, but it isn’t clear what hits him or who. When he gets home, he tells his people what happened, saying he fought with ghosts. In the morning, something black comes out of his mouth, and he dies.
The story is not only strange, but written in an unusual way that makes it difficult to understand. Over time, the subjects reshaped it to make sense to them. Their versions became shorter, more linear, and many details were left out that didn’t make sense in the first place. The ghosts became the enemy, or became the allies, but usually became a central feature of the tale. Many people interpreted them to be the undead, even though in the tale the word “ghost” identifies the name of the clan. The dying man is tended to. The seal hunters become fishermen. The river becomes a sea. The black substance becomes his soul escaping or a blood clot. After a year or so, the stories started to include new characters, totems, and ideas never present in the original, like the journey as a pilgrimage, or the death as a sacrifice.
Memory is imperfect, but also constantly changing. Not only do you filter your past through your present, but your memory is easily infected by social contagion. You incorporate the memories of others into your own head all the time. Studies suggest your memory is permeable, malleable, and evolving. It isn’t fixed and permanent, but more like a dream that pulls in information about what you are thinking about during the day and adds new details to the narrative. If you suppose it could have happened, you are far less likely to question yourself as to whether it did.
The shocking part of these studies is how easily memory gets tainted, how only a few iterations of an idea can rewrite your autobiography. Even stranger is how as memories change, your confidence in them grows stronger. Considering the relentless bombardment to your thoughts and emotions coming from friends, family, and all media: How much of what you recall is accurate? How much of the patchwork is yours alone? What about the stories handed down through time or across a dinner table; what is the ratio of fiction to fact? Considering the misinformation effect not only requires you to be skeptical of eyewitness testimony and your own history, but it also means you can be more forgiving when someone is certain of something that is later revealed to be embellished or even complete fiction.
Consider the previous exercise when you falsely saw curtains in the list of things around a window. It took almost no effort to implant the memory because you were the one doing the implanting. Recognize the control you have over—wait, was it curtains?
33
Conformity
THE MISCONCEPTION:
You are a strong individual who doesn’t conform unless forced to.
THE TRUTH:
It takes little more than an authority figure or social pressure to get you to obey, because conformity is a survival instinct.
On April 4, 2004, a man calling himself Officer Scott called a McDonald’s in Mount Washington, Kentucky. He told the assistant manager, Donna Jean Summers, who answered the phone, there had been a report of theft and that Louise Ogborn was the suspect.
Ogborn, eighteen, worked at the McDonald’s in question, and the man on the other line told Donna Jean Summers to take her into the restaurant’s office, lock the door, and strip her naked while another assistant manager watched. He then asked her to describe the naked teenager to him. This went on for more than an hour, until Summers told Officer Scott she had to return to the counter and continue her duties. He asked her if her fiancé could take over, and so she called him to the store. He arrived shortly after, took the phone, and then started following instructions. Officer Scott told him to tell Ogborn to dance, do jumping jacks, and stand on furniture in the room. He did. She did. Then, Officer Scott’s requests became more sexual. He told Summer’s fiancé to make Ogborn sit in his lap and kiss him so he could smell her breath. When she resisted, Officer Scott told him to spank her naked bottom, which he did. More than three hours into the ordeal, Officer Scott eventually convinced Summers’s fiancé to force Ogborn to perform oral sex while he listened. He then asked for another man to take over, and when a maintenance worker was called in to take the phone, he asked what was going on. He was shocked and skeptical. Officer Scott hung up.
The call was one of more than seventy made over the course of four years by one man pretending to be a police officer. He called fast-food restaurants in thirty-two states and convinced people to shame themselves and others, sometimes in private, sometimes in front of customers. With each call he claimed to be working with the parent corporation, and sometimes he said he worked for the bosses of the individual franchises. He always claimed a crime had been committed. Often, he said investigators and other police officers were on their way. The employees dutifully did as he asked, disrobing, posing, and embarrassing themselves for his amusement. Police eventually captured David Stewart, a Florida prison security guard who had in his possession a calling card that was traced back to several fast-food restaurants, including one that had been hoaxed. Stewart went to court in 2006 but was acquitted. The jury said there wasn’t enough evidence to convict him. There were no more hoax phone calls after the trial.
What could have made so many people follow the commands of a person they had never met and from who they had no proof of his being a police officer?
If I were to hand you a card with a single line on it, and then hand you another card with an identical line drawn near two others, one longer and one shorter, do you think you could match up the original to the copy? Could you tell which line in a group of three was the same length as the one on the first card?
You could. Just about anyone would be able to match up lines of equal length in just a few seconds. Now, what if you were part of a group trying to come to a consensus, and the majority of the people said a line that was clearly shorter than the original was the one that matched? How would you react?
In 1951, psychologist Solomon Asch used to perform an experiment where he would get a group of people together and show them cards like the ones described above. He would then ask the group the same sort of questions. Without coercion, about 2 percent of people answered incorrectly. In the next run of the experiment, Asch added actors to the group who all agreed to incorrectly answer his questions. If he asked which line was the same, or longer, or shorter, or whatever, they would force one hapless subject to be alone in disagreement.
You probably think you would go against the grain and shake your head in disbelief. You think you might say to yourself, “How could these people be so stupid?” Well, I hate to break it to you, but the research says you would eventually break. In Asch’s experiments, 75 percent of the subjects caved in on at least one question. They looked at the lines, knew the answer everyone else was agreeing to was wrong, and went with it anyway. Not only did they conform without being pressured, but when questioned later they seemed oblivious to their own conformity. When the experimenter told them they had made an error, they came up with excuses as to why they made mistakes instead of blaming the others. Intelligent people just like you caved in, went with the group, and then seemed confused as to why.
Asch messed around with the conditions of the experiment, trying it with varying numbers of actors and unwitting subjects. He found one or two people had little effect, but three or more was all he needed to get a small percentage of people to start conforming. The percentage of people who conformed grew proportionally with the number of people who joined in consensus against them. Once the entire group other than the subject was replaced with actors, only 25 percent of his subjects answered every question correctly.
Most people, especially those in Western cultures, like to see themselves as individuals, as people who march to a different beat. You are probably the same sort of person. You value your individuality and see yourself as a nonconformist with unique taste, but ask yourself: How far does this nonconformity go? Do you live in an igloo made of boar tusks in the Arizona desert while refusing to drink the public water supply? Do you speak a language you and your sister created as children and lick strangers on the face during the closing credits of dollar-theater matinees? When other people applaud, do you clap your feet together and boo? To truly refuse to conform to the norms of your culture and the laws of the land would be a daunting exercise in futility. You may not agree with the zeitgeist, but you know conformity is part of the game of life. Chances are, you pick your battles and let a lot of things slide. If you travel to a foreign country, you look to others as guides on how to behave. When you visit someone else’s home, you do as that person does. In a college classroom you sit quietly and take notes. If you join a gym or start a new job, the first thing you do is look for clues as to how to behave. You shave your legs or your face. You wear deodorant. You conform.
As psychologist Noam Shpancer explains on his blog, “We are often not even aware when we are conforming. It is our home base, our default mode.” Shpancer says you conform because social acceptance is built into your brain. To thrive, you know you need allies. You get a better picture of the world when you can receive information from multiple sources. You need friends because outcasts are cut off from valuable resources. So when you are around others, you look for cues as to how to behave, and you use the information offered by your peers to make better decisions. When everyone you know tells you about an awesome app for your phone or a book you should read, it sways you. If all of your friends tell you to avoid a certain part of town or a brand of cheese, you take their advice. Conformity is a survival mechanism.
The most famous conformity experiment was performed by Stanley Milgram in 1963. He had people sit in a room and take commands from a scientist in a lab coat. He told them they would be teaching word pairs to another subject in the next room, and each time their partner got an answer wrong they were to give them an electric shock. A control panel on a complicated-looking contraption clearly indicated the power of the shock. Switches along a single row were labeled with increasing voltages and a description. At the low end it read “slight shock.” In the middle the switch was labeled “intense shock.” At the end of the scale the switch read “XXX,” which implied death. The man in the lab coat would prompt the subject pressing the buttons to shock the partner in the next room. With each shock, screams emanated from next door. After the screams, the scientist in the lab coat asked the subject to increase the voltage. The screams would get louder, and eventually subjects could hear the guy in the other room pleading for his life and asking the psychologist to end the experiment. Most subjects asked if they could stop. They didn’t want to shock the poor man in the next room, but the scientist would urge them to continue, telling them not to worry. The scientist said things like “You have no other choice; you must go on” or “The experiment requires that you continue.” To everyone’s surprise, 65 percent of people could be prompted to go all the way to right below the “XXX.” In reality, there were no shocks, and the other person was just an actor pretending to be in pain. Milgram’s experiment has been repeated many times with many variations. The percentage of people who go all the way can be dropped to zero just by removing the authority figure, or it can be raised into the 90 percentile range by having someone else give the test while the subject has only to deliver the shocks. Again, with Milgram’s experiment there was no reward or punishment involved—just simple conformity.