Read Reclaiming Conversation Online

Authors: Sherry Turkle

Reclaiming Conversation (41 page)

BOOK: Reclaiming Conversation
12.14Mb size Format: txt, pdf, ePub
ads

Estelle responded to this emotionally charged situation with depression and a search for comfort food. Other children who faced a disappointing conversation with Kismet responded with aggression. When Kismet began an animated conversation that Edward, six, could not understand, he shoved objects into Kismet's mouth—a metal pin, a pencil, a toy caterpillar—things Edward found in the robotics laboratory. But at no point did Edward disengage from Kismet. He would not give up his chance for Kismet's recognition.

The important question here is not about the risks of broken robots. Rather, we should ask, “Emotionally, what positive thing would we have given to these children if the robots had been in top form?” Why do we propose machine companionship to children in the first place? For a lonely child, a conversational robot is a guarantee against rejection, a place to entrust confidences. But what children really need is not the guarantee that an inanimate object will simulate acceptance. They need relationships that will teach them real mutuality, caring, and empathy.

So, the problem doesn't start when the machine breaks down. Children are not well served even when the robots are working perfectly. In the case of a robot babysitter, you already have a problem when you have to explain to a child why there isn't a person available for the job.

Treating Machines as People; Treating People as Machines

I
n all of this, an irony emerges: Even as we treat machines as if they were almost human, we develop habits that have us treating human beings as almost-machines. To take a simple example, we regularly put people “on pause” in the middle of a conversation in order to check our phones. And when we talk to people who are not paying attention to us, it is a kind of preparation for talking to uncomprehending machines. When people give us less, talking to machines doesn't seem as much of a downgrade.

At a panel on “cyberetiquette,” I was onstage with a technology reporter and two “advice and manners” columnists. There was general agreement among the panelists on most matters: No texting at family dinners. No texting at restaurants. Don't bring your laptop to your children's sporting events, no matter how tempting.

And then came this question from the audience: A woman said that as a working mother she had very little time to talk to her friends, to email, to text, to keep up. “Actually,” she confessed, “the only time I have is at night, after I'm off work and before I go home, when I go family shopping at Trader Joe's. But the cashier, the guy at the checkout counter, he wants to talk. I just want to be on my phone, into my texts and Facebook. Do I have the right to just ignore him?” The two manners experts went first. Each said a version of the same thing: The man who does the checkout has a job to do. The woman who asked the question has a right to privacy and to her texting as he provides his service.

I listened uncomfortably. I thought of all the years I went shopping with my grandmother as I grew up and all the relationships she had with tradespeople at every store: the baker, the fishmonger, the fruit man, the grocery man (for this is what we called them). These days, we all know that the job the man at the checkout counter does could be done by a machine. In fact, down the street at another supermarket, it is done by
a machine that automatically scans your groceries. And so I shared this thought: Until a machine replaces the man, surely he summons in us the recognition and respect you show a person.
Sharing a few words at the checkout
may make this man feel that in his job, this job that
could
be done by a machine, he is still seen as a human being.

This was not what the audience and my fellow panelists wanted to hear. As I took stock of their cool reaction to what I said, I saw a new symmetry: We want more from technology and less from each other. What once would have seemed like “friendly service” at a community market had become an inconvenience that keeps us from our phones.

It used to be that we imagined our mobile phones were there so that we could talk to each other. Now we want our mobile phones to talk to us. That's what the new commercials for Siri are really about: fantasies of these new conversations and a kind of tutelage in what they might sound like. We are at a moment of temptation, ready to turn to machines for companionship even as we seem pained or inconvenienced to engage with each other in settings as simple as a grocery store. We want technology to step up as we ask people to step back.

People are lonely and fear intimacy, and robots seem ready to hand.
And we are ready for their company if we forget what intimacy is.
And having nothing to forget, our children learn new rules for when it is appropriate to talk to a machine.

Stephanie is forty, a real estate agent in Rhode Island. Her ten-year-old daughter, Tara, is a perfectionist, always the “good girl,” sensitive to any suggestion of criticism. Recently, she has begun to talk to Siri. It is not surprising that children like to talk to Siri. There is just enough inventiveness in Siri's responses to make children feel that someone might be listening. And if children are afraid of judgment, Siri is safe. So Tara expresses anger to Siri that she doesn't show to her parents or friends—with them she plays the part of a “perfect child.” Stephanie overhears her daughter yelling at Siri and says, “She vents to Siri. She starts to talk but then becomes enraged.”

Stephanie wonders if this is “perhaps a good thing, certainly a more
honest conversation” than Tara is having with others in her life. It's a thought worth looking at more closely. It is surely positive for Tara to discover feelings that she censors for other audiences. But talking to Siri leaves Tara vulnerable. She may get the idea that her feelings are something that people cannot handle. She may persist in her current idea that pretend perfection is all other people want from her or can accept from her. Instead of learning that people can value how she really feels, Tara is learning that it is easier not to deal with people at all.

If Tara can “be herself” only with a robot, she may grow up believing that only an object can tolerate her truth. What Tara is doing is not “training” for relating to people. For that, Tara needs to learn that you can attach to people with trust, make some mistakes, and risk open conversations. Her talks with the inanimate are taking her in another direction: to a world without risk and without caring.

Automated Psychotherapy

W
e create machines that seem human enough that they tempt us into conversation and then we treat them as though they can do the things humans do. This is the explicit strategy of a research group at MIT that is
trying to build an automated psychotherapist
by “crowdsourcing” collective emotional intelligence. How does this work? Imagine that a young man enters a brief (one- to three-sentence)
description of a stressful situation
or painful emotion into a computer program. In response, the program divides up the tasks of therapy among “crowd workers.” The only requirement to be employed as a crowd worker is a command of basic English.

The authors of the program say they developed it because the conversations of psychotherapy are a good thing but are too expensive to be available to everyone who needs them. But in what sense is this system providing conversation? One worker sends a quick “empathic” response. Another checks if the problem statement distorts reality and may
encourage a reframing of the problem. Or a reappraisal of the situation. These, too, are brief, no more than four sentences long. There are people in the system, but you can't talk to them. Each crowd worker is simply given an isolated piece of a puzzle to solve.
And indeed, the authors of the program hope that someday
the entire process—already a well-oiled machine—will be
fully automated and you won't need people in the loop
at all, not even piecemeal.

This automated psychotherapist, Tara's conversations with Siri, and the psychiatrist who looks forward to the day when a “smarter” Siri could take over his job say a lot about our cultural moment. Missing in all of them is the notion that, in psychotherapy, conversation cures because of the relationship with the therapist. In that encounter, what therapist and patient share is that they both live human lives. All of us were once children, small and dependent. We all grow up and face decisions about intimacy, generativity, work, and life purpose. We face losses. We consider our mortality. We ask ourselves what legacy we want to leave to a next generation. When we run into trouble with these things—and that kind of trouble is a natural part of every life—that is something a human being would know how to talk to us about. Yet as we become
increasingly willing to discuss these things with machines
, we prepare ourselves, as a culture, for artificial psychotherapists and children laying out their troubles to their iPhones.

When I voice my misgivings about pursuing such conversations, I often get the reaction “If people say they would be happy talking to a robot, if they want a friend they can never disappoint, if they don't want to face the embarrassment or vulnerability of telling their story to a person, why do you care?” But why not turn this question around and ask, “Why don't we all care?” Why don't we all care that when we pursue these conversations, we chase after a fantasy? Why don't we think we deserve more? Don't we think we can have more?

In part, we convince ourselves that we don't need more—that we're comfortable with what machines provide. And then we begin to see a life in which we never fear judgment or embarrassment or vulnerability as perhaps a good thing. Perhaps what machine talk provides is
progress—on the path toward a better way of being in the world? Perhaps these machine “conversations” are not simply better than nothing but better than anything?

There Are No People for These Jobs

A
cover story in
Wired
magazine, “
Better than Human
,” celebrated both the inevitability and the advantages of robots substituting for people in every domain of life. Its premise: Whenever robots take over a human function, the next thing that people get to do is a more human thing. The story was authored by Kevin Kelly, a self-declared techno-utopian, but his argument echoes how I've found people talking about this subject for decades. The argument has two parts. First, robots make us more human by increasing our relational options because now we get to relate to
them
, considered as a new “species.”

Second, whatever people do, if a robot can take over that role, it was, by definition, not specifically human.
And over time, this has come to
include the roles of conversation
, companionship, and caretaking.
We redefine what is human by what technology can't do. But as Alan Turing put it,
computer conversation is “an imitation game
.” We declare computers intelligent if they can fool us into thinking they are people. But that doesn't mean they are.

I work at one of the world's great scientific and engineering institutions. This means that over the years, some of my most brilliant colleagues and students have worked on the problem of
robot conversation and companionship
. One of my students used his own two-year-old daughter's voice as the voice of My Real Baby, a robot doll that was advertised as so responsive it could teach your child socialization skills. More recently, another student developed an artificial dialogue partner with whom you could practice job interviews.

At MIT, researchers imagine sociable robots—when improved—as teachers, home assistants, best friends to the lonely, both young and old.
But particularly to the old
. With the old, the necessity for robots is taken
as self-evident. Because of demography, roboticists explain, “there are no people for these jobs.”

The trend line is clear: too many older people, not enough younger ones to take care of them. This is why, roboticists say, they need to
produce “caretaker machines
” or, as they are sometimes called, “caring machines.”

In fairness, it's not only roboticists who talk this way. In the past twenty years, the years in which I've been studying sociable robotics, I've heard echoes of “There are no people for these jobs” in conversations with people who are not in the robot business at all—carpenters, lawyers, doctors, plumbers, schoolteachers, and office workers. When they say this, they often suggest that the people who are available for “these jobs” are not the right people. They might steal. They might be inept or even abusive. Machines would be less risky. People say things like, “I would rather have a robot take care of my mother than a high school dropout. I know who works in those nursing homes.” Or, “I would rather have a robot take care of my child than a teenager at some day-care center who really doesn't know what she's doing.”

So what are we talking about when we talk about conversations with machines? We are talking about our fears of each other, our disappointments with each other. Our lack of community. Our lack of time. People go straight from voicing reservations about a health-care worker who didn't finish high school to a dream of inventing a robot to care for them, just in time.
Again, we live at the robotic moment, not because the robots are ready for us, but because we are counting on them.

One sixteen-year-old considered having a robot as a friend and said it wasn't for her, but thought she understood at least part of the appeal:

There are some people who have tried to make friends and stuff like that, but they've fallen through so badly that they give up. So when they hear this idea about robots being made to be companions, well, it's not going to be like a human and have its own mind to walk away or ever leave you or anything like that.

Relationship-wise, you're not going to be afraid of a robot cheating on you, because it's a robot. It's programmed to stay with you forever. So if someone heard the idea of this and they had past relationships where they'd always been cheated on and left, they're going to decide to go with the robot idea because they know that nothing bad is going to happen from it.

BOOK: Reclaiming Conversation
12.14Mb size Format: txt, pdf, ePub
ads

Other books

The Makeover by Buscemi, Karen
In the Ice Age : In the Ice Age (9780307532497) by Greenburg, J. C.; Gerardi, Jan (ILT)
Hunts in Dreams by Tom Drury
Los guardianes del oeste by David Eddings
Benediction by Kent Haruf
The Looking-Glass Sisters by Gøhril Gabrielsen
Deadly Obsession by Katie Reus
The Good Doctor by Karen Rose Smith
Amulet of Doom by Bruce Coville