Alone Together (24 page)

Read Alone Together Online

Authors: Sherry Turkle

BOOK: Alone Together
13.59Mb size Format: txt, pdf, ePub
“A ROBOT THAT EVEN SHERRY WILL LOVE”
 
I first heard about Nursebot at a fall 2004 robotics conference where I spoke about what sociable robotics may augur—the sanctioning of “relationships” that make us feel connected although we are alone. Most of my colleagues responded to my ideas by defending the idea that performance is the currency of all social relationships and that rather than a bad thing, this is simply how things are.
14
People are always performing for other people. Now the robots, too, will perform. The world will be richer for having a new cast of performers and a new set of possible performances. At one dinner, a small group took up my reticence with good-natured enthusiasm. They thought there was a robot, benign and helpful, that I would like. Some versions of it were being tested in the United States, some in Japan. This was the Nursebot, which can help elderly people in their homes, reminding them of their medication schedule and to eat regular meals. Some models can bring medicine or oxygen if needed.
15
In an institutional setting, a hospital or nursing home, it learns the terrain. It knows patients’ schedules and accompanies them where they need to go. That awful, lonely scramble in nursing homes when seniors shuffle from appointment to appointment, the waiting around in hospitals for attendants to pick you up: those days would soon be at an end. Feeling dizzy in the bedroom and frightened because you had left your medication in the kitchen: those days were almost over. These researchers wanted to placate the critic in their midst. One said, “This is a robot even Sherry can love.” And indeed, the next day, I saw a video presentation about the find-your-way-around-the-hospital-bot, peppered with interviews of happy patients, most of them elderly.
Only a few months later, after a fall on icy steps in Harvard Square, I was myself being wheeled from one test to another on a hospital stretcher. My companions in this journey were a changing collection of male orderlies. They knew how much it hurt when they had to lift me off the gurney and onto the radiology table. They were solicitous and funny. I was told that I had a “lucky fracture.” While inconvenient and painful, it would heal with no aftereffects. The orderly who took me to the discharge station knew I had received good news and gave me a high five. The Nursebot might have been capable of the logistics, but I was glad that I was there with people. For me, this experience does not detract from the virtues of the robots that provide assistance to the housebound—robots that dispense medication, provide surveillance, check vital signs, and signal for help in an emergency—but it reminds me of their limitations. Getting me around the hospital was a job that a robot could do but that would have been delegated at a cost. Between human beings, simple things reach you. When it comes to care, there may be no pedestrian jobs. I was no longer sure that I could love a Nursebot.
Yet, this story does not lead to any simple conclusions. We are sorting out something complicated. Some elderly tell me that there are kinds of attendance for which they would prefer a robot to a person. Some would rather that a robot bathed them; it would feel less invasive of their privacy. Giving a bath is not something the Nursebot is designed to do, but nurse bots of the future might well be. The director of one of the nursing homes I have studied said, “We do not become children as we age. But because dependency can look childlike, we too often treat the elderly as though this were the case.” Sensing the vulnerability of the elderly, sometimes nurses compensate with curtness; sometimes they do the opposite, using improbable terms of endearment—“sweetie” or “honey”—things said in an attempt at warmth but sometimes experienced as demeaning. The director has great hopes for robots because they may be “neutral.”
By 2006, after the Nursebot had been placed in several retirement facilities, reactions to it, mostly positive, were being posted to online discussion groups. One report from the Longwood Retirement Community in Oakmont, Pennsylvania, was sentimental. It said the robot was “[winning] the hearts of elderly folks there. ”
16
Another describes the robot, called Pearl, as “escort[ing] and schmooz[ing] the elderly” and quotes an older gentleman as saying, “We’re getting along beautifully, but I won’t say whether she’s my kind of girl.”
17
Other comments reveal the ambivalence that I so often find in my conversations with seniors and their families. One woman applauds how Pearl can take over “household chores” but is concerned about the robot’s assuming “certain social functions.” She writes, “I am worried that as technology advances even further, robots like Pearl may become so good at what they do that humans can delegate elderly care entirely to robots. It is really worrying. When u get old, would u like robots to be taking care of you? If however, robots are designed to complement humans and not replace them, then I am all for it! =).”
Another writer begins by insisting, “The human touch of care and love, lets just leave it to humans,” but then proclaims that love from robot pets, to “accompany” the lonely, would be altogether acceptable. In this online forum, as is so often the case, discussions that begin with the idea of a robot pet that would serve practical purposes (it could “alert relatives or the police in case of trouble”) turn into musings about robots that might ward off loneliness, robots that are, in the end, more loveable than any pet could be: “They will never complain and they are allegiant [
sic
].” I am moved by the conflation of allegiance and compliance, both of which imply control over others and both of which are, for the elderly, in short supply.
In another online discussion, no one is prepared to be romantic about the importance of human care because they have seen how careless it can be.
18
The comments are dark. “Robots,” says one writer, “will not abuse the elderly like some humans do in convalescent care facilities.” Another dismisses the sentiment that “nurses need to be human” with the thought that most nurses just try to distance themselves from their jobs—that’s “how they keep from going crazy.” One writer complains that a robot would never be able to tell whether an elderly person was “bothered, sad, really sad, or devastated and wanting to die,” but that the “precious” people who could “are scarcely around.”
I find this discussion of Nursebot typical of conversations about robots and the elderly. It is among people who feel they have few moves left. There is a substantive question to be discussed: Why give objects that don’t understand a life to those who are trying to make sense of their own? But it is almost impossible to discuss this question because of the frame we have built around it—assuming that it has already been decided, irrevocably, that we have few resources to offer the elderly. With this framing, the robots are inevitable. We declare ourselves overwhelmed and lose a creative relationship to ourselves and our future. We learn a deference to what technology offers because we see ourselves as depleted. We give up on ourselves. From this perspective, it really doesn’t matter if I or anyone else can love Nursebot. If it can be made to do a job, it will be there.
To the objection that a robot can only seem to care or understand, it has become commonplace to get the reply that people, too, may only seem to care or understand. Or, as a recent
New York Times
article on Paro and other “caring machines” puts it, “Who among us, after all, has not feigned interest in another? Or abruptly switched off their affections, for that matter?” Here, the conversation about the value of “caring machines” is deflected with the idea that “seeming” or “pretending” behavior long predates robots. So, the problem is not what we are asking machines to do because people have always behaved like machines. The article continues, “In any case, the question, some artificial intelligence aficionados say, is not whether to avoid the feelings that friendly machines evoke in us, but to figure out how to process them.” An AI expert claims that humans “as a species” have to learn to deal with “synthetic emotions,” a way to describe the performances of emotion that come from objects we have made.
19
For him, the production of synthetic emotion is taken as a given. And given that we are going to produce it, we need to adapt to it. The circle is complete. The only way to break the circle is to reframe the matter. One might say that people can pretend to care; a robot cannot care. So a robot cannot pretend because it can only pretend.
DO ROBOTS CURE CONSCIENCE?
 
When I first began studying people and computers, I saw programmers relating one-to-one with their machines, and it was clear that they felt intimately connected. The computer’s reactivity and interactivity—it seemed an almostmind—made them feel they had “company,” even as they wrote code. Over time, that sense of connection became “democratized.” Programs became opaque: when we are at our computers, most of us only deal with surfaces. We summon screen icons to act as agents. We are pleased to lose track of the mechanisms behind them and take them “at interface value.” But as we summon them to life, our programs come to seem almost companions. Now, “almost” has almost left the equation. Online agents and sociable robots are explicitly designed to convince us that they are adequate companions.
Predictably, our emotional involvement ramps up. And we find ourselves comforted by things that mimic care and by the “emotions” of objects that have none. We put robots on a terrain of meaning, but they don’t know what we mean. And they don’t mean anything at all. When a robot’s program cues “disgust,” its face will look, in human terms, disgusted. These are “emotions” only for show. What if we start to see them as “real enough” for our purposes? And moral questions come up as robotic companions not only “cure” the loneliness of seniors but assuage the regrets of their families.
In the spring of 2009, I presented the case of robotic elder care to a class of Harvard undergraduates. Their professor, political theorist Michael Sandel, was surprised by how easily his students took to this new idea. Sandel asked them to think of a nursing home resident who felt comforted by Paro and then to put themselves in the place of her children, who might feel that their responsibility to their mother had been lessened, or even discharged, because a robot “had it covered.” Do plans to provide companion robots to the elderly make us less likely to look for other solutions for their care?
As Sandel tried to get his class to see how the promise of robotic companionship could lead to moral complacency, I thought about Tim, who took comfort in how much his mother enjoyed talking to Paro. Tim said it made “walk[ing] out that door” so much easier when he visited her at the nursing home.
In the short term, Tim’s case may look as though it charts a positive development. An older person seems content; a child feels less guilty. But in the long term, do we really want to make it easier for children to leave their parents? Does the “feel-good moment” provided by the robot deceive people into feeling less need to visit? Does it deceive the elderly into feeling less alone as they chat with robots about things they once would have talked through with their children? If you practice sharing “feelings” with robot “creatures,” you become accustomed to the reduced “emotional” range that machines can offer. As we learn to get the “most” out of robots, we may lower our expectations of all relationships, including those with people. In the process, we betray ourselves.
All of these things came up in Sandel’s class. But in the main, his students were positive as they worked through his thought experiment. In the hypothetical case of mother, child, and robot, they took three things as givens, repeated as mantras. First, the child has to leave his mother. Second, it is better to leave one’s mother content. Third, children should do whatever it takes to make a mother happy.
I left the class sobered, thinking of the fifth graders who, surrounded by a gaggle of peers talking about robots as babysitters and caretakers for their grandparents, began to ask, “Don’t we have people for these jobs?” I think of how little resistance this generation will offer to the placement of robots in nursing homes. And it was during that very spring that, fresh from his triumphant sale of a thousand Paros to the Danish government, their inventor had come to MIT to announce opening up shop in the United States.
CHAPTER 7
 
Communion
 
A
handsome twenty-six-year-old, Rich, in dress shirt and tie, comes to call on Kismet. Rich is being taped with Kismet as part of a study to determine how well the robot manages adult “conversation.” Rich sits close to Kismet, his face directly across from the robot. He is not necessarily expecting much and engages in a spirit of good humor and curiosity.
Rich: I like you Kismet. You’re a pretty funny person.
Kismet: [nods and smiles in assent and recognition]
Rich: Do you laugh at all? I laugh a lot.
 
At first, the conversation between Rich and Kismet shows a bit of the ELIZA effect: Rich clearly wants to put the robot in its best light. Like the children who devote themselves to getting Kismet to say their names, Rich shows Kismet the courtesy of bending to what it does best. Rich seems to play at “gaming” the program, ramping up the illusion to the point that he can imagine believing it.
But with the emotionally expressive Kismet, it is easy for Rich to find moments when he senses the possibility of “more.” They can pass quickly, and this “more” is ill defined. But one moment, Rich plays at a conversation with Kismet, and the next, he is swept up in something that starts to feel real. He begins to talk to Kismet about his girlfriend Carol, and quickly things get personal. Rich tells Kismet that his girlfriend enjoys his laughter and that Rich tries not to laugh at her. When Kismet laughs and seems interested, Rich laughs as well and warms up: “Okay. You’re adorable. Who are you? What are you?”

Other books

Lie with Me by M. Never
Gamers' Challenge by George Ivanoff
This Great Struggle by Steven Woodworth
Washy and the Crocodile by James Maguire
Troll: Taken by the Beast by Knight, Jayme
Baby It's Cold Outside by Susan May Warren
A Voice in the Night by Andrea Camilleri