Read Shelter Online

Authors: Susan Palwick

Tags: #Fiction, #Science Fiction, #General

Shelter (42 page)

BOOK: Shelter
2.53Mb size Format: txt, pdf, ePub
ads

    He was her touchstone, the tale to which all else was footnote. Only later would she realize how firmly every significant event in the wider world was, for her, connected to some milestone or crisis of her son's. The morning that one-year-old Nicholas first correctly placed a round block in a shape sorter, ScoopNet reported that the first brainwipe sentence had been handed down for a convicted murderer. Meredith's pride in her son's accomplishment was inseparable from her relief that he was turning out so normally, that like millions of babies before him, he had been granted the everyday miracle of learning to know a circle from a square. He'd escaped CV for good; he would never need to have his brain scourged with the modified virus.

    The day Mexico joined Africa in granting citizenship to AIs, Nicholas yelled, "Nose!" with one hand on his own nose and one hand on Meredith's. He was a person; he had been born from a human womb, had a human nose, used human language to identify it and a human brain to recognize the abstract within the particular.

    Nicholas built his first block tower during Zephyr's first national netcast. Meredith found her son's performance infinitely preferable to the crazed gyrations of Zephyr's bots as she herded them through a barbwire labyrinth.

    Nicholas said his first two-word sentence, "More juice," the morning Kevin went to work for MacroCorp, designing AI-enhanced buildings. Meredith and Kevin had had a series of exceptionally tense discussions about the decision. "Kevin, you don't need to do this! So what if we don't have my paycheck anymore? I have trust funds! We can use them! You shouldn't do work you don't believe in and you shouldn't feel you have to work for my family."

    "Meredith, I'm an architect. Furthermore, I'm a damned good architect. MacroCorp's lucky to have me, and don't think they don't know it. And the work's interesting and I need to develop these skills, because AIs are going to be all over the place."

    "I thought you didn't like AIs! You never used to!"

    "Meredith, I'm an architect. This is what buildings are now. This is what they do. I'm not turning soulfreak on you. I'm designing buildings, all right? Just like our house. Our house is AI equipped, as you conveniently seem to keep forgetting."

    "Without social programming or autonomous individuality."

    "So what? You still have a smart machine making your coffee in the morning, even if the coffeemaker doesn't chat with you about the weather while it grinds the beans. So unless you want to try to find me a job designing thatched huts, don't yell at me for doing the work I've been trained for!"

    After their volley of furious speech, what a simple blessing it was for her to hear Nicholas stringing two words together to quench his simple thirst. No AI in the world would ever ask for apple juice.

    If his first two-word sentence was a moment of grace in a difficult morning, his first three-word one–"Scary monsters, Mommy!"–would later seem an unheeded omen. He had woken up screaming, bolt upright and eyes wide with terror, unable to recognize Meredith for several minutes. Over breakfast, he delivered his three-word synopsis through a face smeared with oatmeal.

    "Poor baby," Meredith said, and hugged him. "I know. It was scary for me too, when you yelled like that. But the monsters are all gone. They aren't here anymore. Did you dream about Ashputtle and Marzipan, Nicky? But they don't live here anymore. You don't have to be afraid."

    He took a long nap that day, no doubt because he'd woken up earlier than usual, and Meredith, unconcerned–all kids had nightmares–spent the time catching up on news. For once, she curled up in front of the screen in the living room, rather than just having the audio follow her around the house. Nicholas's nightmare had worn her out too, and she wanted to sit down.

    The top story was on the increasing popularity of brain wiping, both as a mandated therapy for criminals and as a voluntary procedure. Camille, a manic-depressive whose doctors had never managed to stabilize her medication, hoped that brainwiping would finally make her life normal again. ''I'm so tired of up and down and up and down–and when I'm down I want to die and when I'm up I hear voices and the middle lasts five minutes. Five seconds. It's there and then it's gone, I blink and I miss it. I'm tired! I'm willing to try anything!"

    The reporter reminded her that she'd lose her entire life, all her memories, the ability even to feed herself with a spoon. "I can relearn all that," Camille said, tears welling on camera. "I can relearn what I need to know to live. I've already lost the people who loved me, because they couldn't stand being around me like this. I've never been able to hold down a job. I'm sick of hospitals, do you understand? I want a home! I want to be a normal human being again! Do you understand? Do you? Do you understand?"

    After a bland, optimistic closing quote from Camille's physician, the story moved to a cautionary interview with Holly O'Riley, a leading civil rights attorney who specialized in defending the mentally ill. "For every twenty violent criminals who have successfully been resocialized after this procedure," O'Riley said, staring at the camera grimly, "one has become permanently disabled: unable to grasp cause-and-effect, unable to tell time, unable to use the first-person pronoun because so much sense of self has been lost. Let me introduce you to Luke."

    Luke was a drooling wreck, a forty-five-year-old man with the mind of a toddler. He had to wear diapers, still crawled, and appeared to be incapable of learning to place a round block in a round hole; the news clip showed him struggling with the same shape sorter Nicholas had mastered. Before brainwiping, Luke had been a skilled carpenter and chess grand master facing life without parole for the brutal rape and murder of twin twelve-year-old girls. "What brainwiping has done to Luke," Holly O'Riley said, "is arguably as hideous as what he did to his victims. True, he had a choice: he chose the procedure, rather than being sentenced to it, and he knew the risks. But as a society, we need to think long and hard about the danger of placing other people in Luke's terrible position."

    Meredith, repelled, turned off the TV. As far as she was concerned, Luke looked like a crawling advertisement for capital punishment. How could anyone feel sorry for him? O'Riley had chosen the wrong poster child; Camille, who so desperately wanted the procedure, was infinitely more sympathetic. And none of it had anything to do with Meredith. She and Nicholas were both done with CV for good. The worst thing they had to worry about now was a few nightmares.

    The nightmares continued, though, growing gradually worse. It took Meredith a while to become concerned, since Nicholas developed normally otherwise. At two years and three months, he first ate an entire meal with a spoon; that same day, Canada, Sweden, and the Netherlands joined Mexico and Africa in the AI emancipation treaty. Three months later, as Nicholas opened his bedroom door by himself for the first time, Congress proposed the "born, not built" amendment to the Constitution, granting personhood—and citizenship—only to genetic, biological humans, to online entities constructed from the memories of genetic, biological humans, and to corporations controlled and staffed by entities defined as persons under the first two clauses. The proposal was, ironically, applauded by both Luddites and AI manufacturers, and vociferously opposed by AI activists.

    Meredith listened to Preston testifying at the California ratification hearings as she coached Nicholas through the momentous task of learning to put on his own shoes. He'd had not one but two nightmares the previous night, waking screaming, his sheets wet with sweat and urine. Meredith had thought bleakly of Hortense. Well, she'd atone for her insensitivity to the old woman by loving Nicholas. "Nicky, what is it? What did you dream?"

    "Scary monsters, Mommy!" All kids have nightmares, Meredith told herself, but she and Nicholas were both exhausted today, and listening to her father speaking calmly in favor of ratification soothed her.

    "There is no doubt that AIs are extremely sophisticated machines," Preston said, "but they are machines nonetheless. Even if the most advanced such machines can be said to possess autonomous individuality, that does not make them human, any more than cats or cows or crickets are human. We care for our pets and our livestock, after all, but we do not grant them citizenship. Citizenship is a human term, meaningful only within human society, as irrelevant to AIs as it would be to cats or cows."

    Meredith, already anticipating the rebuttal, winced. What PR idiot had written that speech? Surely not Jack! The analogy to organic life would only make the soulfreaks more likely to oppose ratification; any sensible argument would have pointed out that AIs weren't organic at all, that they were outside ecology rather than part of it. That was what "born, not built" meant.

    Meanwhile, Nicholas had put his left shoe on his right foot, and was crying because it didn't feel right. "No, Nicky, that's the wrong foot—try it on the other one, good boy, there you go!" She led him through the intricate task of fastening the Velcro straps while the moderator introduced the next speaker in the debate, eminent AI researcher Dan Willem.

    Dan Willem. Dan. Meredith blinked. Dan had been the head of Raji's AI lab. That meant he took grant money from MacroCorp, didn't it? But, no, the AI lab hadn't existed since Raji died. Willem was now arguing against the people who'd once supported his research.

    As Raji would have, surely, if he'd still been alive. Meredith wiped away tears, settled Nicky down with some blocks, and sat to watch the debate.

    After gently charging Preston with hypocrisy, since he was no longer organically human, either, Dan launched into an extremely odd comparison of AI emancipation with the Underground Railroad and the animal rights movement. "No," Meredith said aloud, "the slaves were already people, which is why other people didn't have the right to enslave them. And animals weren't created by humans to be useful to humans, which is why we can't use them like machines." She felt as if she were reliving that disastrous conversation with Zephyr at Cyberjus. That had been the last time she saw Raji. She shook her head to clear it, and said to the television, "Idiot." Raji had thought very highly of Dan. For Raji's sake, she wanted him to be more intelligent than this.

    Nicky looked up from his blocks. "What, Mommy?"

    "Nothing, honey. I'm sorry. I was talking to that man on the TV." She added silently, And animal rights are important because we all share the earth; we're all part of the food chain. AIs aren't part of the food chain at all. Not even Raji would have claimed that. You couldn't make ecological arguments for AIs.

    Food chain. Food. It was one o'clock already. "Want your lunch now, Nicky?"

    "Sticky butter and red jelly!"

    While Meredith made Nicholas's sandwich, Dan sorrowfully accused Preston of supporting the amendment only because of the money MacroCorp would lose if AIs were no longer commercial properties. "Aside from the military market, which you've very virtuously avoided, MacroCorp has a virtual global monopoly on AI technology. If your company couldn't sell AI systems, Mr. Walford—if they were legally acknowledged not simply as autonomous individuals, but as sentient beings who couldn't be traded on the market—Macro Corp would go broke, and you'd lose your computer space. Do you really care about sentience, or just about your own survival?"

    Meredith shook her head in annoyance. MacroCorp had gotten into the AI business because Preston did care about sentience. AIs had been designed to help people after the CV pandemic; didn't Dan know that? Preston said only, "May I remind my esteemed opponent that MacroCorp is a global, diversified conglomerate? AI technology represents only a small percentage of our business; the loss of that business would not even come close to hurting the corporation." Meredith smiled. Even if AIs were declared persons, MacroCorp's lawyers would get around the slavery issue anyhow. Instead of selling AIs, they'd demand salaries for them or something, but of course MacroCorp would still get the money. What would an AI do with a paycheck? The whole issue was ludicrous.

    Three months later, while the ratification hearings were still under way, Nicholas had three nightmares in one night. The next morning over breakfast, Meredith and Kevin heard on a newscast that Zephyr Expanding Cosmos had been arrested for smuggling stolen AI hardware into Mexico.

    Meredith shook her head. "The woman's crazy."

    "Mmmm. Well, she's an odd one, that's for sure. What were Nicky's bad dreams about?"

    "Monsters," Meredith said, around a bite of bagel. Nicholas was in bed, catching up on sleep after his rough night.

    "Don't you think Nicholas dreams about monsters an awful lot? Merry, maybe we should take him to a doctor or something."

    "He's fine. All kids have bad dreams. He'll grow out of it." Merry felt anxiety coiling in her stomach, and remembered Honoli saying that anxiety could trigger OCD. Was Nicky anxious too? "Hey, I heard that Massachusetts already ratified. That's great news, isn't it?"

    Kevin scowled. "I don't care about the amendment. It's stupid. I'm worried about Nicholas."

    "He's fine, Kevin!"

    "He's not fme. Waking up screaming three times a night isn't fine. Maybe it's that brain thing he has. Maybe he needs meds."

    Meredith remembered how Constance had wanted to medicate her out of her passion for cleaning after Raji died. But the cleaning had been better therapy than any pill would have been. "And maybe he's working through something he needs to work through. Speaking of which, don't you have to go to work?"

    "It's been months," Kevin said, standing up. "He hasn't worked through it yet. I think you should talk to somebody, that's all. So we can help him. It's cruel to watch him suffer."

    Meredith felt a pang of guilt; Kevin was right. "Okay, I'll talk to somebody. Relax. Have a nice day. I'll see you later."

    He left, and Meredith, with a sigh, leaned back into her chair. The newscast was still droning. There'd been a new outbreak of CV in China: five thousand dead. Colombia was in the throes of a new civil war. The House and Senate had passed a bill making brainwiping mandatory for anyone convicted of rape, murder, or terrorism,

BOOK: Shelter
2.53Mb size Format: txt, pdf, ePub
ads

Other books

Alice-Miranda At School by Jacqueline Harvey
The Belgravia Club by Fenton, Clarissa
The Year I Almost Drowned by Shannon McCrimmon
The Winter Queen by Boris Akunin, Andrew Bromfield
Soulrazor by Steven Montano
Lunatic by Ted Dekker
Armageddon Conspiracy by John Thompson
Lost & Found by Jacqueline Sheehan
Sweet Carolina by Roz Lee