Authors: Sam Kepfield
“I remember being in a warm, soft space, floating free. Was that the womb?”
Memories of the womb; the few remaining Freudians would be ecstatic over that one. “In a sense, yes. And then?”
“I was in a room, with bright lights, and Doctor Crane was there. He said something that I can’t understand. He said ‘I created you.’ Is he my father?” She held the kitten up to her face, smiled as it batted a paw at a loose strand of hair.
Kelly frowned, thinking fast. “He created you, so he’s your father in a sense.” Again, not quite a lie, not quite the truth.
God, you missed your calling, you should have been a lawyer, Lana
, she heard her mother saying.
“Are you my mother? Did you and Doctor Crane have sexual intercourse and conceive me?”
Kelly laughed in spite of herself, startling the kitten. Crane was nice enough, would be considered a catch by any woman who preferred her companionship to be crested and not cloven. Kelly had long ago made her choice in that matter, but didn’t think confusing Maria with that issue was useful at this point.
“No, I’m not your mother. Not biologically, at least. But I’m trying to do the same things as a mother would.”
“Like what?”
“Like teach you to survive in the world, and to be a good person who knows right from wrong. To watch out for yourself, but to help others when they need it. Among other things.”
“You are doing a good job, Doc — Alannah.” She giggled as the kitten playfully chewed on her finger. “Alannah?”
“Yes, Maria?”
“Alannah, what’s my purpose?”
Kelly couldn’t give an answer, so she distracted Maria with the kittens, and they stayed there the entire morning, letting the kittens run around the mini-forest chasing butterflies and grasshoppers, until it was time for lunch.
“You
were
planning on telling her the truth about herself, at some point, weren’t you?” Kelly asked Crane that afternoon in his office. Her New England accent was flintier than usual. “Tell me you were. Please.”
Crane swiveled in his chair, stared out the huge window at the Front Range, snow on the peaks in the April sun. “I hadn’t planned on it, no,” he said in a detached voice. “Interesting therapy method, by the way.”
“I used kitty therapy on an autistic girl years ago. It worked like a charm,” Kelly said, momentarily thrown off. She put the conversation back on track. “She’s asking questions, you know. Those neural connections in that fabulous RNA and DNA-based supercomputer that Derel designed are multiplying at an astonishing pace.”
“I expected that,” Crane said. His arm went rigid for a few seconds, he grimaced, and then relaxed.
“And not just any questions,” Kelly went on, a trifle shaken by the spasm. “The big ones. Where did I come from? Who am I? What’s my purpose? What do I tell her?”
“It’s a real bag of nails, isn’t it? But you were doing fine out there.”
“I can’t put it off.”
“Why not?”
“Because,” Kelly said, her voice getting sharper, “first, I owe her the truth. I can’t ethically lie to a patient about a course of therapy or treatment.”
“She’s not a patient, Alannah. Not fully. Think of her as an experiment. The rules don’t apply — ”
“To a souped-up lab rat? Is that how you see her?”
Crane looked wounded. “I can assure you I don’t.”
“I hope so. She’s just as human as you or I. Which leads me to the second part — that I don’t know
what
her purpose is. Why
did
you create her, Des? Just to show that you could?”
“Of course not,” Crane grew defensive. “There are a number of functions Maria, or any other droid like her, could fulfill.”
“Yeah. Like cannon fodder for the military or law enforcement.”
Crane turned hard about in his chair. “You’ve been talking to Derel, haven’t you?”
“I don’t need to. It’s obvious. An endless supply of soldiers who never question orders, never hesitate, never get PTSD, perfect killing machines — ”
No leaving a teenage girl at a graveside taking a folded flag
—
“Enough,” Crane said firmly, then took a deep breath. “Even if you were right, think of this. We live in a dangerous world, Alannah. When I was a child, the Soviet Union was the enemy, with thousands of nuclear bombs that would kill millions, and we could predict their moves with satellites and communications intercepts. Now it’s a bunch of ragtag medievalists armed with low-tech gadgets that kill a few at a time, but are much more demoralizing. The big problem — no HUMINT, or human intelligence. The Pentagon kicked out all of its Arabic specialists years ago, because they had homosexual tendencies. They’ve never really recovered. Arabists born here can’t quite understand the mindset, and there’s the danger that they might be discovered. But what if we could create the perfect mole? Create a HUMINT source, program it, immerse it in a culture, no strings, no family, no connections, no divided loyalties?”
“A charming rationalization, Des,” Kelly said coldly. “But you’re not answering the question. What’s her purpose? Is it superspy? A Robocop? Because any way you look at it, it means she’s ultimately disposable.”
“And you’re saying humans aren’t? History begs to differ with you, Doctor Kelly. Remember the Somme. Dresden. Hiroshima and Nagasaki. Stalingrad. The concentration camps, the gulags, the killing fields, millions dead for a Greater Good. To those in power, we’ve always been considered disposable, and occasionally they’re justified in that view. The civilians who died at Hiroshima were sacrificed to prevent an even greater slaughter had the Americans invaded.”
“But the people who died at Hiroshima weren’t
created
for that destiny,” Kelly shot back. “Or is the ultimate goal something not so noble? Maybe you’re not going to play field marshal. Maybe massa is more like it.”
Crane narrowed his eyes. “Just how committed are you to this project? You can leave anytime you want. I can find someone else.”
“I’m just as committed as you are. Maybe even more. But I don’t want to create a new race of slaves. You won’t call them human. So that makes it acceptable to treat them as the Other. And once everyone agrees on that idea, anything’s possible. Remember the plantations? Or maybe
Kristallnacht
?”
“It’s not going to come that,” Crane said, his jaw set.
“What
does
it come to? A bunch of Marias on display in a mall, all for sale? Living mannequins you can take home and — then what? What do they do? Cook? Clean?
Fuck
?”
She sat back, and they stared at each other for a long moment. “I’m sorry,” Kelly said weakly after a while. “I’m getting emotional about her. She thought I was her mother this morning. I guess it kind of tripped some switch inside. I never…”
“I understand,” Crane said softly, steering the conversation away to safer waters. “She’s in her room. You’ve got her listening to music?”
“Yeah. Part of the touchy-feely side. I started her on some of the classics. Bach, Mozart, Beethoven, Tchaikovsky. I’ll get to more classics later.”
“Wagner?”
“Huh-uh. Bob Dylan. Miles Davis.”
She got up and left. But the questions she’d asked still hung there. She wasn’t about to leave the project. Not yet.
And she didn’t buy Crane’s explanation for creating Maria, either. Crane wasn’t the greater-good-of-humanity type. He was a cold realist. He’d created Maria for more selfish, more personal reasons. She just had to find out what those were.
7
“She’s getting to be a problem,” Crane told Danner later that evening on a secure link to the DARPA office in Arlington, Virginia.
“Your droid?”
“No. The droid is fine. She’s coming along better than expected. Kelly is taking a maternal interest in Maria.”
“Let her. She doesn’t have control over the program. You do. I do. When can you be ready for a trial run?”
“For a simple evade and elude, we could do it next week.”
“I’ll set it up.”
Danner signed off, leaving Crane sitting in his office alone in the dark, illuminated only by the monitors.
Standby subroutine all motor functions inoperative but mental functions continue running assimilating knowledge, glowing fluorescent molecules flashing green-blue-red-yellow in a warm organic gel encoding and retrieving bits and bytes of information, collecting and collating.
Alannah, was I a little girl? Where did I come from
?
Search engines retrieve no data on her existence before a week ago, before the cold light cut into her nascent wakefulness. Cases of total amnesia not unknown, but normally linked with head trauma. Self-examination completed, cause not indicated. Alzheimer’s, Parkinson’s, Huntington’s, eponymous genetic syndromes, not indicated at her age of — what? Approximately twenty-five <
Twenty-six…
a mindwhisper
…I was twenty-six when I ceased >
years, but other symptoms of neurodegenerative disorders not indicated.
< new search >
That’s how you treat them. That’s how you treat all living creatures
, the small fuzzy ones and the stronger ones, now scanning files subroutines ETHICS MORALS and she comes across a small file, linked to programming code, a short file with three simple laws: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
She picks up the cable, gently inserts it in her temple, closes her eyes and begins her quest.
< bypass spyware >
< spyware disabled >
< Search Source >…120886 results/sort — wiki: Asimov’s Three Laws of Robotics formulated 1942 by Isaac Asimov, science fiction author, forming a basic framework for robotic-themed works…
< Search term ROBOT >…35672 results/sort — wiki: an automatically guided machine, able to do tasks on its own. Another common characteristic is that by its appearance or movements, a robot often conveys a sense that it has intent or agency of its own orig. roboti orig. russian ref. Karel Capek R.U.R., see ref. golem, Hephaestus. Interrupt. Query —
< Search term NONHUMAN >…1,427,887 results/sort 1.not human. 2. not displaying the emotions, sympathies, intelligence, etc., of most human beings. 3. not intended for consumption by humans: nonhuman products such as soaps and detergents
.
<
Search term SUBHUMAN >…135,904 results/sort less than human. Can refer to several concepts: humanoid, artificial intelligence with performance less than a human being, last man in Nietzschean philosophy, slave, Untermenschen…
Redirect search terms jimcrow:pogrom:holocaust:rwanda:RUN
She begins to understand, microlights within her ceramsteel skull flashing winking.
8
“Alannah?”
“Yes, Maria?”
“Am I a robot?”