Asimov's Future History Volume 4 (56 page)

BOOK: Asimov's Future History Volume 4
10.94Mb size Format: txt, pdf, ePub
ads

Daneel said, slowly and with emphasis, “I am functioning!”

“Come. If a squirrel is alive, or a bug, or a tree, or a blade of grass, why not you? I would never remember to say–or to think–that I am alive but that you are merely functioning, especially if I am to live for a while on Aurora, where I am to try not to make unnecessary distinctions between a robot and myself. Therefore, I tell you that we are both alive and I ask you to take my word for it.”

“I will do so, Partner Elijah.”

“And yet can we say that the ending of robotic life by the deliberate violent action of a human being is also ‘murder’? We might hesitate. If the crime is the same, the punishment should be the same, but would that be right? If the punishment of the murder of a human being is death, should one actually execute a human being who puts an end to a robot?”

“The punishment of a murderer is psychic-probing, Partner Elijah, followed by the construction of a new personality. It is the personal structure of the mind that has committed the crime, not the life of the body.”

“And what is the punishment on Aurora for putting a violent end to the functioning of a robot?”

“I do not know, Partner Elijah. Such an incident has never occurred on Aurora, as far as I know.”

“I suspect the punishment would not be psychic-probing,” said Baley. “How about ‘roboticide’?”

“Roboticide?”

“As the term used to describe the killing of a robot.”

Daneel said, “But what about the verb derived from the noun, Partner Elijah? One never says ‘to homicide’ and it would therefore not be proper to say ‘to roboticide.”

“You’re right. You would have to say ‘to murder’ in each case.”

“But murder applies specifically to human beings. One does not murder an animal, for instance.”

Baley said, “True. And one does not murder even a human being by accident, only by deliberate intent. The more general term is ‘to kill.’ That applies to accidental death as well as to deliberate murder–and it applies to animals as well as human beings. Even a tree may be killed by disease, so why may not a robot be killed, eh, Daneel?”

“Human beings and other animals and plants as well, Partner Elijah, are all living things,” said Daneel. “A robot is a human artifact, as much as this viewer is. An artifact is ‘destroyed,’ ‘damaged,’ ‘demolished,’ and so on. It is never ‘killed.”

“Nevertheless, Daneel, I shall say ‘Killed.’ Jander Panell was killed.”

Daneel said, “Why should a difference in a word make any difference to the thing described?”

“That which we call a rose by any other name would smell as sweet.’ Is that it, Daneel?”

Daneel paused, then said, “I am not certain what is meant by the smell of a rose, but if a rose on Earth is the common flower that is called a rose on Aurora, and if by its ‘smell’ you mean a property that can be detected, sensed, or measured by human beings, then surely calling a rose by another sound-combination–and holding all else equal–would not affect the smell or any other of its intrinsic properties.”

“True. And yet changes in name do result in changes in perception where human beings are concerned.”

“I do not see why, Partner Elijah.”

“Because human beings are often illogical, Daneel. It is not an admirable characteristic.”

Baley sank deeper into his chair and fiddled with his viewer, allowing his mind, for a few minutes, to retreat into private thought. The discussion with Daneel was useful in itself, for while Baley played with the question of words, he managed to forget that he was in space, to forget that the ship was moving forward until it was far enough from the mass centers of the Solar System to make the Jump through hyperspace; to forget that he would soon be several million kilometers from Earth and, not long after that, several light-years from Earth.

More important, there were positive conclusions to be drawn. It was clear that Daneel’s talk about Aurorans making no distinction between robots and human beings was misleading. The Aurorans might virtuously remove the initial “B.,” the use of “boy” as a form of address, and the use of “it” as the customary pronoun, but from Daneel’s resistance to the use of the same word for the violent ends of a robot and of a human being (a resistance inherent in his programming which was, in turn, the natural consequence of Auroran assumptions about how Daneel ought to behave) one had to conclude that these were merely superficial changes. In essence, Aurorans were as firm as Earthmen in their belief that robots were machines that were infinitely inferior to human beings.

That meant that his formidable task of finding a useful resolution of the crisis (if that were possible at all) would not be hampered by at least one particular misperception of Auroran society.

Baley wondered if he ought to question Giskard, in order to confirm the conclusions he reached from his conversation with Daneel–and, without much hesitation, decided not to. Giskard’s simple and rather unsubtle mind would be of no use. He would “Yes, sir” and “No, sir” to the end. It would be like questioning a recording.

Well, then, Baley decided, he would continue with Daneel, who was at least capable of responding with something approaching subtlety.

He said, “Daneel, let us consider the case of Jander Panell, which I assume, from what you have said so far, is the first case of roboticide in the history of Aurora. The human being responsible–the killer–is, I take it, not known.”

“If,” said Daneel, “one assumes that a human being was responsible, then his identity is not known. In that, you are right, Partner Elijah.”

“What about the motive? Why was Jander Panel! killed?”

“That, too, is not known.”

“But Jander Panel! was a humaniform robot, one like yourself and not one like, for instance, R. Gis–I mean, Giskard.”

“That is so. Jander was a humaniform robot like myself.”

“Might it not be, then, that no case of roboticide was intended?”

“I do not understand, Partner Elijah.”

Baley said, a little impatiently, “Might not the killer have thought this Jander was a human being, that the intention was homicide, not roboticide?”

Slowly, Daneel shook his head. “Humaniform robots are quite like human beings in appearance, Partner Elijah, down to the hairs and pores in our skin. Our voices are thoroughly natural, we can go through the motions of eating, and so on. And yet, in our behavior there are noticeable differences. There may be fewer such differences with time and with refinement of technique, but as yet they are many. You–and other Earthmen not used to humaniform robots–may not easily note these differences, but Aurorans would. No Auroran would mistake Jander–or me–for a human being, not for a moment.”

“Might some Spacer, other than an Auroran, make the mistake?”

Daneel hesitated. “I do not think so. I do not speak from personal observation or from direct programmed knowledge, but I do have the programming to know that all the Spacer worlds are as intimately acquainted with robots as Aurora is–some, like Solaria, even more so–and I deduce, therefore, that no Spacer would miss the distinction between human and robot.”

“Are there humaniform robots on the other Spacer worlds?”

“No, Partner Elijah, they exist only on Aurora so far.”

“Then other Spacers would not be intimately acquainted with humaniform robots and might well miss the distinctions and mistake them for human beings.”

“I do not think that is likely. Even humaniform robots will behave in robotic fashion in certain definite ways that any Spacer would recognize.”

“And yet surely there are Spacers who are not as intelligent as most, not as experienced, not as mature. There are Spacer children, if nothing else, who would miss the distinction.”

“It is quite certain, Partner Elijah, that the–roboticide–was not committed by anyone unintelligent, inexperienced, or young. Completely certain.”

“We’re making eliminations. Good. If no Spacer would miss the distinction, what about an Earthman? Is it possible that–”

“Partner Elijah, when you arrive in Aurora, you will be the first Earthman to set foot on the planet since the period of original settlement was over. All Aurorans now alive were born on Aurora or, in a relatively few cases, on other Spacer worlds.”

“The first Earthman,” muttered Baley. “I am honored. Might not an Earthman be present on Aurora without the knowledge of Aurorans?”

“No!” said Daneel with simple certainty.

“Your knowledge, Daneel, might not be absolute.”

“No!” came the repetition, in tones precisely similar to the first.

“We conclude, then,” said Baley with a shrug, “that the roboticide was intended to be roboticide and nothing else.”

“That was the conclusion from the start.”

Baley said, “Those Aurorans who concluded this at the start had all the information to begin with. I am getting it now for the first time.”

“My remark, Partner Elijah, was not meant in any pejorative manner. I know better than to belittle your abilities.”

“Thank you, Daneel. I know there was no intended sneer in your remark.–You said just a while ago that the roboticide was not committed by anyone unintelligent, inexperienced, or young and that this is completely certain. Let us consider your remark–”

Baley knew that he was taking the long route. He had to. Considering his lack of understanding of Auroran ways and of their manner of thought, he could not afford to make assumptions and skip steps. If he were dealing with an intelligent human being in this way, that person would be likely to grow impatient and blurt out information–and consider Baley an idiot into the bargain. Daneel, however, as a robot, would follow Baley down the winding road with total patience.

That was one type of behavior that gave away Daneel as a robot, however humaniform he might be. An Auroran might be able to judge him a robot from a single answer to a single question. Daneel was right as to the subtle distinctions.

Baley said, “One might eliminate children, perhaps also most women, and many male adults by presuming that the method of roboticide involved great strength–that Jander’s head was perhaps crushed by a violent blow or that his chest was smashed inward. This would not, I imagine, be easy for anyone who was not a particularly large and strong human being.” From what Demachek had said on Earth, Baley knew that this was not the manner of the roboticide, but how was he to tell that Demachek herself had not been misled?

Daneel said, “It would not be possible at all for any human being.”

“Why not?”

“Surely, Partner Elijah, you are aware that the robotic skeleton is metallic in nature and much stronger than human bone. Our movements are more strongly powered, faster, and more delicately controlled. The Third Law of Robotics states: ‘A robot must protect its own existence.’ An assault by a human being could easily be fended off. The strongest human being could be immobilized. Nor is it likely that a robot can be caught unaware. We are always aware of human beings. We could not fulfill our functions otherwise.”

Baley said, “Come, now, Daneel. The Third Law states: ‘A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.’ The Second Law states: ‘A robot must obey the orders given it by a human being, except where such orders would conflict with the First Law.’ And the First Law states: ‘A robot may not injure a human being or, through inaction, allow a human being to come to harm.’ A human being could order a robot to destroy himself–and a robot would then use his own strength to smash his own skull. And if a human being attacked a robot, that robot could not fend off the attack without harming the human being, which would violate First Law.”

Daneel said, “You are, I suppose, thinking of Earth’s robots. On Aurora–or on any of the Spacer worlds–robots are regarded more highly than on Earth and are, in general, more complex, versatile, and valuable. The Third Law is distinctly stronger in comparison to the Second Law on Spacer worlds than it is on Earth. An order for self-destruction would be questioned and there would have to be a truly legitimate reason for it to be carried through–a clear and present danger. And in fending off an attack, the First Law would not be violated, for Auroran robots are deft enough to immobilize a human being without hurting him.”

“Suppose, though, that a human being maintained that, unless a robot destroyed himself, he–the human being–would be destroyed? Would not the robot then destroy himself?”

“An Auroran robot would surely question a mere statement to that effect. There would have to be clear evidence of the possible destruction of a human being.”

“Might not a human being be sufficiently subtle to so arrange matters in such a way as to make it seem to a robot that that human being was indeed in great danger? Is it the ingenuity that would be required that makes you eliminate the unintelligent, inexperienced, and young?”

And Daneel said, “No, Partner Elijah, it is not.”

“Is there an error in my reasoning?”

“None.”

“Then the error may lie in my assumption that he was physically damaged. He was not, in actual fact, physically damaged. Is that right?”

“Yes, Partner Elijah.”

(That meant Demachek had had her facts straight, Baley thought.)

“In that case, Daneel, Jander was mentally damaged. Roblock! Total and irreversible!”

“Roblock?”

“Short for robot-block, the permanent shutdown of the functioning of the positronic pathways.”

“We do not use the word ‘roblock’ on Aurora, Partner Elijah.”

“What do you say?”

“We say ‘mental freeze-out.”

“Either way, it is the same phenomenon being described.”

“It might be wise, Partner Elijah, to use our expression or the Aurorans you speak to may not understand; conversation may be impeded. You stated a short while ago that different words make a difference.”

“Very well. I will say ‘freeze-out.’–Could such a thing happen spontaneously?”

“Yes, but the chances are infinitesimally small, roboticists say. As a humaniform robot, I can report that I have never myself experienced any effect that could even approach mental freezeout.”

“Then one must assume that a human being deliberately set up a situation in which mental freeze-out would take place.”

BOOK: Asimov's Future History Volume 4
10.94Mb size Format: txt, pdf, ePub
ads

Other books

The Champion by Morgan Karpiel
Heart of Stone by Debra Mullins
The Toymaker's Apprentice by Sherri L. Smith
Mr. Darcy Vampyre by Amanda Grange
No Shelter by Robert Swartwood
The Faceless One by Mark Onspaugh
The Last Spymaster by Lynds, Gayle
Seduced by the Game by Toni Aleo, Cindy Carr, Nikki Worrell, Jami Davenport, Catherine Gayle, Jaymee Jacobs, V. L. Locey, Bianca Sommerland, Cassandra Carr, Lisa Hollett