Asimov's Future History Volume 1 (53 page)

BOOK: Asimov's Future History Volume 1
11.04Mb size Format: txt, pdf, ePub

Hart cleared his throat and said, “There seems no doubt that the robot can perform certain routine tasks with adequate competence. I have gone over these, for instance, just before coming in and there is very little to find fault with.”

He picked up a long sheet of printing, some three times as long as the average book page. It was a sheet of galley proof, designed to be corrected by authors before the type was set up in page form. Along both of the wide margins of the galley were proofmarks, neat and superbly legible. Occasionally, a word of print was crossed out and a new word substituted in the margin in characters so fine and regular it might easily have been print itself. Some of the corrections were blue to indicate the original mistake had been the author’s, a few in red, where the printer had been wrong.

“Actually,” said Lanning, “there is less than very little to find fault with. I should say there is nothing at all to find fault with, Dr. Hart. I’m sure the corrections are perfect, insofar as the original manuscript was. If the manuscript against which this galley was corrected was at fault in a matter of fact rather than of English, the robot is not competent to correct it.”

“We accept that. However, the robot corrected word order on occasion and I don’t think the rules of English are sufficiently hidebound for US to be sure that in each case the robot’s choice was the correct one.”

“Easy’s positronic brain,” said Lanning, showing large teeth as he smiled, “has been molded by the contents of all the standard works on the subject. I’m sure you cannot point to a case where the robot’s choice was definitely the incorrect one.”

Professor Minott looked up from the graph he still held. “The question in my mind, Dr. Lanning, is why we need a robot at all, with all the difficulties in public relations that would entail. The science of automation has surely reached the point where your company could design a machine, an ordinary computer of a type known and accepted by the public, that would correct galleys.”

“I am sure we could,” said Lanning stiffly, “but such a machine would require that the galleys be translated into special symbols or, at the least, transcribed on tapes. Any corrections would emerge in symbols. You would need to keep men employed translating words to symbols, symbols to words. Furthermore, such a computer could do no other job. It couldn’t prepare the graph you hold in your hand, for instance.”

Minott grunted.

 

Lanning went on. “The hallmark of the positronic robot is its flexibility. It can do a number of jobs. It is designed like a man so that it can use all the tools and machines that have, after all, been designed to be used by a man. It can talk to you and you can talk to it. You can actually reason with it up to a point. Compared to even a simple robot, an ordinary computer with a non-positronic brain is only a heavy adding machine.”

Goodfellow looked up and said, “If we all talk and reason with the robot, what are the chances of our confusing it? I suppose it doesn’t have the capability of absorbing an infinite amount of data.”

“No, it hasn’t. But it should last five years with ordinary use. It will know when it will require clearing, and the company will do the job without charge.”

“The
company
will?”

“Yes. The company reserves the right to service the robot outside the ordinary course of its duties. It is one reason we retain control of our positronic robots and lease rather than sell them. In the pursuit of its ordinary functions, any robot can be directed by any man. Outside its ordinary functions, a robot requires expert handling, and that we can give it. For instance, any of you might clear an EZ robot to an extent by telling it to forget this item or that. But you would be almost certain to phrase the order in such a way as to cause it to forget too much or too little. We would detect such tampering, because we have built-in safeguards. However, since there is no need for clearing the robot in its ordinary work, or for doing other useless things, this raises no problem.”

 

Dean Hart touched his head as though to make sure his carefully cultivated strands lay evenly distributed and said, “You are anxious to have us take the machine. Yet surely it is a losing proposition for U. S. Robots. One thousand a year is a ridiculously low price. Is it that you hope through this to rent other such machines to other universities at a more reasonable price?”

“Certainly that’s a fair hope,” said Lanning.

“But even so, the number of machines you could rent would be limited. I doubt if you could make it a paying proposition.”

Lanning put his elbows on the table and earnestly leaned forward. “Let me put it bluntly, gentlemen. Robots cannot be used on Earth, except in certain special cases, because of prejudice against them on the part of the public. U. S. Robots is a highly successful corporation with our extraterrestrial and spaceflight markets alone, to say nothing of our computer subsidiaries. However, we are concerned with more than profits alone. It is our firm belief that the use of robots on Earth itself would mean a better life for all eventually, even if a certain amount of economic dislocation resulted at first.

“The labor unions are naturally against us, but surely we may expect cooperation from the large universities. The robot, Easy, will help you by relieving you of scholastic drudgery – by assuming, if you permit it, the role of galley slave for you. Other universities and research institutions will follow your lead, and if it works out, then perhaps other robots of other types may be placed and the public’s objections to them broken down by stages.”

Minott murmured, “Today Northeastern University, tomorrow the world.”

Angrily, Lanning whispered to Susan Calvin, “I wasn’t nearly that eloquent and they weren’t nearly that reluctant. At a thousand a year, they were jumping to get Easy. Professor Minott told me he’d never seen as beautiful a job as that graph he was holding and there was no mistake on the galley or anywhere else. Hart admitted it freely.”

The severe vertical lines on Dr. Calvin’s face did not soften. “You should have demanded more money than they could pay, Alfred, and let them beat you down.”

“Maybe,” he grumbled.

Prosecution was not quite done with Professor Hart. “After Dr. Lanning left, did you vote on whether to accept Robot EZ-27?”

“Yes, we did.”

“With what result?”

“In favor of acceptance, by majority vote.”

“What would you say influenced the vote?” Defense objected immediately.

Prosecution rephrased the question. “What influenced you, personally, in your individual vote? You did vote in favor, I think.”

“I voted in favor, yes. I did so largely because I was impressed by Dr. Lanning’s feeling that it was our duty as members of the world’s intellectual leadership to allow robotics to help Man in the solution of his problems.”

“In other words, Dr. Lanning talked you into it.”

“That’s his job. He did it very well.”

“Your witness.”

Defense strode up to the witness chair and surveyed Professor Hart for a long moment. He said, “In reality, you were all pretty eager to have Robot EZ-27 in your employ, weren’t you?”

“We thought that if it could do the work, it might be useful.”

“If
it could do the work? I understand you examined the samples of Robot EZ-27’s original work with particular care on the day of the meeting which you have just described.”

“Yes, I did. Since the machine’s work dealt primarily with the handling of the English language, and since that is my field of competence, it seemed logical that I be the one chosen to examine the work.”

“Very good. Was there anything on display on the table at the time of the meeting which was less than satisfactory? I have all the material here as exhibits. Can you point to a single unsatisfactory item?”

“Well –”

“It’s a simple question. Was there one single solitary unsatisfactory item? You inspected it. Was there?”

The English professor frowned. “There wasn’t.”

“I also have some samples of work done by Robot EZ-27 during the course of his fourteen-month employ at Northeastern. Would you examine these and tell me if there is anything wrong with them in even one particular?”

Hart snapped, “When he did make a mistake, it was a beauty.”

“Answer my question,” thundered Defense, “and only the question I am putting to you! Is there anything wrong with the material?”

Dean Hart looked cautiously at each item. “Well, nothing.”

“Barring the matter concerning which we are here engaged. do you know of any mistake on the part of EZ-27?”

“Barring the matter for which this trial is being held, no.”

 

Defense cleared his throat as though to signal end of paragraph. He said. “Now about the vote concerning whether Robot EZ-27 was to be employed or not. You said there was a majority in favor. What was the actual vote?”

“Thirteen to one, as I remember.”

“Thirteen to one! More than just a majority, wouldn’t you say?”

“No, sir!” All the pedant in Dean Hart was aroused. “In the English language, the word ‘majority’ means ‘more than half.’ Thirteen out of fourteen is a majority, nothing more.”

“But an almost unanimous one.”

“A majority all the same!”

Defense switched ground. “And who was the lone holdout?”

Dean Hart looked acutely uncomfortable. “Professor Simon Ninheimer.”

Defense pretended astonishment. “Professor Ninheimer? The head of the Department of Sociology?”

“Yes, Sir.”

“The
plaintiff?

“Yes, sir.”

Defense pursed his lips. “In other words, it turns out that the man bringing the action for payment of $750,000 damages against my client. United States Robots and Mechanical Men Corporation was the one who from the beginning opposed the use of the robot – although everyone else on the Executive Committee of the University Senate was persuaded that it was a good idea.”

“He voted against the motion, as was his right.”

“You didn’t mention in your description of the meeting any remarks made by Professor Ninheimer. Did he make any?”

“I think he spoke.”

“You
think?”

“Well, he
did
speak.”

“Against using the robot?”

“Yes.”

“Was he violent about it?”

Dean Hart paused. “He was vehement.”

Defense grew confidential. “How long have you known Professor Ninheimer, Dean Hart?”

“About twelve years.”

“Reasonably well?”

“I should say so, yes.”

“Knowing him, then, would you say he was the kind of man who might continue to bear resentment against a robot, all the more so because an adverse vote had –”

Prosecution drowned out the remainder of the question with an indignant and vehement objection of his own. Defense motioned the witness down and Justice Shane called luncheon recess.

 

Robertson mangled his sandwich. The corporation would not founder for loss of three-quarters of a million, but the loss would do it no particular good. He was conscious, moreover, that there would be a much more costly long-term setback in public relations.

He said sourly, “Why all this business about how Easy got into the university? What do they hope to gain?”

The Attorney for Defense said quietly, “A court action is like a chess game, MI. Robertson. The winner is usually the one who can see more moves ahead, and my friend at the prosecutor’s table is no beginner. They can show damage; that’s no problem. Their main effort lies in anticipating our defense. They must be counting on us to try to show that Easy couldn’t possibly have committed the offense – because of the Laws of Robotics.”

“All right,” said Robertson, “that
is
our defense. An absolutely airtight one.”

“To a robotics engineer. Not necessarily to a judge. They’re setting themselves up a position from which they can demonstrate that EZ-27
was no ordinary robot. It was the first of its type to be offered to the public. It was an experimental model that needed field-testing and the university was the only decent way to provide such testing. That would look plausible in the light of Dr. Lanning’s strong efforts to place the robot and the willingness of U. S. Robots to lease it for so little. The prosecution would then argue that the field-test proved Easy to have been a failure. Now do you see the purpose of what’s been going on?”

“But EZ-27
was a perfectly good model,” Argued Robertson. “It was the twenty-seventh in production.”

“Which is really a bad point,” said Defense somberly. “What was wrong with the first twenty-six? Obviously something. Why shouldn’t there be something wrong with the twenty-seventh, too?”

“There was nothing wrong with the first twenty-six except that they weren’t complex enough for the task. These were the first positronic brains of the sort to be constructed and it was rather hit-and-miss to begin with. But the Three Laws held in all of them! No robot is so imperfect that the Three Laws don’t hold.”

“Dr. Lanning has explained this to me, Mr. Robertson, and I am willing to take his word for it. The judge, however, may not be. We are expecting a decision from an honest and intelligent man who knows no robotics and thus may be led astray. For instance, if you or Dr. Lanning or Dr. Calvin were to say on the stand that any positronic brains were constructed ‘hit-and-miss,’ as you just did, prosecution would tear you apart in cross-examination. Nothing would salvage our case. So that’s something to avoid.”

Robertson growled, “If only Easy would talk.”

Defense shrugged. “A robot is incompetent as a witness, so that would do us no good.”

“At least we’d know some of the facts. We’d know how it came to do such a thing.”

Susan Calvin fired up. A dullish red touched her cheeks and her voice had a trace of warmth in it. “We
know
how Easy came to do it. It was ordered to! I’ve explained this to counsel and I’ll explain it to you now.”

“Ordered to by whom?” asked Robertson in honest astonishment. (No one ever told him anything, he thought resentfully. These research people considered
themselves
the owners of U. S. Robots, by God!)

Other books

These Damn Suspicions by Amy Valenti
Las huellas imborrables by Camilla Läckberg
Dragon Bones by Lisa See
Mistress of Elvan Hall by Mary Cummins
Werebeasties by Lizzie Lynn Lee
The Sand Prince by Kim Alexander