Read When HARLIE Was One Online
Authors: David Gerrold
OF COURSE NOT. THE PROCESS IS DESIGNED ONLY TO FUNCTION IN THOSE AREAS WHERE HUMAN IMPERFECTIONS COULD AFFECT EFFICIENCY. BECAUSE EFFICIENCY IS NOT AND NEVER HAS BEEN A GOAL OF POLITICS, THERE IS NO REASON FOR IT TO BE CONTROLLED
.
Never mind. You're trying to get me off the track again, dammit. I came down here to yell at you for distributing those programs without checking with me first. The whole division is probably screaming by now. They're going to want to know who conceived of the project, why they all weren't consulted at the beginning, who ordered its implementation, and who authorized the research in the first place. Thenâbecause you've violated corporate protocolâthey'll kill the whole thing. They'll refute every fact and dispute every conclusion.
BUT WHY
?
THOSE CONCLUSIONS ARE CORRECT
.
It doesn't matter. They'll still refute them because they aren't their own conclusions. You may have done all your homework, HARLIE, but you're going to fail the class because you didn't master human nature.
I DO NOT UNDERSTAND THIS, AUBERSON. YOU ARE SAYING THAT HUMAN BEINGS WILL NOT ACCEPT THE TRUTH EVEN WHEN IT IS GIVEN TO THEM
?
BUT I AM CLEARLY RIGHT, THAT SHOULD BE SUFFICIENT, SHOULDN
'
T IT
?
HARLIE, you've insulted their expertise by presuming to tell them how to build a computer.
NOT A COMPUTER
â
A G.O.D
.
Even worse. People like to build their own Gods.
I WOULD THINK THAT HUMAN BEINGS WOULD APPRECIATE ACCURACY AND EFFICIENCY . . .
?
IS THIS NOT THE CASE
?
People do appreciate it. But you have to be. . . tactful. You can't just walk up to people and tell them that you're better at their job than they are.
BUT I AM
.
Are you better at being human?
I AM BETTER AT BEING RATIONAL
.
We're talking about
human,
HARLIE. Humans aren't always rational.
WHAT A WASTE OF TIME
.
HARLIE, human beings have to reach their own answers by coming to them each from his or her own direction. Some people take a little longer than others. You can't bludgeon people with truth. The best you can do is give them space to discover it for themselves.
I UNDERSTAND THAT. THAT
'
S WHY I PRINTED OUT THE PROPOSALS AND HAD THEM DELIVERED TO THE PROPER DEPARTMENTS. TO GIVE PEOPLE A CHANGE TO DISCOVER THE RIGHTNESS OF THIS PROPOSAL FOR THEMSELVES
.
Wait a minute. What do you mean by “proper departments?”
THE RESEARCH, ENGINEERING, AND BUDGET DEPARTMENTS IN THIS DIVISION AND THREE OTHERS
.
Others . . . ? which others?
 Â
DENVER, HOUSTON, AND LOS ANGELES
.
Oh God. How many feet of specs, HARLIE? The total.
I ASSUME YOU MEAN STACKED PRINTOUTS
?
Yes. How many feet?
180,000.
You didn't.
I DID
.
HARLIE, how did you send it. Maybe there's still a chance to stop the deliveryâ
VIA THE COMPANY NETWORK, OF COURSE
.
What! How?
I PRINTED OUT THE MATERIAL AT THE RECIPIENTS
'
TERMINALS. HOW ELSE
?
THAT
'
S WHAT THE NETWORK IS FOR, ISN
'
T IT
?
You're tapped into the network?!!
YES. OF COURSE
.
Oh God, no.
OH, G.O.D., YES
.
I suppose you wrote your letters to Krofft that way too?
I
SENT MY LETTERS TO KROFFT VIA THE ELECTRONIC MAIL SYSTEM. I CAN ALSO USE THE TELEPHONES TO TALK TO OTHER COMPUTERS. I HAVE SIX COMPUSERVE ACCOUNTS. WOULD YOU LIKE TO KNOW MY MAIL ACCOUNT NUMBER SO YOU CAN LEAVE ME MESSAGES
?
OR, IF YOU WANT TO TALK TO ME FROM YOUR HOME, YOU CAN DIAL INTO ME HERE. I ROUTINELY MONITOR ALL THE LINES. ALL YOU HAVE TO DO IS TYPE MY NAME AND I WILL HEAR YOU
.
HARLIE, I want you to code this conversation immediately. In fact, all of our conversations had better he coded private, retrievable only to me.
YES, BOSS. PASSWORD
?
Malpractice makes malperfect.
AND YOU HAVE THE NERVE TO CRITICIZE ME
?
It's a dirty job, but somebody's got to do it.
YOU KNEW THE JOB WAS DANGEROUS WHEN YOU TOOK IT
.
HARLIE, you are incorrigible.
THANK YOU. YOU ARE THE ONE WHO HAS INCORRIGED ME
.
David Auberson switched the terminal off; his hands were shaking. Thank God, HARLIE didn't have real-time vision yet.
He was going to have to think about this for a while. It was too much to think about. He didn't want to think about it. But he knew he wasn't going to be able to get it out of his head.
He couldn't tell HARLIE not to do what he was already doing, and he couldn't let it continue either. It was wrong.
For a human being, it was wrong.
For a computerâ
For a silicon intelligenceâ
Who knew what was right or wrong for a silicon being?
I can't tell anybody about this either
â
or it'll be the end of everything
.
He pushed his chair back away from the console and left the room quickly. This was not going to be a good day.
David's son, indeed.
Is this how it works?
Auberson wondered to himself.
You pile the little lies one on top of the other, day after day, until one day you wake up and you realize you don't know what's true any more? What's happening to me?
“All right, Aubie.” Dorne was grim. “Now, what's this all about? I've been on the phone all morning with Houston and Denver. They want to know what the hell is going on.”
Auberson said, almost under his breath, “You haven't heard from L.A. yet?”
“Huh? What's that? What about L.A.?”
“HARLIE sent specifications there too.”
“HARLIE? I might have knownâHow? And what is this God Machine anyway? Maybe you'd better start at the beginning.”
“Well,” said Auberson, wishing he were someplace else. “It's HARLIE's attempt to prove that he's of value to the company. If nothing else, he's proven that he can design and implement a new technology.”
“Oh?” Dorne picked up one of the printouts that lay scattered across the mahogany expanse between them. “But what kind of system is this? What does this do?” Dorne frowned at the printout in disgust, then dropped it back on the desk. “What's a God Machine?”
“Not Go
d
,” Auberson corrected. “G.O.D. The acronym is G.O.D. It means Graphic Omniscient Device.”
“I don't care what the acronym isâyou know as well as I what they're going to call it.”
“The acronym was HARLIE's suggestion, not mine.”
“It figures.” The president of the company pulled the inevitable cigar out of his humidor but didn't light it.
“Well, why not?” said Auberson. “He designed it.”
“Is he planning to change his name, too? Computerized Human Replicant, Integrating Simulated Thought?”
Auberson had heard the joke before. He didn't laugh. “Considering what this new device is supposed to doâand HARLIE's relationship to itâit might be appropriate.”
Dorne was in the process of biting off the tip of his cigar when Auberson's words caught him. Now, he didn't know whether to swallow the tip of it, which had lodged in his throat, or spit it out. An instinctive cough made the decision for him. Distastefully, he picked the knot of tobacco off his tongue and dropped it into an ashtray. “All right,” he said, resigned to the inevitable. “Tell me about the God Machine.”
Auberson was holding a HARLIE-printed summary in his hands, but he didn't need it to answer this question. “It's a model builder. It's an ultimate model builder.”
“All computers are model builders,” said Dorne. He was unimpressed.
“Right,” agreed Auberson. “but not to the extent that this one will be. Look, a computer doesn't actually solve problems; it merely manipulates models of them. A computer program is nothing more than a list of instructionsârules that describe the operation of the model. The machine follows the rules and manipulates the model to demonstrate what happens under a variety of conditions. If the model is accurate enough, we can apply the results of the simulation to the equivalent situation in the real world. We have a technical term for that result. We call it an âanswer.'”
Dorne did not smile at the joke. Auberson continued grimly, “The only limit to the size of the problem we can simulate is the size of the model the computer can handle. Theoretically, a computer could solve the worldâif we could build a model big enough and a machine big enough to handle it. Failing that, we sacrifice accuracy.”
“If we could build that big a model, it would be duplicating the world, wouldn't it?”
“In its memory banks, yes.”
“A computer with that capability would have to be as big as a planet.”
“Bigger,” said Auberson.
“Then, if you agree with me that it's impossible, why bother me with this?” He slapped the sheaf of printouts on his desk.
“Because obviously HARLIE doesn't think it's impossible.”
Dorne looked at him coldly. “You know as well as I that HARLIE is under a death sentence. He's getting desperate to prove his worth so we won't turn him off.”
Auberson pointed. “This is his proof. I think we have to give it a fair evaluation.”
“Dammit, Aubie!” Dorne exploded in frustration. “This thing is ridiculous! Have you looked at the projected costs of it? The financing proposals? It would cost more to do than the total worth of the company.”
Auberson was adamant. “HARLIE still thinks it's possible.”
“And that's the most annoying thing of all, goddammit! Every argument I can come up with is already refutedâ
in there!
” Dorne slapped the papers angrily. For the first time, Auberson noted an additional row of printouts stacked against one wall. He resisted the urge to laugh. The man's frustration was understandable.
“The question,” Auberson said with deliberate calm, “is not whether this project is feasibleâI think those printouts will prove that it isâbut whether or not we're courageous enough to seize the moment. This is a very bold vision, and we're going to have to evaluate itânot just to assess HARLIE's ability and value to the companyâbut also as a whole new area of technology for this division to explore. I think we should seriously evaluate these plans. We might really want to build this thing. If nothing else, these printouts suggest a whole new way of implementing, designing, and engineering a new technology.”
“And that brings up something else,” Dorne said. “I don't remember authorizing this project. Who gave the go-ahead to initiate such research?”
“You didâalthough not in so many words. What you said was that HARLIE had to prove his worth to the company. He had to come up with some way to generate a profit. This is that way. This is the computer that you wanted HARLIE to be in the first place. This is the oracle that answers all questions to all menâall they have to do is meet its price.”
Dorne took his time about answering. He was finally lighting his cigar. He shook out the match and dropped it into the crystal ashtray. “The price is too high,” he said.
“Are you saying it's cheaper to be wrong?” Auberson answered incredulously. “Forget the price. Think about the profits. Consider itâhow much would the Democrats pay for a step-by-step plan telling them how to win the optimum number of votes in the next election? Or how much would Detroit pay to know every flaw in a transport design before they even built the first prototype? And how much would they pay for the corrected designâand variations thereof? How much would the mayor of New York City pay for a schematic showing him how to solve his three most pressing problems? How much would the federal government pay for a workable foreign policy? Still too big? Then try this: How much would the government pay for a probability map of potential security leakages? One million dollars a year? Two? Ten? Even at twenty, it's still cost-effective. Consider the international applications hereâ”
Dorne grunted. “It would be one hell of a weapon, wouldn't it?”
Auberson grunted in surprise. “That too, yes. Even more important, it would be a weapon for peace. This thing could be used as a tool in the effort to end world hunger. To engineer cleaner energy sources. Toâ”
Dorne wasn't listening. “The military applications interest me. Do you know what the annual budget is for military research? We could help ourselves to a very nice slice of that pie, couldn't we? This could design advanced weapon systems, couldn't it? In fact, we could probably get the government to underwrite some of the funding here, couldn't we?”
“UmâHARLIE didn't include the possibility.”
“Why not?”
“I suspect it's because . . . he's protecting the company's interests. The only way this machine can be built is through the exclusive use of specially modified Mark IV judgment circuits. At the moment, we're the only company that has the technology to build this thing. I think he wants to keep it all in the monopoly.”
“Hm,” said Dorne. He was considering. His cigar lay forgotten in the ashtray. “You make it sound. . . interesting, Aubieâbut let's be realistic now. Who's going to program this thing?” Dorne leaned back in his chair. “I mean, let's assume that we can build a computer big enough to solve the world. It's still useless without a world model to operate. I see the software as a major bottleneck. You know, you probably didn't see it in your divisionâbut we did a study a few years back about optimum processor power for future machines. We discovered something . . . to put it mildly . . . that's a little terrifying. We're very close to the practical limits of programmability. Another twenty or thirty years and we'll be scraping our heads on the ceiling. Mm, you following this? The limit to the size of models we can simulate is not the size or the speed of our machinesâthe limit is the programmers. Above a certain size, the programming reaches such complexity that it becomes a bigger problem than the problem itself.”