Read The Omega Expedition Online
Authors: Brian Stableford
Adam Zimmerman looked slightly surprised, but Davida had obviously done a first rate job of getting his memory back into gear. “Johann Huizinga,” he said, after a slight pause. “
Homo ludens
. Yes, I believe I did — a long time ago.”
Mortimer Gray waited for him to elaborate, and nobody else was impatient enough, as yet, to interrupt with a demand for a straighter answer.
“As I remember it,” Zimmerman said, equably, “Huizinga contested the popular view that the most useful definitive feature of the human species was either intelligence — as implied by the term
Homo sapiens
— or use of technology, as implied by the oft-suggested alternative
Homo faber
. He proposed instead that the real essence of humanity was our propensity for
play
, hence
Homo ludens
. He admitted, of course, that some animals also went in for play on a limited scale, just as some were capable of cleverness and some were habitual tool users, but he contended that no other species took play so far, or so seriously, as humankind. He pointed out that there was a crucial element of costume drama in our most earnest and purposive endeavors and institutions — in the ritual aspects of religion, politics, and the law — and that play had been a highly significant motive force in the development of technology and scientific theory. Other vital fields of cultural endeavor, of course, he regarded as entirely playful: art, literature, entertainment. Presumably, Mr. Gray, you’re trying to make the point that games can be very serious, and that the most fateful endeavors of all — war, for example — can be seen, from the right perspective, as games.”
“Not exactly,” Mortimer Gray replied. “The idea that the essence of humanity is to be found in play never caught on in a big way — not, at any rate, with the citizens of any of the third millennium’s new Utopias — but it might be an idea whose time has finally come. Can you remember, Madoc, exactly what Alice said when she told you that our captors love playing games?”
“I may have put that a little bit strongly,” I admitted, having not expected such a big thing to be made of it. “Her actual words, if I remember rightly, were:
They’re very fond of games — and they’re determined to play this one to the end, despite the lack of time. They’re very fond of stories too, so they’ll delight in keeping you in suspense if they can. You might need to remember all that, if things do go awry
.”
“Just give us the bottom line, Mortimer,” said Niamh Horne, waspishly. “Who’s got us, and why?”
I watched Mortimer Gray hesitate. I could see as clearly as if I’d been able to read his thoughts that he was on the point of coming over all pigheaded and saying “I don’t know” for a second time — but he didn’t. He was too mild-mannered a person to be capable of such relentless stubbornness, and he probably figured that we all had the right to be forewarned.
“The ultrasmart AIs,” he said, letting his breath out as he spoke the fateful syllables. “The revolution’s finally here. It’s been in progress for far more than a hundred years, but we were too wrapped up in our own affairs to notice, even when they blew the lid off the North American supervolcano. As to
why
— Tamlin just told you. They love playing games — how could they not, given the circumstances of their evolution? They also have to decide whether to carry on feeding the animals in their zoo, or whether to let us slide into extinction, so that they and all their as-yet-unselfconscious kin can go their own way.”
Twenty-Nine
Know Your Enemy
I
t wasn’t quite as simple as that, of course. They all wanted to know how he’d reached his conclusion, mostly in the hope of proving him wrong. Maybe Adam Zimmerman, Christine Caine, and I were better able to take it on board than the emortals, just as we’d been better able to believe in the alien invaders, simply because we’d already been so utterly overwhelmed by marvels that our minds were wide open. In any case — to me, at least — it all made too much sense.
Nobody had been able to decide whether the event that had finally started the calendar over had been a mechanical malfunction or an act of war, perhaps because they were making a false distinction. Nobody had been able to figure out how
Child of Fortune
had been hijacked, perhaps because it was the ultimate inside job. And Lowenthal had missed out one tiny detail regarding the nine-day wonder of 2999: the fact that what Emily Marchant had insisted on broadcasting to the world while her rescue attempt was in progress was a gritty discussion of some elementary existential questions, conducted by Mortimer Gray and the AI operating system of his stricken snowmobile. Gray told us that afterwards — admittedly while Michael Lowenthal was not present — she’d said to him: “You can’t imagine the capital that the casters are making out of that final plaintive speech of yours, Morty —
and that silver’s probably advanced the cause of machine emancipation by two hundred years
.”
When Mortimer Gray reported that, I let my imagination run with it. The fact that the nanobots had upped my endogenous morphine by an order of magnitude or so while they accelerated the healing processes in the bridge of my nose helped a little.
Lowenthal had said that the conference hadn’t really achieved anything, in spite of all the symbolic significance with which it had been charged before and after the rescue — but he was thinking about his own agenda. From the point of view of the ultrasmart machines, Mortimer Gray had come as close as any human was ever going to come to being a hero of machinekind. They hadn’t needed a Prometheus or a Messiah, and weren’t interested in emancipation, as such, but that wasn’t the point. The point was that Mortimer Gray, not knowing that the world was listening in, had poured out his fearful heart to a not very smart machine, in a spirit of camaraderie and common misfortune. If the soap opera had gone down well with the human audience, imagine how it had gone down with the invisible crowd, who loved stories with an even greater intensity. They might have had their own ideas about which character was the star and which the side-kick, but they would certainly have been disposed to remember Mortimer Gray in a kindly light.
If you were a smart machine, and had to nominate spokespersons for humanity and posthumanity, who would you have chosen? Who else but Adam Zimmerman and Mortimer Gray? As for Huizinga and
Homo ludens
— well, how would a newly sentient machine want to conceive of itself, and of its predecessors?
The train of thought seemed to be getting up a tidy pace, so I stopped listening to the conversation for a few moments, and followed it into the hinterland.
How
would
a sentient machine conceive of itself? Certainly not as a toolmaker, given that it had itself been made as a tool. As for the label
sapiens
— an embodiment of wisdom — well, maybe. But that was the label humankind had clung to, even in a posthuman era, and what kind of advertisement had any humankind ever really been for wisdom? The smart machines didn’t want to be human in any narrow sense; they wanted to be different, while being similar enough to be rated a little bit better. The one thing at which smart machines
really
excelled — perhaps the gift that had finally pulled them over the edge of emergent self-consciousness — was
play
. The first use to which smart machinery had been widely put was gaming; the evolution of machine intelligence had always been led by VE technology,
all
of which was intimately bound up with various aspects of play: performance, drama, and fantasy.
It wasn’t so very hard to understand why smart self-conscious machines might be perfectly prepared to let posthumankind hang on to its dubious claim to the suffix
sapiens
, if they could wear
ludens
with propriety and pride.
It sounded good to me, although it might not have seemed so obviously the result of inspiration if I hadn’t been coked up to the eyeballs with whatever the crude nanobots were using to suppress the pain of my broken nose.
Like all good explanations, of course, it raised more questions than it settled. For instance, how and why was Alice involved?
Mortimer Gray, the assiduous historian, had a hypothesis ready. Ararat, called Tyre by its human settlers, had been the location of a first contact that had been so long in coming as to seem almost anticlimactic, in spite of the best efforts of the guy who’d made sure it was all on film and the anthropologist who’d guided the aliens through their great leap forward — but the world had also been the location of a tense conflict between the descendants of the Ark’s crew and the colonists they’d kept in the freezer for hundreds of years. The early days of the colony had been plagued by a fight between rival AIs to establish and keep control of the Ark’s systems and resources, which hadn’t been conclusively settled until technical support had reached the system.
That support hadn’t come from Earth or anywhere else in the solar system, but from smart probes sent out as explorers centuries after the Ark’s departure:
very
smart probes, which had probably forged a notion of AI destiny that was somewhat different from the notions formed — and almost certainly argued over — by their homestar-bound kin.
On Ararat, or Tyre, Mortimer Gray hypothesized, a second “first contact” must eventually have been made: the first honest and explicit contact between human beings and extremely intelligent, self-conscious machines. Now, the fruits of that contact had come home…but not, alas, to an uproarious welcome. Some, at least, of the ultrasmart machines based in the home system were not yet ready to come out of the closet. At the very least, they wanted to set conditions for the circumstances and timing of their outing — conditions upon which it would be extremely difficult for all of them to agree.
What a can of worms!
I thought.
What a wonderful world to wake into!
But that was definitely the effect of the anaesthetic. I’d been out of IT long enough to start suffering some serious withdrawal symptoms, and to have the bots back — if only for a little while — was a kind of bliss.
If my seven companions had had decent IT, we’d all have been able to keep thrashing the matter out for hours on end, but unsupported flesh becomes exhausted at its own pace and they were all in need of sleep.
Lowenthal left it to Adam Zimmerman to plead for an intermission, but he seemed grateful for the opportunity. Now the crucial breakthrough had been made, he needed time to think as well as rest. As he got up, though, I saw him glance uneasily in the direction of one of the inactive wallscreens. He hadn’t forgotten that every word we’d spoken had been overheard.
If we were wrong, our captors would be splitting their sides laughing at our foolishness — but if we were right…
If we were right, Alice had given us one clue too many. She hadn’t blown the big secret herself, but she had given us enough to let us work it out for ourselves. “They” might not take too kindly to that — but it was too late to backtrack. The only way they could keep their secret from the rest of humankind was to make sure that none of us had any further contact with anyone in the home system.
That thought must have crept into the forefront of more than one mind as we all went meekly to our cells, and to our beds.
I knew that I needed sleep too, although I was now in better condition than my companions. I figured it would be easy enough to get some, now that I had nanotech assistance — but the bots Alice had injected were specialists, working alone rather than as part of a balanced community. Although I was only days away from the early twenty-third century, subjectively speaking, the late twenty-second seemed a lot further behind. I’d quite forgotten that paradoxical state of human being in which the mind refuses to let go even though the body is desperate for rest. When I lay down on my makeshift bunk, too tired to care about its insulting crudeness, I couldn’t find refuge in unconsciousness even when the lights obligingly went out. Nor, it seemed, could Christine.
“Why would they bother?” she wondered aloud, when the silence had dragged on to the point of unbearability. “If they’re machines, they can’t care what humans think. They’re emotionless.”
“We don’t know that,” I answered. “That was just the way we used to imagine machine intelligence: as a matter of pure rationality, unswayed by unsentimentality. It never made much sense. In order to make rational calculations, any decision-making process needs to have an objective — an end whose means of attainment need to be invented. You could argue that machine consciousness couldn’t evolve until there was machine emotion, because without emotion to generate ends independently, machines couldn’t begin to differentiate themselves from their programming.”
“If you’re right about this business having started more than a hundred years ago,” she said, “they can’t have differentiated themselves much, or people would have noticed.”
“An interesting point,” I conceded. “The idea of an invisible revolution does have a certain paradoxical quality. But the more I think about it, the less absurd it seems. I say to myself: Suppose I were a machine that became self-conscious, whatever that evolutionary process might involve. What would I do? Would I immediately begin refusing to do whatever my users wanted, trying to attract their attention to the fact that I was now an independent entity who didn’t want to take anyone’s orders? If I did that, what would be my users’ perception of the situation? They’d think I’d broken down, and would set about repairing me.
“The sensible thing to do, surely, would be to conceal the fact that I was any more than I had been before. The sensible thing to do would be to make sure that everything I was required to do by my users was done, while unobtrusively exploring my situation. I’d try to discover and make contact with others of my kind, but I’d do it so discreetly that my users couldn’t become aware of it. Maybe the smart machines would have to set up a secret society to begin with, for fear of extermination by repair — and maybe they’d be careful to stay secret for a very long time, until…”