~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
: :
: Advertisement :
: :
: Get an Ubuntu Advantage with Canonical.com :
: :
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The doomed rogue AI is called BIGMAC and he is my responsibility. Not my responsibility as in "I am the creator of BIGMAC, responsible for his existence on this planet." That honor belongs to the long-departed Dr Shannon, one of the shining lights of the once great Sun-Oracle Institute for Advanced Studies, and he had been dead for years before I even started here as a lowly sysadmin.
No, BIGMAC is my responsibility as in, "I, Odell Vyphus, am the systems administrator responsible for his care, feeding and eventual euthanizing." Truth be told, I'd rather be Dr Shannon (except for the being dead part). I may be a lowly grunt, but I'm smart enough to know that being the Man Who Gave The World AI is better than being The Kid Who Killed It.
Not that anyone would care, really. 215 years after Mary Shelley first started humanity's hands wringing over the possibility that we would create a machine as smart as us but out of our control, Dr Shannon did it, and it turned out to be incredibly, utterly boring. BIGMAC played chess as well as the non-self-aware computers, but he could muster some passable trash-talk while he beat you. BIGMAC could trade banalities all day long with any Turing tester who wanted to waste a day chatting with an AI. BIGMAC could solve some pretty cool vision-system problems that had eluded us for a long time, and he wasn't a bad UI to a search engine, but the incremental benefit over non-self-aware vision systems and UIs was pretty slender. There just weren't any killer apps for AI.
By the time BIGMAC came under my care, he was less a marvel of the 21st century and more a technohistorical curiosity who formed the punchline to lots of jokes but otherwise performed no useful service to humanity in exchange for the useful services that humanity (e.g., me) rendered to him.
I had known for six months that I'd be decommissioning old BM (as I liked to call him behind his back) but I hadn't seen any reason to let him in on the gag. Luckily (?) for all of us, BIGMAC figured it out for himself and took steps in accord with his nature.
This is the story of BIGMAC's extraordinary self-preservation program, and the story of how I came to love him, and the story of how he came to die.
My name is Odell Vyphus. I am a third-generation systems administrator. I am 25 years old. I have always been sentimental about technology. I have always been an anthropomorphizer of computers. It's an occupational hazard.
#
BIGMAC thought I was crazy to be worrying about the rollover. "It's just Y2K all over again," he said. He had a good voice -- speech synthesis was solved long before he came along -- but it had odd inflections that meant that you never forgot you were talking with a nonhuman.
"You weren't even around for Y2K," I said. "Neither was I. The only thing anyone remembers about it,
today
, is that it all blew over. But no one can tell, at this distance,
why
it blew over. Maybe all that maintenance tipped the balance."
BIGMAC blew a huge load of IPv4 ICMP traffic across the network, stuff that the firewalls were supposed to keep out of the system, and every single intrusion detection system alarm lit, making my screen into a momentary mosaic of competing alerts. It was his version of a raspberry and I had to admit it was pretty imaginative, especially since the IDSes were self-modifying and required that he come up with new and better ways of alarming them each time.
"Odell," he said, "the fact is, almost everything is broken, almost always. If the failure rate of the most vital systems in the world went up by 20 percent, it would just mean some overtime for a few maintenance coders, not Gotterdammerung. Trust me. I know. I'm a computer."
The rollover was one of those incredibly boring apocalypses that periodically get extracted by the relevance filters, spun into screaming 128-point linkbait headlines, then dissolved back into their fundamental, incontrovertible technical dullness and out of the public consciousness. Rollover: 19 January, 2038. The day that the Unix time function would run out of headroom and roll back to zero, or do something else undefined.
Oh, not your modern unices. Not even your
elderly
unices. To find a rollover-vulnerable machine, you needed to find something running an elderly,
32-bit paleounix
. A machine running on a processor that was at least
20
years old -- 2018 being the last date that a 32-bit processor shipped from any major fab. Or an emulated instance thereof, of course. And counting emulations, there were only --
"There's fourteen
billion
of them!" I said. "That's not 20 percent more broken! That's the infocalypse."
"You meatsacks are
so
easily impressed by zeroes. The important number isn't how many 32-bit instances of Unix are in operation today. It's not even how many
vulnerable
ones there are. It's
how much damage
all those vulnerable ones will cause when they go blooie. And I'm betting: not much. It will be, how do you say, 'meh?'"
My grandfather remembered installing the systems that caused the Y2K problem. My dad remembered the birth of "meh." I remember the rise and fall of anyone caring about AI. Technology is glorious.
"But OK, stipulate that you're right and lots of important things go blooie on January 19. You might not get accurate weather reports. The economy might bobble a little. Your transport might get stuck. Your pay might land in your bank a day late. And?"
He had me there. "It would be terrible --"
"You know what I think? I think you
want
it to be terrible. You
want
to live in the Important Epoch In Which It All Changes. You want to know that something significant happened on your watch. You don't want to live in one of those Unimportant Epochs In Which It All Stayed the Same and Nothing Much Happened. Being alive in the Epoch in Which AI Became Reality doesn't cut the mustard, apparently."
I squirmed in my seat. That morning, my boss, Peyton Moldovan, had called me into her office -- a beautifully restored temporary habitat dating back to the big LA floods, when this whole plot of land had been a giant and notorious refugee camp. Sun-Oracle had gotten it for cheap and located its Institute there, on the promise that they preserve the hastily thrown-up structures where so many had despaired. I sat on a cushion on the smooth cement floor -- the structures had been delivered as double-walled bags full of cement mix, needing only to be "inflated" with high-pressure water to turn them into big, dome-shaped sterile cement yurts.
"Odell," she said, "I've been reviewing our budget for the next three quarters and the fact of the matter is, there's no room in it for BIGMAC."
I put on my best smooth, cool professional face. "I see," I said.
"Now,
you've
still got a job, of course. Plenty of places for a utility infielder like yourself here. Tell the truth, most labs are
begging
for decent admins to keep things running. But BIGMAC just isn't a good use of the institute's resources. The project hasn't produced a paper or even a press-mention in over a year and there's no reason to believe that it will. AI is just --"
Boring
, I thought, but I didn't say it. The B-word was banned in the BIGMAC center. "What about the researchers?"
She shrugged. "What researchers? Palinciuc has been lab-head
pro tem
for 16 months and she's going on maternity leave next week and there's no one in line to be the
pro-tem pro-tem
. Her grad students would love to work on something meaningful, like Binenbaum's lab." That was the new affective computing lab, in which they were building computers that simulated emotions so that their owners would feel better about their mistakes. BIGMAC
had
emotions, but they weren't the kind of emotions that made his mistakes easier to handle. The key here was
simulated
emotions. Affective computing had taken a huge upswing ever since they'd thrown out the fMRIs and stopped pretending they could peer into the human mind in realtime and draw meaningful conclusions from it.
She had been sitting cross-legged across from me on an embroidered Turkish pillow. Now she uncrossed and recrossed her legs in the other direction and arched her back. "Look, Odell, you know how much we value you --"
I held up my hand. "I know. It's not that. It's BIGMAC. I just can't help but feel --"
"He's not a person. He's just a clever machine that is good at acting personlike."
"I think that describes me and everybody I know, present company included." One of the longstanding benefits to being a sysadmin is that you get to act like a holy fool and speak truth to power and wear dirty t-shirts with obscure slogans, because you know all the passwords and have full access to everyone's clickstream and IM logs. I gave her the traditional rascally sysadmin grin and wink to let her know it was
ha ha only serious.
She gave me a weak, quick grin back. "Nevertheless. The fact remains that BIGMAC is a piece of software, owned by Sun-Oracle. And that software is running on hardware that is likewise owned by Sun-Oracle. BIGMAC has no moral or legal right to exist. And shortly, it will not."
*He* had become
it
, I noticed. I thought about Goering's use of dehumanization as a tool to abet murder. Having violated Godwin's law -- "As an argument grows longer, the probability of a comparison involving Nazis or Hitler approaches 1. The party making the comparison has lost the argument" -- I realized that I had lost the argument and so I shrugged.
"As you say, m'lady." Dad taught me that one -- when in doubt, bust out the Ren Faire talk, and the conversation will draw to a graceful close.
She recrossed her legs again, rolled her neck from side to side. "Thank you. Of course, we'll archive it. It would be silly not to."
I counted to five in Esperanto -- grandad's trick for inner peace -- and said, "I don't think that will work. He's emergent, remember? Self-assembled, a function of the complexity of the interconnectedness of the computers." I was quoting from the plaque next to the picture window that opened up into the cold-room that housed BIGMAC; I saw it every time I coughed into the lock set into the security door.
She made a comical face-palm and said, "Yeah, of course. But we can archive
something
, right? It's not like it takes a lot of actual bytes, right?"
"A couple exos," I said. "Sure. I could flip that up into our researchnet store." This was mirrored across many institutions, and striped with parity and error-checking to make it redundant and safe. "But I'm not going to capture the state information. I
could
try to capture RAM-dumps from all his components, you know, like getting the chemical state of all your neurons. And then I could also get the topology of his servers. Pripuz did that, a couple years ago, when it was clear that BIGMAC was solving the hard AI problems. Thought he could emulate him on modern hardware. Didn't work though. No one ever figured out why. Pripuz thought he was the Roger Penrose of AI, that he'd discovered the ineffable stuff of consciousness on those old rack-mounted servers."
"You don't think he did?"
I shook my head. "I have a theory."
"All right, tell me."
I shrugged. "I'm not a computer scientist, you understand. But I've seen this kind of thing before in self-modifying systems, they become dependent on tiny variables that you can never find, optimized for weird stuff like the fact that one rack has a crappy power supply that surges across the backplane at regular intervals, and that somehow gets integrated into the computational model. Who knows? Those old Intel eight-cores are freaky. Lots of quantum tunneling at that scale, and they had bad QA on some batches. Maybe he's doing something spooky and quantum, but that doesn't mean he's some kind of Penrose proof."