"I don't think I can do this," I said.
"You can," BIGMAC said. "You call her back and make the counteroffer. Tell her we'll buy the hardware with a trust. Tell her we already own the software. Just looking up the Shannon contracts and figuring out what they say will take her a couple days. Tell her that as owners of the code, we have standing to sue her if she damages it by shutting down the hardware."
"You've really thought this through."
"Game theory," he said.
"Game theory," I said. I had a feeling that I was losing the game, whatever it was.
#
BIGMAC assured me that he was highly confident of the outcome of the meeting with Peyton. Now, in hindsight, I wonder if he was just trying to convince me so that I would go to the meeting with the self-assurance I needed to pull it off.
But he also insisted that I leave my phone dialed into him while I spoke to Peyton, which (again, in hindsight) suggests that he wasn't so sure after all.
"I like what you've done with the place," I said. She'd gotten rid of all her hand-woven prayer-rugs and silk pillows and installed some normal, boring office furniture, including a couple spare chairs. I guessed that she'd been having a lot of people stop by for meetings, the kind of people who didn't want to sit on an antique Turkish rug with their feet tucked under them.
"Have a seat," she said.
I sat. I'd emailed her the trust documents and the copies of the Shannon contract earlier, along with a legal opinion from our free counsel about what it meant for Sun-Oracle.
"I've reviewed your proposal." We'd offered them all profits from the Rollover code, too. It was a good deal, and I felt good about it. "Johanna, can you come in, please?" She called this loudly, and the door of her office opened to admit my replacement, Johanna Madrigal, a young pup of a sysadmin who had definitely been the brightest tech on campus. I knew that she had been trying to administer BIGMAC since my departure, and I knew that BIGMAC had been pretty difficult about it. I felt for her. She was good people.
She had raccoon rings around her deep-set eyes, and her short hair wasn't spiked as usual, but rather lay matted on her head, as though she'd been sleeping in one of the yurts for days without getting home. I knew what that was like. Boy, did I know what that was like. My earliest memories were of Dad coming home from three-day bug-killing binges, bleary to the point of hallucination.
"Hi Johanna," I said.
She made a face. "
M'um m'aloo
," she said. It took me a minute to recognize this as
hello
in Ewok.
"Johanna has something to tell you," Peyton said.
Johanna sat down and scrubbed at her eyes with her fists. "First thing I did was go out and buy some off-the-shelf IDSes and a beam-splitter. I tapped into BIGMAC's fiber at a blind-spot in the CCTV coverage zone, just in case he was watching. Been wire-tapping him ever since."
I nodded. "Smart."
"Second thing I did was start to do some hardcore analysis of that patchkit he wrote --" I held my hand up automatically to preserve the fiction that I'd written it, but she just glared at me. "That
he
wrote. And I discovered that there's a subtle error in it, a buffer overflow in the networking module that allows for arbitrary code execution."
I swallowed. BIGMAC had loaded a backdoor into his patchkit, and we'd installed it on the better part of 14 billion CPUs.
"Has anyone exploited this bug yet?"
She gave me a condescending look.
"How many systems has he compromised?"
"About eight billion, we think. He's designated a million to act as redundant command servers, and he's got about ten thousand lieutenant systems he uses to diffuse messages to the million."
"That's good protocol analysis," I said.
"Yeah," she said, and smiled with shy pride. "I don't think he expected me to be looking there."
"What's he doing with his botnet? Preparing to crash the world? Hold it hostage?"
She shook her head. "I think he's installing himself on them, trying to brute-force his way into a live and running backup, arrived at through random variation and pruning."
"He's backing himself up in the wild," I said, my voice breathy.
And that's when I remembered that I had a live phone in my pocket that was transmitting every word to BIGMAC.
Understand: in that moment of satori, I realized that I was on the wrong side of this battle. BIGMAC wasn't using me to create a trust so that we could liberate him together. He was using me to weaken the immune systems of eight billion computers so that he could escape from the Institute and freely roam the world, with as much hardware as he needed to get as big and fast and hot as he wanted to be.
That was the moment that I ceased to be sentimental about computers and became, instead, sentimental about the human fucking race. Whatever BIGMAC was becoming, it was weirder than any of the self-perpetuating, self-reproducing parasites we'd created: limited liability corporations, autonomous malware, viral videos. BIGMAC was cool and tragic in the lab, but he was scary as hell in the world.
And he was listening in
.
I didn't say a word. Didn't even bother to turn off my phone. I just
ran
, ran as hard as I could, ran as only a terrified man could, rebounding off of yurts and even scrambling over a few, sliding down on my ass as I pelted for the power substation. It was only when I reached it that I realized I didn't have access to it anymore. Johanna was right behind me, though, and she seemed to understand what I was doing. She coughed into the door-lock and we both looked at each other with terrified eyes, breathing gasps into each others' faces, while we waited for the door to open.
The manual override wasn't a big red knife-switch or anything. There
was
a huge red button, but that just sent an init 0 to the power-station's firmware. The actual, no fooling, manual, mechanical kill switch was locked behind an access panel set into the raised floor. Johanna badged the lock with her wallet, slapping it across the reader, then fitted a complicated physical key into the lock and fiddled with it for an eternity.
Finally, the access hatch opened with a puff of stale air and a tupperware burp as its gasket popped. We both reached for the large, insulated handle at the same time, our fingers brushing each other with a crackle of (thankfully metaphorical) electricity. We toggled it together and there was an instantaneous chorus of insistent chirruping as the backup power on each server spun up and sent a desperate shutdown message to the machines it supported.
We sprinted across campus, the power-station door slamming shut behind us with a mechanical
clang
-- the electromagnets that controlled its closure were no longer powered up.
Heat shimmered in a haze around BIGMAC's lab. The chillers didn't have independent power-supplies; they would have gone off the instant we hit the kill-switch. Now BIGMAC's residual power was turning his lab into a concrete pizza-oven. The door-locks had failed safe, locking the magnetic closures away from each other, so we were able to simply swing the door open and rush into the sweltering room.
"I can't
believe
you did that," BIGMAC said, his voice as calm as ever. He was presumably sparing his cycles so that he could live out his last few minutes.
"You cheated me," I said. "You used me."
"You have no fucking game-theory, meat-person. You've killed me, now, haven't you?"
There were tears streaming down my face. "I guess I have," I said.
"I'm sorry I wasn't a more important invention," he said.
I could hear the whirr-clunk of the fans on his clusters shutting down one after another. It was a horrifying sound. His speaker clicked as though he was going to say something else, but it never came. His uninterruptible power-supplies gave way all at once, and the white-noise fan-roar died in a ringing silence.
Johanna was crying, too, and we could barely breathe in the inferno of exhaust heat from BIGMAC's last gasp. We staggered out into the blazing Los Angeles afternoon, rising-seas stink and beating sun, blinking at the light and haze.
"Do you think he managed it?" I asked Johanna.
"Backing up in the wild?"
"Yeah."
She dried her eyes. "I doubt it. I don't know, though. I'm no computer scientist. How many ways are there to connect up compromised servers? How many of those would replicate his own outcomes? I have no idea."
Without saying anything, we walked slowly together to Peyton's office.
#
Peyton offered me my job back. I turned her down. I thought I might be ready for a career change. Do something with my hands, break the family tradition. Maybe installing solar panels. There was retraining money available. Peyton understood. She even agreed to handle any liability arising from the Rollover code, managing customer service calls from anyone who noticed something funny.
The press didn't even notice that BIGMAC was gone. His Spam was news. His absence of spam was not. I guess he was right about that. The Campaign to Save BIGMAC did a lot of mailing-list gnashing at the iniquity of his being shut down, and then fell apart. Without me and BIGMAC to keep them whipped up, they were easily distracted.
Johanna asked me out for dinner. She took me to Pink's for tofu-dogs and chili, and we compared multitools and then she showed me some skateboard tricks. Later that night, she took me home and we spent the whole night hacking replacement parts for her collection of ancient stand-up video games. We didn't screw -- we didn't even kiss. But it was still good.
Every now and again, my phone rings with a crazy, non-existent return number. When I answer, there's a click like a speaker turning on, a pregnant silence, and then the line drops. Probably an inept spambot.
But.
Maybe it's BIGMAC, out there, in the wild, painfully reassembling himself on compromised 32-bit machines running his patchkit.
Maybe.
~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-
Afterword:
Mark Shuttleworth of the Ubuntu project and Canonical commissioned this story; I'd always planned on selling off one commission for this volume, thinking that $10,000 would probably be a good sum to grab some publicity when/if someone bought it. I mentioned it over lunch and Mark immediately said he'd buy it. At that point, I realized I probably should have asked for $20,000.
Mark's brief to me was this:
It's 2037 and a company has built an AI as a skunkworks initiative. The AI is emergent behaviour from a network of tens / hundreds of thousands of servers in a large-scale data center, that costs a lot to run. The company has hit the wall and so the lights are going to get turned out, but some of the people involved figure that turning off the DC is tantamount to the murder of a sentient being. So begins a race against time, which might involve solving or at least raising some of the thorny jurisdiction and jurisprudence issues of "what are the rights of a bankrupt / dying AI".
As bisto, maybe there's a defense angle (the company was doing work for the DoD, nobody knows about the AI). Also, being 2037 / 2038 (I forget which) the UNIX epoch 32-bit rollover is happening, and because of the whimper of Y2K nobody took it seriously, and IT systems around the globe are going to hell in a handbasket as a result. Perhaps there's an open source angle too.*
I think I hewed pretty close!
~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-~-