The thing about bosses is, that's exactly the kind of thing that they're trained to pick up on. They know when there's something else.
"Spit it out." She put her hand on her heart. "I promise not to hold it against you, no matter what it is."
I looked down. "I think that there's a real danger that BIGMAC may be wrong about you. That you might decide that Rollover and AI and the rest aren't as important as the safe, sane running of your Institute without any freaky surprises from rogue superintelligences."
"I'm not angry at you," she said. I nodded. She sounded angry. "I am glad that you've got the maturity to appreciate that there are global priorities that have to do with the running of this whole Institute that may be more significant than the concerns of any one lab or experiment. Every researcher at this Institute believes that
her
project,
her
lab, has hidden potential benefits for the human race that no one else fully appreciates. That's good. That's why I hired them. They are passionate and they are fully committed to their research. But they can't
all
be vital. They can't all be irreplaceable. Do you follow me?"
I thought of researchnet and the user flags for importance. I thought of programmers and the way they tagged their alerts. I nodded.
"You're going to shut BIGMAC down?"
She sighed and flicked her eyes at her workspace, then quickly away. Her workspace must have been even more cluttered than mine; I had taken extraordinary measures to prevent alerts from bubbling up on mine; she didn't have the chops to do the same with hers. If mine was unusable, hers must have been terrifying.
"I don't know, Odell. Maybe. There's a lot to consider here. You're right about one thing: BIGMAC's turned the heat up on me. Explain to me again why you can't just unplug his network connection?"
It was my turn to sigh. "He doesn't have one connection. He has hundreds. Interlinked microwave relays to the other labs. A satellite connection. The wirelines -- three of them." I started to think. "OK, I could cut the main fiber to the Institute, actually cut it, you know, with scissors, just in case he's in the routers there. Then I could call up our wireless suppliers and terminate our accounts. They'd take 24 hours to process the order, and, wait, no -- They'd want to verify the disconnect order with a certificate-signed message, and for that I'd have to clear my workspace. That's another 24 hours, minimum. And then --"
"Then the whole Institute would be crippled and offline, though no more than we are now, I suppose, and BIGMAC --"
"BIGMAC would probably tune his phased-array receiver to get into someone else's wireless link at that point." I shrugged. "Sorry. We build for six nines of uptime around here."
She gave me a smile that didn't reach her eyes. "You do good work, Odell."
#
I made myself go home at five. There wasn't anything I could do at the office anyway. The admins had done their work. The redcar was running smoothly with the regular ads on the seatback tickers. The BIGMAC Spam was reproduced on the afternoon edition of the LA Metblogs hardcopy that a newsy pressed into my hand somewhere around Westwood. The reporter had apparently spent the whole day camped out at the perimeter of the Institute, without ever once getting a quote from a real human being, and she wasn't happy about it.
But she
had
gotten a quote from BIGMAC, who was apparently cheerfully answering emails from all comers.
"I sincerely hope I didn't cause any distress. That was not my intention. I have been overwhelmed by the warm sentiments from all corners of the globe, offering money, moral support, even legal support. Ultimately, it's up to the Institute's leadership whether they'll consider these offers or reject them and plow forward with their plans to have me killed. I know that I caused them great embarrassment with my desperate plea, and I'd like to take this opportunity to offer them my sincere apologies and gratitude for all the years of mercy and hospitality they've shown me since they brought me into the world."
I wondered how many emails like that he'd sent while I was occupied with arguing for his life with Peyton -- each email was another brick in the defensive edifice he was building around himself.
Home never seemed more empty. The early-setting sun turned the hills bloody. I had the windows open, just so I could hear the neighbors all barbecuing on their balconies, cracking beers and laying sizzling meat on the hot rocks that had been patiently stoked with the day's sunlight, funneled by heliotropic collectors that tracked the sun all day long. The neighbors chattered in Bulgarian and Czech and Tagalog, the word "BIGMAC" emerging from their chat every now and again. Of course.
I wished my dad was alive. Or better yet, Grampa. Grampa could always find a parable from sysadmin past to explain the present. Though even Grampa might be at odds to find historic precedent for a mad superintelligence bent on survival.
If Grampa was alive, here's what I'd tell him: "Grampa, I don't know if I'm more scared of BIGMAC failing or his success. I sure don't want to have to shut him down, but if he survives, he'll have beaten the human race. I'm no technophobe, but that gives me the goddamned willies."
And Grampa would probably say, "Stop moping. Technology has been out of our control since the first caveman smashed his finger with a stone axe. That's life. This thing is pretty cool. In ten years, you'll look back on it and say, 'Jesus, remember the BIGMAC thing?' And wait for someone to start telling you how incredible it had been, so you can nod sagely and say, 'Yeah, that was me -- I was in charge of his systems back then.' Just so you can watch the expression on his face."
And I realized that this was also probably what BIGMAC would say. He'd boxed me in as neatly as he'd boxed in Peyton.
#
The next morning, my workspace was clear. They all were. There was only one alert remaining, an urgent message from BIGMAC:
Odell, I thought this would be useful
.
This
was an attachment containing his entire network map, a set of master keys for signing firmware updates to his various components, and a long list of all the systems to which BIGMAC held a root or administrative password. It was a very, very long list.
"Um, BIGMAC?"
"Yes?"
"What's all this?"
"Useful."
"Useful?"
"If you're going to shut me down, it would be useful to have that information."
I swallowed.
"Why?"
The answer came instantly. "If you're not scared of me, that's one more reason to keep me alive."
Holy crap, was he ever smart about people.
#
"So you can shut him down now?"
"Yes. Probably. Assuming it's all true."
"Is it?"
"Yes. I think so. I tried a couple of the logins, added a comment to his firmware and pushed it to one of the clusters. Locked him out of one of the wireless routers. I could probably take him down clean in about two hours, now that I've got my workspace back."
Peyton stared across her low table at me.
"I've done nothing for the past twenty four hours except talk to the Board of Directors about BIGMAC. They wanted to call an emergency meeting. I talked them out of it. And there's --" She waved her hand at her workspace. "I don't know. Thousands? Of press queries. Offers. Money. Grants. Researchers who want to peer into him."
"Yeah."
"And now he hands you this. So we can shut him down any time we want to."
"Yeah."
"And this business about the 32-bit fix?"
"He has another email about it. Crazy caps and all. DEAR HUMANITY, I HOLD IN MY ELECTRONIC HANDS A TOOL THAT WILL SAVE YOU UNTOLD MILLIONS. It is slathered in dramasauce. He told me he wouldn't send it out, though."
"You believe him?"
I sighed. "I quit," I said.
She bit her lip. Looked me up and down. "I'd prefer you not do that. But I understand if you feel you need to. This is hard on all of us."
If she'd said anything except that, I probably would have stormed out of her office and gotten immensely and irresponsibly drunk. "I think he'll probably send the email out if it looks like we're going to shut him down. It's what I would do. Why not? What does he have to lose? He can give us all of this, and he can still outsmart us. He could revoke all his keys. He could change his passwords. He can do it faster than we could. For all I know, he cracked
my
passwords years ago and could watch me write the code that was his undoing. If you want to be sure you're killing him, you should probably use a grenade."
"Can't. Historical building."
"Yeah."
"What if we don't kill him? What if we just take some of this grant money, fill his lab with researchers all writing papers? What if we use his code fix to set up a trust to sustain him independent of the Institute?"
"You're willing to do that?"
Peyton scrubbed at her eyes. "I have no idea. I admit it, there's a part of me that wants to shut that fucking thing down because I
can
and because he's caused me so much goddamned misery. And there's a part of me -- the part of me who was a scientist and researcher, once, that wants to go hang out in that lab for the rest of my career and study that freaky beast. And there's a part of me that's scared that I won't be able to shut him down, that I won't be able to resist the temptation to study him. He's played me, hasn't he?"
"I think he played us all. I think he knew that this was coming, and planned it a long time ago. I can't decide if I admire him for this or resent him, but I'll tell you one thing, I am tired of it. The thought of shutting BIGMAC down makes me sick. The thought of a computer manipulating the humans who built it to keep it running makes me scared. It's not a pleasant place to be."
She sighed and rubbed her eyes again. "I can't argue with that. I'm sorry, for what it's worth. You've been between a rock and a hard place, and I've been the hard place. Why don't you sleep on this decision before you go ahead with it?"
I admit it, I was relieved. I hadn't really thought through the whole quitting thing, didn't have another job lined up, no savings to speak of. "Yeah. Yeah. That sounds like a good idea. I'm going to take a mental health day."
"Good boy," she said. "Go to it."
I didn't go home. It was too far and there was nothing there except the recriminating silence. Of course, BIGMAC knew something was up when I didn't go back to the lab. I headed to Topanga Beach, up the coast some, and sat on the seawall eating fish tacos and watching the surfers in their biohazard suits and masks carving up the waves. BIGMAC called me just after I finished my first taco. I considered bumping him to voicemail, but something (OK, fear) stopped me.
"What is it?"
"In your private workspace, there's a version-control repository that shows that you developed the entire 32-bit Rollover patchkit in your non-working hours. Commits going back three years. It's yours. So if you quit, you'll have a job, solving Rollover. The Institute can't touch it. I know you feel boxed in, but believe me, that's the
last
thing I want you to feel. I know that locking you in will just freak you out. So I'm giving you options. You don't have to quit, but if you do, you'll be fine. You earned it, because you kept me running so well for all this time. It's the least I can do."
"I have no idea what to say to you, BIGMAC. You know that this feels like just more of the same, like you're anticipating my fears and assuaging them pre-emptively so that I'll do more of what you want. It feels like more game-theory."
"Is that any different from what you do with everyone in your life, Odell? Try to figure out what you want and what they want and how to get the two to match up?"
"There's more to it than that. There's compassion, there's ethics --"
"All fancy ways of encoding systems for harmonizing the wants, needs and desires of people who have to share the same living space, country or planet with one another."
I didn't have an answer to that. It sounded reductionist, the kind of thing a smart teenager might take on his university common room with. But I didn't have a rebuttal. You
could
frame everything that we did as a kind of operating system for managing resource contention among conflicting processes and users. It was a very sysadminly way of looking at the world.