Lock In

Chapter Thirteen

 

A LIGHT PING ECHOED through my cave. I recognized the tone as a noninvasive hail, a call that would be delivered if the recipient was conscious, but not if not. Hadens, like anyone else, hated to be woken up by random calls in the middle of the night. I pulled up a window to see who it was. It was Tony.

 

I cleared the call, audio only. “You’re up late,” I said.

 

“Deadline on a gig,” Tony replied. “I had a hunch you might have been lying when you said you wanted to sleep.”

 

“I wasn’t lying,” I said. “I just couldn’t sleep.”

 

“What are you doing instead?”

 

“Trying to figure out a whole bunch of shit that unfortunately I can’t tell you much about. And you?”

 

“At the moment, compiling code. Which I can tell you about but which I don’t imagine you care about,” Tony said.

 

“Nonsense,” I said. “I am endlessly fascinated.”

 

“I’ll take that as a challenge,” Tony said, and then the data panel popped up a button. “That’s a door code. Come on over.”

 

Tony was offering me an invite to his liminal space, or at very least a public area of it.

 

I hesitated for a second. Most Hadens were protective of their personal spaces. Tony was offering me an intimacy of sorts. I hadn’t known him that long.

 

But then I decided I was overthinking it and touched the button. It expanded into a doorframe and I stepped through.

 

Tony’s workspace looked like a high-walled retro video game cube, all black space with the walls defined by neon blue lines, off of which branched geometric patterns.

 

“Don’t tell me, let me guess,” I said. “You’re a Tron fan.”

 

“Got it in one,” Tony said. He was at a standing desk, above which a neon-lined keyboard hovered. Beside that was a floating screen with code, with a toolbar slowly pulsing, marking the amount of time until Tony’s code compiled. Above him, rotating slowly, was a swirl of lines, apparently haphazardly connected.

 

I recognized them immediately.

 

“A neural network,” I said.

 

“Also got that in one,” Tony said. His self-image was, like most people’s, a version of his physical self, fitter, more toned, and stylishly clothed. “If you really want to impress me, you’ll tell me the make and model.”

 

“I haven’t the slightest idea,” I admitted.

 

“Amateur,” Tony said, lightly. “It’s a Santa Ana Systems DaVinci, Model Seven. It’s their latest-released iteration. I’m coding a software patch to it.”

 

“Should I be seeing any of this?” I asked, pointing at the code in the display. “I would guess this is all supposed to be confidential.”

 

“It is,” Tony said. “But you don’t look like much of a coder to me—no offense—and I’m willing to guess that the DaVinci up there looks mostly like artfully arranged spaghetti to you.”

 

“That it does.”

 

“Then we’re fine,” Tony said. “And anyway it’s not like you can record anything in here.” Which was true. In personal liminal spaces, visitor recording was turned off by default.

 

I looked up at the model of the neural network hovering over Tony’s head. “It’s strange, isn’t it?” I said.

 

“Neural networks in general, or the DaVinci Seven in particular?” Tony asked. “Because confidentially speaking the D7s are a pain in the ass. Their architecture is kind of screwy.”

 

“I meant in general,” I said, and looked up again. “The fact we’ve got one of these sitting in our skulls.”

 

“Not just in our skulls,” Tony said. “In our brains. Actually in them, sampling neural activity a couple thousand times a second. Once they’re in, you can’t get them out. Your brain ends up adapting to them, you know. If you tried to remove it, you’d end up crippling yourself. More than we already are.”

 

“That’s a cheerful thought.”

 

“If you want really cheerful thoughts, you should worry about the software,” Tony said. “It governs how the networks run, and it’s all really just one kludge after another.” He pointed at his code. “The last software update Santa Ana put out accidentally caused the gallbladder to get overstimulated in about a half a percent of the operators.”

 

“How does that happen?”

 

“Unexpected interference between the D7 and the brain’s neural signals,” Tony said. “Which happens more often than it should. They run all the software through brain simulators before they upload it into customers, but real brains are unique, and Haden brains are even more so because of how the disease messes with the structure. So there’s always something unexpected going on. This patch should fix the problem before it causes gallstones. Or at least if gallstones happen, they won’t be traced back to the neural network.”

 

“Wonderful,” I said. “You’re making me glad it’s not a Santa Ana network in my head.”

 

“Well, to be fair, it’s not just Santa Ana,” Tony said. He nodded at me. “What do you have in there?”

 

“It’s a Raytheon,” I said.

 

“Wow,” Tony said. “Old school. They got out of the neural network business a decade ago.”

 

“I didn’t need to hear that,” I said.

 

Tony waved it off. “Their maintenance is handled by Hubbard,” he said.

 

“Excuse me?” I said. I was momentarily shocked.

 

“Hubbard Technologies,” Tony said. “Lucas Hubbard’s first company, before he formed Accelerant. Hubbard doesn’t build networks—another Accelerant company does that—but Hubbard makes a lot of money off of maintaining the systems of companies who left the field after the first gold rush. He did a lot of the early coding and patching himself, if you believe his corporate PR.”

 

“Okay,” I said. The sudden intrusion of Hubbard into my head, literally as well as figuratively, had thrown me off.

 

“I’ve done work for Hubbard, too,” Tony said. “Just a couple of months ago, as a matter of fact. Trust me, they have their issues.”

 

“Do I want to know?” I asked.

 

“Suffered any colon spasms recently?”

 

“Uh,” I said. “No.”

 

“Then nothing you need to worry about.”

 

“Lovely.”

 

“I’ve worked with them all,” Tony said. “All the networks. The biggest issue isn’t neural interference, actually. It’s basic security.”

 

“Like people hacking into the neural networks,” I said.

 

“Yeah.”

 

“I’ve never heard of that happening.”

 

“There’s a reason for that,” Tony said. “First, the architecture of the neural networks is designed to be complex to make them hard to program on, and hard to access from outside. The D7 being a pain in the ass to deal with is a feature, not a bug. Every other network since the first iteration is designed that way too.

 

“Second, they hire people like me to make sure it doesn’t happen. Half my contracts are for white-hat incursions, trying to get into the networks.”

 

“And what do you do when you get in?” I asked.

 

“Me? I file a report,” Tony said. “With the first iteration of networks the hackers would run blackmail schemes. Fire up a series of gory pictures or put ‘It’s a Small World’ on a repeating loop until the victim paid to make it stop.”

 

“That sucks,” I said.

 

Tony shrugged. “They were dumb,” he said. “Honestly. A computer inside your brain? What the hell did they think was going to happen with that? They got serious about patching when some hacker from Ukraine started giving people arrhythmia just for kicks. That shit’s actual attempted first-degree murder.”

 

“I’m glad they fixed that,” I said.

 

“Well, for now,” Tony said. His code had compiled and he waved his hand to execute it. From above, the network pulsed. It wasn’t just a pretty image. It was an actual simulation of the network.

 

“What do you mean ‘for now’?” I asked.

 

“Think about it, Chris,” Tony said. He pointed at my head. “You’ve got what’s effectively a legacy system in your head. Its upkeep is currently being paid for out of the budget of the National Institutes of Health. When Abrams-Kettering goes into effect next Monday, the NIH will stop paying for upkeep once its current batch of contracts expires. Santa Ana and Hubbard aren’t updating and patching out of the goodness of their corporate hearts, you know. They get paid to do it. When that stops, either someone else is going to have to pay for it, or the updates stop coming.”

 

“And then we’re all screwed,” I said.

 

“Some people will be screwed,” Tony said. “I’ll be fine because this shit is my job and I can hack my own network. You’ll be fine because you can afford to hire someone like me to maintain your network. Our roommates will be fine because I like them and don’t want them to have spam piped into their brains against their will. And the middle-class Hadens will probably be able to pay for a monthly subscription of updates, which is something I know Santa Ana, at least, is already planning for.

 

“Poor Hadens, on the other hand, are kind of fucked. They’ll either get no updates, which will leave them vulnerable to software rot or hacking, or they’ll have to deal with some sort of update model that features, I don’t know, ads. So every morning, before they can do anything else with their day, they’ll have to sit through six goddamn advertisements for new threeps or nutritional powder or bags for their crap.”

 

“So, spam,” I said.

 

“It’s not spam if you agree to it,” Tony said. “They just won’t have much of a choice.”

 

“Swell.”

 

“It’s not just updates,” Tony said. “Think about the Agora. Most of us think of it as a magical free-floating space somewhere out there.” He gestured with his hands. “In fact it’s run out of an NIH server farm outside of Gaithersburg.”

 

“But it’s not on the chopping block,” I said. “There’d be a panic if it was.”

 

“It’s not being cut, no,” Tony said. “But I know the NIH is talking to potential buyers.” He pointed up at the neural network. “Santa Ana’s putting in a bid, Accelerant’s making one, GM’s in, and so is just about every Silicon Valley holding company.”

 

He shrugged. “Whoever eventually buys the farm will probably have to promise to leave the character of the Agora unchanged for a decade or so, but we’ll see how much that’ll be worth. It’s going to be monthly access fees from there for sure. I don’t know how you’d do billboards in the Agora, but I’m pretty sure they’ll figure it out sooner than later.”

 

“You’ve thought about this a lot,” I said, after a minute.

 

Tony smiled, looked away, and made a dismissive wave. “Sorry. It’s a hobbyhorse of mine, I know. I’m not this humorless about most things.”

 

“It’s fine,” I said. “And it’s fine that you’re thinking about it.”

 

“Well, there’s also the side effect that once all these government contracts go kerplooey, my line of work is going to get tougher,” Tony said. “So this is not me being socially active out of the goodness of my own heart. I like to eat. Well, be fed nutritionally balanced liquids, anyway. The Hadens who are walking out this week are making the point that our world is about to be wildly disrupted, and the rest of America doesn’t really seem to give a shit.”

 

“You’re not part of the walkout, though,” I said.

 

“I’m inconsistent,” Tony said. “Or maybe I’m a coward. Or just someone who wants to bank as much money as he can now because he expects things to dry up. I see the wisdom of the walkout. I don’t see it as something I can do right now.”

 

“What about the march on the Mall?” I asked.

 

“Oh, I’ll definitely be going to that,” Tony said, and grinned. “I think we’ll all be going. Are you planning on it?”

 

“I’m pretty sure I’ll be working it,” I said.

 

“Right,” Tony said. “I guess this is a busy week for you.”

 

“Just a little.”

 

“Got thrown into the deep end, it looks like,” Tony said, looking back to his code. “You picked a hell of a week to start your gig.”

 

I smiled at that and looked up again at the pulsing neural network, thinking. “Hey, Tony,” I said.

 

“Yes?”

 

“You said a hacker gave people heart attacks.”

 

“Well, arrhythmia, actually, but close enough for government work,” Tony said. “Why?”

 

“Is it possible for a hacker to implant suicidal thoughts?” I asked.

 

Tony frowned at this for a minute, considering. “Are we talking general feelings of depression, leading to suicidal thoughts, or specific thoughts, like ‘Today I should eat a bullet’?”

 

“Either,” I said. “Both.”

 

“You could probably cause depression through a neural network, yeah,” Tony said. “That’s a matter of manipulating brain chemistry, which is something networks do already”—he pointed up at his network simulator—“although usually accidentally. The patch I’m doing now is designed to stop just that sort of chemistry manipulation.”

 

“What about specific thoughts?”

 

“Probably not,” Tony said. “If we’re talking about thoughts that feel like they’re originating from a person’s own brain. Generating images and noises that come from the outside is trivial—we’re both doing it right now. This room is a mutually agreed-upon illusion. But directly manipulating consciousness so that you make someone think they’re thinking a thought you give them—and then making them act on it—is difficult.”

 

“Difficult or impossible?” I asked.

 

“I never say ‘impossible,’” Tony said. “But when I say ‘difficult’ here I mean that as far as I’ve heard no one’s ever done it. And I don’t know how to do it, even if I wanted to, which I wouldn’t.”

 

“Because it’s unethical,” I prompted.

 

“Hell yes,” Tony said. “And also because I know if I’ve figured it out, someone else has too, because there’s always someone else smarter out there, who may not have ethics. And that would really mess with shit. It’s hard enough to believe in free will as it is.”

 

“So,” I said. “Really difficult but not actually impossible.”

 

“Really really really difficult,” Tony allowed. “But theoretically possible because, hey, it’s a quantum physics universe. Why do you ask, Chris? I sense this is not an entirely idle question.”

 

“What’s your work schedule look like?” I asked.

 

Tony nodded upward. “It looks like my patch is doing what it’s supposed to. Once I clean it up a bit, which should take less than an hour, I’ll send it off and then I’m free.”

 

“Have you ever done work for the federal government?”

 

“I live in Washington, D.C., Chris,” Tony said. “Of course I’ve done work for the government. I have a vendor ID and everything.”

 

“Do you have a security clearance?”

 

“I’ve done confidential work before, yes,” Tony said. “Whether on the level you seem to be thinking about is something I guess we’d have to find out.”

 

“I may have a job for you, then,” I said.

 

“Involving neural networks?”

 

“Yes,” I said. “Hardware and software.”

 

“When would you want me to start?”

 

“Probably tomorrow,” I said. “Probably, like, nine A.M.”

 

Tony smiled. “Well, then,” he said. “I should probably finish up what I’m doing so I can at least attempt to get some sleep.”

 

“Thanks,” I said.

 

“No,” Tony said. “Thank you. It’s not every day that a new housemate comes bearing work. That makes you officially my favorite housemate.”

 

“I won’t tell,” I said.

 

“No, go ahead and tell,” Tony said. “Maybe it will inspire a competition. That’ll work out for me. I could use the work.”

 

 

 

 

 

John Scalzi's books