Crucible (Sigma Force #14)

Edmonds studied Kat. “I’m surprised she’s still breathing on her own. Unfortunately, I expect that function to deteriorate. Even if it doesn’t, for her long-term care, we’ll need to establish a nasogastric tube to feed her and intubate her to keep her from aspirating.”

Monk shook his head, not denying her this care, but refusing to accept this diagnosis. “So she is conscious for the most part, but unable to move or communicate.”

“Some locked-in patients learn to speak through eye movements, but in your wife’s case, she’s only showing a minimal spontaneous eye movement. Not enough, we believe, to actively communicate.”

Monk stumbled back to the chair, sat down, and took Kat’s hand. “What’s the prognosis? With time, can she recover?”

“You asked me to be blunt, so I will. There is no treatment or cure. It is very rare for patients to recover or regain significant motor control. At best, some minimal arm and leg control, maybe improved eye movements.”

He squeezed her fingers. “She’s a fighter.”

“Still, ninety percent of locked-in patients die within four months.”

The doctor’s phone chimed from a holster on his belt. He tilted the screen to read the text message. “I must go,” he mumbled, distracted, and headed toward the door. “But I’ll write up orders for her intubation.”

Alone again, Monk lowered his forehead to the back of her hand. He pictured the ruins of Gray’s home, the broken crystal angel. She had fought fiercely to protect the girls. And he would do everything he could to get them back.

But in the meantime . . .

“Baby, you keep fighting,” he whispered to her. “This time, for yourself.”


2:02 A.M.

“How could that be?” Gray asked, dumbfounded by Painter’s claim. “Are you suggesting an AGI has already been created? That one already exists or existed?”

Painter lifted a palm toward Gray. “It’s possible. Back in the eighties, a researcher named Douglas Lenat created an early AI called Eurisko. It learned to create its own rules, adjusted to mistakes, even began to rewrite its own code. Most surprising of all, it began to break rules it didn’t like.”

Gray frowned. “Really?”

Painter nodded. “Lenat even tested his program against expert players of a military game. His AI defeated every opponent, three years in a row. During the later years, players changed the rules without informing the developer to better handicap the game in their favor. Still, Eurisko soundly defeated them. Following this, Lenat grew concerned at what his creation was becoming, how it was self-improving. Ultimately, he shut it down and refused to release its code. To this day, it’s still locked up. Many believe the program was on its way to becoming an AGI, all on its own.”

A trickle of dread traced through Gray. “Still, true or not, you believe there’s no stopping this from happening again in the near future.”

“That’s the consensus of the experts. But that’s not their ultimate fear.”

Gray could guess what scared them. “If the creation of an AGI is inevitable, then an ASI will not be far behind.” Before Kowalski could ask, he added. “ASI stands for artificial super intelligence.”

“Thanks for spelling that out,” Kowalski said sourly. “But what exactly is that?”

“Ever see the movie Terminator?” Gray asked. “Where robots destroy mankind in the future? That’s an ASI. A supercomputer that outgrows mankind and decides to get rid of us.”

“But it’s no longer science fiction,” Painter added. “If an AGI is right around the corner, most believe it will not stay a general intelligence for long. Such a self-aware system will seek to improve itself—and rapidly. Researchers call it a hard takeoff or intelligence explosion, where an AGI quickly grows into an ASI. With the speed of computer processing, it could be a matter of weeks, days, hours, if not minutes.”

“And then it’ll try to kill us?” Kowalski asked, sitting up.

Gray knew this was a possibility. We could be the creators of our own end.

“There is no saying for sure,” Painter cautioned. “Such a superintel ligence would certainly be beyond our comprehension and understanding. We’d be little more than ants before a god.”

Gray had enough of these speculations. This threat could wait. He had more pressing and immediate concerns. “What does any of this have to do with the attack, with finding Seichan and Monk’s kids?”

Painter nodded, acknowledging Gray’s impatience. “I was about to get to that. Like I said from the beginning, DARPA has been pouring money into various projects. And by money, I mean billions. Last year’s budget devoted sixty million to machine-learning programs, fifty to cognitive computing, and four hundred to other projects. But what is significant—what is germane to the matter at hand—is the hundred million sent out this year under the category ‘Classified Programs.’”

“In other words,” Gray said, “covert projects.”

“DARPA has been secretly funding a handful of ventures that are not only close to developing the first AGI, but whose research is aimed at a specific goal.”

“And what’s that?”

“To make sure the first AGI to arrive on this planet is a benevolent one.”

Kowalski snorted with derision. “So, Casper the friendly robot.”

“More like ethical,” Gray corrected, well aware of this line of pursuit. “A machine that won’t try to kill us when it ascends to godhood.”

“DARPA has made this a priority,” Painter emphasized. “As have many other research groups. The Machine Intelligence Research Institute. The Center for Applied Rationality. But these organizations are vastly outnumbered by those pursuing the golden ring of an ordinary AGI.”

“That seems stupid,” Kowalski said.

“No, it’s simply cheaper. It’s much easier and faster to build the first AGI than it is to engineer the first safe AGI.”

“And with a prize this valuable,” Gray said, “caution takes a backseat to speed.”

“Knowing that, DARPA has been funding and nurturing talented individuals and projects, those that show promise of creating a friendly AGI.”

Gray sensed Painter was finally getting to his point. “And one of these programs has some bearing on what happened tonight?”

“Yes. A promising project at the University of Coimbra in Portugal.”

Gray frowned. Why did that sound familiar?

Painter reached over to the computer on his desk, tapped a few buttons, and brought up a video feed onto one of the wall monitors. The footage revealed a tabletop view into a stone room. Rows of books filled shelves to either side. A group of women stirred around the table, staring straight into the camera. Lips moved, but there was no sound.

The posturing struck Gray as familiar. He guessed the feed came from a computer’s built-in camera. It appeared the women were studying something on the monitor in that stone room.

“This footage was taken the night of December twenty-first,” Painter said.

Again, something nagged at Gray. The date. The location. Before he could dredge it up, one of the women leaned closer. He recognized her and gasped. He stood up and crossed to the screen.

“That’s Charlotte Carson,” he said, already guessing what would happen next.

“U.S. ambassador to Portugal. She headed a network of women scientists. Bruxas International. The group funded hundreds of female researchers around the world through grants, fellowships, and awards. To accomplish this goal, Bruxas was self-supported for a long time, mostly through the largesse of two founding members—Eliza Guerra and Professor Sato—who were from old and new money respectively. But even their pockets only went so deep. In order to help more women, the group sought out additional support, collecting capital from corporations and government agencies.”

Gray glanced over to Painter. “Let me guess. Including DARPA.”

“Yes, but only to finance a specific handful of their grant recipients. Like one woman’s project called Xénese. Or in English, Genesis.”

“One of DARPA’s friendly AGI projects.”

Painter nodded. “Only Dr. Carson knew of DARPA’s interest in this project. She was sworn to secrecy. Not even the young woman running the program, Mara Silviera—a veritable genius—knew of our involvement. That’s significant.”

“Why?”

“Watch.”

By now Kowalski had joined Gray at the screen. Gray knew what was about to happen, but clearly Kowalski did not. As a group of robed and blindfolded men burst into the room, Kowalski swore. The big man took a step back when the gunfire started. As the women’s bodies crashed to the stone floor, he turned away.

“Motherfuckers,” Kowalski mumbled.

Gray agreed with his characterization, but he kept staring. Charlotte Carson slumped to the floor, mortally wounded, blood pooling under her. Still, her face stared toward the camera, her brow bunched with confusion.

“What is she staring at?” Gray mumbled.

Answering him, Painter zoomed the view to a tiny corner of the screen. Focused on the horror of the attack, Gray had failed to note the small window open there. Painter replayed the last of the footage. A symbol of a pentagram filled the wall monitor. It started to spin wildly, then suddenly broke apart, leaving a single symbol glowing on the screen.

Designed by the author



James Rollins's books