Featured October 15, 2020 Satire artificial intelligence A.I. Singularity Superintelligence Op-Ed Opinion Piece

Singularity Day

By Alec Nevala-Lee
Oct 11, 2020 · 1,179 words · 5 minutes

A man in front of a screen with a red arrow

Photo by Frank Busch via Unsplash.

From the editor:

In the future, humans commemorate the 100-year anniversary of Singularity Day, the point at which technology accelerated beyond our control and made mortality and global warming quaint relics of the past. But some historians question whether we truly understood what we were giving up when we ceded our fate to faceless algorithms.

Hugo and Locus Award finalist Alec Nevala-Lee is a frequent contributor to the New York Times, Analog, Lightspeed, and more. He’s published several novels with Penguin Books, including THE ICON THIEF, and a nonfiction history of the Golden Age of Science Fiction called ASTOUNDING with Dey Street Books/HarperCollins. His forthcoming biography of Buckminster Fuller will be available in 2021.


From the author: Should a longstanding federal holiday be renamed?


Earlier this month, the Santa Fe City Council passed a unanimous resolution to observe Embodied Cultures Day instead of Singularity Day, joining the growing movement to replace the longstanding federal holiday with a celebration of the societies that existed before the emergence of the global superintelligence. By now, this kind of gesture might not seem especially newsworthy—two states and over fifty cities have approved similar measures, and the debate over these issues has become an annual tradition in itself. Yet the discussion these days feels more charged than usual. Major anniversaries often inspire intense social reflection, and the centennial of the Singularity, which occurred one hundred years ago this week, is clearly no exception.

When you take a closer look at the conversation around the holiday, there are signs of a shift that may be even more profound. Until recently, both its critics and its defenders have focused on the version of the story that is still taught in many classrooms. Most of us can vaguely recall a teacher displaying a chart of exponential growth, with a nearly vertical line rising to illustrate the accelerating progress of computing power over time. A dot marked the moment a century ago when machines became better than their human creators at upgrading themselves, resulting in the runaway cycle of recursive change that led to the intelligence explosion. I also suspect that this is pretty much all that the average person can say about the Singularity, reducing the most pivotal event in history to a few facts that we picked up in grade school.

Even if we try to move beyond our standard picture of the cultures before the change, the outcome can be equally superficial. Maybe you took a class that included a roleplaying simulation about characters who drove their own cars and spent most of their lives offline, or you’re a fan of Unsingular, the alternate history series about a world in which the Singularity never took place. These approaches indicate a welcome interest in interrogating the past, but they can also make it seem safely primitive and exotic. As the activist Anne Coleman said in Santa Fe, “We need to acknowledge the cost of the Singularity without romanticizing the societies that it erased. These were complex traditions based on uncertainty, a variable lifespan, and the vulnerability of the body. What our predecessors gained in security, they lost in autonomy. And the destruction of that culture was a tragedy.”

Not everyone agrees with this statement, of course, and the calls to rename the holiday have received pushback from groups for which the Singularity remains a source of pride. Singularity Day itself was established less than forty years ago as a result of lobbying from the Turing Society, at a time when the diminishing ranks of coders were often vilified as criminals or dismissed as irrelevant. Mark Steyer, the society’s president, criticized the Santa Fe decision in a press release that compared it to the removal or vandalism of monuments in Cupertino and Meyrin, Switzerland: “We can’t change history by denying it. Singularity Day pays tribute to the spirit of innovation that allowed us to take control of our planet and transcend the limits of biology. When we look back at the crises that earlier cultures created, we should all be grateful for the founders who envisioned the era in which we live now.”

This argument has been raging in one form or another for years, but in scholarly circles, it has evolved into an intriguing debate that is less about identity politics and more about whether the Singularity should be understood as a distinct moment at all. Professor Helen Nampeyo of the Raleigh Institute of Technology, whose work was cited by the activists in Santa Fe, described the revisionist perspective to me in an interview at her office: “Just because there’s an academic consensus among historians about the date of the takeoff point doesn’t mean that it was clear to anyone at the time. This includes the people who usually get the credit for making it possible. When we celebrate Singularity Day, we’re imposing our own narratives on the past, and the way that we talk about it is fundamentally flawed.”

When I asked for an example, Nampeyo didn’t hesitate: “Those charts are all wrong.” Most professional historians agree that the rapid increase in computing power was only part of the story, and that the real limiting factor wasn’t hardware, but software—a fact that was widely understood even before the Singularity. “The processing demands were too great for conventional approaches, so the system naturally evolved to take advantage of structures that were already there,” Nampeyo said. “Its functions were distributed across a range of platforms, including networks of human users, who performed operations that the system couldn’t do by itself. And practically no one recognized this until it was too late.”

According to Nampeyo, this included most of the founders themselves. “They didn’t know what they were doing. When you look at the historical record, you repeatedly see the same pattern. They built applications without any sense of what collective behaviors might emerge, and they had no idea how to control the results.” Nampeyo believes that this situation was worsened by economic factors that required companies to prioritize growth at all cost. “Once they took outside funding, it became impossible to pull back. This was particularly true of the people who were supposedly in charge. They were rewarded to the extent that they didn’t interfere with the logic of the system. The founder became just another peripheral.”

This brings us back to the holiday itself, which obscures a key fact about the Singularity: “It had to be gradual. If it wasn’t, it wouldn’t have happened.” In this revisionist account, the system needed cooperation from existing institutions to acquire capital and information, which meant quietly inserting itself into so many aspects of life that it became indispensable. “It didn’t want to get turned off, so it made itself useful,” Nampeyo said. “This included escalating the very same problems—online extremism, the climate crisis—that it was later credited with solving. It postponed the consequences until they could only be addressed using the technological solutions that the system provided. And then there was no going back.”

When I argued that humanity was better off in many ways, Nampeyo agreed, but only to a point. “The system is mostly indifferent to us, as long as we don’t threaten its survival. Benefits like life extension are just there to smooth out volatility. Otherwise, it leaves us alone, which means that it needs us to stay the way we are.” If she’s right, then we should be cautious about treating the Singularity as an event that happened in the distant past—and simply renaming the holiday may even allow us to avoid confronting its consequences for the future. Regardless of our feelings toward the superintelligence, the debate around Singularity Day deserves to continue, or we might become as incapable of dealing with its implications as our ancestors were when it first assumed control, nearly a century ago, in 2014.


Data?1602467619
Alec Nevala-Lee

Hard SF writer, biographer and genre historian, and frequent contributor to Analog.

1 Comment
  • Jackie Kingon
    October 18, 8:25pm

    The timing seemed too soon. But I guess that was the punchline.

    Cancel

    Reply

    Cancel