What Passes for Sacred
Somewhere in the contemporary cacophony of religious and political activism is a group of nontheistic Satanists that have even gone so far as to file lawsuits intended to remind the separation of church and state. And if it's also true that there arose, somewhere, a congregation of nontheistic Friends, i.e., godless Quakers, maybe that, too, makes a certain point. And, here we might consider what constitutes a religion. While atheism is not a religion but, rather, a religious status, and we can reconcile for ourselves the pretense of atheistic Satanists and Christians, there is another circumstance to consider in terms of what constitutes a religion. Compared to pulling God out of a religious institution, we might consider what elevates a new religion. The variation on the theme, here, is that the new religion appears to be godless. As a legal question, it is possible; anthropologically, it's not at all a confusing proposition; politically, it is, of course, a tempestuous question mark. Of the rising tescreality, two questions stand out: One is the difference between God in particular, and sacredness more generally. The other is to wonder what passes for sacred.
To be clear, the rising tescreality is not actually a religion; its creed remains approximately mysterious, its code irresolute, and its cult variable. Nonetheless, there comes a point at which it's hard to not notice that the whole thing is a megalomaniacal pretense asserting human, not divine, gatekeepers to decide who is worthy.
Charles Stross↱, for Scientific American:
It is important to note: Effective altruism is also being described, by tescrealists, as effective accelerationism; when you see "e/acc" in social media usernames or profile bios, this is what it refers to.
Generally speaking, what binds the acronym together is not any overarching creed, but a chain of overlapping arguments within a range of influential people, and its spillover into rank and file. For many, the various elements are seen or heard among their friends working tech jobs. Along the way, though, these elements are already having their effects. Caroline Ellison, for instance, convicted for her role in the FTX crypto exchange collapse, was a rationalist↗, of sorts; that is, she was apparently a proponent of Eliezer Yudkowsky's rationalist screed, a 660,000-word Harry Potter fan fiction story.
Yes, that Yudkowsky:
And all of this, sensible, everyday talk; of course, literary wordplay is insufficient to describe this game of gods and monsters. That "moral obligation" Andreessen suggests presents an interesting question. What does it mean when translated from pitch to practice? The acronym comes from an early draft of Gebru and Torres (2024)↱:
Tescreal easily looks more like any number of cinematic adventures, a fanciful genre that escalates industry leaders to supervillains. But it is in its question of moral obligation that it treads within the scope of the word "religion"↗, cf., Armstrong, "The origins of the Latin religio are obscure … but had imprecise connotations of obligation and taboo". And if, per longtermism, the failure of transhumanist transition would be "profoundly wrong" (Gebru and Torres, 7), we find a sketch of a boundary marker beginning to delineate what is sacred.
____________________
Notes:
Armstrong, Karen. Fields of Blood: Religion and the History of Violence. New York: Alfred A. Knopf, 2014.
Gabru, Timnit and Émile P. Torres. "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". First Monday, v. 29, n. 4. April 2024. FirstMonday.org. 19 January 2025. https://firstmonday.org/ojs/index.php/fm/article/view/13636/11606
Stross, Charles. "Tech Billionaires Need to Stop Trying to Make the Science Fiction They Grew Up on Real". Scientific American. 20 December 2023. ScientificAmerican.com. 19 January 2025. https://www.scientificamerican.com/...ake-the-science-fiction-they-grew-up-on-real/
Torres, Émile P. "The Acronym Behind Our Wildest AI Dreams and Nightmares". TruthDig. 15 June 2023. TruthDig.com. 19 January 2025. https://www.truthdig.com/articles/the-acronym-behind-our-wildest-ai-dreams-and-nightmares/
Somewhere in the contemporary cacophony of religious and political activism is a group of nontheistic Satanists that have even gone so far as to file lawsuits intended to remind the separation of church and state. And if it's also true that there arose, somewhere, a congregation of nontheistic Friends, i.e., godless Quakers, maybe that, too, makes a certain point. And, here we might consider what constitutes a religion. While atheism is not a religion but, rather, a religious status, and we can reconcile for ourselves the pretense of atheistic Satanists and Christians, there is another circumstance to consider in terms of what constitutes a religion. Compared to pulling God out of a religious institution, we might consider what elevates a new religion. The variation on the theme, here, is that the new religion appears to be godless. As a legal question, it is possible; anthropologically, it's not at all a confusing proposition; politically, it is, of course, a tempestuous question mark. Of the rising tescreality, two questions stand out: One is the difference between God in particular, and sacredness more generally. The other is to wonder what passes for sacred.
To be clear, the rising tescreality is not actually a religion; its creed remains approximately mysterious, its code irresolute, and its cult variable. Nonetheless, there comes a point at which it's hard to not notice that the whole thing is a megalomaniacal pretense asserting human, not divine, gatekeepers to decide who is worthy.
Charles Stross↱, for Scientific American:
Billionaires who grew up reading science-fiction classics published 30 to 50 years ago are affecting our life today in almost too many ways to list: Elon Musk wants to colonize Mars. Jeff Bezos prefers 1970s plans for giant orbital habitats. Peter Thiel is funding research into artificial intelligence, life extension and "seasteading." Mark Zuckerberg has blown $10 billion trying to create the Metaverse from Neal Stephenson's novel Snow Crash. And Marc Andreessen of the venture capital firm Andreessen Horowitz has published a "techno-optimist manifesto" promoting a bizarre accelerationist philosophy that calls for an unregulated, solely capitalist future of pure technological chaos.
These men collectively have more than half a trillion dollars to spend on their quest to realize inventions culled from the science fiction and fantasy stories that they read in their teens. But this is tremendously bad news because the past century's science fiction and fantasy works widely come loaded with dangerous assumptions.
SF is a profoundly ideological genre—it's about much more than new gadgets or inventions. Canadian science-fiction novelist and futurist Karl Schroeder has told me that "every technology comes with an implied political agenda." And the tech plutocracy seems intent on imposing its agenda on our planet's eight billion inhabitants.
We were warned about the ideology driving these wealthy entrepreneurs by Timnit Gebru, former technical co-lead of the ethical artificial intelligence team at Google and founder of the Distributed Artificial Intelligence Research Institute (DAIR), and Émile Torres, a philosopher specializing in existential threats to humanity. They named this ideology TESCREAL, which stands for "transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism and longtermism." These are separate but overlapping beliefs in the circles associated with big tech in California. Transhumanists seek to extend human cognition and enhance longevity; extropians add space colonization, mind uploading, AI and rationalism (narrowly defined) to these ideals. Effective altruism and longtermism both discount relieving present-day suffering to fund a better tomorrow centuries hence. Underpinning visions of space colonies, immortality and technological apotheosis, TESCREAL is essentially a theological program, one meant to festoon its high priests with riches.
These men collectively have more than half a trillion dollars to spend on their quest to realize inventions culled from the science fiction and fantasy stories that they read in their teens. But this is tremendously bad news because the past century's science fiction and fantasy works widely come loaded with dangerous assumptions.
SF is a profoundly ideological genre—it's about much more than new gadgets or inventions. Canadian science-fiction novelist and futurist Karl Schroeder has told me that "every technology comes with an implied political agenda." And the tech plutocracy seems intent on imposing its agenda on our planet's eight billion inhabitants.
We were warned about the ideology driving these wealthy entrepreneurs by Timnit Gebru, former technical co-lead of the ethical artificial intelligence team at Google and founder of the Distributed Artificial Intelligence Research Institute (DAIR), and Émile Torres, a philosopher specializing in existential threats to humanity. They named this ideology TESCREAL, which stands for "transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism and longtermism." These are separate but overlapping beliefs in the circles associated with big tech in California. Transhumanists seek to extend human cognition and enhance longevity; extropians add space colonization, mind uploading, AI and rationalism (narrowly defined) to these ideals. Effective altruism and longtermism both discount relieving present-day suffering to fund a better tomorrow centuries hence. Underpinning visions of space colonies, immortality and technological apotheosis, TESCREAL is essentially a theological program, one meant to festoon its high priests with riches.
It is important to note: Effective altruism is also being described, by tescrealists, as effective accelerationism; when you see "e/acc" in social media usernames or profile bios, this is what it refers to.
Generally speaking, what binds the acronym together is not any overarching creed, but a chain of overlapping arguments within a range of influential people, and its spillover into rank and file. For many, the various elements are seen or heard among their friends working tech jobs. Along the way, though, these elements are already having their effects. Caroline Ellison, for instance, convicted for her role in the FTX crypto exchange collapse, was a rationalist↗, of sorts; that is, she was apparently a proponent of Eliezer Yudkowsky's rationalist screed, a 660,000-word Harry Potter fan fiction story.
Yes, that Yudkowsky:
At the heart of TESCREALism is a "techno-utopian" vision of the future. It anticipates a time when advanced technologies enable humanity to accomplish things like: producing radical abundance, reengineering ourselves, becoming immortal, colonizing the universe and creating a sprawling "post-human" civilization among the stars full of trillions and trillions of people. The most straightforward way to realize this utopia is by building superintelligent AGI.
But then, as the AGI finish line got closer, some began to worry that the whole plan might backfire: AGI could actually turn on its creators, destroying humanity and along with it, this utopian future. Rather than ushering in a paradise among the stars, an AGI built "under anything remotely like the current circumstances" would kill "literally everyone on Earth," to quote Yudkowsky. Others in the TESCREAL neighborhood, like Andreessen, disagree, arguing that the probability of doom is very low. In their view, the most likely outcome of advanced AI is that it will drastically increase economic productivity, give us "the opportunity to profoundly augment human intelligence" and "take on new challenges that have been impossible to tackle without AI, from curing all diseases to achieving interstellar travel." Developing AI is thus "a moral obligation that we have to ourselves, to our children and to our future," writes Andreessen.
(Torres↱)
But then, as the AGI finish line got closer, some began to worry that the whole plan might backfire: AGI could actually turn on its creators, destroying humanity and along with it, this utopian future. Rather than ushering in a paradise among the stars, an AGI built "under anything remotely like the current circumstances" would kill "literally everyone on Earth," to quote Yudkowsky. Others in the TESCREAL neighborhood, like Andreessen, disagree, arguing that the probability of doom is very low. In their view, the most likely outcome of advanced AI is that it will drastically increase economic productivity, give us "the opportunity to profoundly augment human intelligence" and "take on new challenges that have been impossible to tackle without AI, from curing all diseases to achieving interstellar travel." Developing AI is thus "a moral obligation that we have to ourselves, to our children and to our future," writes Andreessen.
(Torres↱)
And all of this, sensible, everyday talk; of course, literary wordplay is insufficient to describe this game of gods and monsters. That "moral obligation" Andreessen suggests presents an interesting question. What does it mean when translated from pitch to practice? The acronym comes from an early draft of Gebru and Torres (2024)↱:
What ideologies are driving the race to attempt to build AGI? To answer this question, we analyze primary sources by leading figures investing in, advocating for, and attempting to build AGI. Disturbingly, we trace this goal back to the Anglo-American eugenics movement, via transhumanism. In doing this, we delineate a genealogy of interconnected and overlapping ideologies that we dub the "TESCREAL bundle," where the acronym "TESCREAL" denotes "transhumanism, Extropianism, singularitarianism, (modern) cosmism, Rationalism, Effective Altruism, and longtermism".
These ideologies, which are direct descendants of first-wave eugenics, emerged in roughly this order, and many were shaped or founded by the same individuals.
These ideologies, which are direct descendants of first-wave eugenics, emerged in roughly this order, and many were shaped or founded by the same individuals.
Tescreal easily looks more like any number of cinematic adventures, a fanciful genre that escalates industry leaders to supervillains. But it is in its question of moral obligation that it treads within the scope of the word "religion"↗, cf., Armstrong, "The origins of the Latin religio are obscure … but had imprecise connotations of obligation and taboo". And if, per longtermism, the failure of transhumanist transition would be "profoundly wrong" (Gebru and Torres, 7), we find a sketch of a boundary marker beginning to delineate what is sacred.
____________________
Notes:
Armstrong, Karen. Fields of Blood: Religion and the History of Violence. New York: Alfred A. Knopf, 2014.
Gabru, Timnit and Émile P. Torres. "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". First Monday, v. 29, n. 4. April 2024. FirstMonday.org. 19 January 2025. https://firstmonday.org/ojs/index.php/fm/article/view/13636/11606
Stross, Charles. "Tech Billionaires Need to Stop Trying to Make the Science Fiction They Grew Up on Real". Scientific American. 20 December 2023. ScientificAmerican.com. 19 January 2025. https://www.scientificamerican.com/...ake-the-science-fiction-they-grew-up-on-real/
Torres, Émile P. "The Acronym Behind Our Wildest AI Dreams and Nightmares". TruthDig. 15 June 2023. TruthDig.com. 19 January 2025. https://www.truthdig.com/articles/the-acronym-behind-our-wildest-ai-dreams-and-nightmares/