Warning: The posting below makes reference to what is considered to be A Dangerous Idea. It is so dangerous, it can (conceivably) doom you - and your future descendants - to an existence of suffering simply by reading it. Keep reading at your own peril. Have you ever heard of the Legend that is Roko's Basilisk? https://rationalwiki.org/wiki/Roko's_basilisk I've been fascinated with the idea since I first learned if it. I am transfixed by the concept of an idea so dangerous that just hearing about it can (in theory) endanger your future and even your descendants' futures. Now that you're up-to-speed on the idea (and, incidentally, doomed), I'll draw the connection to this thread. (Or some other thread as determined by James R : Not sure which of your threads to piggyback on.) I did not make the connection between the Basilisk and God until I started reading even deeper into it and came across this re-interpretation, at the link above: "Quite a lot of this article will make more sense if you mentally replace the words "artificial intelligence" with the word "God", and "acausal trade" with "prayer"." In other words, Roko's Basilisk makes a functional stand-in for a God for the purposes of an atheist. An atheist, granting the premises of the Roko's Basilisk conjecture, have a personal (self-serving) imperative to do as Pascal urges: throw as much of their resources at bringing a sentient AI into existence as they can manage. Some further reading: https://en.wikipedia.org/wiki/LessWrong "Discussion of Roko's basilisk was banned on LessWrong for several years because Yudkowsky had stated that it caused some readers to have a nervous breakdown." https://en.wikipedia.org/wiki/Pascal's_wager#:~:text=Pascal's wager "Pascal's Wager: Pascal argues that a rational person should live as though (the Christian) God exists and seek to believe in God. If God does not actually exist, such a person will have only a finite loss (some pleasures, luxury, etc.), whereas if God does exist, he stands to receive infinite gains (as represented by eternity in Heaven) and avoid infinite losses (eternity in Hell)" https://www.lesswrong.com/posts/6ddcsdA2c2XpNpE5x/newcomb-s-problem-and-regret-of-rationality "Newcomb's Problem: Box A is transparent and contains a thousand dollars. Box B is opaque, and contains either a million dollars, or nothing. You can take both boxes, or take only box B. And the twist is that Omega has put a million dollars in box B iff Omega has predicted that you will take only box B."