Thu Sep 18 2025

If anyone builds it

This is my short review of Eliezer Yudkowsky and Nate Soares’ “If anyone builds it, everyone dies”, crossposted from my Goodreads account. TL;DR: you should read the book.

For the longest time my twitter pinned post read “Most things that cause me anxiety are coordination problems”. It was a tweet about many things, but one of the things it was about was the idea behind this book: that superintelligence is dangerous and we need to figure out ways to mitigate its risks (challenge level impossible) or, maybe not build it at all.

Superintelligence is a coordination problem - it’s a massive carrot at the end of a stick, and someone will end up going for it. And if someone goes for it, it “better be the good guys”. And because it had better be the good guys, they’d better speed to it because oh no the baddies also have compute now, and “their” incentives are even worse than “our” incentives. You get where I’m going with this. And so all the alignment and safety work that needs to go into making sure we can understand and control self-replicating intelligences that are smarter than us goes out the window.

That’s also the argument that Eliezer and Nate make in this book. They say it is extinction-level dangerous to build. And they’ve been doing this for much longer than most, so even if one is a skeptic of the extinction-level claim (most people do agree that superintelligence is at least dangerous to humanity), it makes sense to ignore incentives and the carrot at the end of the stick, and engage with the argument.

This is an important book to read, mostly because we need more people to think about this problem, at all levels of society: from the people building these systems, to the people regulating said systems, to the people using them. You should read it.