“So you want to save the world. As it turns out, the world cannot be saved by caped crusaders with great strength and the power of flight. No, the world must be saved by mathematicians, computer scientists, and philosophers.”

These are the words of Luke Muehlhauser, currently a research analyst for the EA-aligned Open Philanthropy Project, which has given tens of thousands of dollars to organizations engaged in longtermist projects and research.[26] Silly as they are, they capture, I believe, the grandiosity of longtermist thinking. Longtermists are the prophets and messiahs of the Precipice, out to “save the world” through…


Longtermism is one of the three main cause areas of the Effective Altruism (EA) movement.[19] Oddly enough, other major cause areas are alleviating global poverty and eliminating factory farming. Thus, there is a direct tension between longtermism, on the one hand, and these other two cause areas, on the other. In some cases, the tension is resolved by explicitly saying, as Beckstead does in his dissertation, that saving rich lives is “substantially more important” than saving poor lives, for the sake of the greater good over the extremely long term.

Others appear to be more tentative in their endorsement of…


There are other reasons for worrying about longtermism gaining more clout. For example, consider Tyler Cowen’s observation that utilitarianism seems to “support the transfer of resources from the poor to the rich … if we have a deep concern for the distant future.” The reason pertains to the features of this ethical theory that we discussed above. The Oxford philosopher Andreas Mogensen echoes this idea in a more recent paper published by the Global Priorities Institute. “It has been assumed,” Mogensen writes, “that utilitarianism concretely directs us to maximize welfare within a generation by transferring resources to people currently living…


The dangers of utilitarian modes of moral reasoning and the utopian promise of eternal life in paradise are well known. As Pinker writes,

utopian ideologies invite genocide for two reasons. One is that they set up a pernicious utilitarian calculus. In a utopia, everyone is happy forever, so its moral value is infinite. Most of us agree that it is ethically permissible to divert a runaway trolley that threatens to kill five people onto a side track where it would kill only one. But suppose it were a hundred million lives one could save by diverting the trolley, or a…


We begin with the following argument:

(p1) Since an existential catastrophe would prevent astronomical numbers of people from coming into existence, and

(p2) since we have an overriding, profound obligation to ensure that astronomical numbers of people come to exist, it follows that

(c) we have an overriding, profound obligation to avoid an existential catastrophe.

As Bostrom wrote in 2013, the potential size of the future implies “that the loss in expected value resulting from an existential catastrophe is so enormous that the objective of reducing existential risks should be a dominant consideration whenever we act out of an impersonal…


The question then is: How many future people could there be? In short, a lot. The first to crunch the numbers was Carl Sagan in a 1983 article published in Foreign Affairs. He calculated that if humanity remains on Earth and survives “over a typical time period for the biological evolution of a successful species,” which he specified as 10 million years, and if the human population remains stable at 4.6 billion (the number of people in 1983), then some 500 trillion people may yet come into existence. …


On the website utilitarianism.net, Will MacAskill and Darius Meissner write that “advocates of utilitarianism have argued that the theory has attractive theoretical virtues such as simplicity.” But I find this misleading: utilitarianism is actually a complex bundle of many different ideas.[4] In this short chapter, I will do my best to outline the features and properties of utilitarianism that are most relevant to the critique below.

The first thing to note is that the question “What is right and wrong?” is different from the question “What is good and bad?” You may have heard the old saying that “whereas deontology…


According to Bostrom’s paper “A History of Transhumanist Thought,” first published in 2005, the term “transhumanism” was coined by Julian Huxley in his 1927 book Religion without Revelation. But this is not true. The first time that Huxley used the term was in a 1951 lecture, although it was employed in roughly its contemporary sense even earlier, in 1940, by a Canadian historian and philosopher. Nonetheless, Huxley clearly stated in his 1957 book New Bottles for New Wine, building upon ideas expounded in 1927, “the human species can, if it wishes, transcend itself — not just sporadically, an individual here…


If you’re the type of person who follows public “intellectuals” like Sam Harris, browses popular media like The New Yorker and Vox, or hopes to do the most good in the world, you have very likely heard about longtermism.[1] It is one of the central ideas in Toby Ord’s popular new book The Precipice, published in 2020, and is closely linked to the concept of an existential risk. Not only has the term become more visible to the public over the past few years — and longtermists have big plans for this trend to continue — but projects associated with…


The Case Against Longtermism

Phil Torres

Click on the chapters below to begin reading. You can also download PDF and EPUB versions of the mini-book here.

SUMMARY:

In this mini-book, I argue that “longtermism” — an ideology closely associated with existential risks and championed by leading Effective Altruists like Toby Ord and Will MacAskill — is extremely dangerous. …

Phil Torres

Author and scholar of existential threats to humanity and civilization. www.xriskology.com. @xriskology

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store