Even with a bigger picture viewpoint, longtermism is short-sighted in a cosmological context
Considered one of the most dangerous views in contemporaneity, longtermism might sound ambitious when you forget that humans are not the center of the world.
A while ago, the essay Understanding “longtermism”: Why this suddenly influential philosophy is so toxic brought up the concept of “longtermism”, a philosophy that invites people to think things in the long term, with a bigger picture, about the potential of humanity. The idea resonates not only in academia but also in the work of authors such as Nick Bostrom and researchers of the Future of Humanity Institute (FHI), as well as in the industry with people such as Elon Musk.
But what does longtermism actually mean? Years ago, when I met the writer Alexey Dodsworth, I learned about his thesis in philosophy about the ethics and metaphysics of transhumanism. He discusses the fact that Thomas Hobbes considered death, or more specifically violent death, the biggest of evils (summum malum). This concept was later updated by Hans Jonas when he proposed that the supreme evil is not in death itself, but in the extinction of a species.
When longtermism speaks of existential risks, it is not necessarily concerned with wars or even climate change. In their bigger picture viewpoint, these would just be small details that, in the long run, wouldn’t make so much difference. Despite the fact that world wars are terrible events that already caused the death of millions of civilians, in a longtermist perspective, this is not necessarily an existential risk for humanity since, after all, we are still here.
Émile Torres, author of the essay previously mentioned, is very didactic when explaining why longtermism is problematic. In the past Torres himself has flirted with the concept, even publishing a book about long-term future visions, but now he thinks longtermism is dangerous and controversial for its utilitarianism. In other words, from an utilitarian perspective, if we want to safeguard the human race, we should be investing in the protection of richer countries, as they are supposedly in a more “advanced stage” of development and, therefore, they can…