What is longtermism? All about Elon Musk's life philosophy
Longtermism is a philosophical doctrine that is based on focusing our actions on how to save humanity from the future. According to critics of this philosophy, this obsession with the future makes longtermists willing to sacrifice the lives of those who live in the present.
Elon Musk and other techno-billionaires love this philosophy as it justifies their projects, which often represent a waste of millions of dollars on adventures that maybe, just maybe, will be useful in the distant future. (While, at the same time, said techno-billionaires "economize" on wages for their workers or real-life improvements for the people of today.)
One of the fathers of longtermism is William MacAskill, born in Glasgow in 1987 and who began by defending "effective altruism" in his texts. He proposes to live with just enough money (in a monastic way) and dedicate the rest to really priority causes such as helping others. But how do you choose those causes?
Image: by Sam Deere, Wikimedia commons
"Effective altruism" tries to apply economic calculations to charity: you have to invest, argues William MacAskill, in causes that with little investment give a lot of benefit. A kind of application of the rules of stock market capitalism to human solidarity.
But little by little, William MacAskill has been passing over to longtermism, which MacAskill describes as "...an ethical stance which gives priority to improving the long-term future—and proposes that we can make the future better in two ways."
Image: Ryan Cheng/Unsplash
As he writes in his book 'What We Owe the Future,' the two ways we can make the future better are: "by averting permanent catastrophes, thereby ensuring civilization’s survival; or by changing civilization's trajectory to make it better while it lasts...Broadly, ensuring survival increases the quantity of future life; trajectory changes increase its quality."
William MacAskill defends that the concerns of the present should not be abandoned (completely), but other longtermism authors, according to some scholars, have settled into a kind of millenarianism with a couple of fundamental obsessions: artificial intelligence and biological weapons.
Image: Fallon Michael/Unsplash
Climate change is a danger, yes, but longtermists do not give it much importance. Science can solve, they believe, the climate catastrophe.
Image: Roya Ann Miller/Unsplash
William MacAskill himself wrote in an article for the BBC: "Future technology such as advanced artificial intelligence (AI) could give bad political actors far greater ability to entrench their power."
MacAskill continued, "The 20th Century saw totalitarian regimes rise out of democracies, and if that happened again, the freedom and equality we have gained over the last century could be lost. In the worst case, the future might not be controlled by humans at all – we could be outcompeted by super-intelligent AI systems whose goals conflict with our own."
Another longtermismf author, retweeted by Elon Musk, is Nick Ostrom, who has written that space exploration must be prioritized to escape from a planet that, sooner or later, will be uninhabitable.
Image: Pete Linforth/Pixabay
Underlying longtermism is a tendency to take the extinction of the human race as certain and, in the end, what it would be about is preserving an optimal surviving community that would continue civilization.
Image: Nicoli Afina / Unsplash
Longtermism fits in with the vision of new millionaires, enriched with cryptocurrencies and donor of long-term associations. For these 21st century billionaires, the present day is unpleasant and a future (virtual or real) must be invented to replace the miseries of this planet.
Image: By Cointelegraph, wikicommons
One of the problems of longtermism authors is that they try to turn what is pure speculation about the future into mathematics.
Gideon Lewis-Kraus, in a lengthy article for The New Yorker written after a week with some hard core longtermists, wrote: "Depending on the probabilities one attaches to this or that outcome, something like a .0001-per-cent reduction in over-all existential risk might be worth more than the effort to save a billion people today."
Applying logic while avoiding any sentimentality or morality is what leads longtermism authors like Nick Beckstead to say: "It now seems more plausible to me that saving a life in a rich country is substantially more important than saving a life in a poor country."
That is why there are those who think that longtermists are brilliant minds who, deep down, practice escapism. They talk about the future and the threat of artificial intelligence so as not to face the present.
Image: Diane Picchiottino/Unsplash
And even more, in the aforementioned article by Gideon Lewis-Kraus for The New Yorker, he ends up expressing his impression that longtermism is almost like a religious sect.
Image: Timothy Eberly/Unsplash
All it needs is for Elon Musk, in his incessant search for notoriety, power and posterity, to found the Church of Longtermism and establishes himself as its prophet....stranger things have happened.