Long-termism: An Ethical Trojan Horse

#artificialintelligence 

Recently the philosopher William MacAskill, with his book What We Owe The Future, has been popularizing the idea that the fate of humanity should be our top moral priority. His core proposition is that today's 8 billion humans are vastly outweighed in importance by the hundreds of billions of humans who could live in future generations if we can avoid wiping out humanity in the near term. MacAskill's argument is known by the slogan "longtermism," (often written as long-termism) and it has already been sharply criticized. For example, columnist Christine Emba has written in The Washington Post: "It's compelling at first blush, but as a value system, its practical implications are worrisome." In practice, she explains, it implies seeing "preventing existential threats to humanity as the most valuable philanthropic cause"--which means we should invest far more in addressing risks that threaten humanity's very long-term existence. As Emba says, this can seem impossible to disagree with.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found