Transhumanist philosopher Nick Bostrom argues that there is about a 50% chance that humankind will be annihilated this century.
Bostrom, who is currently working with Milan Ćirković on a book about existential risks, has given the topic of human extinction considerable attention, including his seminal 2001 paper, Existential Risks: Analyzing Human Extinction Scenarios.
Writing in Global Agenda magazine, Bostrom notes how potentially self-inflicted existential risks have only recently appeared in human history, namely the advent of the atomic bomb. Unfortunately, writes Bostrom, this is the first of many other new threats on the horizon. He lists such potential problems as a deliberately engineered pathogen, particle accelerator experiments, molecular nanotechnology run amok, and runaway superintelligence.
Bostrom also notes possible sociological catastrophes, such as the advent of a repressive totalitarian regime or a transhumanism that could "lead to stagnation and thorough debasement of human life."
While pessimistic of our chances, Bostrom is not defeatist, and he suggests that some serious foresight and energy be put into the problem. "The magnitude of existential risks is not a fixed quantity – it becomes larger or smaller depending on human action," he writes. He says that deliberate steps can be made to reduce many existential risks, noting such endeavors as the mapping of potentially dangerous asteroids. Moreover, some of the studies and countermeasures that would reduce existential risk would also be relevant for mitigating lesser hazards, argues Bostrom.
Ultimately, says Bostrom, the challenge is to neither ignore the risks nor to indulge in gloomy despondency but to "seek understanding and to take the most cost-effective steps to make the world safer."
Tags: existential risks, catastrophes, foresight, nick bostrom, futurism, disaster prevention.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.