Paper Analyzes Human Extinction Scenarios

from the broad-scale-thinking dept.
Nick Bostrom writes "This is a beefed-up version of the presenation I gave in a SIG meeting at the recent Foresight gathering. Comments and suggestions would be welcome. Maybe it can develop into a FI white paper?

The aim is to try to get a better view of the threat picture of what I call "existential risks" – ways in which humankind could go extinct or have its potential permanently destroyed. The paper provides a classification of these risks and argues that they have a cluster of features that make ordinary risk management unworkable. It also argues that there is a substantial probability that we will fall victim to one of these risks. The point, of course, is not to welter in gloom and doom, but to understand where the pitfalls are so we can create better strategies for avoiding falling into them. The paper discusses several implications for policy and ethics, and the potential for destructive uses of nanotechnology is given particular attention.

The text is available in two formats: on the web and as an MS Word document. (Footnotes and formatting are nicer in the M$-version.)"

Leave a comment

0
    0
    Your Cart
    Your cart is emptyReturn to Shop