# Nick Bostrom

Nick Bostrom
2006
Born Niklas Boström
10 March 1973 (age 41)
Alma mater
Occupation Philosopher
Employer University of Oxford
Known for Existential risk
Movement Transhumanism

Nick Bostrom (born Niklas Boström on 10 March 1973[1]) is a Swedish philosopher at St. Cross College, University of Oxford known for his work on existential risk, the anthropic principle, human enhancement ethics, the reversal test, and consequentialism. He holds a PhD from the London School of Economics (2000). He is the founding director of both The Future of Humanity Institute and the Oxford Martin Programme on the Impacts of Future Technology as part of the Oxford Martin School at Oxford University.[2]

He is the author of over 200 publications,[3] including Superintelligence: Paths, Dangers, Strategies and Anthropic Bias. He has been awarded the Eugene R. Gannon Award and has been listed in Foreign Policy's Top 100 Global Thinkers list.

## Early life and education

Bostrom was born in 1973 in Helsingborg, Sweden.[4] He pursued postgraduate studies in theoretical physics and philosophy at Stockholm University, and computational neuroscience at King’s College in London. He got his PhD from the London School of Economics in 2000. He held a teaching position at Yale University (2000-2002), and he was a British Academy Postdoctoral Fellow at the University of Oxford (2002-2005).[5]

## Philosophy

### Existential risk

An important strand of Bostrom’s research concerns the future of humanity and long-term outcomes.[6][7] He introduced the concept of an existential risk, which he defines as one in which an "adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." In the 2008 volume "Global Catastrophic Risks", editors Bostrom and Cirkovic characterize the relation between existential risk and the broader class of global catastrophic risks, and link existential risk to observer selection effects[8] and the Fermi paradox.[9] In In a 2013-paper in the journal Global Policy, Bostrom offers a taxonomy of existential risk and proposes a reconceptualization of sustainability in dynamic terms, as a developmental trajectory that minimizes existential risk.[10]

Bostrom has argued that, from a consequentialist perspective, even small reductions in the cumulative amount of existential risk that humanity will face is extremely valuable, to the point where the traditional utilitarian imperative—to maximize expected utility—can be simplified to the Maxipok principle: maximize the probability of an OK outcome (where an OK outcome is any that avoids existential catastrophe).[11][12]

He has suggested that technology policy aimed at reducing existential risk should seek to influence the order in which various technological capabilities are attained, proposing the Principle of Differential Technological Development, which states that we ought to retard the development of dangerous technologies, particularly ones that raise the level of existential risk, and accelerate the development of beneficial technologies, particularly those that protect against the existential risks posed by nature or by other technologies.

### Anthropic reasoning

Bostrom has published numerous articles on anthropic reasoning, as well as the book Anthropic Bias: Observation Selection Effects in Science and Philosophy. In the book, he criticizes previous formulations of the anthropic principle, including those of Brandon Carter, John Leslie, John Barrow, and Frank Tipler.[13]

Bostrom showed how there are problems in various different areas of inquiry (including in cosmology, philosophy, evolution theory, game theory, and quantum physics) that involve a common set of issues related to the handling of indexical information. He argued that a theory of anthropics is needed to deal with these. He introduced the Self-Sampling Assumption (SSA) and the Self-Indication Assumption (SIA) and showed how they lead to different conclusions in a number of cases. He pointed out that each is affected by paradoxes or counterintuitive implications in certain thought experiments (the SSA in e.g. the Doomsday argument; the SIA in the Presumptuous Philosopher thought experiment). He suggested that a way forward may involve extending SSA into the Strong Self-Sampling Assumption (SSSA), which replaces "observers" in the SSA definition by "observer-moments". This could allow for the reference class to be relativized (and he derived an expression for this in the “observation equation”).

In later work, he has described the phenomenon of anthropic shadow, an observation selection effect that prevent observers from observing certain kinds of catastrophes in their recent geological and evolutionary past.[14] Catastrophe types that lie in the anthropic shadow are apt to be underestimated unless statistical corrections are made.

### Ethics of human enhancement

Bostrom is favorable towards "human enhancement", or "self-improvement and human perfectibility through the ethical application of science",[15][16] as well as a critic of bio-conservative views.[17] He has proposed the reversal test for reducing status quo bias in bioethical discussions of human enhancement.[18]

In 1998, Bostrom co-founded (with David Pearce) the World Transhumanist Association[15] (which has since changed its name to Humanity+). In 2004, he co-founded (with James Hughes) the Institute for Ethics and Emerging Technologies. In 2005 he was appointed Director of the newly created Future of Humanity Institute in Oxford. Bostrom is the 2009 recipient of the Eugene R. Gannon Award for the Continued Pursuit of Human Advancement [19][20] and was named in Foreign Policy's 2009 list of top global thinkers "for accepting no limits on human potential." [21]

### Simulation argument

Bostrom argues that at least one of the following statements is very likely to be true:

1. The fraction of human-level civilizations that reach a posthuman stage is very close to zero;
2. The fraction of posthuman civilizations that are interested in running ancestor-simulations is very close to zero;
3. The fraction of all people with our kind of experiences that are living in a simulation is very close to one.

To estimate the probability of at least one of those propositions holding, he offers the following equation:[22]

$f_\textrm{sim} = \frac{f_\textrm{p}NH} {(f_\textrm{p}NH)+H}$

where:

$f_\textrm{p}$ is the fraction of all human civilizations that will reach a technological capability to program reality simulators.
$N$ is the average number of ancestor-simulations run by the civilizations mentioned by $f_\textrm{p}$.
$H$ is the average number of individuals who have lived in a civilization before it was able to perform reality simulation.
$f_\textrm{sim}$ is the fraction of all humans who live in virtual realities.

N can be calculated by multiplying the number of civilizations interested in performing such simulations ($f_\textrm{1}$) by the number of simulations run by such civilizations ($N_\textrm{1}$):

$N = f_\textrm{1}$$N_\textrm{1}$

Thus the formula becomes:

$f_\textrm{sim} = \frac{f_\textrm{p}f_\textrm{1}N_\textrm{1}} {(f_\textrm{p}f_\textrm{1}N_\textrm{1})+1}$

Because post-human computing power $N_\textrm{1}$ will be such a large value, at least one of the following three approximations will be true:

$f_\textrm{p}$ ≈ 0
$f_\textrm{1}$ ≈ 0
$f_\textrm{sim}$ ≈ 1

## Television

Bostrom appeared a 2003 episode of Horizon. More recently, he's appeared in two episodes of Closer to Truth hosted by Robert Lawrence Kuhn, once to discuss the universe and once to discuss his agnosticism.

## References

1. ^ nickbostrom.com
2. ^ http://www.oxfordmartin.ox.ac.uk/people/22
3. ^ https://blog.oup.com/2014/04/is-transcendence-possible-the-science-behind-the-film/
5. ^ http://www.nickbostrom.com/cv.pdf
6. ^ — (March 2002). "Existential Risks". Journal of Evolution and Technology 9.
7. ^ http://aeon.co/magazine/world-views/ross-andersen-human-extinction/
8. ^ Tegmark, Max; Bostrom, Nick (2005). "Astrophysics: is a doomsday catastrophe likely?" (PDF). Nature 438 (7069): 754. doi:10.1038/438754a.
9. ^ — (May–June 2008). "Where are they? Why I Hope the Search for Extraterrestrial Life Finds Nothing" (PDF). MIT Technology Review: 72–77.
10. ^ http://www.existential-risk.org/concept.pdf
11. ^ http://www.nickbostrom.com/astronomical/waste.html
12. ^ http://www.nickbostrom.com/existential/risks.html
13. ^ Bostrom, Nick (2002). Anthropic Bias: Observation Selection Effects in Science and Philosophy. New York: Routledge. pp. 44–58. ISBN 0-415-93858-9. Retrieved 22 July 2014.