Nick Bostrom

Jump to: navigation, search
Nick Bostrom
2006
Born Niklas Boström
10 March 1973 (age 41)
Alma mater
Occupation Philosopher
Employer University of Oxford
Known for Existential risk and the anthropic principle
Movement Transhumanism

Nick Bostrom (born Niklas Boström on 10 March 1973[1]) is a Swedish philosopher at St. Cross College, University of Oxford known for his work on existential risk and the anthropic principle. He holds a PhD from the London School of Economics (2000). He is currently the director of both The Future of Humanity Institute and the Oxford Martin Programme on the Impacts of Future Technology as part of the Oxford Martin School at Oxford University.[2]

He is the author of over 200 publications,[3] including Superintelligence: Paths, Dangers, Strategies and Anthropic Bias. He has been awarded the Eugene R. Gannon Award and has been listed in Foreign Policy's Top 100 Global Thinkers list.

In addition to his writing for academic and popular press, Bostrom makes frequent media appearances in which he talks about transhumanism-related topics such as artificial intelligence, superintelligence, mind uploading, cryonics, nanotechnology, human enhancement, and the simulation argument.

Early life and education

Bostrom was born in 1973 in Helsingborg, Sweden.[4]

Philosophy

Ethics of human enhancement

Bostrom is favourable towards "human enhancement", or "self-improvement and human perfectibility through the ethical application of science",[5][6] as well as a critic of bio-conservative views.[7] He has proposed the reversal test for reducing status quo bias in bioethical discussions of human enhancement.[8]

In 1998, Bostrom co-founded (with David Pearce) the World Transhumanist Association[5] (which has since changed its name to Humanity+). In 2004, he co-founded (with James Hughes) the Institute for Ethics and Emerging Technologies. In 2005 he was appointed Director of the newly created Future of Humanity Institute in Oxford. Bostrom is the 2009 recipient of the Eugene R. Gannon Award for the Continued Pursuit of Human Advancement [9][10] and was named in Foreign Policy's 2009 list of top global thinkers "for accepting no limits on human potential." [11]

Existential risk

Bostrom has addressed the philosophical question of humanity's long-term survival.[12][13] He defines an existential risk as one in which an "adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." In the 2008 volume "Global Catastrophic Risks", editors Bostrom and Cirkovic offer a detailed taxonomy of existential risk, and various papers link existential risk to observer selection effects[14] and the Fermi paradox.[15]

Simulation argument

Bostrom argues that at least one of the following statements is very likely to be true:

1. Human civilization is unlikely to reach a level of technological maturity capable of producing simulated realities, or such simulations are physically impossible.
2. A comparable civilization reaching aforementioned technological status will likely not produce a significant number of simulated realities, for any of a number of reasons, such as diversion of computational processing power for other tasks, ethical considerations of holding entities captive in simulated realities, etc.
3. Any entities with our general set of experiences are almost certainly living in a simulation.

To quantify that tripartite disjunction, he offers the following equation:[16]

$f_\textrm{sim} = \frac{f_\textrm{p}NH} {(f_\textrm{p}NH)+H}$

where:

$f_\textrm{p}$ is the fraction of all human civilizations that will reach a technological capability to program reality simulators.
$N$ is the average number of ancestor-simulations run by the civilizations mentioned by $f_\textrm{p}$.
$H$ is the average number of individuals who have lived in a civilization before it was able to perform reality simulation.
$f_\textrm{sim}$ is the fraction of all humans who live in virtual realities.

N can be calculated by multiplying the number of civilizations interested in performing such simulations ($f_\textrm{1}$) by the number of simulations run by such civilizations ($N_\textrm{1}$):

$N = f_\textrm{1}$$N_\textrm{1}$

Thus the formula becomes:

$f_\textrm{sim} = \frac{f_\textrm{p}f_\textrm{1}N_\textrm{1}} {(f_\textrm{p}f_\textrm{1}N_\textrm{1})+1}$

Because post-human computing power $N_\textrm{1}$ will be such a large value, at least one of the following three approximations will be true:

$f_\textrm{p}$ ≈ 0
$f_\textrm{1}$ ≈ 0
$f_\textrm{sim}$ ≈ 1

Anthropic reasoning

Bostrom has published numerous articles on anthropic reasoning, as well as the book Anthropic Bias: Observation Selection Effects in Science and Philosophy. In the book, he criticizes previous formulations of the anthropic principle, including those of Brandon Carter, John Leslie, John Barrow, and Frank Tipler.[17]

Television

Bostrom appeared a 2003 episode of Horizon. More recently, he's appeared in two episodes of Closer to Truth hosted by Robert Lawrence Kuhn, once to discuss the universe and once to discuss his agnosticism.

References

1. ^ nickbostrom.com
2. ^ http://www.oxfordmartin.ox.ac.uk/people/22
3. ^ https://blog.oup.com/2014/04/is-transcendence-possible-the-science-behind-the-film/
4. ^ "Nick Bostrom's Home Page". Retrieved 22 July 2014.
5. ^ a b
6. ^ — (2003). "Human Genetic Enhancements: A Transhumanist Perspective" (PDF). Journal of Value Inquiry 37 (4): 493–506. doi:10.1023/B:INQU.0000019037.67783.d5.
7. ^ — (2005). "In Defence of Posthuman Dignity". Bioethics 19 (3): 202–214. doi:10.1111/j.1467-8519.2005.00437.x. PMID 16167401.
8. ^ —; Ord, Toby (2006). "The reversal test: eliminating status quo bias in applied ethics" (PDF). Ethics 116 (4): 656–679. doi:10.1086/505233.
9. ^ http://gannonaward.org/The_Gannon_Award/The_Gannon_Group.html
10. ^ http://www.fhi.ox.ac.uk/archive/2009/eugene_r._gannon_award_for_the_continued_pursuit_of_human_advancement
11. ^ "73. Nick Bostrom". The FP Top 100 Global Thinkers. Foreign Policy. December 2009.
12. ^ — (March 2002). "Existential Risks". Journal of Evolution and Technology 9.
13. ^ http://aeon.co/magazine/world-views/ross-andersen-human-extinction/
14. ^ Tegmark, Max; Bostrom, Nick (2005). "Astrophysics: is a doomsday catastrophe likely?" (PDF). Nature 438 (7069): 754. doi:10.1038/438754a.
15. ^ — (May–June 2008). "Where are they? Why I Hope the Search for Extraterrestrial Life Finds Nothing" (PDF). MIT Technology Review: 72–77.
16. ^ — (19 January 2010). "Are You Living in a Computer Simulation?".
17. ^ Bostrom, Nick (2002). Anthropic Bias: Observation Selection Effects in Science and Philosophy. New York: Routledge. pp. 44–58. ISBN 0-415-93858-9. Retrieved 22 July 2014.