||This biographical article needs additional citations for verification. (June 2009)|
Nick Bostrom (born Niklas Boström on 10 March 1973) is a Swedish philosopher at St. Cross College, University of Oxford known for his work on existential risk and the anthropic principle. He holds a PhD from the London School of Economics (2000). He is currently the director of both The Future of Humanity Institute and the Programme on the Impacts of Future Technology as part of the Oxford Martin School at Oxford University.
He is the author or editor of some 200 publications, including Anthropic Bias (Routledge, 2002), Global Catastrophic Risks (ed., OUP, 2008), and Human Enhancement (ed., OUP, 2009). He has been awarded the Eugene R. Gannon Award and has been listed in Foreign Policy's Top 100 Global Thinkers list. His work has been translated into more than 20 languages, and there have been some 100 translations or reprints of his works.
In addition to his writing for academic and popular press, Bostrom makes frequent media appearances in which he talks about transhumanism-related topics such as cloning, artificial intelligence, superintelligence, mind uploading, cryonics, nanotechnology, and the simulation argument.
Ethics of human enhancement
Bostrom is favourable towards "human enhancement", or "self-improvement and human perfectibility through the ethical application of science", as well as a critic of bio-conservative views. He has proposed the reversal test for reducing status quo bias in bioethical discussions of human enhancement.
In 1998, Bostrom co-founded (with David Pearce) the World Transhumanist Association (which has since changed its name to Humanity+). In 2004, he co-founded (with James Hughes) the Institute for Ethics and Emerging Technologies. In 2005 he was appointed Director of the newly created Future of Humanity Institute in Oxford. Bostrom is the 2009 recipient of the Eugene R. Gannon Award for the Continued Pursuit of Human Advancement  and was named in Foreign Policy's 2009 list of top global thinkers "for accepting no limits on human potential." 
Bostrom has addressed the philosophical question of humanity's long-term survival. He defines an existential risk as one in which an "adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." In the 2008 volume "Global Catastrophic Risks", editors Bostrom and Cirkovic offer a detailed taxonomy of existential risk, and various papers link existential risk to observer selection effects and the Fermi paradox.
Bostrom argues that at least one of the following statements is very likely to be true:
- Human civilization is unlikely to reach a level of technological maturity capable of producing simulated realities, or such simulations are physically impossible.
- A comparable civilization reaching aforementioned technological status will likely not produce a significant number of simulated realities, for any of a number of reasons, such as diversion of computational processing power for other tasks, ethical considerations of holding entities captive in simulated realities, etc.
- Any entities with our general set of experiences are almost certainly living in a simulation.
To quantify that tripartite disjunction, he offers the following equation:
- is the fraction of all human civilizations that will reach a technological capability to program reality simulators.
- is the average number of ancestor-simulations run by the civilizations mentioned by .
- is the average number of individuals who have lived in a civilization before it was able to perform reality simulation.
- is the fraction of all humans who live in virtual realities.
N can be calculated by multiplying the number of civilizations interested in performing such simulations () by the number of simulations run by such civilizations ():
Thus the formula becomes:
Because post-human computing power will be such a large value, at least one of the following three approximations will be true:
- ≈ 0
- ≈ 0
- ≈ 1
- Anthropic Bias: Observation Selection Effects in Science and Philosophy, ISBN 0-415-93858-9
- Global Catastrophic Risks, edited by Nick Bostrom, ISBN 978-0-19-857050-9
- Human Enhancement, edited by Julian Savulescu and Nick Bostrom, ISBN 0-19-929972-2
- Sutherland, John (9 May 2006). "The ideas interview: Nick Bostrom; John Sutherland meets a transhumanist who wrestles with the ethics of technologically enhanced human beings". The Guardian.
- — (2003). "Human Genetic Enhancements: A Transhumanist Perspective" (PDF). Journal of Value Inquiry 37 (4): 493–506. doi:10.1023/B:INQU.0000019037.67783.d5.
- — (2005). "In Defence of Posthuman Dignity". Bioethics 19 (3): 202–214. doi:10.1111/j.1467-8519.2005.00437.x. PMID 16167401.
- —; Ord, Toby (2006). "The reversal test: eliminating status quo bias in applied ethics" (PDF). Ethics 116 (4): 656–679.
- "73. Nick Bostrom". The FP Top 100 Global Thinkers. Foreign Policy. December 2009.
- — (March 2002). "Existential Risks". Journal of Evolution and Technology 9.
- Tegmark, Max; Bostrom, Nick (2005). "Astrophysics: is a doomsday catastrophe likely?" (PDF). Nature 438 (7069): 754. doi:10.1038/438754a.
- — (May/June 2008). "Where are they? Why I Hope the Search for Extraterrestrial Life Finds Nothing" (PDF). MIT Technology Review: 72–77.
- — (19 January 2010). "Are You Living in a Computer Simulation?".
- Nick Bostrom's homepage.
- Bostrom's Anthropic Principle page, containing information about the anthropic principle and the Doomsday argument.
- Online copy of book, "Anthropic Bias: Observation Selection Effects in Science and Philosophy" (HTML, PDF)
- Bostrom's Simulation Argument page.
- Bostrom's Existential Risk page.
- Oxford Future of Humanity Institute
- The Guardian interviews Bostrom about the World Transhumanist Association
- Interview on transhumanism
- TED Talks: Nick Bostrom on our biggest problems at TED Global in 2005