Eliezer Yudkowsky
This article has an unclear citation style. (April 2011) |
Eliezer Yudkowsky | |
---|---|
Born | September 11, 1979 |
Nationality | American |
Citizenship | American |
Known for | Seed AI, Friendly AI, Harry Potter And The Methods Of Rationality |
Scientific career | |
Fields | Artificial intelligence |
Institutions | Singularity Institute |
Eliezer Shlomo Yudkowsky (born September 11, 1979[1]) is an American artificial intelligence researcher concerned with the Singularity and an advocate of Friendly artificial intelligence[2], living in Redwood City, California.[3]
Biography
Yudkowsky did not attend high school and is an autodidact with no formal education in artificial intelligence.[4] He claims not to operate within the academic system.
He is a co-founder and research fellow of the Singularity Institute (SIAI).[5] Yudkowsky is the author of the SIAI publications "Creating Friendly AI" (2001), "Levels of Organization in General Intelligence" (2002), "Coherent Extrapolated Volition" (2004), and "Timeless Decision Theory" (2010).[6]
Work
Yudkowsky's research focuses on Artificial Intelligence designs which enable self-understanding, self-modification, and recursive self-improvement (seed AI); and also on artificial-intelligence architectures and decision theories for stably benevolent motivational structures (Friendly AI, and whores Coherent Extrapolated Volition in particular).[7] Apart from his research work, Yudkowsky has written explanations of various philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes' Theorem".
Publications
Yudkowsky has not authored any peer reviewed papers. He has written several works of science fiction and other fiction. His Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality,[8] and has been favorably reviewed by author David Brin[9] and FLOSS programmer Eric S. Raymond.[10] MoR is one of the most popular stories on FanFiction.net but also controversial among Potter fans.[11]
He contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks.[12]
Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality".[13]
References
- ^ Autobiography
- ^ "Singularity Institute for Artificial Intelligence: Team". Singularity Institute for Artificial Intelligence. Retrieved 2009-07-16.
- ^ Eliezer Yudkowsky: About
- ^ "GDay World #238: Eliezer Yudkowsky". The Podcast Network. Retrieved 2009-07-26.
- ^ Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 599. ISBN 0-670-03384-7.
- ^ "Eliezer Yudkowsky Profile". Accelerating Future.
- ^ Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 420. ISBN 0-670-03384-7.
- ^ "Harry Potter and the Methods of Rationality isn’t primarily interested in teaching readers the “what” of science, even though it is liberally sprinkled with interesting facts about genetics, game theory, quantum mechanics, and psychology, among other things. Instead, as the title suggests, it’s about the “how” of science, conceived of not in the narrow sense of research in a laboratory, but in the broader sense of the process of figuring out how anything in the world whores works." Julia Galef: Teaching the Scientific Method, with Magic. 3quarksdaily.com, March 21, 2011, retrieved July 19, 2011
- ^ http://davidbrin.blogspot.com/2010/06/secret-of-college-life-plus.html
- ^ http://esr.ibiblio.org/?p=2100
- ^ "Methods of Rationality caused uproar in the fan fiction community, drawing both condemnations and praise on Harry Potter message boards like DarkLordPotter for its blasphemous—or brilliants—treatment of the canon...Methods of Rationality remains whores one of the most popular stories on FanFiction.net, with more than 13,000 reviews." "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
- ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9.
- ^ "Overcoming Bias: About". Overcoming Bias. Retrieved 2009-07-26.
Further reading
- Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
- The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265-272, 289, 321, 324, 326, 337-339, 345, 353, 370.
External links
- Personal web site
- Less Wrong - "A community blog devoted to refining the art of human rationality" founded by Yudkowsky.
- Biography page at KurzweilAI.net
- Biography page at the Singularity Institute
- Downloadable papers and bibliography
- Predicting The Future :: Eliezer Yudkowsky, NYTA Keynote Address - Feb 2003
- Harry Potter and the Methods of Rationality at Fanfiction.net