Eliezer Yudkowsky: Difference between revisions
m fixed link to recursive self-improvement |
→External links: added link to RationalWiki article about him to counter any possible pro-POV |
||
Line 66: | Line 66: | ||
* [http://lesswrong.com/ Less Wrong] - "A community blog devoted to refining the art of human rationality" co-founded by Yudkowsky. |
* [http://lesswrong.com/ Less Wrong] - "A community blog devoted to refining the art of human rationality" co-founded by Yudkowsky. |
||
* [http://hpmor.com/ Harry Potter and the Methods of Rationality] |
* [http://hpmor.com/ Harry Potter and the Methods of Rationality] |
||
* [http://rationalwiki.org/wiki/Eliezer_Yudkowsky RationalWiki article about him] |
|||
{{Persondata |
{{Persondata |
Revision as of 20:05, 11 April 2014
This article may rely excessively on sources too closely associated with the subject, potentially preventing the article from being verifiable and neutral. (February 2014) |
Eliezer Yudkowsky | |
---|---|
Born | September 11, 1979 |
Nationality | American |
Known for | Friendly AI,[citation needed] Harry Potter fan fiction |
Scientific career | |
Fields | Machine ethics |
Institutions | Machine Intelligence Research Institute |
Eliezer Shlomo Yudkowsky (born September 11, 1979[citation needed]) is an American blogger, writer, and advocate for Friendly artificial intelligence.
Biography
Yudkowsky is a resident of Berkeley, California. Largely self-educated [1]: 38 , he co-founded the nonprofit Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.[2]: 599
Work
Yudkowsky's interests focus on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement, and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular).[2]: 420 Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems.[clarification needed]
Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias[3][non-primary source needed] sponsored by the Future of Humanity Institute of Oxford University. In early 2009[citation needed], he helped to found LessWrong, a "community blog devoted to refining the art of human rationality".[1]: 37
Yudkowsky contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks.[4]
Fan fiction
This section may contain excessive or inappropriate references to self-published sources. (March 2014) |
Yudkowsky has also written several works[citation needed] of science fiction and other fiction. His lengthy Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality[1]: 37 (The New Yorker described it as "recast[ing] the original story in an attempt to explain Harry's wizardry through the scientific method"[5]), and has been reviewed by authors David Brin[6] and Rachel Aaron,[7][8] Robin Hanson,[9] Aaron Swartz,[10] and by programmer Eric S. Raymond.[11]
References
- ^ a b c Singularity Rising, by James Miller
- ^ a b Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. ISBN 0-670-03384-7.
- ^ "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01.
- ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9.
- ^ pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"
- ^ David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31.
- "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
- David Brin (2012-01-20). "CONTRARY BRIN: David Brin's List of "Greatest Science Fiction and Fantasy Tales"". Davidbrin.blogspot.com. Retrieved 2012-08-31.
- Science Fiction and Our Duty to the Past
- ^ Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31.
- ^ "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31.
- ^ Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31.
- ^ Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". archive.org. Retrieved 2013-04-10.
- ^ "Harry Potter and the Methods of Rationality". Esr.ibiblio.org. 2010-07-06. Retrieved 2012-08-31.
Publications
- Creating Friendly AI (2001)
- Levels of Organization in General Intelligence(2002)
- Coherent Extrapolated Volition(2004)
- Timeless Decision Theory (2010)
- Complex Value Systems are Required to Realize Valuable Futures (2011)
- Tiling Agents for Self-Modifying AI, and the Löbian Obstacle (2013)
- A Comparison of Decision Algorithms on Newcomblike Problems (2013)
Further reading
- Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
- The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265-272, 289, 321, 324, 326, 337-339, 345, 353, 370.
External links
- Personal web site
- Less Wrong - "A community blog devoted to refining the art of human rationality" co-founded by Yudkowsky.
- Harry Potter and the Methods of Rationality
- RationalWiki article about him