|This article may rely excessively on sources too closely associated with the subject, preventing the article from being verifiable and neutral. (February 2014)|
Eliezer Yudkowsky at the 2006 Stanford Singularity Summit
|Born||September 11, 1979|
|Institutions||Machine Intelligence Research Institute|
|Known for||Friendly AI, Harry Potter fan fiction|
|Influences||Judea Pearl, Vernor Vinge, E.T. Jaynes, I. J. Good|
Yudkowsky is a resident of Berkeley, California. Largely self-educated :38, he co-founded the nonprofit Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.:599
Yudkowsky's interests focus on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement, and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular).:420 Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems.[clarification needed]
Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias[non-primary source needed] sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality".:37
||This section may contain improper references to self-published sources. (March 2014)|
Yudkowsky has also written several works of science fiction and other fiction. His lengthy Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality:37 (The New Yorker described it as "recast[ing] the original story in an attempt to explain Harry's wizardry through the scientific method"), and has been reviewed by authors David Brin and Rachel Aaron, Robin Hanson, Aaron Swartz, and by programmer Eric S. Raymond.
- Singularity Rising, by James Miller
- Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. ISBN 0-670-03384-7.
- "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01.
- Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9.
- pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"
- David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31.
- "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
- David Brin (2012-01-20). "CONTRARY BRIN: David Brin's List of "Greatest Science Fiction and Fantasy Tales"". Davidbrin.blogspot.com. Retrieved 2012-08-31.
- Science Fiction and Our Duty to the Past
- Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31.
- "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31.
- Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31.
- Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". archive.org. Retrieved 2013-04-10.
- "Harry Potter and the Methods of Rationality". Esr.ibiblio.org. 2010-07-06. Retrieved 2012-08-31.
- Creating Friendly AI (2001)
- Levels of Organization in General Intelligence(2002)
- Coherent Extrapolated Volition(2004)
- Timeless Decision Theory (2010)
- Complex Value Systems are Required to Realize Valuable Futures (2011)
- Tiling Agents for Self-Modifying AI, and the Löbian Obstacle (2013)
- A Comparison of Decision Algorithms on Newcomblike Problems (2013)
- Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
- The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265-272, 289, 321, 324, 326, 337-339, 345, 353, 370.
|Wikiquote has a collection of quotations related to: Eliezer Yudkowsky|