Eliezer Yudkowsky

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Eliezer Yudkowsky
Eliezer Yudkowsky, Stanford 2006 (square crop).jpg
Eliezer Yudkowsky at the 2006 Stanford Singularity Summit
Born (1979-09-11) September 11, 1979 (age 36)
Nationality American
Spouse Brienne Yudkowsky[1] (m. 2013)[2]

Eliezer Shlomo Yudkowsky (born September 11, 1979) is an American writer, blogger, and advocate for friendly artificial intelligence.

Personal life[edit]

Yudkowsky is a resident of the San Francisco Bay Area.[1] He is largely self-educated.[3]:38 He supports cryonics.[4]


Yudkowsky's interests focus on Artificial Intelligence theory for self-awareness, self-modification, and recursive self-improvement, and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular).[5]:420 Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems.

He co-founded the nonprofit Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.[5]:599 Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias[6] sponsored by the Future of Humanity Institute of Oxford University. In February 2009, he helped to found LessWrong,[7] a "community blog devoted to refining the art of human rationality".[3]:37 LessWrong has been covered in depth in Business Insider.[8] Core concepts from LessWrong have been referenced in columns in The Guardian.[9][10] LessWrong has been mentioned briefly in articles related to the technological singularity and the work of the Machine Intelligence Research Institute (formerly called the Singularity Institute).[11]

Yudkowsky contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks.[12]


Yudkowsky has also written several works[13] of science fiction and other fiction. His wide-ranging Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality.[3]:37[14][15][16][17][18][19] The New Yorker described it as recasting "the original story in an attempt to explain Harry’s wizardry through the scientific method."[20]


  1. ^ a b www.yudkowsky.net
  2. ^ "Married Brienne Yudkowsky". Facebook. Retrieved 9 July 2015. 
  3. ^ a b c Singularity Rising, by James Miller
  4. ^ "You Only Live Twice". Less Wrong. Retrieved 2015-09-14. 
  5. ^ a b Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. ISBN 0-670-03384-7. 
  6. ^ "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01. 
  7. ^ "Where did Less Wrong come from? (LessWrong FAQ)". Retrieved September 11, 2014. 
  8. ^ Miller, James (July 28, 2011). "You Can Learn How To Become More Rational". Business Insider. Retrieved March 25, 2014. 
  9. ^ Burkeman, Oliver (July 8, 2011). "This column will change your life: Feel the ugh and do it anyway. Can the psychological flinch mechanism be beaten?". The Guardian. Retrieved March 25, 2014. 
  10. ^ Burkeman, Oliver (March 9, 2012). "This column will change your life: asked a tricky question? Answer an easier one. We all do it, all the time. So how can we get rid of this eccentricity?". The Guardian. Retrieved March 25, 2014. 
  11. ^ Tiku, Natasha (July 25, 2012). "Faith, Hope, and Singularity: Entering the Matrix with New York's Futurist Set It's the end of the world as we know it, and they feel fine.". BetaBeat. Retrieved March 25, 2014. 
  12. ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9. 
  13. ^ Eliezer S. Yudkowsky. "Fiction". Yudkowsky. Retrieved 2015-09-14. 
  14. ^ David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31. "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
  15. ^ Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31. 
  16. ^ "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31. 
  17. ^ Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31. 
  18. ^ Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". archive.org. Archived from the original on 2013-03-16. Retrieved 2013-04-10. 
  19. ^ "Harry Potter and the Methods of Rationality". fanfiction.net. 2010-02-28. Retrieved 2014-12-29. 
  20. ^ pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"


Further reading[edit]

  • Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
  • The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265–272, 289, 321, 324, 326, 337–339, 345, 353, 370.

External links[edit]