Eliezer Yudkowsky

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Eliezer Yudkowsky
Eliezer Yudkowsky, Stanford 2006 (square crop).jpg
Eliezer Yudkowsky at the 2006 Stanford Singularity Summit
Born (1979-09-11) September 11, 1979 (age 35)
Nationality American

Eliezer Shlomo Yudkowsky (born September 11, 1979[citation needed]) is an American blogger, writer, and advocate for friendly artificial intelligence.

Biography[edit]

Yudkowsky is a resident of the San Francisco Bay Area[citation needed]. Largely self-educated,[1]:38 he co-founded the nonprofit Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.[2]:599

Work[edit]

Yudkowsky's interests focus on Artificial Intelligence theory for self-awareness, self-modification, and recursive self-improvement, and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular).[2]:420 Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems.

Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias[3] sponsored by the Future of Humanity Institute of Oxford University. In February 2009, he helped to found LessWrong,[4] a "community blog devoted to refining the art of human rationality".[1]:37 LessWrong has been covered in depth in Business Insider.[5] Core concepts from LessWrong have been referenced in columns in The Guardian.[6][7] LessWrong has been mentioned briefly in articles related to the technological singularity and the work of the Machine Intelligence Research Institute (formerly called the Singularity Institute).[8] It has also been mentioned in articles about online monarchists and neo-reactionaries.[9]

Yudkowsky contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks.[10]

Fan fiction[edit]

Yudkowsky has also written several works[11] of science fiction and other fiction. His wide-ranging Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality.[1]:37[12][13][14][15][16][17] The New Yorker described it as "recast[ing] the original story in an attempt to explain Harry's wizardry through the scientific method."[18]

References[edit]

  1. ^ a b c Singularity Rising, by James Miller
  2. ^ a b Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. ISBN 0-670-03384-7. 
  3. ^ "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01. 
  4. ^ "Where did Less Wrong come from? (LessWrong FAQ)". Retrieved September 11, 2014. 
  5. ^ Miller, James (July 28, 2011). "You Can Learn How To Become More Rational". Business Insider. Retrieved March 25, 2014. 
  6. ^ Burkeman, Oliver (July 8, 2011). "This column will change your life: Feel the ugh and do it anyway. Can the psychological flinch mechanism be beaten?". The Guardian. Retrieved March 25, 2014. 
  7. ^ Burkeman, Oliver (March 9, 2012). "This column will change your life: asked a tricky question? Answer an easier one. We all do it, all the time. So how can we get rid of this eccentricity?". The Guardian. Retrieved March 25, 2014. 
  8. ^ Tiku, Natasha (July 25, 2012). "Faith, Hope, and Singularity: Entering the Matrix with New York's Futurist Set It's the end of the world as we know it, and they feel fine.". BetaBeat. Retrieved March 25, 2014. 
  9. ^ Finley, Klint (November 22, 2013). "Geeks for Monarchy: The Rise of the Neoreactionaries". TechCrunch. Retrieved March 25, 2014. 
  10. ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9. 
  11. ^ [1]
  12. ^ David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31. "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
  13. ^ Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31. 
  14. ^ "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31. 
  15. ^ Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31. 
  16. ^ Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". archive.org. Retrieved 2013-04-10. 
  17. ^ "Harry Potter and the Methods of Rationality". Esr.ibiblio.org. 2010-07-06. Retrieved 2012-08-31. 
  18. ^ pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"

Publications[edit]

Further reading[edit]

  • Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
  • The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265–272, 289, 321, 324, 326, 337–339, 345, 353, 370.

External links[edit]