Jump to content

Eliezer Yudkowsky: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
see talk
m →‎Biography: this should be one link
(19 intermediate revisions by 4 users not shown)
Line 1: Line 1:
{{multiple issues|
'''Eliezer Shlomo Yudkowsky''' (born September 11, 1979<ref>[http://www.goodreads.com/author/show/4533716.Eliezer_Yudkowsky Goodreads author page]</ref>) is an American blogger, writer, and advocate for [[Friendly artificial intelligence]].<ref name=SingRising>{{cite book|last=Miller|first=James|title=Singularity Rising|year=2012|publisher=BenBella Books|location=Texas|isbn=1936661659|pages=35–44|url=http://www.singularityrising.com/}}</ref><ref name="singinst">{{cite web|url=http://www.singinst.org/aboutus/team|title=Singularity Institute for Artificial Intelligence: Team|publisher=Singularity Institute for Artificial Intelligence|accessdate = 2009-07-16}}</ref>
{{primary sources|date=February 2014}}
{{self-published|date=February 2014}}
{{unreliable sources|date=February 2014}}
}}
'''Eliezer Shlomo Yudkowsky''' (born September 11, 1979{{CN}}) is an American [[blogger]], writer, and advocate for [[Friendly artificial intelligence]].


==Biography==
==Biography==
Yudkowsky, a resident of [[Berkeley, California|Berkeley]], [[California]] has no formal education in computer science or artificial intelligence.<ref>Singularity Rising, by James Miller, page 35</ref> He co-founded the nonprofit [[Machine Intelligence Research Institute]] (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.<ref name = "SiNnote">{{cite book|author=Kurzweil, Ray|title=The Singularity Is Near|publisher=Viking Penguin|location=New York, US|year=2005|isbn=0-670-03384-7|page=599|authorlink=Ray_Kurzweil}}</ref> He scored a 1410 on the SAT at age eleven<ref>Singularity Rising, by James Miller, page 38</ref> and a perfect 1600 four years later.<ref>The Spike, by Damien Broderick, page 265</ref>
Yudkowsky, a resident of [[Berkeley, California]] has no formal education in computer science or artificial intelligence.<ref>Singularity Rising, by James Miller, page 35</ref> He co-founded the nonprofit [[Machine Intelligence Research Institute]] (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.<ref name = "SiNnote">{{cite book|author=Kurzweil, Ray|title=The Singularity Is Near|publisher=Viking Penguin|location=New York, US|year=2005|isbn=0-670-03384-7|page=599|authorlink=Ray_Kurzweil}}</ref>


==Work==
==Work==
Yudkowsky's interests focus on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement ([[seed AI]]), and on artificial-intelligence architectures and [[decision theory|decision theories]] for stable motivational structures ([[Friendly AI]] and [[Coherent Extrapolated Volition]] in particular).<ref name = "SiN1">{{cite book|author=Kurzweil, Ray|title=The Singularity Is Near|publisher=Viking Penguin|location=New York, US|year=2005|isbn=0-670-03384-7|page=420}}</ref> Apart from his research work, Yudkowsky has written explanations of mathematical and philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes' Theorem".<ref>[http://yudkowsky.net/rational/bayes An Intuitive Explanation of Bayes' Theorem]</ref>
Yudkowsky's interests focus on Artificial Intelligence theory for [[self-awareness|self-understanding]], self-modification, and [[Seed AI|recursive self-improvement]], and on artificial-intelligence architectures and [[decision theory|decision theories]] for stable motivational structures ([[Friendly AI]] and [[Coherent Extrapolated Volition]] in particular).<ref name = "SiN1">{{cite book|author=Kurzweil, Ray|title=The Singularity Is Near|publisher=Viking Penguin|location=New York, US|year=2005|isbn=0-670-03384-7|page=420}}</ref> Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems{{clarify}}.


Yudkowsky was, along with [[Robin Hanson]], one of the principal contributors to the blog ''Overcoming Bias''<ref>{{cite web|url=http://www.overcomingbias.com/about|title=Overcoming Bias: About|publisher=Robin Hanson|accessdate = 2012-02-01}}</ref>{{primary-inline}} sponsored by the [[Future of Humanity Institute]] of Oxford University. In early 2009, he helped to found ''Less Wrong'', a "community blog devoted to refining the art of human rationality".<ref>{{cite web|url=http://lesswrong.com/|title=Welcome to Less Wrong|publisher=Less Wrong|accessdate = 2012-02-01}}</ref>{{primary-inline}}
==Publications==
Yudkowsky was, along with [[Robin Hanson]], one of the principal contributors to the blog ''Overcoming Bias''<ref>{{cite web|url=http://www.overcomingbias.com/about|title=Overcoming Bias: About|publisher=Robin Hanson|accessdate = 2012-02-01}}</ref> sponsored by the [[Future of Humanity Institute]] of Oxford University. In early 2009, he helped to found ''Less Wrong'', a "community blog devoted to refining the art of human rationality".<ref>{{cite web|url=http://lesswrong.com/|title=Welcome to Less Wrong|publisher=Less Wrong|accessdate = 2012-02-01}}</ref> The Sequences<ref>{{cite web|url=http://wiki.lesswrong.com/wiki/Sequences|title=Sequences-Lesswrongwiki|accessdate = 2012-02-01}}</ref> on Less Wrong comprise over two years of blog posts on epistemology, Artificial Intelligence, and metaethics.


Yudkowsky contributed two chapters to [[Oxford University|Oxford]] philosopher [[Nick Bostrom]]'s and Milan Ćirković's edited volume ''Global Catastrophic Risks''.<ref name = "bostrom">{{cite book|editor1-last=Bostrom|editor1-first=Nick|editor1-link=Nick_Bostrom|editor2-last=Ćirković|editor2-first=Milan M.|title=Global Catastrophic Risks|publisher=Oxford University Press|location=Oxford, UK|year=2008|isbn=978-0-19-857050-9|pages=91–119, 308–345}}</ref>
Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems, including "Tiling Agents for Self-Modifying AI, and the Löbian Obstacle"<ref name="http">{{cite web|url=http://intelligence.org/files/TilingAgents.pdf|title=Tiling Agents for Self-Modifying AI, and the Löbian Obstacle|publisher=Machine Intelligence Research Institute|accessdate=2013-08-26}}</ref> and "Robust Cooperation in the Prisoner's Dilemma: Program Equilibrium via Provability Logic".<ref name="ReferenceA">{{cite web|url=http://intelligence.org/files/RobustCooperation.pdf|title=Robust Cooperation in the Prisoner's Dilemma: Program Equilibrium via Provability Logic|publisher=Machine Intelligence Research Institute|accessdate=2013-08-26}}</ref> "A Comparison of Decision Algorithms on Newcomblike Problems" summarizes some of Yudkowsky's work on timeless decision theory.<ref name="intelligence.org">{{cite web|url=http://intelligence.org/files/Comparison.pdf|title=A Comparison of Decision Algorithms on Newcomblike Problems|accessdate=2013-08-26|publisher=Machine Intelligence Research Institute}}</ref>


Yudkowsky has also written several works of science fiction and other fiction. His [[Harry Potter]] [[fan fiction]] story ''Harry Potter and the Methods of Rationality'' illustrates topics in [[cognitive science]] and [[rationality]] (''[[The New Yorker]]'' described it as "a thousand-page online 'fanfic' text ... which recasts the original story in an attempt to explain Harry's wizardry through the scientific method"<ref>pg 54, [http://www.newyorker.com/reporting/2011/11/28/111128fa_fact_packer "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"]</ref>), and has been reviewed by authors [[David Brin]]<ref>{{cite web|author=David Brin |url=http://davidbrin.blogspot.com/2010/06/secret-of-college-life-plus.html |title=CONTRARY BRIN: A secret of college life... plus controversies and science! |publisher=Davidbrin.blogspot.com |date=2010-06-21 |accessdate=2012-08-31}}</ref><ref>[http://www.theatlantic.com/entertainment/archive/2011/07/harry-potter-and-the-key-to-immortality/241972/ "'Harry Potter' and the Key to Immortality"], Daniel Snyder, ''[[The Atlantic]]''</ref><ref>{{cite web|author=David Brin |url=http://davidbrin.blogspot.com/2012/01/david-brins-list-of-greatest-science.html |title=CONTRARY BRIN: David Brin's List of "Greatest Science Fiction and Fantasy Tales" |publisher=Davidbrin.blogspot.com |date=2012-01-20 |accessdate=2012-08-31}}</ref><ref>http://davidbrin.blogspot.com/2013/02/science-fiction-and-our-duty-to-past.html</ref> and Rachel Aaron,<ref>{{cite web|author=Authors |url=http://www.fantasybookreview.co.uk/blog/2012/04/02/rachel-aaron-interview-april-2012/ |title=Rachel Aaron interview (April 2012) |publisher=Fantasybookreview.co.uk |date=2012-04-02 |accessdate=2012-08-31}}</ref><ref>{{cite web|url=http://civilian-reader.blogspot.com/2011/05/interview-with-rachel-aaron.html |title=Civilian Reader: An Interview with Rachel Aaron |publisher=Civilian-reader.blogspot.com |date=2011-05-04 |accessdate=2012-08-31}}</ref> [[Robin Hanson]],<ref>{{cite web|last=Hanson |first=Robin |url=http://www.overcomingbias.com/2010/10/hyper-rational-harry.html |title=Hyper-Rational Harry |publisher=Overcoming Bias |date=2010-10-31 |accessdate=2012-08-31}}</ref> [[Aaron Swartz]],<ref>{{cite web|last=Swartz |first=Aaron |url=http://web.archive.org/web/20130316081659/http://www.aaronsw.com/weblog/books2011 |title=The 2011 Review of Books (Aaron Swartz's Raw Thought) |publisher=archive.org |date= |accessdate=2013-04-10}}</ref> and by programmer [[Eric S. Raymond]].<ref>{{cite web|url=http://esr.ibiblio.org/?p=2100 |title=Harry Potter and the Methods of Rationality |publisher=Esr.ibiblio.org |date=2010-07-06 |accessdate=2012-08-31}}</ref>
Yudkowsky contributed two chapters to [[Oxford University|Oxford]] philosopher [[Nick Bostrom]]'s and Milan Ćirković's edited volume ''Global Catastrophic Risks'',<ref name = "bostrom">{{cite book|editor1-last=Bostrom|editor1-first=Nick|editor1-link=Nick_Bostrom|editor2-last=Ćirković|editor2-first=Milan M.|title=Global Catastrophic Risks|publisher=Oxford University Press|location=Oxford, UK|year=2008|isbn=978-0-19-857050-9|pages=91–119, 308–345}}</ref> and "Complex Value Systems are Required to Realize Valuable Futures"<ref>{{cite conference|first = Eliezer|last= Yudkowsky|title=Complex Value Systems are Required to Realize Valuable Futures|year = 2011|booktitle = AGI-11|url = http://singinst.org/upload/complex-value-systems.pdf}}</ref> to the conference AGI-11.

Yudkowsky is the author of the Singularity Institute publications "Creating Friendly AI"<ref>{{cite web|first = Eliezer|last = Yudkowsky|url=http://www.singinst.org/upload/CFAI/index.html|title=Creating Friendly AI|publisher=[[Singularity Institute for Artificial Intelligence]]|accessdate = 2012-02-01}}</ref> (2001), "Levels of Organization in General Intelligence"<ref>{{cite web|first = Eliezer|last = Yudkowsky|url=http://www.singinst.org/upload/LOGI//LOGI.pdf|title=Levels of Organization in General Intelligence|publisher=[[Singularity Institute for Artificial Intelligence]]|accessdate = 2012-02-01}}</ref> (2002), "Coherent Extrapolated Volition"<ref>{{cite web|first = Eliezer|last = Yudkowsky|url=http://singinst.org/upload/CEV.html|title=Coherent Extrapolated Volition|publisher=[[Singularity Institute for Artificial Intelligence]]|accessdate = 2012-02-01}}</ref> (2004), and "Timeless Decision Theory"<ref>{{cite web|first = Eliezer|last = Yudkowsky|url=http://singinst.org/upload/TDT-v01o.pdf|title=Timeless Decision Theory|publisher=[[Singularity Institute for Artificial Intelligence]]|accessdate = 2012-02-01}}</ref> (2010).

Yudkowsky played the role of the AI in the first [[AI box]] experiments and wrote a page describing the rules he had used for the game.<ref>{{cite web|url=http://yudkowsky.net/singularity/aibox|title=The AI-Box Experiment|accessdate=2013-08-26}}</ref>

Yudkowsky has also written several works<ref>{{cite web|url=http://yudkowsky.net/other/fiction|title=Yudkowsky- Fiction|publisher=Eliezer Yudkowsky}}</ref> of science fiction and other fiction. His [[Harry Potter]] [[fan fiction]] story ''Harry Potter and the Methods of Rationality'' illustrates topics in [[cognitive science]] and [[rationality]] (''[[The New Yorker]]'' described it as "a thousand-page online 'fanfic' text called 'Harry Potter and the Methods of Rationality', which recasts the original story in an attempt to explain Harry's wizardry through the scientific method"<ref>pg 54, [http://www.newyorker.com/reporting/2011/11/28/111128fa_fact_packer "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"]</ref>), and has been reviewed by authors [[David Brin]]<ref>{{cite web|author=David Brin |url=http://davidbrin.blogspot.com/2010/06/secret-of-college-life-plus.html |title=CONTRARY BRIN: A secret of college life... plus controversies and science! |publisher=Davidbrin.blogspot.com |date=2010-06-21 |accessdate=2012-08-31}}</ref><ref>[http://www.theatlantic.com/entertainment/archive/2011/07/harry-potter-and-the-key-to-immortality/241972/ "'Harry Potter' and the Key to Immortality"], Daniel Snyder, ''[[The Atlantic]]''</ref><ref>{{cite web|author=David Brin |url=http://davidbrin.blogspot.com/2012/01/david-brins-list-of-greatest-science.html |title=CONTRARY BRIN: David Brin's List of "Greatest Science Fiction and Fantasy Tales" |publisher=Davidbrin.blogspot.com |date=2012-01-20 |accessdate=2012-08-31}}</ref><ref>http://davidbrin.blogspot.com/2013/02/science-fiction-and-our-duty-to-past.html</ref> and Rachel Aaron,<ref>{{cite web|author=Authors |url=http://www.fantasybookreview.co.uk/blog/2012/04/02/rachel-aaron-interview-april-2012/ |title=Rachel Aaron interview (April 2012) |publisher=Fantasybookreview.co.uk |date=2012-04-02 |accessdate=2012-08-31}}</ref><ref>{{cite web|url=http://civilian-reader.blogspot.com/2011/05/interview-with-rachel-aaron.html |title=Civilian Reader: An Interview with Rachel Aaron |publisher=Civilian-reader.blogspot.com |date=2011-05-04 |accessdate=2012-08-31}}</ref> [[Robin Hanson]],<ref>{{cite web|last=Hanson |first=Robin |url=http://www.overcomingbias.com/2010/10/hyper-rational-harry.html |title=Hyper-Rational Harry |publisher=Overcoming Bias |date=2010-10-31 |accessdate=2012-08-31}}</ref> [[Aaron Swartz]],<ref>{{cite web|last=Swartz |first=Aaron |url=http://web.archive.org/web/20130316081659/http://www.aaronsw.com/weblog/books2011 |title=The 2011 Review of Books (Aaron Swartz's Raw Thought) |publisher=archive.org |date= |accessdate=2013-04-10}}</ref> and by programmer [[Eric S. Raymond]].<ref>{{cite web|url=http://esr.ibiblio.org/?p=2100 |title=Harry Potter and the Methods of Rationality |publisher=Esr.ibiblio.org |date=2010-07-06 |accessdate=2012-08-31}}</ref>


==References==
==References==
{{Reflist}}
{{Reflist}}

==Publications==
*[http://www.singinst.org/upload/CFAI/index.html Creating Friendly AI]{{dead-link}} (2001)
*[http://www.singinst.org/upload/LOGI//LOGI.pdf Levels of Organization in General Intelligence]{{dead-link}} (2002)
*[http://singinst.org/upload/CEV.html Coherent Extrapolated Volition]{{dead-link}} (2004)
*[http://singinst.org/upload/TDT-v01o.pdf Timeless Decision Theory]{{dead-link}} (2010)
*[http://singinst.org/upload/complex-value-systems.pdf Complex Value Systems are Required to Realize Valuable Futures]{{dead-link}} (2011)
*[http://intelligence.org/files/TilingAgents.pdf Tiling Agents for Self-Modifying AI, and the Löbian Obstacle] (2013)
*[http://intelligence.org/files/Comparison.pdf A Comparison of Decision Algorithms on Newcomblike Problems] (2013)
*[http://intelligence.org/files/RobustCooperation.pdf Robust Cooperation in the Prisoner's Dilemma: Program Equilibrium via Provability Logic] (2014)


==Further reading==
==Further reading==
Line 31: Line 39:
* [http://yudkowsky.net/ Personal web site]
* [http://yudkowsky.net/ Personal web site]
* [http://lesswrong.com/ Less Wrong] - "A community blog devoted to refining the art of human rationality" co-founded by Yudkowsky.
* [http://lesswrong.com/ Less Wrong] - "A community blog devoted to refining the art of human rationality" co-founded by Yudkowsky.
* [http://intelligence.org/team/ Biography page at the Machine Intelligence Research Institute]
* [http://unjobs.org/authors/eliezer-yudkowsky Downloadable papers and bibliography]
* [http://hpmor.com/ Harry Potter and the Methods of Rationality]
* [http://hpmor.com/ Harry Potter and the Methods of Rationality]
* [http://itunes.apple.com/us/podcast/harry-potter-methods-rationality/id431784580 Harry Potter and the Methods of Rationality - The Podcast]


{{Persondata
{{Persondata

Revision as of 00:27, 21 February 2014

Eliezer Shlomo Yudkowsky (born September 11, 1979[citation needed]) is an American blogger, writer, and advocate for Friendly artificial intelligence.

Biography

Yudkowsky, a resident of Berkeley, California has no formal education in computer science or artificial intelligence.[1] He co-founded the nonprofit Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.[2]

Work

Yudkowsky's interests focus on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement, and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular).[3] Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems[clarification needed].

Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias[4][non-primary source needed] sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality".[5][non-primary source needed]

Yudkowsky contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks.[6]

Yudkowsky has also written several works of science fiction and other fiction. His Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality (The New Yorker described it as "a thousand-page online 'fanfic' text ... which recasts the original story in an attempt to explain Harry's wizardry through the scientific method"[7]), and has been reviewed by authors David Brin[8][9][10][11] and Rachel Aaron,[12][13] Robin Hanson,[14] Aaron Swartz,[15] and by programmer Eric S. Raymond.[16]

References

  1. ^ Singularity Rising, by James Miller, page 35
  2. ^ Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 599. ISBN 0-670-03384-7.
  3. ^ Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 420. ISBN 0-670-03384-7.
  4. ^ "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01.
  5. ^ "Welcome to Less Wrong". Less Wrong. Retrieved 2012-02-01.
  6. ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9.
  7. ^ pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"
  8. ^ David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31.
  9. ^ "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
  10. ^ David Brin (2012-01-20). "CONTRARY BRIN: David Brin's List of "Greatest Science Fiction and Fantasy Tales"". Davidbrin.blogspot.com. Retrieved 2012-08-31.
  11. ^ http://davidbrin.blogspot.com/2013/02/science-fiction-and-our-duty-to-past.html
  12. ^ Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31.
  13. ^ "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31.
  14. ^ Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31.
  15. ^ Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". archive.org. Retrieved 2013-04-10.
  16. ^ "Harry Potter and the Methods of Rationality". Esr.ibiblio.org. 2010-07-06. Retrieved 2012-08-31.

Publications

Further reading

  • Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
  • The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265-272, 289, 321, 324, 326, 337-339, 345, 353, 370.

External links

Template:Persondata