LessWrong

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

LessWrong
LessWrong logo.svg
Type of site
Internet forum, blog
Available inEnglish
Created byEliezer Yudkowsky
URLLessWrong.com
RegistrationOptional, but is required for contributing content
LaunchedFebruary 1, 2009; 13 years ago (2009-02-01)
Current statusActive
Written inJavaScript, CSS (powered by React and GraphQL)

LessWrong (also written Less Wrong) is a community blog and forum focused on discussion of cognitive biases, philosophy, psychology, economics, rationality, and artificial intelligence, among other topics.[1][2]

Purpose[edit]

LessWrong promotes lifestyle changes believed by its community to lead to increased rationality and self-improvement. Posts often focus on avoiding biases related to decision-making and the evaluation of evidence. One suggestion is the use of Bayes' theorem as a decision-making tool.[2] There is also a focus on psychological barriers that prevent good decision-making, including fear conditioning and cognitive biases that have been studied by the psychologist Daniel Kahneman.[3]

LessWrong is also concerned with transhumanism, existential threats and the singularity. The New York Observer noted that "Despite describing itself as a forum on 'the art of human rationality,' the New York Less Wrong group ... is fixated on a branch of futurism that would seem more at home in a 3D multiplex than a graduate seminar: the dire existential threat—or, with any luck, utopian promise—known as the technological Singularity ... Branding themselves as 'rationalists,' as the Less Wrong crew has done, makes it a lot harder to dismiss them as a 'doomsday cult'."[4]

History[edit]

LessWrong developed from Overcoming Bias, an earlier group blog focused on human rationality, which began in November 2006, with artificial intelligence theorist Eliezer Yudkowsky and economist Robin Hanson as the principal contributors. In February 2009, Yudkowsky's posts were used as the seed material to create the community blog LessWrong, and Overcoming Bias became Hanson's personal blog.[5] In 2013, a significant portion of the rationalist community shifted focus to Scott Alexander's Slate Star Codex.[6]

LessWrong and its surrounding movement are the subjects of the 2019 book The AI Does Not Hate You, written by former BuzzFeed science correspondent Tom Chivers.[7][8][9]

Roko's basilisk[edit]

In July 2010, LessWrong contributor Roko posted a thought experiment to the site in which an otherwise benevolent future AI system tortures people who heard of the AI before it came into existence and failed to work tirelessly to bring it into existence, in order to incentivise said work. Using Yudkowsky's "timeless decision" theory, the post claimed doing so would be beneficial for the AI even though it cannot causally affect people in the present. This idea came to be known as "Roko's basilisk", based on Roko's idea that merely hearing about the idea would give the hypothetical AI system stronger incentives to employ blackmail. Yudkowsky deleted Roko's posts on the topic, saying that posting it was "stupid" as the dissemination of information that can be harmful to even be aware of is itself a harmful act, and that the idea, while critically flawed, represented a space of thinking that could contain "a genuinely dangerous thought", something considered an information hazard. Discussion of Roko's basilisk was banned on LessWrong for several years because Yudkowsky had stated that it caused some readers to have nervous breakdowns.[10][11][4] The ban was lifted in October 2015.[12]

David Auerbach wrote in Slate "the combination of messianic ambitions, being convinced of your own infallibility, and a lot of cash never works out well, regardless of ideology, and I don't expect Yudkowsky and his cohorts to be an exception. I worry less about Roko's Basilisk than about people who believe themselves to have transcended conventional morality."[11]

Roko's basilisk was referenced in Canadian musician Grimes's music video for her 2015 song "Flesh Without Blood" through a character named "Rococo Basilisk" who was described by Grimes as "doomed to be eternally tortured by an artificial intelligence, but she's also kind of like Marie Antoinette". After thinking of this pun and finding that Grimes had already made it, Elon Musk contacted Grimes, which led to them dating.[13][14] The concept was also referenced in an episode of Silicon Valley titled "Facial Recognition".[15]

The Basilisk has been compared to Pascal's wager.[16]

Neoreaction[edit]

The neoreactionary movement first grew on LessWrong,[17][18] attracted by discussions on the site of eugenics and evolutionary psychology.[19] Yudkowsky has strongly rejected neoreaction.[18][20][21] In a survey among LessWrong users in 2016, 28 out of 3060 respondents, or 0.92%, identified as "neoreactionary".[22]

Effective altruism[edit]

LessWrong played a significant role in the development of the effective altruism (EA) movement,[23] and the two communities are closely intertwined.[24]: 227  In a survey of LessWrong users in 2016, 664 out of 3060 respondents, or 21.7%, identified as "effective altruists". A separate survey of effective altruists in 2014 revealed that 31% of respondents had first heard of EA through LessWrong,[24] though that number had fallen to 8.2% by 2020.[25] Two early proponents of effective altruism, Toby Ord and William MacAskill, met transhumanist philosopher Nick Bostrom at Oxford University. Bostrom's research influenced many effective altruists to work on existential risk reduction.[24]

References[edit]

  1. ^ "Less Wrong FAQ". LessWrong. Archived from the original on 30 April 2019. Retrieved 25 March 2014.
  2. ^ a b Miller, James (28 July 2011). "You Can Learn How To Become More Rational". Business Insider. Archived from the original on 10 August 2018. Retrieved 25 March 2014.
  3. ^ Burkeman, Oliver (9 March 2012). "This column will change your life: asked a tricky question? Answer an easier one". The Guardian. Archived from the original on 26 March 2014. Retrieved 25 March 2014.
  4. ^ a b Tiku, Nitasha (25 July 2012). "Faith, Hope, and Singularity: Entering the Matrix with New York's Futurist Set". Observer. Archived from the original on 12 April 2019. Retrieved 12 April 2019.
  5. ^ "Where did Less Wrong come from? (LessWrong FAQ)". Archived from the original on 30 April 2019. Retrieved 25 March 2014.
  6. ^ Lewis-Kraus, Gideon (9 July 2020). "Slate Star Codex and Silicon Valley's War Against the Media". The New Yorker. Archived from the original on 10 July 2020. Retrieved 4 August 2020.
  7. ^ Cowdrey, Katherine (21 September 2017). "W&N wins Buzzfeed science reporter's debut after auction". The Bookseller. Archived from the original on 27 November 2018. Retrieved 21 September 2017.
  8. ^ Chivers, Tom (2019). The AI Does Not Hate You. Weidenfeld & Nicolson. ISBN 978-1474608770.
  9. ^ Marriott, James (31 May 2019). "The AI Does Not Hate You by Tom Chivers review — why the nerds are nervous". The Times. ISSN 0140-0460. Archived from the original on 23 April 2020. Retrieved 3 May 2020.
  10. ^ Love, Dylan (6 August 2014). "WARNING: Just Reading About This Thought Experiment Could Ruin Your Life". Business Insider. Archived from the original on 18 November 2018. Retrieved 6 December 2014.
  11. ^ a b Auerbach, David (17 July 2014). "The Most Terrifying Thought Experiment of All Time". Slate. Archived from the original on 25 October 2018. Retrieved 18 July 2014.
  12. ^ RobbBB (5 October 2015). "A few misconceptions surrounding Roko's basilisk". LessWrong. Archived from the original on 15 March 2018. Retrieved 10 April 2016. The Roko's basilisk ban isn't in effect anymore
  13. ^ Paez, Danny (5 August 2018). "Elon Musk and Grimes: "Rococo Basilisk" Links the Two on Twitter". Inverse. Archived from the original on 24 July 2020. Retrieved 24 July 2020.
  14. ^ Oberhaus, Daniel (8 May 2018). "Explaining Roko's Basilisk, the Thought Experiment That Brought Elon Musk and Grimes Together". Vice. Archived from the original on 25 July 2020. Retrieved 24 July 2020.
  15. ^ Burch, Sean (23 April 2018). "'Silicon Valley' Fact Check: That Thought Experiment Is Real and Horrifying". TheWrap. Archived from the original on 12 November 2020. Retrieved 12 November 2020.
  16. ^ Paul-Choudhury, Sumit (2 August 2019). "Tomorrow's Gods: What is the future of religion?". BBC. Archived from the original on 1 September 2020. Retrieved 28 August 2020.
  17. ^ Riggio, Adam (23 September 2016). "The Violence of Pure Reason: Neoreaction: A Basilisk". Social Epistemology Review and Reply Collective. 5 (9): 34–41. ISSN 2471-9560. Archived from the original on 5 October 2016. Retrieved 5 October 2016. The embryo of the movement lived in the community pages of Yudkowsky’s blog LessWrong, a website dedicated to refining human rationality.
  18. ^ a b Siemons, Mark (14 April 2017). "Neoreaktion im Silicon Valley: Wenn Maschinen denken". Frankfurter Allgemeine Zeitung (in German). ISSN 0174-4909. Archived from the original on 13 June 2022. Retrieved 23 March 2019.
  19. ^ Keep, Elmo (22 June 2016). "The Strange and Conflicting World Views of Silicon Valley Billionaire Peter Thiel". Fusion. Archived from the original on 13 February 2017. Retrieved 5 October 2016. Thanks to LessWrong’s discussions of eugenics and evolutionary psychology, it has attracted some readers and commenters affiliated with the alt-right and neoreaction, that broad cohort of neofascist, white nationalist and misogynist trolls.
  20. ^ Riggio, Adam (23 September 2016). "The Violence of Pure Reason: Neoreaction: A Basilisk". Social Epistemology Review and Reply Collective. 5 (9): 34–41. ISSN 2471-9560. Archived from the original on 5 October 2016. Retrieved 5 October 2016. Land and Yarvin are openly allies with the new reactionary movement, while Yudkowsky counts many reactionaries among his fanbase despite finding their racist politics disgusting.
  21. ^ Eliezer Yudkowsky (8 April 2016). "Untitled". Optimize Literally Everything (blog). Archived from the original on 26 May 2019. Retrieved 7 October 2016.
  22. ^ Hermansson, Patrik; Lawrence, David; Mulhall, Joe; Murdoch, Simon (2020). "The Dark Enlightenment: Neoreaction and Silicon Valley". The International Alt-Right. Fascism for the 21st Century?. Abingdon-on-Thames, England, UK: Routledge. ISBN 9781138363861. Archived from the original on 13 June 2022. Retrieved 2 October 2020.
  23. ^ de Lazari-Radek, Katarzyna; Singer, Peter (27 September 2017). Utilitarianism: A Very Short Introduction. Oxford University Press. p. 110. ISBN 9780198728795.{{cite book}}: CS1 maint: date and year (link)
  24. ^ a b c Chivers, Tom (2019). "Chapter 38: The Effective Altruists". The AI Does Not Hate You. Weidenfeld & Nicolson. ISBN 978-1474608770.
  25. ^ Moss, David (20 May 2021). "EA Survey 2020: How People Get Involved in EA". Effective Altruism Forum. Archived from the original on 28 July 2021. Retrieved 28 July 2021.

External links[edit]