Jump to content

LessWrong

From Wikipedia, the free encyclopedia

LessWrong
Type of site
Internet forum,blog
Available inEnglish
Created byEliezer Yudkowsky
URLLessWrong.com
RegistrationOptional, but is required for contributing content
LaunchedFebruary 1, 2009;15 years ago(2009-02-01)
Current statusActive
Written inJavaScript,CSS(powered byReactandGraphQL)

LessWrong(also writtenLess Wrong) is a communityblogandforumfocused on discussion ofcognitive biases,philosophy,psychology,economics,rationality,andartificial intelligence,among other topics.[1][2]

Purpose

[edit]

LessWrongpromotes lifestyle changes believed by its community to lead to increased rationality andself-improvement.The best known posts of LessWrong are "The Sequences", a series of essays which aim to describe how to avoid the typical failure modes of human reasoning with the goal of improving decision-making and the evaluation of evidence.[3][4]One suggestion is the use ofBayes' theoremas a decision-making tool.[2]There is also a focus on psychological barriers that prevent good decision-making, includingfear conditioningandcognitive biasesthat have been studied by the psychologistDaniel Kahneman.[5]

LessWrongis also concerned with artificial intelligence,transhumanism,existential threatsand thesingularity.The New York Observerin 2019 noted that "Despite describing itself as a forum on 'the art of human rationality,' the New York Less Wrong group... is fixated on a branch offuturismthat would seem more at home in a 3D multiplex than a graduate seminar: the dire existential threat—or, with any luck, utopian promise—known as the technological Singularity... Branding themselves as 'rationalists,' as the Less Wrong crew has done, makes it a lot harder to dismiss them as a 'doomsday cult'. "[6]

History

[edit]
Eliezer YudkowskyatStanford Universityin 2006

LessWrongdeveloped fromOvercoming Bias,an earlier group blog focused on human rationality, which began in November 2006, with artificial intelligence researcherEliezer Yudkowskyand economistRobin Hansonas the principal contributors. In February 2009, Yudkowsky's posts were used as the seed material to create the community blogLessWrong,andOvercoming Biasbecame Hanson's personal blog.[7]In 2013, a significant portion of the rationalist community shifted focus to Scott Alexander'sSlate Star Codex.[8]

Artificial Intelligence

[edit]

Discussions of AI within LessWrong includeAI alignment,AI safety,[9]andmachine consciousness.[citation needed]Articles posted on LessWrong about AI have been cited in the news media.[9][10]LessWrong,and its surrounding movement work on AI are the subjects of the 2019 bookThe AI Does Not Hate You,written by formerBuzzFeedscience correspondent Tom Chivers.[11][12][13]

Effective altruism

[edit]

LessWrongplayed a significant role in the development of theeffective altruism(EA) movement,[14]and the two communities are closely intertwined.[15]: 227 In a survey ofLessWrongusers in 2016, 664 out of 3,060 respondents, or 21.7%, identified as "effective altruists". A separate survey of effective altruists in 2014 revealed that 31% of respondents had first heard of EA throughLessWrong,[15]though that number had fallen to 8.2% by 2020.[16]

Roko's basilisk

[edit]

In July 2010,LessWrongcontributor Roko posted athought experimentto the site in which an otherwisebenevolent future AI systemtortures people who heard of the AI before it came into existence and failed to work tirelessly to bring it into existence, in order to incentivise said work. This idea came to be known as "Roko's basilisk",based on Roko's idea that merely hearing about the idea would give the hypothetical AI system an incentive to try suchblackmail.[17][18][6]

Neoreaction

[edit]

The comment section ofOvercoming Biasattracted prominentneoreactionariessuch asCurtis Yarvin(pen name Mencius Moldbug), the founder of the neoreactionary movement,[19]and Hanson posted his side of a debate versus Moldbug onfutarchy.[20]AfterLessWrongsplit fromOvercoming Bias,it too attracted some individuals affiliated with neoreaction with discussions ofeugenicsandevolutionary psychology.[21]However, Yudkowsky has strongly rejected neoreaction.[22][23]In a survey amongLessWrongusers in 2016, 28 out of 3060 respondents (0.92%) identified as "neoreactionary".[24]

Notable users

[edit]

LessWrong has been associated with several influential contributors. Founder Eliezer Yudkowsky established the platform to promote rationality and raise awareness about potential risks associated with artificial intelligence.[25]Scott Alexanderbecame one of the site's most popular writers before starting his own blog, Slate Star Codex, contributing discussions on AI safety and rationality.[25]

Further notable users on LessWrong includePaul Christiano,Wei DaiandZvi Mowshowitz.A selection of posts by these and other contributers, selected through a community review process,[26]were published as parts of the essay collections "A Map That Reflects the Territory"[27]and "The Engines of Cognition".[28][26][29]

References

[edit]
  1. ^"Less Wrong FAQ".LessWrong.Archivedfrom the original on 30 April 2019.Retrieved25 March2014.
  2. ^abMiller, James (28 July 2011)."You Can Learn How To Become More Rational".Business Insider.Archivedfrom the original on 10 August 2018.Retrieved25 March2014.
  3. ^Lewis-Kraus, Gideon (9 July 2020)."Slate Star Codex and Silicon Valley's War Against the Media".The New Yorker.Archivedfrom the original on 10 July 2020.Retrieved4 August2020.
  4. ^"Sequences Highlights".LessWrong.Archivedfrom the original on 6 July 2024.Retrieved12 July2024.
  5. ^Burkeman, Oliver (9 March 2012)."This column will change your life: asked a tricky question? Answer an easier one".The Guardian.Archivedfrom the original on 26 March 2014.Retrieved25 March2014.
  6. ^abTiku, Nitasha (25 July 2012)."Faith, Hope, and Singularity: Entering the Matrix with New York's Futurist Set".Observer.Archivedfrom the original on 12 April 2019.Retrieved12 April2019.
  7. ^"Where did Less Wrong come from? (LessWrong FAQ)".Archivedfrom the original on 30 April 2019.Retrieved25 March2014.
  8. ^Lewis-Kraus, Gideon (9 July 2020)."Slate Star Codex and Silicon Valley's War Against the Media".The New Yorker.Archivedfrom the original on 10 July 2020.Retrieved4 August2020.
  9. ^abChivers, Tom (22 November 2023)."What we've learned about the robot apocalypse from the OpenAI debacle".Semafor (website).Archivedfrom the original on 3 March 2024.Retrieved14 July2024.Since the late 1990s those worries have become more specific, and coalesced around Nick Bostrom's 2014 bookSuperintelligence: Paths, Dangers, Strategiesand Eliezer Yudkowsky's blog LessWrong.
  10. ^Newport, Cal (15 March 2024)."Can an A.I. Make Plans?".The New Yorker.ISSN0028-792X.Retrieved14 July2024.
  11. ^Cowdrey, Katherine (21 September 2017)."W&N wins Buzzfeed science reporter's debut after auction".The Bookseller.Archivedfrom the original on 27 November 2018.Retrieved21 September2017.
  12. ^Chivers, Tom (2019).The AI Does Not Hate You.Weidenfeld & Nicolson.ISBN978-1474608770.
  13. ^Marriott, James (31 May 2019)."The AI Does Not Hate You by Tom Chivers review — why the nerds are nervous".The Times.ISSN0140-0460.Archivedfrom the original on 23 April 2020.Retrieved3 May2020.
  14. ^de Lazari-Radek, Katarzyna;Singer, Peter(27 September 2017).Utilitarianism: A Very Short Introduction.Oxford University Press. p. 110.ISBN9780198728795.{{cite book}}:CS1 maint: date and year (link)
  15. ^abChivers, Tom (2019). "Chapter 38: The Effective Altruists".The AI Does Not Hate You.Weidenfeld & Nicolson.ISBN978-1474608770.
  16. ^Moss, David (20 May 2021)."EA Survey 2020: How People Get Involved in EA".Effective Altruism Forum.Archivedfrom the original on 28 July 2021.Retrieved28 July2021.
  17. ^Love, Dylan (6 August 2014)."WARNING: Just Reading About This Thought Experiment Could Ruin Your Life".Business Insider.Archivedfrom the original on 18 November 2018.Retrieved6 December2014.
  18. ^Auerbach, David(17 July 2014)."The Most Terrifying Thought Experiment of All Time".Slate.Archivedfrom the original on 25 October 2018.Retrieved18 July2014.
  19. ^Sandifer, Elizabeth (2018).Neoreaction a Basilisk: Essays On and Around the Alt-Right(2nd ed.). Eruditorum Press.one of the sites where [Moldbug] got his start as a commenter was on Overcoming Bias, i.e. where Yudkowsky was writing before LessWrong.
  20. ^Hanson, Robin."My Moldbug Debate".Overcoming Bias.Archived fromthe originalon 24 January 2010.Retrieved13 July2024.
  21. ^Keep, Elmo (22 June 2016)."The Strange and Conflicting World Views of Silicon Valley Billionaire Peter Thiel".Fusion.Archivedfrom the original on 13 February 2017.Retrieved5 October2016.Thanks to LessWrong's discussions of eugenics and evolutionary psychology, it has attracted some readers and commenters affiliated with the alt-right and neoreaction, that broad cohort of neofascist, white nationalist and misogynist trolls.
  22. ^Riggio, Adam (23 September 2016)."The Violence of Pure Reason: Neoreaction: A Basilisk".Social Epistemology Review and Reply Collective.5(9): 34–41.ISSN2471-9560.Archivedfrom the original on 5 October 2016.Retrieved5 October2016.Land and Yarvin are openly allies with the new reactionary movement, while Yudkowsky counts many reactionaries among his fanbase despite finding their racist politics disgusting.
  23. ^Eliezer Yudkowsky (8 April 2016)."Untitled".Optimize Literally Everything (blog).Archivedfrom the original on 26 May 2019.Retrieved7 October2016.
  24. ^Hermansson, Patrik; Lawrence, David; Mulhall, Joe; Murdoch, Simon (2020)."The Dark Enlightenment: Neoreaction and Silicon Valley".The International Alt-Right. Fascism for the 21st Century?.Abingdon-on-Thames, England, UK: Routledge.ISBN9781138363861.Archivedfrom the original on 13 June 2022.Retrieved2 October2020.
  25. ^abMiller, J.D. (2017). "Reflections on the Singularity Journey". In Callaghan, V.; Miller, J.; Yampolskiy, R.; Armstrong, S. (eds.).The Technological Singularity.The Frontiers Collection. Berlin, Heidelberg: Springer. pp. 225–226.ISBN978-3-662-54033-6.Yudkowsky helped create the Singularity Institute (now called the Machine Intelligence Research Institute) to help mankind achieve a friendly Singularity. (Disclosure: I have contributed to the Singularity Institute.) Yudkowsky then founded the community bloghttp://LessWrong.com,which seeks to promote the art of rationality, to raise the sanity waterline, and to in part convince people to make considered, rational charitable donations, some of which, Yudkowsky (correctly) hoped, would go to his organization.
  26. ^abGasarch, William (2022). "Review of" A Map that Reflects the Territory: Essays by the LessWrong Community "".ACM SIGACT News.53(1): 13–24.doi:10.1145/3532737.3532741.Users wrote reviews of the best posts of 2018, and voted on them using the quadratic voting system, popularized by Glen Weyl and Vitalik Buterin. From the 2000+ posts published that year, the Review narrowed down the 44 most interesting and valuable posts.
  27. ^Lagerros, J.; Pace, B.; LessWrong.com (2020).A Map That Reflects the Territory: Essays by the LessWrong Community.Center for Applied Rationality.ISBN9781736128503.
  28. ^Pace, B.; LessWrong (2021).The Engines of Cognition: Essays by the LessWrong Community.Center for Applied Rationality.ISBN9781736128510.
  29. ^Gasarch, William (2022). "Review of" The Engines of Cognition: Essays by the Less Wrong Community "".ACM SIGACT News.53(3): 6–16.doi:10.1145/3561066.3561064.