Misplaced Pages

Eliezer Yudkowsky: Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 01:20, 19 February 2014 editDavidcpearce (talk | contribs)Extended confirmed users613 edits see talk← Previous edit Revision as of 06:22, 20 February 2014 edit undoInanygivenhole (talk | contribs)992 edits Added {{primary sources}}, {{self-published}} and {{unreliable sources}} tags (within {{multiple issues}}) to article (TW)Next edit →
Line 1: Line 1:
{{multiple issues|
{{primary sources|date=February 2014}}
{{self-published|date=February 2014}}
{{unreliable sources|date=February 2014}}
}}
'''Eliezer Shlomo Yudkowsky''' (born September 11, 1979<ref></ref>) is an American blogger, writer, and advocate for ].<ref name=SingRising>{{cite book|last=Miller|first=James|title=Singularity Rising|year=2012|publisher=BenBella Books|location=Texas|isbn=1936661659|pages=35–44|url=http://www.singularityrising.com/}}</ref><ref name="singinst">{{cite web|url=http://www.singinst.org/aboutus/team|title=Singularity Institute for Artificial Intelligence: Team|publisher=Singularity Institute for Artificial Intelligence|accessdate = 2009-07-16}}</ref> '''Eliezer Shlomo Yudkowsky''' (born September 11, 1979<ref></ref>) is an American blogger, writer, and advocate for ].<ref name=SingRising>{{cite book|last=Miller|first=James|title=Singularity Rising|year=2012|publisher=BenBella Books|location=Texas|isbn=1936661659|pages=35–44|url=http://www.singularityrising.com/}}</ref><ref name="singinst">{{cite web|url=http://www.singinst.org/aboutus/team|title=Singularity Institute for Artificial Intelligence: Team|publisher=Singularity Institute for Artificial Intelligence|accessdate = 2009-07-16}}</ref>



Revision as of 06:22, 20 February 2014

This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
This article relies excessively on references to primary sources. Please improve this article by adding secondary or tertiary sources.
Find sources: "Eliezer Yudkowsky" – news · newspapers · books · scholar · JSTOR (February 2014) (Learn how and when to remove this message)
This article may contain excessive or inappropriate references to self-published sources. Please help improve it by removing references to unreliable sources where they are used inappropriately. (February 2014) (Learn how and when to remove this message)
Some of this article's listed sources may not be reliable. Please help improve this article by looking for better, more reliable sources. Unreliable citations may be challenged and removed. (February 2014) (Learn how and when to remove this message)
(Learn how and when to remove this message)

Eliezer Shlomo Yudkowsky (born September 11, 1979) is an American blogger, writer, and advocate for Friendly artificial intelligence.

Biography

Yudkowsky, a resident of Berkeley, California has no formal education in computer science or artificial intelligence. He co-founded the nonprofit Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow. He scored a 1410 on the SAT at age eleven and a perfect 1600 four years later.

Work

Yudkowsky's interests focus on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement (seed AI), and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular). Apart from his research work, Yudkowsky has written explanations of mathematical and philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes' Theorem".

Publications

Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality". The Sequences on Less Wrong comprise over two years of blog posts on epistemology, Artificial Intelligence, and metaethics.

Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems, including "Tiling Agents for Self-Modifying AI, and the Löbian Obstacle" and "Robust Cooperation in the Prisoner's Dilemma: Program Equilibrium via Provability Logic". "A Comparison of Decision Algorithms on Newcomblike Problems" summarizes some of Yudkowsky's work on timeless decision theory.

Yudkowsky contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks, and "Complex Value Systems are Required to Realize Valuable Futures" to the conference AGI-11.

Yudkowsky is the author of the Singularity Institute publications "Creating Friendly AI" (2001), "Levels of Organization in General Intelligence" (2002), "Coherent Extrapolated Volition" (2004), and "Timeless Decision Theory" (2010).

Yudkowsky played the role of the AI in the first AI box experiments and wrote a page describing the rules he had used for the game.

Yudkowsky has also written several works of science fiction and other fiction. His Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality (The New Yorker described it as "a thousand-page online 'fanfic' text called 'Harry Potter and the Methods of Rationality', which recasts the original story in an attempt to explain Harry's wizardry through the scientific method"), and has been reviewed by authors David Brin and Rachel Aaron, Robin Hanson, Aaron Swartz, and by programmer Eric S. Raymond.

References

  1. Goodreads author page
  2. Miller, James (2012). Singularity Rising. Texas: BenBella Books. pp. 35–44. ISBN 1936661659.
  3. "Singularity Institute for Artificial Intelligence: Team". Singularity Institute for Artificial Intelligence. Retrieved 2009-07-16.
  4. Singularity Rising, by James Miller, page 35
  5. Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 599. ISBN 0-670-03384-7.
  6. Singularity Rising, by James Miller, page 38
  7. The Spike, by Damien Broderick, page 265
  8. Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 420. ISBN 0-670-03384-7.
  9. An Intuitive Explanation of Bayes' Theorem
  10. "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01.
  11. "Welcome to Less Wrong". Less Wrong. Retrieved 2012-02-01.
  12. "Sequences-Lesswrongwiki". Retrieved 2012-02-01.
  13. "Tiling Agents for Self-Modifying AI, and the Löbian Obstacle" (PDF). Machine Intelligence Research Institute. Retrieved 2013-08-26.
  14. "Robust Cooperation in the Prisoner's Dilemma: Program Equilibrium via Provability Logic" (PDF). Machine Intelligence Research Institute. Retrieved 2013-08-26.
  15. "A Comparison of Decision Algorithms on Newcomblike Problems" (PDF). Machine Intelligence Research Institute. Retrieved 2013-08-26.
  16. Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9.
  17. Yudkowsky, Eliezer (2011). "Complex Value Systems are Required to Realize Valuable Futures" (PDF). AGI-11. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  18. Yudkowsky, Eliezer. "Creating Friendly AI". Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  19. Yudkowsky, Eliezer. "Levels of Organization in General Intelligence" (PDF). Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  20. Yudkowsky, Eliezer. "Coherent Extrapolated Volition". Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  21. Yudkowsky, Eliezer. "Timeless Decision Theory" (PDF). Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  22. "The AI-Box Experiment". Retrieved 2013-08-26.
  23. "Yudkowsky- Fiction". Eliezer Yudkowsky.
  24. pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"
  25. David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31.
  26. "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
  27. David Brin (2012-01-20). "CONTRARY BRIN: David Brin's List of "Greatest Science Fiction and Fantasy Tales"". Davidbrin.blogspot.com. Retrieved 2012-08-31.
  28. http://davidbrin.blogspot.com/2013/02/science-fiction-and-our-duty-to-past.html
  29. Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31.
  30. "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31.
  31. Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31.
  32. Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". archive.org. Retrieved 2013-04-10.
  33. "Harry Potter and the Methods of Rationality". Esr.ibiblio.org. 2010-07-06. Retrieved 2012-08-31.

Further reading

  • Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
  • The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265-272, 289, 321, 324, 326, 337-339, 345, 353, 370.

External links

Template:Persondata

Categories: