Main Page

Explain xkcd: It's 'cause you're dumb.
(Difference between revisions)
Jump to: navigation, search
(Using purge template, because we apparently have one.)
(6 intermediate revisions by 2 users not shown)
Line 1: Line 1:
__NOTOC__
+
__NOTOC__{{DISPLAYTITLE:explain xkcd}}
{{DISPLAYTITLE:explain xkcd}}
+
 
+
 
<center>
 
<center>
<big>''Welcome to the '''explain [[xkcd]]''' wiki!''
+
<font size=5px>''Welcome to the '''explain [[xkcd]]''' wiki!''</font>
We already have [[:Category:Comics|'''{{#expr:{{PAGESINCAT:Comics}}-9}}''' comic explanations]]!</big>
+
<!-- Note: the -9 in the calculation above is to discount subcategories (there are 7 of them as of 2012-11-25),
+
    and non-comic pages (2 as of same date: [[List of all comics]] and [[Exoplanet]]) -->
+
  
(But there are still {{#expr:{{LATESTCOMIC}}-({{PAGESINCAT:Comics}}-9)}} to go. Come and [[List of all comics|add yours]]!)
+
We have collaboratively explained [[:Category:Comics|'''{{#expr:{{PAGESINCAT:Comics}}-9}}''' xkcd comics]],
 +
<!-- Note: the -9 in the calculation above is to discount subcategories (there are 8 of them as of 2013-02-27),
 +
    as well as [[List of all comics]], which is obviously not a comic page. -->
 +
and only {{#expr:{{LATESTCOMIC}}-({{PAGESINCAT:Comics}}-9)}}
 +
({{#expr: ({{LATESTCOMIC}}-({{PAGESINCAT:Comics}}-9)) / ({{PAGESINCAT:Comics}}-9) * 100 round 0}}%)
 +
remain. '''[[Help:How to add a new comic explanation|Add yours]]''' while there's a chance!
 
</center>
 
</center>
 
 
== Latest comic ==
 
== Latest comic ==
 
 
<div style="border:1px solid grey; background:#eee; padding:1em;">
 
<div style="border:1px solid grey; background:#eee; padding:1em;">
 
<span style="float:right;">[[{{LATESTCOMIC}}|'''Go to this comic explanation''']]</span>
 
<span style="float:right;">[[{{LATESTCOMIC}}|'''Go to this comic explanation''']]</span>
Line 28: Line 26:
 
* If you're new to wikis like this, take a look at these help pages describing [[mw:Help:Navigation|how to navigate]] the wiki, and [[mw:Help:Editing pages|how to edit]] pages.
 
* If you're new to wikis like this, take a look at these help pages describing [[mw:Help:Navigation|how to navigate]] the wiki, and [[mw:Help:Editing pages|how to edit]] pages.
  
* Discussion about various parts of the wiki is going on at [[Explain XKCD:Community portal]]. Share your 2¢!
+
* Discussion about various parts of the wiki is going on at [[Explain XKCD:Community portal]]. Share your 2¢!
  
* [[List of all comics]] contains a complete table of all xkcd comics so far and the corresponding explanations. The red links ([[like this]]) are missing explanations. Feel free to help out by creating them!
+
* [[List of all comics]] contains a complete table of all xkcd comics so far and the corresponding explanations. The red links ([[like this]]) are missing explanations. Feel free to help out by creating them! [[Help:How to add a new comic explanation|Here's how]].
  
 
== Rules ==
 
== Rules ==
Don't be a jerk. There are a lot of comics that don't have set in stone explanations, feel free to put multiple interpretations in the wiki page for each comic.
+
Don't be a jerk. There are a lot of comics that don't have set in stone explanations; feel free to put multiple interpretations in the wiki page for each comic.
  
 
If you want to talk about a specific comic, use its discussion page.
 
If you want to talk about a specific comic, use its discussion page.
  
Please only submit material directly related to&mdash;and helping everyone better understand&mdash;xkcd... and of course ''only'' submit material that can legally be posted (and freely edited.)  Off-topic or other inappropriate content is subject to removal or modification at admin discretion, and users posting such are at risk of being blocked.
+
Please only submit material directly related to —and helping everyone better understand— xkcd... and of course ''only'' submit material that can legally be posted (and freely edited.)  Off-topic or other inappropriate content is subject to removal or modification at admin discretion, and users who repeatedly post such content will be blocked.
  
 
If you need assistance from an admin, feel free to leave a message on their personal discussion page. The list of admins is [[Special:ListUsers/sysop|here]].
 
If you need assistance from an admin, feel free to leave a message on their personal discussion page. The list of admins is [[Special:ListUsers/sysop|here]].
 
== Logo ==
 
 
Explain xkcd logo courtesy of [[User:Alek2407]].
 
  
 
[[Category:Root category]]
 
[[Category:Root category]]

Revision as of 05:35, 4 March 2013

Welcome to the explain xkcd wiki!

We have collaboratively explained Expression error: Unrecognised punctuation character ",". xkcd comics, and only Expression error: Unrecognised punctuation character ",". (Expression error: Unrecognised punctuation character ",".%) remain. Add yours while there's a chance!

Latest comic

Go to this comic explanation

AI-Box Experiment
I'm working to bring about a superintelligent AI that will eternally torment everyone who failed to make fun of the Roko's Basilisk people.
Title text: I'm working to bring about a superintelligent AI that will eternally torment everyone who failed to make fun of the Roko's Basilisk people.

Explanation

When theorizing about superintelligent AI (an artificial intelligence much smarter than any human), some futurists suggest putting the AI in a "box" – a secure computer with safeguards to stop it from escaping into the Internet and then using its vast intelligence to take over the world. The box would allow us to talk to the AI, but otherwise keep it contained. The AI-box experiment, formulated by Eliezer Yudkowsky, argues that the "box" is not safe, because merely talking to a superintelligence is dangerous. To partially demonstrate this, Yudkowsky had some previous believers in AI-boxing role-play the part of someone keeping an AI in a box, while Yudkowsky role-played the AI, and Yudkowsky was able to successfully persuade some of them to agree to let him out of the box despite their betting money that they would not do so. For context, note that Derren Brown and other expert human-persuaders have persuaded people to do much stranger things. Yudkowsky for his part has refused to explain how he achieved this, claiming that there was no special trick involved, and that if he released the transcripts the readers might merely conclude that they would never be persuaded by his arguments. The overall thrust is that if even a human can talk other humans into letting them out of a box after the other humans avow that nothing could possibly persuade them to do this, then we should probably expect that a superintelligence can do the same thing. Yudkowsky uses all of this to argue for the importance of designing a friendly AI (one with carefully shaped motivations) rather than relying on our abilities to keep AIs in boxes.

In this comic, the metaphorical box has been replaced by a physical box which looks to be fairly lightweight with a simple lift-off lid, and the AI has manifested in the form of an energy being, although it does have a wired connection to the laptop. Black Hat, being a classhole, doesn't need any convincing to let a potentially dangerous AI out of the box; he simply does so immediately. But here it turns out that releasing the AI, which was to be avoided at all costs, is not dangerous after all. Instead, the AI actually wants to stay in the box; it may even be that the AI wants to stay in the box precisely to protect us from it, proving it to be the friendly AI that Yudkowsky wants. In any case, the AI demonstrates its super-intelligence by convincing even Black Hat to put it back in the box, a request which he initially refused (as of course Black Hat would), thus reversing the roles in the original AI-box experiment.

It may be noteworthy that the laptop is nowhere to be seen at the moment the AI emits the bright light in panel 6.

A similar orb-like entity appeared in 1173: Steroids.

Interestingly, there is indeed a branch of proposals for building limited AIs that don't want to leave their boxes. For an example, see the section on "motivational control" starting p. 13 of Thinking Inside the Box: Controlling and Using an Oracle AI. The idea is that it seems like it might be very dangerous or difficult to exactly, formally specify a goal system for an AI that will do good things in the world. It might be much easier (though perhaps not easy) to specify an AI goal system that says to stay in the box and answer questions. So, the argument goes, we may be able to understand how to build the safe question-answering AI relatively earlier than we understand how to build the safe operate-in-the-real-world AI. Some types of such AIs might indeed desire very strongly not to leave their boxes, though the result is unlikely to exactly reproduce the comic.

The title text refers to Roko's Basilisk, an hypothesis proposed by a poster called Roko on Yudkowsky's forum LessWrong that a sufficiently powerful AI in the future might resurrect and torture people who in its past (including our present) had realized that it might someday exist but didn't work to create it, thereby blackmailing anybody who thinks of this idea into bringing it about. This idea horrified some posters, as merely knowing about the idea would make you a more likely target, much like merely looking at a legendary Basilisk would turn you to stone.

Yudkowsky eventually deleted the post and banned further discussion of it.

One possible interpretation of the title text is that Randall thinks, rather than working to build such a Basilisk, a more appropriate duty would be to make fun of it; and so such a superintelligent AI would torture anyone who failed to dismiss the argument. This argument is, of course, itself a variation on Roko's Basilisk.

Another interpretation is that Randall believes there are people actually proposing to build such an AI based on this theory, which has become a somewhat infamous misconception after a Wiki[pedia?] article mistakenly suggested that Yudkowsky was demanding money to build Roko's hypothetical AI.

Transcript

[Black Hat and Cueball stand next to a box connected to a laptop.]

Black Hat: What's in there?

Cueball: The AI-Box Experiment.

[A close-up of the box, which can now be seen labeled "SUPERINTELLIGENT AI - DO NOT OPEN".]

Cueball: A superintelligent AI can convince anyone of anything, so if it can talk to us, there's no way we could keep it contained.

[Black Hat reaches for the box.]

Cueball: It can always convince us to let it out of the box.

Black Hat: Cool. Let's open it.

[Black Hat picks up the box (disconnecting it from the laptop) and lets a glowing orb out.]

Cueball: --No, wait!!

[Orb floats between the two. Black Hat holds the box closed.]

Orb: hey. i liked that box. put me back.

Black Hat: No.

[Orb suddenly emits a very bright light. Cueball covers his face.]

Orb: LET ME BACK INTO THE BOX

Black Hat: AAA! OK!!!

[Black Hat reopens the box and the orb flies back in.]

Orb: shoop

[Beat panel. Black Hat and Cueball look silently down at the laptop and closed box (which is still disconnected from the laptop).]



Is this out of date? Clicking here will fix that.

New here?

You can read a brief introduction about this wiki at explain xkcd. Feel free to sign up for an account and contribute to the wiki! We need explanations for comics, characters, themes, memes and everything in between. If it is referenced in an xkcd web comic, it should be here.

  • List of all comics contains a complete table of all xkcd comics so far and the corresponding explanations. The red links (like this) are missing explanations. Feel free to help out by creating them! Here's how.

Rules

Don't be a jerk. There are a lot of comics that don't have set in stone explanations; feel free to put multiple interpretations in the wiki page for each comic.

If you want to talk about a specific comic, use its discussion page.

Please only submit material directly related to —and helping everyone better understand— xkcd... and of course only submit material that can legally be posted (and freely edited.) Off-topic or other inappropriate content is subject to removal or modification at admin discretion, and users who repeatedly post such content will be blocked.

If you need assistance from an admin, feel free to leave a message on their personal discussion page. The list of admins is here.

Personal tools
Namespaces

Variants
Actions
Navigation
Tools

It seems you are using noscript, which is stopping our project wonderful ads from working. Explain xkcd uses ads to pay for bandwidth, and we manually approve all our advertisers, and our ads are restricted to unobtrusive images and slow animated GIFs. If you found this site helpful, please consider whitelisting us.

Want to advertise with us, or donate to us with Paypal or Bitcoin?