Editing 1696: AI Research
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 14: | Line 14: | ||
In this comic, [[Randall]]/[[Cueball]] jokingly suggests that in order to accomplish this goal, one can give him an AI that's already as smart as an adult, and let him teach it childish and silly things. He is shown teaching it dumb jokes, much like the ones a sassy six-year-old would make, as the first {{w|Flatulence humor|"fart" joke}} where ''art''ificial is changed to ''fart''ificial. | In this comic, [[Randall]]/[[Cueball]] jokingly suggests that in order to accomplish this goal, one can give him an AI that's already as smart as an adult, and let him teach it childish and silly things. He is shown teaching it dumb jokes, much like the ones a sassy six-year-old would make, as the first {{w|Flatulence humor|"fart" joke}} where ''art''ificial is changed to ''fart''ificial. | ||
− | The humor in the comic is that Randall is essentially accomplishing the present goal of a six-year-old-equivalent AI by starting with the final goal, which is a full human intelligence, and making it dumber, just by teaching it poor | + | The humor in the comic is that Randall is essentially accomplishing the present goal of a six-year-old-equivalent AI by starting with the final goal, which is a full human intelligence, and making it dumber, just by teaching it poor humor. This is not unlike the old joke, "The easiest way to make a small fortune on Wall Street [or similar] is to start with a large one." |
The specific situation may also be a reference to {{w|Tay (bot)|Tay}}, a Microsoft chatbot that was taught to {{w|internet troll|troll}} within hours of its exposure to the public. | The specific situation may also be a reference to {{w|Tay (bot)|Tay}}, a Microsoft chatbot that was taught to {{w|internet troll|troll}} within hours of its exposure to the public. |