Editing Talk:1546: Tamagotchi Hive
Please sign your posts with ~~~~ |
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 14: | Line 14: | ||
The singularity reference is worth explaining: The Singularity is a frequent trope in Science Fiction stories that postulates a time when AI technologies become all-pervasive, often alongside ubiquitous computing. This can include a situation where human minds can be uploaded into AIs, effectively running as simulations within these large distributed computers. {{unsigned ip|141.101.98.139}} | The singularity reference is worth explaining: The Singularity is a frequent trope in Science Fiction stories that postulates a time when AI technologies become all-pervasive, often alongside ubiquitous computing. This can include a situation where human minds can be uploaded into AIs, effectively running as simulations within these large distributed computers. {{unsigned ip|141.101.98.139}} | ||
− | :Can someone please elaborate on then significance of "singularity" in the comic? Sure, "the implication is that the author takes care of a population of virtual creatures rather than an AI ruling over the human population" but what has singularity got to do with this? [[User:Pacerier|Pacerier]] ([[User talk:Pacerier|talk]]) 18:44, 12 July 2015 (UTC) | + | : Can someone please elaborate on then significance of "singularity" in the comic? Sure, "the implication is that the author takes care of a population of virtual creatures rather than an AI ruling over the human population" but what has singularity got to do with this? ———[[User:Pacerier|Pacerier]] ([[User talk:Pacerier|talk]]) 18:44, 12 July 2015 (UTC) |
+ | |||
: Actually "The Singularity" only means that an artificial system has grown in complexity beyond our ability to understand or predict it; In many ways this has already occurred. [[Special:Contributions/108.162.221.152|108.162.221.152]] 15:07, 3 July 2015 (UTC) | : Actually "The Singularity" only means that an artificial system has grown in complexity beyond our ability to understand or predict it; In many ways this has already occurred. [[Special:Contributions/108.162.221.152|108.162.221.152]] 15:07, 3 July 2015 (UTC) | ||
::I always thought 'The Singularity' was misnamed, anyway. In the way it is commonly used it is more like 'The Event Horizon'... Not that this has anything to do with the comic, but perhaps worth a side-note, anyway. [[Special:Contributions/141.101.98.181|141.101.98.181]] 19:35, 3 July 2015 (UTC) | ::I always thought 'The Singularity' was misnamed, anyway. In the way it is commonly used it is more like 'The Event Horizon'... Not that this has anything to do with the comic, but perhaps worth a side-note, anyway. [[Special:Contributions/141.101.98.181|141.101.98.181]] 19:35, 3 July 2015 (UTC) | ||
Line 20: | Line 21: | ||
::::The so-called-Singularity' point for AI is apparently where the AI crosses the line of dominance and inexorability. So, yes, that's an 'event horizon', I'd say. [[Special:Contributions/141.101.99.53|141.101.99.53]] 03:14, 4 July 2015 (UTC) | ::::The so-called-Singularity' point for AI is apparently where the AI crosses the line of dominance and inexorability. So, yes, that's an 'event horizon', I'd say. [[Special:Contributions/141.101.99.53|141.101.99.53]] 03:14, 4 July 2015 (UTC) | ||
::::I agree with this definition of singularity (the positive-feedback loop of self-improving AI reaching the point where it is gaining apparently infinite improvement in any human-measurable time), and disagree with the idea that it implies anything about AI taking over or simulating human brains. The joke (as I see it) is that the AI that is optimised to manage trillions of emulated Tamagotchis will start along the same self-improvement path as other, contemporary AIs but will at some point decide that it is pointless improving itself further. Or will purposefully cease improving itself out of the sheer horror of contemplating its rapidly expanding mind-space filled with gazillions of Tamagotchis... [[Special:Contributions/108.162.229.167|108.162.229.167]] 08:35, 6 July 2015 (UTC) | ::::I agree with this definition of singularity (the positive-feedback loop of self-improving AI reaching the point where it is gaining apparently infinite improvement in any human-measurable time), and disagree with the idea that it implies anything about AI taking over or simulating human brains. The joke (as I see it) is that the AI that is optimised to manage trillions of emulated Tamagotchis will start along the same self-improvement path as other, contemporary AIs but will at some point decide that it is pointless improving itself further. Or will purposefully cease improving itself out of the sheer horror of contemplating its rapidly expanding mind-space filled with gazillions of Tamagotchis... [[Special:Contributions/108.162.229.167|108.162.229.167]] 08:35, 6 July 2015 (UTC) | ||
− | |||
Someone needs to get on this and create a BOINC project or something. In all seriousness though, I wonder how many Tamagotchis you could simulate at once on the average home computer. [[User:Saklad5|Saklad5]] ([[User talk:Saklad5|talk]]) 14:55, 3 July 2015 (UTC) | Someone needs to get on this and create a BOINC project or something. In all seriousness though, I wonder how many Tamagotchis you could simulate at once on the average home computer. [[User:Saklad5|Saklad5]] ([[User talk:Saklad5|talk]]) 14:55, 3 July 2015 (UTC) | ||
Line 43: | Line 43: | ||
::::: Also [http://spritesmods.com/?art=tamasingularity]. [[Special:Contributions/108.162.245.147|108.162.245.147]] 07:19, 14 December 2015 (UTC) | ::::: Also [http://spritesmods.com/?art=tamasingularity]. [[Special:Contributions/108.162.245.147|108.162.245.147]] 07:19, 14 December 2015 (UTC) | ||
− | Every other explanation I've ever seen of The Singularity conflicts with this one. This one indicates computers becoming intelligent enough to take control, like with The Matrix or the Terminator movies, which makes it a rather negative thing we should want to avoid. Every OTHER explanation I've seen paints it as something to look forward to, describing The Singularity as being the point when computers become sophisticated enough as to allow humans to transfer their consciousness into a computer, thus extending our lifespans theoretically infinitely (an example of this version would be one particular episode of Big Bang Theory, in which Sheldon calculates that he will not live long enough to see The Singularity, and laments this | + | Every other explanation I've ever seen of The Singularity conflicts with this one. This one indicates computers becoming intelligent enough to take control, like with The Matrix or the Terminator movies, which makes it a rather negative thing we should want to avoid. Every OTHER explanation I've seen paints it as something to look forward to, describing The Singularity as being the point when computers become sophisticated enough as to allow humans to transfer their consciousness into a computer, thus extending our lifespans theoretically infinitely (an example of this version would be one particular episode of Big Bang Theory, in which Sheldon calculates that he will not live long enough to see The Singularity, and laments this. I believe past xkcd comics have likewise used this version) - NiceGuy1 [[Special:Contributions/162.158.60.17|162.158.60.17]] 06:16, 14 February 2016 (UTC) |
− |