Editing Talk:1546: Tamagotchi Hive

Jump to: navigation, search
Ambox notice.png Please sign your posts with ~~~~

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 43: Line 43:
 
::::: Also [http://spritesmods.com/?art=tamasingularity]. [[Special:Contributions/108.162.245.147|108.162.245.147]] 07:19, 14 December 2015 (UTC)
 
::::: Also [http://spritesmods.com/?art=tamasingularity]. [[Special:Contributions/108.162.245.147|108.162.245.147]] 07:19, 14 December 2015 (UTC)
  
βˆ’
Every other explanation I've ever seen of The Singularity conflicts with this one. This one indicates computers becoming intelligent enough to take control, like with The Matrix or the Terminator movies, which makes it a rather negative thing we should want to avoid. Every OTHER explanation I've seen paints it as something to look forward to, describing The Singularity as being the point when computers become sophisticated enough as to allow humans to transfer their consciousness into a computer, thus extending our lifespans theoretically infinitely (an example of this version would be one particular episode of Big Bang Theory, in which Sheldon calculates that he will not live long enough to see The Singularity, and laments this). I believe past xkcd comics have likewise used this version. - NiceGuy1 [[Special:Contributions/162.158.60.17|162.158.60.17]] 06:16, 14 February 2016 (UTC) I finally signed up! This comment is mine. [[User:NiceGuy1|NiceGuy1]] ([[User talk:NiceGuy1|talk]]) 06:17, 9 June 2017 (UTC)
+
Every other explanation I've ever seen of The Singularity conflicts with this one. This one indicates computers becoming intelligent enough to take control, like with The Matrix or the Terminator movies, which makes it a rather negative thing we should want to avoid. Every OTHER explanation I've seen paints it as something to look forward to, describing The Singularity as being the point when computers become sophisticated enough as to allow humans to transfer their consciousness into a computer, thus extending our lifespans theoretically infinitely (an example of this version would be one particular episode of Big Bang Theory, in which Sheldon calculates that he will not live long enough to see The Singularity, and laments this. I believe past xkcd comics have likewise used this version) - NiceGuy1 [[Special:Contributions/162.158.60.17|162.158.60.17]] 06:16, 14 February 2016 (UTC)
 
:There are a number of sci-fi concepts conflated together when people talk about "The Singularity". Technically, the Singularity is the point at which the behaviour of AI becomes so sophisticated (relative to a human brain) that we are no longer able to predict it's behaviour, motives, etc. Thus is could then dominate history from that point onward in such a way that we can't predict anything that happens afterward (quite frankly, I think it is a pretty silly idea that makes some grandiose assumptions about the nature of intelligence, and is usually written by people with little idea of how history has proceeded in the past, but it is an interesting source of sci-fi stories at least). This is often combined, in sci-fi, with the ideas of "post-humanism", where people link themselves into technology to a degree that fundamentally changes what "the human experience" is. One manifestation of that is the idea that everyone will link themselves into a virtual reality and history will effectively end via a different method (the logic is that if you have AIs capable of outsmarting human brains, then you have AIs capable of containing sim,ulated human brains within themselves, hence: virtual world). Whether or not the Singularity is good or bad is irrelevant - the point of the metaphor is just that it is *unpredictable*. "Blindsight" by Peter Watts shows a future where the Singularity is disturbing and probably bad overall. The Culture novels by Iain M. Banks, in contrast, show a future where the Singularity is overwhelmingly good, and in fact the machines help us to achieve galactic communism. So the bad/good aspect of it just depends on what kind of novel you feel like writing. For a third perspective, check out Kim Stanley Robinson's "2312"... he is skeptical of the concept of The Singularity, and his super-sophisticated AIs impact humanity in quite a different way.[[Special:Contributions/108.162.249.155|108.162.249.155]] 01:09, 10 March 2016 (UTC)
 
:There are a number of sci-fi concepts conflated together when people talk about "The Singularity". Technically, the Singularity is the point at which the behaviour of AI becomes so sophisticated (relative to a human brain) that we are no longer able to predict it's behaviour, motives, etc. Thus is could then dominate history from that point onward in such a way that we can't predict anything that happens afterward (quite frankly, I think it is a pretty silly idea that makes some grandiose assumptions about the nature of intelligence, and is usually written by people with little idea of how history has proceeded in the past, but it is an interesting source of sci-fi stories at least). This is often combined, in sci-fi, with the ideas of "post-humanism", where people link themselves into technology to a degree that fundamentally changes what "the human experience" is. One manifestation of that is the idea that everyone will link themselves into a virtual reality and history will effectively end via a different method (the logic is that if you have AIs capable of outsmarting human brains, then you have AIs capable of containing sim,ulated human brains within themselves, hence: virtual world). Whether or not the Singularity is good or bad is irrelevant - the point of the metaphor is just that it is *unpredictable*. "Blindsight" by Peter Watts shows a future where the Singularity is disturbing and probably bad overall. The Culture novels by Iain M. Banks, in contrast, show a future where the Singularity is overwhelmingly good, and in fact the machines help us to achieve galactic communism. So the bad/good aspect of it just depends on what kind of novel you feel like writing. For a third perspective, check out Kim Stanley Robinson's "2312"... he is skeptical of the concept of The Singularity, and his super-sophisticated AIs impact humanity in quite a different way.[[Special:Contributions/108.162.249.155|108.162.249.155]] 01:09, 10 March 2016 (UTC)

Please note that all contributions to explain xkcd may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see explain xkcd:Copyrights for details). Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel | Editing help (opens in new window)

Template used on this page: