Talk:1838: Machine Learning
Appearently, there is the issue of people "training" intelligent systems out of their gut feeling: Let's say for example a system should determine whether or not a person should be promoted to fill a currently vacant business position. If the system is taught by the humans currently in charge of that very decision, and it weakens the people the humnas would decline and stenghtens the one they wouldn't, all these people might do is feeding the machine their own irrational biases. Then, down the road, some candidate may be declined because "computer says so". One could argue that this, if it happens, is just bad usage and no inherent issue of machine learning itself, so I'm not sure if this thought can be connected to the comic. In my head, it's close to "stirring the pile until the answers look right". What do you people think? 184.108.40.206 05:39, 17 May 2017 (UTC)
It's a good point but I don't think it's relevant to the comic. 220.127.116.11 13:55, 17 May 2017 (UTC)
Up the creek *with* a paddle. 18.104.22.168 07:52, 17 May 2017 (UTC)
It's a compost pile! Stir it and keep it moist until something useful comes out. 22.214.171.124 11:40, 17 May 2017 (UTC) ]
Actually I doin't think the paddle has anything to do with canoes - paddles like that are often used when stirring large quantities. In Louisiana its called a crawfish or gumbo paddle
Maybe one day bots will learn to create entire explanations for xkcd. 126.96.36.199 12:38, 17 May 2017 (UTC)
- Good, then maybe we won't have over-thought explanations anymore.