Editing Talk:1345: Answers
Please sign your posts with ~~~~ |
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 36: | Line 36: | ||
It's long been a fundamental technique of artificial neural network learning, to alternate between "learning" and "sleep" modes. I've heard (but cannot find the citation, sigh) that when running neural networks, it turns out that they lose the ability to learn after running a long time. But you can avoid this effect if you periodically bathe the neural network with completely random input. [[User:Jorgbrown|Jorgbrown]] ([[User talk:Jorgbrown|talk]]) 07:35, 23 January 2015 (UTC) | It's long been a fundamental technique of artificial neural network learning, to alternate between "learning" and "sleep" modes. I've heard (but cannot find the citation, sigh) that when running neural networks, it turns out that they lose the ability to learn after running a long time. But you can avoid this effect if you periodically bathe the neural network with completely random input. [[User:Jorgbrown|Jorgbrown]] ([[User talk:Jorgbrown|talk]]) 07:35, 23 January 2015 (UTC) | ||
:Makes sense, and reminds me of some optimization algorithms. Interesting! [[Special:Contributions/108.162.219.125|108.162.219.125]] 03:01, 4 March 2015 (UTC) | :Makes sense, and reminds me of some optimization algorithms. Interesting! [[Special:Contributions/108.162.219.125|108.162.219.125]] 03:01, 4 March 2015 (UTC) | ||
β | |||
β |