Editing 2318: Dynamic Entropy

Jump to: navigation, search

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 10: Line 10:
 
This is another one of [[Randall|Randall's]] [[:Category:Tips|Tips]], this time a [[:Category:Science tip|Science Tip]]. This time it is a bit special since it came less than three weeks after another Science Tip: [[2311: Confidence Interval]] (which was itself the first time that a non-Protip Tip type has been re-used). This is the first time a type of tip (that was not a [[:Category:Protip|Protip]]) has been used for two "tips comics" in a row.
 
This is another one of [[Randall|Randall's]] [[:Category:Tips|Tips]], this time a [[:Category:Science tip|Science Tip]]. This time it is a bit special since it came less than three weeks after another Science Tip: [[2311: Confidence Interval]] (which was itself the first time that a non-Protip Tip type has been re-used). This is the first time a type of tip (that was not a [[:Category:Protip|Protip]]) has been used for two "tips comics" in a row.
  
This Science Tip suggests that if you have a cool new concept, you should call it ''dynamic entropy'', hence the title.  
+
This Science Tip suggests that if you have a cool new concept, you should call it ''Dynamic entropy'', hence the title.  
  
 
{{w|Dynamic programming}} is a mathematical optimization method and computer programming method developed by {{w|Richard Bellman}} in the 1950s. The {{w|Dynamic programming#History|History section}} of the Wikipedia article contains the full paragraph from Bellman's autobiography that contains the quote that is in the comic strip. Bellman describes how he was doing mathematical research funded by the military at a time when the Secretary of Defense had a literal pathological fear of the word "research", and by extension, "mathematical". Bellman borrowed the word "dynamic" from physics as being both accurate for his work and as a word that in plain English has positive connotations and is never used in a pejorative sense (expressing contempt or disapproval).  The word "dynamic" itself comes from the Greek ''dynamikos'', "powerful", which is a positive meaning in itself, and has been applied to topics in physics that are related to motion and forces and used in ordinary English to refer to things that exert power, force, growth, and change (dynamo, dynamite, and as an adjective).  Even though those things aren't always good, when they're bad, we use other words instead (e.g. cancer undergoes {{w|metastasis}}, not "dynamism").
 
{{w|Dynamic programming}} is a mathematical optimization method and computer programming method developed by {{w|Richard Bellman}} in the 1950s. The {{w|Dynamic programming#History|History section}} of the Wikipedia article contains the full paragraph from Bellman's autobiography that contains the quote that is in the comic strip. Bellman describes how he was doing mathematical research funded by the military at a time when the Secretary of Defense had a literal pathological fear of the word "research", and by extension, "mathematical". Bellman borrowed the word "dynamic" from physics as being both accurate for his work and as a word that in plain English has positive connotations and is never used in a pejorative sense (expressing contempt or disapproval).  The word "dynamic" itself comes from the Greek ''dynamikos'', "powerful", which is a positive meaning in itself, and has been applied to topics in physics that are related to motion and forces and used in ordinary English to refer to things that exert power, force, growth, and change (dynamo, dynamite, and as an adjective).  Even though those things aren't always good, when they're bad, we use other words instead (e.g. cancer undergoes {{w|metastasis}}, not "dynamism").
Line 16: Line 16:
 
{{w|Entropy}} is a term from physics, specifically statistical mechanics, describing a property of a thermodynamic system. When {{w|Claude Shannon}} developed a mathematical framework for studying signal processing and communications systems, which became known as {{w|Information theory}}, he struggled to come up with a proper name for one mathematical concept in his theory that quantified amount of noise or uncertainty in a signal. Computer scientist {{w|John von Neumann}} noticed the similarity of the equations with some in thermodynamics and suggested, "You should {{w|Entropy (information theory)|call it entropy}}, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (see {{w|History of information theory#Entropy in statistical mechanics|History of information theory}}).
 
{{w|Entropy}} is a term from physics, specifically statistical mechanics, describing a property of a thermodynamic system. When {{w|Claude Shannon}} developed a mathematical framework for studying signal processing and communications systems, which became known as {{w|Information theory}}, he struggled to come up with a proper name for one mathematical concept in his theory that quantified amount of noise or uncertainty in a signal. Computer scientist {{w|John von Neumann}} noticed the similarity of the equations with some in thermodynamics and suggested, "You should {{w|Entropy (information theory)|call it entropy}}, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (see {{w|History of information theory#Entropy in statistical mechanics|History of information theory}}).
  
The naming of dynamic programming and of entropy in information theory are both examples of scientists choosing a name for what were at least partially very non-scientific seeming reasons. In one case because it has only positive and no negative connotations in plain English. In the other case because there is much confusion over the meaning of the word so Shannon would be free to adopt it in a new context. [[Randall]] is claiming that would make them great to put together to name some new concept; the combination will mean whatever the creator wants it to mean (even able to change mid-debate), and never sound bad the way that e.g. {{w|cold fusion}} has come to be.
+
The naming of dynamic programming and of entropy in Information Theory are both examples of scientists choosing a name for what were at least partially very non-scientific seeming reasons. In one case because it has only positive and no negative connotations in plain English. In the other case because there is much confusion over the meaning of the word so Shannon would be free to adopt it in a new context. [[Randall]] is claiming that would make them great to put together to name some new concept; the combination will mean whatever the creator wants it to mean (even able to change mid-debate), and never sound bad the way that e.g. {{w|cold fusion}} has come to be.
  
 
Even though the caption implies that "dynamic entropy" would be available as a new name, it has actually been used in physics<ref>Allegrini, P., Douglas, J. F., & Glotzer, S. C. (1999). Dynamic entropy as a measure of caging and persistent particle motion in supercooled liquids. Physical Review E, 60(5), 5714, doi: 10.1103/physreve.60.5714.</ref>, probability<ref>Asadi, M., Ebrahimi, N., Hamedani, G., & Soofi, E. (2004). Maximum Dynamic Entropy Models. Journal of Applied Probability, 41(2), 379-390. Retrieved June 11, 2020, from www.jstor.org/stable/3216023</ref>, computer science<ref>S. Satpathy et al., "An All-Digital Unified Static/Dynamic Entropy Generator Featuring Self-Calibrating Hierarchical Von Neumann Extraction for Secure Privacy-Preserving Mutual Authentication in IoT Mote Platforms," 2018 IEEE Symposium on VLSI Circuits, Honolulu, HI, 2018, pp. 169-170, doi: 10.1109/VLSIC.2018.8502369.</ref>, and even the term "dynamical entropy" in physics<ref>Green, J. R., Costa, A. B., Grzybowski, B. A., & Szleifer, I. (2013). Relationship between dynamical entropy and energy dissipation far from thermodynamic equilibrium. Proceedings of the National Academy of Sciences, 110(41), 16339-16343.</ref><ref>Słomczyński, W., & Szczepanek, A. (2017). Quantum dynamical entropy, chaotic unitaries and complex Hadamard matrices. IEEE Transactions on Information Theory, 63(12), 7821-7831, doi: 10.1109/TIT.2017.2751507.</ref> and bioscience<ref>Chakrabarti, C. G., & Ghosh, K. (2013). Dynamical entropy via entropy of non-random matrices: Application to stability and complexity in modelling ecosystems. Mathematical biosciences, 245(2), 278-281, doi: 10.1016/j.mbs.2013.07.016.</ref>.
 
Even though the caption implies that "dynamic entropy" would be available as a new name, it has actually been used in physics<ref>Allegrini, P., Douglas, J. F., & Glotzer, S. C. (1999). Dynamic entropy as a measure of caging and persistent particle motion in supercooled liquids. Physical Review E, 60(5), 5714, doi: 10.1103/physreve.60.5714.</ref>, probability<ref>Asadi, M., Ebrahimi, N., Hamedani, G., & Soofi, E. (2004). Maximum Dynamic Entropy Models. Journal of Applied Probability, 41(2), 379-390. Retrieved June 11, 2020, from www.jstor.org/stable/3216023</ref>, computer science<ref>S. Satpathy et al., "An All-Digital Unified Static/Dynamic Entropy Generator Featuring Self-Calibrating Hierarchical Von Neumann Extraction for Secure Privacy-Preserving Mutual Authentication in IoT Mote Platforms," 2018 IEEE Symposium on VLSI Circuits, Honolulu, HI, 2018, pp. 169-170, doi: 10.1109/VLSIC.2018.8502369.</ref>, and even the term "dynamical entropy" in physics<ref>Green, J. R., Costa, A. B., Grzybowski, B. A., & Szleifer, I. (2013). Relationship between dynamical entropy and energy dissipation far from thermodynamic equilibrium. Proceedings of the National Academy of Sciences, 110(41), 16339-16343.</ref><ref>Słomczyński, W., & Szczepanek, A. (2017). Quantum dynamical entropy, chaotic unitaries and complex Hadamard matrices. IEEE Transactions on Information Theory, 63(12), 7821-7831, doi: 10.1109/TIT.2017.2751507.</ref> and bioscience<ref>Chakrabarti, C. G., & Ghosh, K. (2013). Dynamical entropy via entropy of non-random matrices: Application to stability and complexity in modelling ecosystems. Mathematical biosciences, 245(2), 278-281, doi: 10.1016/j.mbs.2013.07.016.</ref>.

Please note that all contributions to explain xkcd may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see explain xkcd:Copyrights for details). Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel | Editing help (opens in new window)