# Difference between revisions of "Talk:2318: Dynamic Entropy"

(GOOMPH.D.R.) |
(Entropy again) |
||

Line 15: | Line 15: | ||

Well bugger me (METAPHOR! METAPHOR!) but my current Master thesis in Computer Science could use that term without much shoehorning. (tl;dr: Binary search trees that adapt, =dynamic, can serve a query series faster than static, and the gain depends on the structure of the query series, =entropy. I prefer the good old "instance optimality", though...) [[Special:Contributions/162.158.159.122|162.158.159.122]] 08:58, 11 June 2020 (UTC) | Well bugger me (METAPHOR! METAPHOR!) but my current Master thesis in Computer Science could use that term without much shoehorning. (tl;dr: Binary search trees that adapt, =dynamic, can serve a query series faster than static, and the gain depends on the structure of the query series, =entropy. I prefer the good old "instance optimality", though...) [[Special:Contributions/162.158.159.122|162.158.159.122]] 08:58, 11 June 2020 (UTC) | ||

+ | |||

+ | This seems to tie in with the recent comic [[2315]]: ''Eventual Consistency'', which is also about entropy (in a thermodynamic(al) sense), but I guess that like the rest of the world I don't know what entropy really is, because if [[wikipedia:Entropy (information theory)|entropy]] is a measure of how "surprising" a variable is, why is everything being flat and spread out evenly called a state of maximum entropy? Everything being the same doesn't sound very surprising to me... --[[User:IByte|IByte]] ([[User talk:IByte|talk]]) 09:08, 11 June 2020 (UTC) |

## Revision as of 09:08, 11 June 2020

Can confirm, have never lost an argument. Dynamic Entropy (talk) 00:45, 11 June 2020 (UTC)

- Allegrini, P., Douglas, J. F., & Glotzer, S. C. (1999). Dynamic entropy as a measure of caging and persistent particle motion in supercooled liquids. Physical Review E, 60(5), 5714, doi: 10.1103/physreve.60.5714.

- Asadi, M., Ebrahimi, N., Hamedani, G., & Soofi, E. (2004). Maximum Dynamic Entropy Models. Journal of Applied Probability, 41(2), 379-390. Retrieved June 11, 2020, from www.jstor.org/stable/3216023

- Green, J. R., Costa, A. B., Grzybowski, B. A., & Szleifer, I. (2013). Relationship between dynamical entropy and energy dissipation far from thermodynamic equilibrium. Proceedings of the National Academy of Sciences, 110(41), 16339-16343.

- S. Satpathy et al., "An All-Digital Unified Static/Dynamic Entropy Generator Featuring Self-Calibrating Hierarchical Von Neumann Extraction for Secure Privacy-Preserving Mutual Authentication in IoT Mote Platforms," 2018 IEEE Symposium on VLSI Circuits, Honolulu, HI, 2018, pp. 169-170, doi: 10.1109/VLSIC.2018.8502369.
- Bugstomper (talk) 01:28, 11 June 2020 (UTC)

Well bugger me (METAPHOR! METAPHOR!) but my current Master thesis in Computer Science could use that term without much shoehorning. (tl;dr: Binary search trees that adapt, =dynamic, can serve a query series faster than static, and the gain depends on the structure of the query series, =entropy. I prefer the good old "instance optimality", though...) 162.158.159.122 08:58, 11 June 2020 (UTC)

This seems to tie in with the recent comic 2315: *Eventual Consistency*, which is also about entropy (in a thermodynamic(al) sense), but I guess that like the rest of the world I don't know what entropy really is, because if entropy is a measure of how "surprising" a variable is, why is everything being flat and spread out evenly called a state of maximum entropy? Everything being the same doesn't sound very surprising to me... --IByte (talk) 09:08, 11 June 2020 (UTC)