Both sides previous revision Previous revision Next revision | Previous revision | ||
basic_notions:entropy [2018/03/27 15:23] jakobadmin [Layman] |
basic_notions:entropy [2020/06/19 13:04] (current) 91.89.151.102 [Concrete] |
||
---|---|---|---|
Line 1: | Line 1: | ||
====== Entropy ====== | ====== Entropy ====== | ||
- | <tabbox Why is it interesting?> | + | <tabbox Intuitive> |
- | + | Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice, every system gets more chaotic over time, unless we use energy to bring it into order. | |
- | + | ||
- | + | ||
- | <tabbox Layman> | + | |
- | Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice that every system gets more chaotic over time unless we use energy to bring it into order. | + | |
An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy. | An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy. | ||
Line 17: | Line 13: | ||
* https://gravityandlevity.wordpress.com/2009/04/01/entropy-and-gambling/ | * https://gravityandlevity.wordpress.com/2009/04/01/entropy-and-gambling/ | ||
- | <tabbox Student> | + | * [[https://arxiv.org/abs/1705.02223|Entropy? Honest!]] by Toffoli |
- | The, in the beginning purely empirical, observation that the observation of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. | + | <tabbox Concrete> |
+ | * [[https://philpapers.org/archive/FRIEA.pdf|ENTROPY: A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl | ||
+ | |||
+ | ----- | ||
+ | |||
+ | The, in the beginning purely empirical, observation that the entropy of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. | ||
Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. | Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. | ||
Line 151: | Line 152: | ||
</blockquote> | </blockquote> | ||
<-- | <-- | ||
- | <tabbox Researcher> | + | <tabbox Abstract> |
- | <note tip> | + | A great discussion of common misconceptions by E. T. Jaynes can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf |
- | The motto in this section is: //the higher the level of abstraction, the better//. | + | |
- | </note> | + | |
- | A great discussion of common misconceptions by Jaynes himself can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf | + | See also "[[http://bayes.wustl.edu/etj/articles/stand.on.entropy.pdf|Where do we stand on maximum entropy]]" by Jaynes. |
- | --> Common Question 1# | + | <tabbox Why is it interesting?> |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | --> Common Question 2# | + | |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | <tabbox Examples> | + | |
- | + | ||
- | --> Example1# | + | |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | --> Example2:# | + | |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | <tabbox History> | + | |
</tabbox> | </tabbox> | ||