Both sides previous revision Previous revision Next revision | Previous revision | ||
basic_notions:entropy [2017/06/24 15:43] jakobadmin [Student] |
basic_notions:entropy [2020/06/19 13:04] (current) 91.89.151.102 [Concrete] |
||
---|---|---|---|
Line 1: | Line 1: | ||
====== Entropy ====== | ====== Entropy ====== | ||
- | <tabbox Why is it interesting?> | + | <tabbox Intuitive> |
+ | Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice, every system gets more chaotic over time, unless we use energy to bring it into order. | ||
+ | An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy. | ||
+ | Therefore, it is much more likely that a system will end up in a state of higher entropy. | ||
- | <tabbox Layman> | + | A familiar example is sand. It is far more likely to find sand lying randomly around than in an ordered form like a sand castle. |
- | <note tip> | + | ---- |
- | Explanations in this section should contain no formulas, but instead colloquial things like you would hear them during a coffee break or at a cocktail party. | + | |
- | </note> | + | * https://gravityandlevity.wordpress.com/2009/04/01/entropy-and-gambling/ |
- | + | * [[https://arxiv.org/abs/1705.02223|Entropy? Honest!]] by Toffoli | |
- | <tabbox Student> | + | <tabbox Concrete> |
- | The previously empirical observation that the observation of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. | + | * [[https://philpapers.org/archive/FRIEA.pdf|ENTROPY: A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl |
+ | |||
+ | ----- | ||
+ | |||
+ | The, in the beginning purely empirical, observation that the entropy of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. | ||
Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. | Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. | ||
Line 42: | Line 49: | ||
If this point of view is correct, an obvious question pops up: **Why then is the principle of maximum entropy so uniformly successful?** | If this point of view is correct, an obvious question pops up: **Why then is the principle of maximum entropy so uniformly successful?** | ||
+ | The reason is that the multiplicity of the macroscopic configurations, i.e. the number of ways in which they can be realized in terms of microscopic configurations, has an extremely sharp maximum. This can be calculated explicitly, as shown, for example at page 7 [[https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf|here]]. There it is found for a simple system that "not only is $E'$ the value of $E_1$ | ||
+ | that can happen in the greatest number of ways for given total energy $E$; the vast majority of all | ||
+ | possible microstates with total energy $E$ have $E_1$ very close to $E'$. Less than 1 in $10^8$ of all possible states have $E_1$ outside the interval ($E' \pm 6 \sigma$), far too narrow to measure experimentally". | ||
+ | If it would be otherwise, for example, when the maximum would be broad or if there would be many local maxima the principle of maximum entropy wouldn't be so powerful. | ||
+ | |||
+ | Thus even | ||
+ | |||
+ | <blockquote> | ||
+ | if we had more information we would seldom do better in prediction of reproducible phenomena, because those are the same for virtually all microstates in an enormously large class C; and therefore also in virtually any subset of C. [...] Knowledge of the "data" E alone would not enable us to choose among the different values | ||
+ | of $E_1$ allowed by [energy conservation]; the additional information contained in the entropy functions, nevertheless | ||
+ | leads us to make one definite choice as far more likely than any other, on the information supposed. | ||
+ | |||
+ | <cite>https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf</cite> | ||
+ | </blockquote> | ||
-->Shannon Entropy# | -->Shannon Entropy# | ||
Line 87: | Line 108: | ||
<-- | <-- | ||
- | <-- Gibbs Entropy# | + | --> Gibbs Entropy# |
In the beginning there was just Clausius' weak statement that the entropy of a system tends to increase: | In the beginning there was just Clausius' weak statement that the entropy of a system tends to increase: | ||
Line 131: | Line 152: | ||
</blockquote> | </blockquote> | ||
<-- | <-- | ||
- | <tabbox Researcher> | + | <tabbox Abstract> |
- | <note tip> | + | A great discussion of common misconceptions by E. T. Jaynes can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf |
- | The motto in this section is: //the higher the level of abstraction, the better//. | + | |
- | </note> | + | |
- | A great discussion of common misconceptions by Jaynes himself can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf | + | See also "[[http://bayes.wustl.edu/etj/articles/stand.on.entropy.pdf|Where do we stand on maximum entropy]]" by Jaynes. |
- | --> Common Question 1# | + | <tabbox Why is it interesting?> |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | --> Common Question 2# | + | |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | <tabbox Examples> | + | |
- | + | ||
- | --> Example1# | + | |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | --> Example2:# | + | |
- | + | ||
- | + | ||
- | <-- | + | |
- | + | ||
- | <tabbox History> | + | |
</tabbox> | </tabbox> | ||