User Tools

Site Tools


basic_notions:entropy

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
basic_notions:entropy [2017/06/24 15:40]
jakobadmin [Student]
basic_notions:entropy [2020/06/19 13:04] (current)
91.89.151.102 [Concrete]
Line 1: Line 1:
 ====== Entropy ====== ====== Entropy ======
  
-<​tabbox ​Why is it interesting?> ​+<​tabbox ​Intuitive>​  
 +Entropy ​is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice, every system gets more chaotic over time, unless we use energy to bring it into order. ​
  
 +An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy.
  
 +Therefore, it is much more likely that a system will end up in a state of higher entropy.
  
-<tabbox Layman> ​+A familiar example is sand. It is far more likely to find sand lying randomly around than in an ordered form like a sand castle.
  
-<note tip> +---- 
-Explanations in this section should contain no formulas, but instead colloquial things like you would hear them during a coffee break or at a cocktail party. + 
-</note> +  * https://​gravityandlevity.wordpress.com/​2009/​04/​01/​entropy-and-gambling
-   +  ​* [[https://​arxiv.org/​abs/​1705.02223|Entropy?​ Honest!]] by Toffoli 
-<​tabbox ​Student>  +<​tabbox ​Concrete 
-The previously ​empirical observation that the observation ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​+  * [[https://​philpapers.org/​archive/​FRIEA.pdf|ENTROPY:​ A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl 
 + 
 +----- 
 + 
 +The, in the beginning purely ​empiricalobservation that the entropy ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​
  
 Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​ Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​
Line 36: Line 43:
  
 In this sense, the maximization of entropy is a general principle for reasoning in situations where we do not know all the details. In this sense, the maximization of entropy is a general principle for reasoning in situations where we do not know all the details.
-</​note>​ 
  
 For a nice simple example of this kind of reasoning, see http://​www-mtl.mit.edu/​Courses/​6.050/​2003/​notes/​chapter9.pdf ​ For a nice simple example of this kind of reasoning, see http://​www-mtl.mit.edu/​Courses/​6.050/​2003/​notes/​chapter9.pdf ​
 +</​note>​
  
-**Shannon Entropy**+If this point of view is correct, an obvious question pops up: **Why then is the principle of maximum entropy so uniformly successful?** 
 + 
 +The reason is that the multiplicity of the macroscopic configurations,​ i.e. the number of ways in which they can be realized in terms of microscopic configurations,​ has an extremely sharp maximum. This can be calculated explicitly, as shown, for example at page 7 [[https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf|here]]. There it is found for a simple system that "not only is $E'$ the value of $E_1$ 
 +that can happen in the greatest number of ways for given total energy $E$; the vast majority of all 
 +possible microstates with total energy $E$ have $E_1$ very close to $E'$. Less than 1 in $10^8$ of all possible states have $E_1$ outside the interval ($E' \pm 6 \sigma$), far too narrow to measure experimentally"​. 
 + 
 +If it would be otherwise, for example, when the maximum would be broad or if there would be many local maxima the principle of maximum entropy wouldn'​t be so powerful.  
 + 
 +Thus even 
 + 
 +<​blockquote>​ 
 +if we had more information we would seldom do better in prediction of reproducible phenomena, because those are the same for virtually all microstates in an enormously large class C; and therefore also in virtually any subset of C. [...] Knowledge of the "​data"​ E alone would not enable us to choose among the different values 
 +of $E_1$ allowed by [energy conservation];​ the additional information contained in the entropy functions, nevertheless 
 +leads us to make one de finite choice as far more likely than any other, on the information supposed. 
 + 
 +<​cite>​https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf</​cite>​ 
 +</​blockquote>​ 
 + 
 +-->​Shannon Entropy#
  
 <​blockquote>​ <​blockquote>​
Line 47: Line 72:
 <​cite>​https://​adamcaulton.files.wordpress.com/​2013/​05/​thermo51.pdf2</​cite>​ <​cite>​https://​adamcaulton.files.wordpress.com/​2013/​05/​thermo51.pdf2</​cite>​
 </​blockquote>​ </​blockquote>​
 +<--
  
-** Boltzmann Entropy ​**+--> ​Boltzmann Entropy#
  
 <​blockquote>​ <​blockquote>​
Line 80: Line 106:
 </​blockquote>​ </​blockquote>​
  
-** Gibbs Entropy ​**+<-- 
 + 
 +--> ​Gibbs Entropy#
  
 In the beginning there was just Clausius'​ weak statement that the entropy of a system tends to increase: In the beginning there was just Clausius'​ weak statement that the entropy of a system tends to increase:
Line 123: Line 151:
 <​cite>​https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf</​cite>​ <​cite>​https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf</​cite>​
 </​blockquote>​ </​blockquote>​
-<tabbox Researcher> ​ 
- 
-<note tip> 
-The motto in this section is: //the higher the level of abstraction,​ the better//. 
-</​note>​ 
- 
-A great discussion of common misconceptions by Jaynes himself can be found here https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf 
- 
- 
---> Common Question 1# 
- 
-  
 <-- <--
 +<tabbox Abstract> ​
  
---> Common Question 2#+A great discussion of common misconceptions by E. T. Jaynes can be found here https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf
  
-  +See also "​[[http://​bayes.wustl.edu/​etj/​articles/​stand.on.entropy.pdf|Where do we stand on maximum entropy]]"​ by Jaynes.
-<-- +
-   +
-<tabbox Examples> ​+
  
---> Example1# 
  
-  +<​tabbox ​Why is it interesting?​
-<-- +
- +
---> Example2:#​ +
- +
-  +
-<-- +
-   +
-<​tabbox ​History+
  
 </​tabbox>​ </​tabbox>​
  
  
basic_notions/entropy.1498311639.txt.gz · Last modified: 2017/12/04 08:01 (external edit)