User Tools

Site Tools


basic_notions:entropy

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
basic_notions:entropy [2018/03/27 15:24]
jakobadmin
basic_notions:entropy [2020/06/19 13:04] (current)
91.89.151.102 [Concrete]
Line 2: Line 2:
  
 <tabbox Intuitive> ​ <tabbox Intuitive> ​
-Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice ​that every system gets more chaotic over time unless we use energy to bring it into order. ​+Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practiceevery system gets more chaotic over timeunless we use energy to bring it into order. ​
  
 An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy. An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy.
Line 13: Line 13:
  
   * https://​gravityandlevity.wordpress.com/​2009/​04/​01/​entropy-and-gambling/​   * https://​gravityandlevity.wordpress.com/​2009/​04/​01/​entropy-and-gambling/​
 +  * [[https://​arxiv.org/​abs/​1705.02223|Entropy?​ Honest!]] by Toffoli
 <tabbox Concrete> ​ <tabbox Concrete> ​
-The, in the beginning purely empirical, observation that the observation ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​+  * [[https://​philpapers.org/​archive/​FRIEA.pdf|ENTROPY:​ A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl 
 + 
 +----- 
 + 
 +The, in the beginning purely empirical, observation that the entropy ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​
  
 Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​ Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​
Line 149: Line 154:
 <tabbox Abstract> ​ <tabbox Abstract> ​
  
-<note tip> +A great discussion of common misconceptions by E. T. Jaynes can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf
-The motto in this section is: //the higher the level of abstraction,​ the better//. +
-</​note>​+
  
-A great discussion of common misconceptions by Jaynes himself can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf+See also "​[[http://bayes.wustl.edu/etj/articles/​stand.on.entropy.pdf|Where do we stand on maximum entropy]]"​ by Jaynes.
  
  
basic_notions/entropy.1522157049.txt.gz · Last modified: 2018/03/27 13:24 (external edit)