User Tools

Site Tools


basic_notions:entropy

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
basic_notions:entropy [2017/06/24 15:58]
jakobadmin [Student]
basic_notions:entropy [2020/06/19 13:04] (current)
91.89.151.102 [Concrete]
Line 1: Line 1:
 ====== Entropy ====== ====== Entropy ======
  
-<​tabbox ​Why is it interesting?> ​+<​tabbox ​Intuitive>​  
 +Entropy ​is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice, every system gets more chaotic over time, unless we use energy to bring it into order. ​
  
 +An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy.
  
 +Therefore, it is much more likely that a system will end up in a state of higher entropy.
  
-<tabbox Layman> ​+A familiar example is sand. It is far more likely to find sand lying randomly around than in an ordered form like a sand castle.
  
-<note tip> +---- 
-Explanations in this section should contain no formulas, but instead colloquial things like you would hear them during a coffee break or at a cocktail party. + 
-</note> +  * https://​gravityandlevity.wordpress.com/​2009/​04/​01/​entropy-and-gambling
-   +  ​* [[https://​arxiv.org/​abs/​1705.02223|Entropy?​ Honest!]] by Toffoli 
-<​tabbox ​Student>  +<​tabbox ​Concrete 
-The, in the beginning purely empirical, observation that the observation ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​+  * [[https://​philpapers.org/​archive/​FRIEA.pdf|ENTROPY:​ A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl 
 + 
 +----- 
 + 
 +The, in the beginning purely empirical, observation that the entropy ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​
  
 Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​ Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​
Line 145: Line 152:
 </​blockquote>​ </​blockquote>​
 <-- <--
-<​tabbox ​Researcher+<​tabbox ​Abstract
  
-<note tip> +A great discussion of common misconceptions by E. T. Jaynes can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf
-The motto in this section is: //the higher the level of abstraction,​ the better//. +
-</​note>​+
  
-A great discussion of common misconceptions by Jaynes himself can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf+See also "​[[http://bayes.wustl.edu/etj/articles/​stand.on.entropy.pdf|Where do we stand on maximum entropy]]"​ by Jaynes.
  
  
---> Common Question 1# +<​tabbox ​Why is it interesting?​
- +
-  +
-<-- +
- +
---> Common Question 2# +
- +
-  +
-<-- +
-   +
-<tabbox Examples>​  +
- +
---> Example1# +
- +
-  +
-<-- +
- +
---> Example2:#​ +
- +
-  +
-<-- +
-   +
-<​tabbox ​History+
  
 </​tabbox>​ </​tabbox>​
  
  
basic_notions/entropy.1498312730.txt.gz · Last modified: 2017/12/04 08:01 (external edit)