User Tools

Site Tools


basic_notions:entropy

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
basic_notions:entropy [2020/02/20 13:53]
jakobadmin [Concrete]
basic_notions:entropy [2020/04/02 16:35]
76.21.112.112 [Intuitive] fix grammar
Line 2: Line 2:
  
 <tabbox Intuitive> ​ <tabbox Intuitive> ​
-Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice ​that every system gets more chaotic over time unless we use energy to bring it into order. ​+Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practiceevery system gets more chaotic over timeunless we use energy to bring it into order. ​
  
 An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy. An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy.
Line 17: Line 17:
   * [[https://​philpapers.org/​archive/​FRIEA.pdf|ENTROPY:​ A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl   * [[https://​philpapers.org/​archive/​FRIEA.pdf|ENTROPY:​ A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl
  
----+-----
  
 The, in the beginning purely empirical, observation that the observation of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​ The, in the beginning purely empirical, observation that the observation of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​
basic_notions/entropy.txt · Last modified: 2020/06/19 13:04 by 91.89.151.102