User Tools

Site Tools


basic_notions:entropy

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
basic_notions:entropy [2020/02/20 13:53]
jakobadmin [Concrete]
basic_notions:entropy [2020/06/19 13:04]
91.89.151.102 [Concrete]
Line 2: Line 2:
  
 <tabbox Intuitive> ​ <tabbox Intuitive> ​
-Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice ​that every system gets more chaotic over time unless we use energy to bring it into order. ​+Entropy is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practiceevery system gets more chaotic over timeunless we use energy to bring it into order. ​
  
 An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy. An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy.
Line 19: Line 19:
 ----- -----
  
-The, in the beginning purely empirical, observation that the observation ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​+The, in the beginning purely empirical, observation that the entropy ​of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes. ​
  
 Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​ Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system. ​
basic_notions/entropy.txt · Last modified: 2020/06/19 13:04 by 91.89.151.102