User Tools

Site Tools


basic_notions:entropy

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
basic_notions:entropy [2017/06/24 15:08]
jakobadmin [Student]
basic_notions:entropy [2020/06/19 13:04] (current)
91.89.151.102 [Concrete]
Line 1: Line 1:
 ====== Entropy ====== ====== Entropy ======
  
-<​tabbox ​Why is it interesting?> ​+<​tabbox ​Intuitive>​  
 +Entropy ​is a measure of chaos or randomness in a system. An important property of entropy is that it increases over time. In practice, every system gets more chaotic over time, unless we use energy to bring it into order. ​
  
 +An explanation for this steady increase of entropy is that there are far more possible states of high entropy than there are states of low entropy.
  
 +Therefore, it is much more likely that a system will end up in a state of higher entropy.
  
-<​tabbox ​Layman+A familiar example is sand. It is far more likely to find sand lying randomly around than in an ordered form like a sand castle. 
 + 
 +---- 
 + 
 +  * https://​gravityandlevity.wordpress.com/​2009/​04/​01/​entropy-and-gambling/​ 
 +  * [[https://​arxiv.org/​abs/​1705.02223|Entropy?​ Honest!]] by Toffoli 
 +<​tabbox ​Concrete 
 +  * [[https://​philpapers.org/​archive/​FRIEA.pdf|ENTROPY:​ A GUIDE FOR THE PERPLEXED]] by Roman Frigg and Charlotte Werndl 
 + 
 +----- 
 + 
 +The, in the beginning purely empirical, observation that the entropy of a system always increases, can be deduced from general logical arguments as was demonstrated by Jaynes.  
 + 
 +Entropy is a macroscopic notion like temperature and is used when we do not have absolute knowledge about the exact micro configurations of the system.  
 + 
 +Boltzmann interpretation of entropy is that is a measure of the "​number of ways" in which the macrostate can be realized in terms of microstates. If there are many microstates that yield the same macrostate, this macrostate has a high probability.  
 + 
 +In this sense entropy gets maximized because this is the most probable macroscopic configuration of the system.  
 + 
 +Jaynes goes one step further and argues that entropy is a tool that explicitly takes into account that what we can predict depends on our state of knowledge. This means, while what happens in the real world depends on the physical laws, what is really important is what we can actually predict. Predictions necessarily rely on what we know about the given system.  
 + 
 +In his own words: 
 + 
 +<​blockquote>​ 
 +Instead of asking, "What do the laws of physics require the system to do?", which cannot be answered without knowledge of the exact microstate, Gibbs asked a more modest question, which can be answered: "What is the best guess we can make, from the partial information that we have?"​ 
 + 
 +<​cite>​https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf</​cite>​ 
 +</​blockquote>​ 
 + 
 +In combination with Boltzmann'​s interpretation a nice logical picture emerges:
  
 <note tip> <note tip>
-Explanations in this section should contain no formulas, but instead ​colloquial things like you would hear them during ​coffee break or at cocktail party.+The fact that we always observe that entropy increases, is not something that nature **does**, but instead a result of how we make predictions for physical systems with limited knowledge. For macroscopic system the microscopic details are usually much too complicated to know exactly. Nevertheless,​ we want to make predictions. Our best guess in such situations is the most probable outcome, i.e. the macroscopic configurations that can be realized by the largest number of micro configurations,​ i.e. the macroscopic configuration with maximal entropy. ​  
 + 
 +In this sense, the maximization of entropy is general principle for reasoning in situations where we do not know all the details. 
 + 
 +For a nice simple example of this kind of reasoning, see http://​www-mtl.mit.edu/​Courses/​6.050/​2003/​notes/​chapter9.pdf 
 </​note>​ </​note>​
-  ​ + 
-<tabbox Student>  +If this point of view is correct, an obvious question pops up: **Why then is the principle of maximum entropy so uniformly successful?​** 
-**Shannon Entropy**+ 
 +The reason is that the multiplicity of the macroscopic configurations,​ i.e. the number of ways in which they can be realized in terms of microscopic configurations,​ has an extremely sharp maximum. This can be calculated explicitly, as shown, for example at page 7 [[https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf|here]]. There it is found for a simple system that "not only is $E'$ the value of $E_1$ 
 +that can happen in the greatest number of ways for given total energy $E$; the vast majority of all 
 +possible microstates with total energy $E$ have $E_1$ very close to $E'$. Less than 1 in $10^8$ of all possible states have $E_1$ outside the interval ($E' \pm 6 \sigma$), far too narrow to measure experimentally"​. 
 + 
 +If it would be otherwise, for example, when the maximum would be broad or if there would be many local maxima the principle of maximum entropy wouldn'​t be so powerful.  
 + 
 +Thus even 
 + 
 +<blockquote
 +if we had more information we would seldom do better in prediction of reproducible phenomena, because those are the same for virtually all microstates in an enormously large class C; and therefore also in virtually any subset of C. [...] Knowledge of the "​data"​ E alone would not enable us to choose among the different values 
 +of $E_1$ allowed by [energy conservation];​ the additional information contained in the entropy functions, nevertheless 
 +leads us to make one de finite choice as far more likely than any other, on the information supposed. 
 + 
 +<​cite>​https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf</​cite>​ 
 +</​blockquote>​ 
 + 
 +-->Shannon Entropy#
  
 <​blockquote>​ <​blockquote>​
Line 19: Line 72:
 <​cite>​https://​adamcaulton.files.wordpress.com/​2013/​05/​thermo51.pdf2</​cite>​ <​cite>​https://​adamcaulton.files.wordpress.com/​2013/​05/​thermo51.pdf2</​cite>​
 </​blockquote>​ </​blockquote>​
 +<--
  
-** Boltzmann Entropy ​**+--> ​Boltzmann Entropy#
  
 <​blockquote>​ <​blockquote>​
Line 45: Line 99:
 an entropy difference ($S_B- an entropy difference ($S_B-
 S_A$) corresponding to one microcalorie at room temperature indicates S_A$) corresponding to one microcalorie at room temperature indicates
-a ratio $W_B =W_A > exp(10^15)$. Thus violations are so improbable that Carnot'​s principle, or+a ratio $W_B =W_A > exp(10^{15})$. Thus violations are so improbable that Carnot'​s principle, or
 the equivalent Clausius statement (14), appear in the laboratory as absolutely rigid "stone wall" the equivalent Clausius statement (14), appear in the laboratory as absolutely rigid "stone wall"
 constraints suggesting a law of physics rather than a matter of probability constraints suggesting a law of physics rather than a matter of probability
Line 52: Line 106:
 </​blockquote>​ </​blockquote>​
  
-** Gibbs Entropy ​**+<-- 
 + 
 +--> ​Gibbs Entropy#
  
 In the beginning there was just Clausius'​ weak statement that the entropy of a system tends to increase: In the beginning there was just Clausius'​ weak statement that the entropy of a system tends to increase:
Line 95: Line 151:
 <​cite>​https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf</​cite>​ <​cite>​https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf</​cite>​
 </​blockquote>​ </​blockquote>​
-<tabbox Researcher> ​ 
- 
-<note tip> 
-The motto in this section is: //the higher the level of abstraction,​ the better//. 
-</​note>​ 
- 
-A great discussion of common misconceptions by Jaynes himself can be found here https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf 
- 
- 
---> Common Question 1# 
- 
-  
 <-- <--
 +<tabbox Abstract> ​
  
---> Common Question 2#+A great discussion of common misconceptions by E. T. Jaynes can be found here https://​pdfs.semanticscholar.org/​d7ff/​97069799d3a912803ddd2266cdf573c2461d.pdf
  
-  +See also "​[[http://​bayes.wustl.edu/​etj/​articles/​stand.on.entropy.pdf|Where do we stand on maximum entropy]]"​ by Jaynes.
-<-- +
-   +
-<tabbox Examples> ​+
  
---> Example1# 
  
-  +<​tabbox ​Why is it interesting?​
-<-- +
- +
---> Example2:#​ +
- +
-  +
-<-- +
-   +
-<​tabbox ​History+
  
 </​tabbox>​ </​tabbox>​
  
  
basic_notions/entropy.1498309708.txt.gz · Last modified: 2017/12/04 08:01 (external edit)