User Tools

Site Tools


Sidebar


Add a new page:

basic_notions:entropy

This is an old revision of the document!


Entropy

Why is it interesting?

Layman

Explanations in this section should contain no formulas, but instead colloquial things like you would hear them during a coffee break or at a cocktail party.

Student

Shannon Entropy

The Shannon entropy is often taken as codifying the amount of information given in a probability distribution. The idea is that the most informative distribution gives probability 1 to some value (and 0 to all the others); and the least informative gives equal probability to all values (in this case 1 m for all m values).

https://adamcaulton.files.wordpress.com/2013/05/thermo51.pdf2

Boltzmann Entropy

$$S = k log W $$ This is such a strikingly simple relation that one can hardly avoid jumping to the conclusion that it must be true in general; i.e., the entropy of any macroscopic thermodynamic state A is a measure of the phase volume $W_A$ occupied by all microstates compatible with A. It is convenient verbally to say that S measures the "number of ways" in which the macrostate A can be realized. This is justified in quantum theory, where we learn that a classical phase volume W does correspond to a number of global quantum states $n = W/h^{3N}$ . So if we agree, as a convention, that we shall measure classical phase volume in units of $h^{3N}$ , then this manner of speaking will be appropriate in either classical or quantum theory.

We feel quickly that the conjectured generalization of (17) must be correct, because of the light that this throws on our problem. Suddenly, the mysteries evaporate; the meaning of Carnot's principle, the reason for the second law, and the justi cation for Gibbs' variational principle, all become obvious. Let us survey quickly the many things that we can learn from this remarkable discovery. Given a "choice" between going into two macrostates A and B, if $S_A < S_B$, a system will appear to show an overwhelmingly strong preference for B, not because it prefers any particular microstate in B, but only because there are so many more of them.

As noted in Appendix C, an entropy difference ($S_B- S_A$) corresponding to one microcalorie at room temperature indicates a ratio $W_B =W_A > exp(10^{15})$. Thus violations are so improbable that Carnot's principle, or the equivalent Clausius statement (14), appear in the laboratory as absolutely rigid "stone wall" constraints suggesting a law of physics rather than a matter of probability

https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf

Gibbs Entropy

In the beginning there was just Clausius' weak statement that the entropy of a system tends to increase:

$$ S_{initial} \leq S_{final} .$$

This old statement, has been replaced by the modern view of Gibbs and Jaynes:

Instead of Clausius' weak statement that the total entropy of all bodies involved "tends" to increase, Gibbs made the strong prediction that it will increase, up to the maximum value permitted by whatever constraints (conservation of energy, volume, mole numbers, etc.) are imposed by the experimental arrangement and the known laws of physics. Furthermore, the systems for which this is predicted can be more complicated than those envisaged by Clausius; they may consist of many different chemical components, free to distribute themselves over many phases. Gibbs' variational principle resolved the ambiguity: Given the initial macroscopic data defining a nonequilibrium state, there are millions of conceivable final equilibrium macrostates to which our system might go, all permitted by the conservation laws. Which shall we choose as the most likely to be realized?"

Although he gave a de nite answer to this question, Gibbs noted that his answer was not found by deductive reasoning. Indeed, the problem had no deductive solution because it was ill-posed. There are initial microstates, allowed by the data and the laws of physics, for which the system will not go to the macrostate of maximum entropy. There may be additional constraints, unknown to us, which make it impossible for the system to get to that state; for example new \constants of the motion". So on what grounds could he justify making that choice in preference to all others?

At this point thermodynamics takes on a fundamentally new character. We have to recognize the distinction between two different kinds of reasoning; deduction from the laws of physics, and human inference from whatever information you or I happen to have. Instead of asking, "What do the laws of physics require the system to do?", which cannot be answered without knowledge of the exact microstate, Gibbs asked a more modest question, which can be answered: "What is the best guess we can make, from the partial information that we have?"

[…]

Gibbs said almost nothing about what entropy really means. He showed, far more than anyone else, how much we can accomplish by maximizing entropy. Yet we cannot learn from Gibbs: \What are we actual ly doing when we maximize entropy ?" For this we must turn to Boltzmann.

https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf

Researcher

The motto in this section is: the higher the level of abstraction, the better.

A great discussion of common misconceptions by Jaynes himself can be found here https://pdfs.semanticscholar.org/d7ff/97069799d3a912803ddd2266cdf573c2461d.pdf

Common Question 1
Common Question 2

Examples

Example1
Example2:

History

basic_notions/entropy.1498309722.txt.gz · Last modified: 2017/12/04 08:01 (external edit)