Add a new page:
Add a new page:
The traditional approach of theoreticians, going back to the foundation of quantum mechanics, is to run to Schrödinger’s equation when confronted by a problem in atomic, molecular or solid state physics! One establishes the Hamiltonian, makes some (hopefully) sensible approximations and then proceeds to attempt to solve for the energy levels, eigenstates and so on. However, for truly complicated systems in what, these days, is much better called “condensed matter physics,” this is a hopeless task; furthermore, in many ways it is not even a very sensible one!
The modern attitude is, rather, that the task of the theorist is to understand what is going on and to elucidate which are the crucial features of the problem. For instance, if it is asserted that the exponent [β] depends on the dimensionality, d, and on the symmetry number, n, but on no other factors, then the theorist’s job is to explain why this is so and subject to what provisos. If one had a large enough computer to solve Schrödinger’s equation and the answer came out that way, one would still have no understanding of why this was the case!
Thus the need is to gain understanding, nut just numerical answers: that does not necessarily mean going back to Schrödinger's equation which, in any case, should be really regarded just as an approximation to some sort of gauge field theory. So the crucial change of emphasis of the last 20 or 30 years that distinguishes the new era from the old one is that when we look at the theory of condensed matter nowadays we inevitably talk about a “model”. As a matter of fact even Schrödinger's equation and gauge field theories themselves are just models of the physical world, albeit pretty good ones as far as we can presently judge.
We should be prepared to look even at rather crude models, and, in particular, to study the relations between different models. We may well try to simplify the nature of a model to the point where it represents a ‘mere caricature’ of reality. But notice that when one looks at a good political cartoon one can recognise the various characters even though the artist has portrayed them with but a few strokes. Those well chosen strokes tell one all one really needs to known about the individual, his expression, his intentions and his character. So, accepting Frenkel’s guidance, [. . . ] a good theoretical model of a complex system should be like a good caricature: It should emphasise those features which are most important and should downplay the inessential details. Now the only snag with this advice is that one does not really know which are the inessential details until one has understood the phenomena under study. Consequently, one should investigate a wide range of models and not stake one’s life (or one’s theoretical insight) on one particular model alone. Nevertheless, one model which, historically, has been of particular importance and which has given us a great deal of confidence in the phenomenological descriptions of critical exponents and scaling presented earlier deserves special attention: this is the so-called Ising model. Even today its study continues to provide us with new insights.
Michael E. Fisher. Scaling, universality and renormalization group theory. In F.J.W. Hahne, editor, Critical Phenomena, volume 186 of Lecture Notes in Physics, Berlin, 1983. Summer School held at the University of Stellenbosch, South Africa; January 18–29, 1982, Springer-Verlag.