Hierarchy Puzzle

Intuitive

According to the Higgs mechanism, space is filled with a uniform density of Higgs substance. But because of the complexity of the quantum vacuum, the calm sea of Higgs substance is continually disturbed by the rapid production and annihilation of all sorts of virtual particles. The constant buzz of virtual particles affects the density of the space-filling Higgs substance. Just as ghosts flickering in and out of the netherworld leave vivid memories in the minds of impressionable individuals, so virtual particles leave their indelible imprint in the Higgs substance. The swirling of virtual particles effectively gives a gigantic contribution to the density of the Higgs substance, which becomes extremely thick. Theoretical calculations show that this contribution to the density of the Higgs substance is proportional to the maximum energy carried by virtual particles. Since virtual particles can carry huge amounts of energy, the molasses-like Higgs substance becomes thicker than mud or even as hard as rock when quantum-mechanical effects are taken into account. Ordinary particles moving inside this medium should feel a tremendous resistance or, in more precise physical terms, should acquire enormous masses. Calculations based on a simple extrapolation of the Standard Model all the way down to the Planck length yield the result that electrons should be one million billion times more massive than what we observe – as heavy as prokaryote bacteria. But since electrons are obviously not as heavy as bacteria, we are confronted with a puzzle: why is the Higgs substance so dilute in spite of the natural tendency of virtual particles to make it grow thicker? This dilemma is usually referred to as the naturalness problem. The density of the Higgs substance determines how far the W and Z particles can propagate the weak force; in other words, it determines the weak length. Therefore, when virtual particles make the Higgs substance thicker, they effectively make the weak length shorter. The naturalness problem then refers to the conflict between, on one side, the tendency of virtual particles to make the weak length as small as the Planck length and, on the other side, the observation that the two length scales differ by the enormous factor of 1017. Let us rephrase the problem with an analogy. Suppose that you insert a piece of ice into a hot oven. After waiting for a while, you open the oven and you discover that the ice is perfectly solid and hasn’t melted at all. Isn’t it puzzling? The air molecules inside the hot oven should have conveyed their thermal energy to the piece of ice, quickly raising its temperature and melting it. But they did not. The naturalness problem is equally puzzling. The energetic virtual particles are like the hot air molecules of the oven analogy, and the Higgs substance is like the piece of ice. The frenzied motion of virtual particles is communicated to the Higgs substance, which should become as hard as rock. And yet, it remains very dilute. The weak length should become as small as the Planck length. And yet, the two lengths differ by a factor of 1017. Just as in the inside of a hot oven nothing can remain much cooler than the ambient temperature, so in the quantum vacuum virtual particles do not tolerate that the weak length remain much larger than the Planck length. Thus, the real puzzle is that no hierarchy between weak and gravitational force should exist at all, let alone there being a differ- ence by a factor of 1017. The essence of the naturalness problem is that the anarchic behaviour of virtual particles does not tolerate hierarchies. At this point, a very important warning should be issued. The naturalness problem is not a question of logical consistency. As the word says, it is only a problem of naturalness. Virtual particles provide one part of the energy stored in the Higgs substance. Nothing forbids the possibility of nature carefully choosing the initial density of the Higgs substance in such a way as to nearly compensate the effect from virtual particles. Under these circumstances, the enormous disparity between the weak and Planck lengths could be just the result of a precise compensation among various effects. Although this possibility cannot be logically excluded, it seems very contrived. Most physicists have difficulties accepting such accurate compensations between unrelated effects, and regard them as extremely unnatural.

"A Zeptospace Odyssey" by Guidice

Why is it interesting?

Since long ago [1, 2] physicists have been reluctant to accept small (or large) numbers without an underlying dynamical explanation, even when the smallness of a parameter is technically natural in the sense of ’t Hooft [3]. One reason for this reluctance is the belief that all physical quantities must eventually be calculable in a final theory with no free parameters. It would be strange for small numbers to pop up accidentally from the final theory without a reason that can be inferred from a low-energy perspective.

https://arxiv.org/pdf/1610.07962.pdf

Look at the Higgs field φ responsible for breaking electroweak theory. We don’t know its renormalized or physical mass precisely, but we do know that it is of order $M_{EW}$ . Imagine calculating the bare perturbation series in some grand unified theory - the precise theory does not enter into the discussion —starting with some bare mass $\mu_0$ for φ. The Weisskopf phenomenon tells us that quantum correction shifts $μ_0^2$ by a huge quadratically cutoff dependent amount $δμ_0^2 ∼ f^2\Lambda^2 \sim f^2 M_{GUT}^2$ , where we have substituted for $\Lambda$ the only natural mass scale around, namely $M_{GUT}$, and where $f$ denotes some dimensionless coupling. To have the physical mass squared $μ^2 = μ_0^2 + δμ_0^2$ come out to be of order $M_{EW}$, something like 28 orders of magnitude smaller than $M_{GUT}$, would require an extremely fine-tuned and highly unnatural cancellation between $\mu_0$ and $δμ_0$. How this could happen “naturally” poses a severe challenge to theoretical physicists.

Naturalness The hierarchy problem is closely connected with the notion of naturalness dear to the theoretical physics community. We naturally expect that dimensionless ratios of parameters in our theories should be of order unity, where the phrase “order unity” is interpreted liberally between friends, say anywhere from $10^{-2}$ or $10^{-3}$ to $10^2$ or $10^3$. Following ’t Hooft, we can formulate a technical definition of naturalness: The smallness of a dimensionless parameter η would be considered natural only if a symmetry emerges in the limit η → 0. Thus, fermion masses could be naturally small, since, as you will recall from chapter II.1, a chiral symmetry emerges when a fermion mass is set equal to zero. On the other hand, no particular symmetry emerges when we set either the bare or renormalized mass of a scalar field equal to zero. This represents the essence of the hierarchy problem.

page 419 in QFT in a Nutshell by A. Zee