nLab naturalness

Redirected from "fine-tuning".
Contents

This entry is about the notion of naturalness in (particle-)physics. For the notion in mathematics see at natural transformation.

Context

Physics

physics, mathematical physics, philosophy of physics

Surveys, textbooks and lecture notes


theory (physics), model (physics)

experiment, measurement, computable physics

Philosophy

Contents

Idea

In (particle-)physics the term “naturalness” [‘t Hooft 1980, Gell- Mann 1983] refers to the vague idea that a model of physics is expected to work without requiring unlikely-looking ad-hoc coincidences or “fine-tuning” of its parameters (such as the choice of renormalization constants), and that instead all such “conspiracies” among parameter values are to have some systematic cause, such as some (broken) symmetry of the model.

An archetypical example, which at times has been referred to as the naturalness probem of the standard model of particle physics, is that the mass of the Higgs boson is first of all many orders of magnitude below the Planck scale, while at the same time potentially receiving very large potential quantum corrections from the presence of (potentially undetected) heavy particles, which therefore must coincidentally cancel out. This is also called the hierarchy problem.

Accordingly, a popular suggestion has been that there should be a symmetry in nature which is to naturally explain this otherwise coincidental cancellation, namely low-energy supersymmetry. The failure of such a symmetry to be observed at the LHC experiment is leading the high energy physics community to a re-examination of the naturalness paradigm and/or to focus on alternative mechanisms, such as a composite Higgs boson.

The next main example of (lack of) naturalness often considered is the cosmological constant (dark energy), which seems tiny when compared to the quantum corrections which it naively receives from the vacuum energy of all fields in the observable universe.

Problems with the concept

A key problem is arguably that the principle of naturalness has been used in a very vague and very ambiguous sense. Even apart from the problem of which large or small numbers are to be regarded as “unlikely” (see the quote below from Wilson 04), there is technical fine print.

1) Dependence on renormalization scheme

Both the mass of the Higgs particle as well as the cosmological constant are subject to renormalization in perturbative quantum field theory, whence “quantum corrections”.

(For more on the renormalization-freedom in the cosmological constant see there.)

A key fact, that seems to be mostly overlooked in discussion of naturalness/fine-tuning, is that renormalization parameters a priori take values in an affine space (see this theorem).

This means that without choosing an arbitrary origin and then a set of coordinates, there is no sense in which a quantum correction is “large” or “small”. An origin and a set of coordinates is however provided by a choice of renormalization scheme, and discussions in the literature typically implicitly take such a choice for granted.

2) General viability

Even if one assumes that the parameters under consideration are invariantly defined numbers, independent on choices like renormalization schemes (see above) it has been questioned whether it is fruitful to consider the “naturalness” of the values they take

The following is from Kane 17 (there in a discussion of G₂-MSSM model building):

Until recently there were no theories predicting the values of superpartner masses. The arguments based on ‘naturalness’ are basically like saying the weather tomorrow should be the same as today. The opposite of naturalness is having a theory. [[]] Claims [[superpartners]] should have been seen would be valid given so called naturalness arguments, but are wrong in actual theories. Many of us think that is a misuse of the idea of naturalness, but it is the fashionable use.

(p. 33 (3-2))

Some arguments (‘naturalness’) can be used to estimate what values [[MSSM parameters]] might have. If those arguments were correct some superpartners would already have been discovered at the CERN LHC. It would have been nice if the naturalness arguments had worked, but they did not. Since they were not predictions from a theory it is not clear how to interpret that.

(p. 39 (4-3))

The failure of naïve naturalness to describe the world tells us we should look harder for a theory that does, an ‘ultraviolet completion’. Compactified string/ M-theories appear to be strong candidates for such a theory.

The alternative to naturalness, often neglected as an alternative, is having a theory.

(p. 57 (6-1))

Similarly Wilson 04, p. 10:

[][ \cdots ] The claim was that it would be unnatural for such particles to have masses small enough to be detectable soon. But this claim makes no sense when one becomes familiar with the history of physics. There have been a number of cases where numbers arose that were unexpectedly small or large. An early example was the very large distance to the nearest star as compared to the distance to the Sun, as needed by Copernicus, because otherwise the nearest stars would have exhibited measurable parallax as the Earth moved around the Sun. Within elementary particle physics, one has unexpectedly large ratios of masses, such as the large ratio of the muon mass to the electron mass. There is also the very small value of the weak coupling constant. In the time since my paper was written, another set of unexpectedly small masses was discovered: the neutrino masses. There is also the riddle of dark energy in cosmology, with its implication of possibly an extremely small value for the cosmological constant in Einstein’s theory of general relativity. This blunder was potentially more serious, if it caused any subsequent researchers to dismiss possibilities for very large or very small values for parameters that now must be taken seriously.

(Wilson 04, p. 10)

References

Early discussion:

  • Gerard 't Hooft, Naturalness, Chiral Symmetry, and Spontaneous Chiral Symmetry Breaking In: G. ‘Hooft et al., Recent Developments in Gauge Theories, NATO Advanced Study Institutes Series 59, Springer (1980) [doi:10.1007/978-1-4684-7571-5_9]

    [pp 135:] “it is unlikely that the microscopic equations contain various free parameters that are carefully adjusted by Nature to give cancelling effects such that the macroscopic systems have some special properties. This is a philosophy which we would like to apply to the unified gauge theories: the effective interactions at a large length scale, corresponding to a low energy scale μ 1\mu_1 should follow from the properties at a much smaller length scale, or higher energy scale μ 2\mu_2, without the requirement that various different parameters at the energy scale μ 2\mu_2 match with an accuracy of the order of μ 1/μ 2\mu_1/\mu_2. That would be unnatural.”

  • Murray Gell-Mann, introductory talk at Shelter Island II (1983) [pdf, pdf]

    in: Shelter Island II: Proceedings of the 1983 Shelter Island Conference on Quantum Field Theory and the Fundamental Problems of Physics. MIT Press. pp. 301–343. ISBN 0-262-10031-2.

    “we seem to have to dial various renormalized quantities to small values. The situation of having numerous arbitrary dimensional parameters is even more humiliating when some of them are very small. […] First of all, we’d like to interpret small or nearly symmetric quantities as coming from a slightly broken symmetry; otherwise they don’t make sense to us. […] When we can’t avoid dialing some renormalized quantity to a small value […] that situation has recently been described as a problem of ‘naturalness’.”

Survey and further discussion:

An attempt to make the notion precise is due to

Critical comments are in

See also

Last revised on July 18, 2024 at 13:12:15. See the history of this page for a list of all contributions to it.