This entry is about the notion of naturalness in (particle-)physics. For the notion in mathematics see at natural transformation.



physics, mathematical physics, philosophy of physics

Surveys, textbooks and lecture notes

theory (physics), model (physics)

experiment, measurement, computable physics




In (particle-)physics the term “naturalness” refers to the vague idea that a model of physics is expected to work without requiring unlikely-looking ad-hoc coincidences or “fine-tuning” of its parameters, and that instead all such “conspiracies” among parameter values are to have some systematic cause, such as some symmetry of the model.

An archetypical example, which at times has been referred to as the naturalness probem of the standard model of particle physics, is that the mass of the Higgs boson is first of all many orders of magnitude below the Planck scale, while at the same time potentially receiving very large potential quantum corrections from the presence of (potentially undetected) heavy particles, which therefore must coincidentally cancel out. This is also called the hierarchy problem.

Accordingly, a popular suggestion has been that there should be a symmetry which is to naturally explain this otherwise coincidental cancellation, namely low-energy supersymmetry. The failure of such a symmetry to be observed at the LHC experiment is leading to a re-examination of the naturalness paradigm.

The next main example of (lack of) naturalness often considered is the cosmological constant (dark energy), which seems tiny when compared to the quantum corrections which it naively receives from the vacuum energy of all fields in the observable universe.

Problems with the concept

A key problem is arguably that the principle of naturalness has been used in a very vague and very ambiguous sense. Even apart from the problem of which large or small numbers are to be regarded as “unlikely” (see the quote below from Wilson 04), there is technical fine print.

1) Dependence on renormalization scheme

Both the mass of the Higgs particle as well as the cosmological constant are subject to renormalization in perturbative quantum field theory, whence “quantum corrections”.

(For more on the renormalization-freedom in the cosmological constant see there.)

A key fact, that seems to be mostly overlooked in discussion of naturalness/fine-tuning, is that renormalization parameters a priori take values in an affine space (see this theorem).

This means that without choosing an arbitrary origin and then a set of coordinates, there is no sense in which a quantum correction is “large” or “small”. An origin and a set of coordinates is however provided by a choice of renormalization scheme, and discussions in the literature typically implicitly take such a choice for granted.

2) General viability

Even if one assumes that the parameters under consideration are invariantly defined numbers, independent on choices like renormalization schemes (see above) it has been questioned whether it is fruitful to consider the “naturalness” of the values they take

The following is from Kane 17 (there in a discussion of G2-MSSM model building):

Until recently there were no theories predicting the values of superpartner masses. The arguments based on ‘naturalness’ are basically like saying the weather tomorrow should be the same as today. The opposite of naturalness is having a theory. [[]] Claims [[superpartners]] should have been seen would be valid given so called naturalness arguments, but are wrong in actual theories. Many of us think that is a misuse of the idea of naturalness, but it is the fashionable use.

(p. 33 (3-2))

Some arguments (‘naturalness’) can be used to estimate what values [[MSSM parameters]] might have. If those arguments were correct some superpartners would already have been discovered at the CERN LHC. It would have been nice if the naturalness arguments had worked, but they did not. Since they were not predictions from a theory it is not clear how to interpret that.

(p. 39 (4-3))

The failure of naïve naturalness to describe the world tells us we should look harder for a theory that does, an ‘ultraviolet completion’. Compactified string/ M-theories appear to be strong candidates for such a theory.

The alternative to naturalness, often neglected as an alternative, is having a theory.

(p. 57 (6-1))

Similarly Wilson 04, p. 10:

[][ \cdots ] The claim was that it would be unnatural for such particles to have masses small enough to be detectable soon. But this claim makes no sense when one becomes familiar with the history of physics. There have been a number of cases where numbers arose that were unexpectedly small or large. An early example was the very large distance to the nearest star as compared to the distance to the Sun, as needed by Copernicus, because otherwise the nearest stars would have exhibited measurable parallax as the Earth moved around the Sun. Within elementary particle physics, one has unexpectedly large ratios of masses, such as the large ratio of the muon mass to the electron mass. There is also the very small value of the weak coupling constant. In the time since my paper was written, another set of unexpectedly small masses was discovered: the neutrino masses. There is also the riddle of dark energy in cosmology, with its implication of possibly an extremely small value for the cosmological constant in Einstein’s theory of general relativity. This blunder was potentially more serious, if it caused any subsequent researchers to dismiss possibilities for very large or very small values for parameters that now must be taken seriously.


Survey and discussion includes

An attempt to make the notion precise is due to

  • Greg Anderson, Diego Castano, Measures of fine tuning, Phys. Lett.B 347:300-308, 1995 (arXiv:hep-ph/9409419)

Critical comments are in

See also

Last revised on April 15, 2019 at 05:20:08. See the history of this page for a list of all contributions to it.