This article may be too technical for most readers to understand.(March 2021)
In physics, naturalness is the property that the dimensionless ratios between free parameters or physical constants appearing in a physical theory should take values "of order 1" and that free parameters are not fine-tuned. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234.
The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in particle physics. It is a criterion that arises from the seeming non-naturalness of the standard model and the broader topics of the hierarchy problem, fine-tuning, and the anthropic principle. However it does tend to suggest a possible area of weakness or future development for current theories such as the Standard Model, where some parameters vary by many orders of magnitude, and which require extensive "fine-tuning" of their current values of the models concerned. The concern is that it is not yet clear whether these seemingly exact values we currently recognize, have arisen by chance (based upon the anthropic principle or similar) or whether they arise from a more advanced theory not yet developed, in which these turn out to be expected and well-explained, because of other factors not yet part of particle physics models.
The concept of naturalness is not always compatible with Occam's razor, since many instances of "natural" theories have more parameters than "fine-tuned" theories such as the Standard Model. Naturalness in physics is closely related to the issue of fine-tuning, and over the past decade many scientists argued that the principle of naturalness is a specific application of Bayesian statistics.
A simple example:
Suppose a physics model requires four parameters which allow it to produce a very high quality working model, calculations, and predictions of some aspect of our physical universe. Suppose we find through experiments that the parameters have values:
- 0.9 and
- 404,331,557,902,116,024,553,602,703,216.58 (roughly 4 x 1029).
We might wonder how such figures arise. But in particular we might be especially curious about a theory where three values are close to one, and the fourth is so different; in other words, the huge disproportion we seem to find between the first three parameters and the fourth. We might also wonder, if one force is so much larger than the others that it needs a factor of 4 x 1029 to allow it to be related to them in terms of effects, how did our universe come to be so exactly balanced when its forces emerged. In current particle physics the differences between some parameters are much larger than this, so the question is even more noteworthy.
One answer given by some physicists is the anthropic principle. If the universe came to exist by chance, and perhaps vast numbers of other universes exist or have existed, then life capable of physics experiments only arose in universes that by chance had very balanced forces. All the universes where the forces were not balanced, didn't develop life capable of the question. So if a lifeform like human beings asks such a question, it must have arisen in a universe having balanced forces, however rare that might be. So when we look, that is what we would expect to find, and what we do find.
A second answer is that perhaps there is a deeper understanding of physics, which, if we discovered and understood it, would make clear these aren't really fundamental parameters and there is a good reason why they have the exact values we have found, because they all derive from other more fundamental parameters that are not so unbalanced.
In particle physics, the assumption of naturalness means that, unless a more detailed explanation exists, all conceivable terms in the effective action that preserve the required symmetries should appear in this effective action with natural coefficients.
where d is the dimension of the field operator; and c is a dimensionless number which should be "random" and smaller than 1 at the scale where the effective theory breaks down. Further renormalization group running can reduce the value of c at an energy scale E, but by a small factor proportional to ln(E/Λ).
Some parameters in the effective action of the Standard Model seem to have far smaller coefficients than required by consistency with the assumption of naturalness, leading to some of the fundamental open questions in physics. In particular:
- The naturalness of the QCD "theta parameter" leads to the strong CP problem, because it is very small (experimentally consistent with "zero") rather than of order of magnitude unity.
- The naturalness of the Higgs mass leads to the hierarchy problem, because it is 17 orders of magnitude smaller than the Planck mass that characterizes gravity. (Equivalently, the Fermi constant characterizing the strength of the weak force is very large compared to the gravitational constant characterizing the strength of gravity.)
- The naturalness of the cosmological constant leads to the cosmological constant problem because it is at least 40 and perhaps as much as 100 or more orders of magnitude smaller than naively expected.
In addition, the coupling of the electron to the Higgs, the mass of the electron, is abnormally small, and to a lesser extent, the masses of the light quarks.
In models with large extra dimensions, the assumption of naturalness is violated for operators which multiply field operators that create objects which are localized at different positions in the extra dimensions.
Naturalness and the gauge hierarchy problem
A more practical definition of naturalness is that for any observable which consists of n independent contributions
then all independent contributions to should be comparable to or less than . Otherwise, if one contribution, say , then some other independent contribution would have to be fine-tuned to a large opposite-sign value such as to maintain at its measured value. Such fine-tuning is regarded as unnatural and indicative of some missing ingredient in the theory.
For instance, in the Standard Model with Higgs potential given by
the physical Higgs boson mass is calculated to be
where the quadratically divergent radiative correction is given by
where is the top-quark Yukawa coupling, is the SU(2) gauge coupling and is the energy cut-off to the divergent loop integrals. As increases (depending on the chosen cut-off ), then can be freely dialed so as to maintain at its measured value (now known to be GeV). By insisting on naturalness, then . Solving for , one finds TeV. This then implies that the Standard Model as a natural effective field theory is only valid up to the 1 TeV energy scale.
Sometimes it is complained that this argument depends on the regularization scheme introducing the cut-off and perhaps the problem disappears under dimensional regularization. In this case, if new particles which couple to the Higgs are introduced, one once again regains the quadratic divergence now in terms of the new particle squared masses. For instance, if one includes see-saw neutrinos into the Standard Model, then would blow up to near the see-saw scale, typically expected in the GeV range.
Naturalness, supersymmetry and the little hierarchy
By supersymmetrizing the Standard Model, one arrives at a solution to the gauge hierarchy, or big hierarchy, problem in that supersymmetry guarantees cancellation of quadratic divergences to all orders in perturbation theory. The simplest supersymmetrization of the SM leads to the Minimal Supersymmetric Standard Model or MSSM. In the MSSM, each SM particle has a partner particle known as a super-partner or sparticle. For instance, the left- and right-electron helicity components have scalar partner selectrons and respectively whilst the eight colored gluons have eight colored spin-1/2 gluino superpartners. The MSSM Higgs sector must necessarily be expanded to include two rather than one doublets leading to five physical Higgs particles and whilst three of the eight Higgs component fields are absorbed by the and bosons to make them massive. The MSSM is actually supported by three different sets of measurements which test for the presence of virtual superpartners: 1. the celebrated weak scale measurements of the three gauge couplings strengths are just what is needed for gauge coupling unification at a scale GeV, 2. the value of GeV falls squarely in the range needed to trigger a radiatively-driven breakdown in electroweak symmetry and 3. the measured value of GeV falls within the narrow window of allowed values for the MSSM.
Nonetheless, verification of weak scale SUSY (WSS, SUSY with superpartner masses at or around the weak scale as characterized by GeV) requires the direct observation of at least some of the superpartners at sufficiently energetic colliding beam experiments.[clarification needed] As recent as 2017, the CERN Large Hadron Collider, a collider operating at center-of-mass energy 13 TeV, has not found any evidence for superpartners. This has led to mass limits on the gluino TeV and on the lighter top squark TeV (within the context of certain simplified models which are assumed to make the experimental analysis more tractable). Along with these limits, the rather large measured value of GeV seems to require TeV-scale highly mixed top squarks. These combined measurements have raised concern now about an emerging Little Hierarchy problem characterized by . Under the Little Hierarchy, one might expect the now log-divergent light Higgs mass to blow up to the sparticle mass scale unless one fine-tunes. The Little Hierarchy problem has led to concern that WSS is perhaps not realized in nature, or at least not in the manner typically expected by theorists in years past.
Status of naturalness and the little hierarchy
In the MSSM, the light Higgs mass is calculated to be
where the mixing and loop contributions are but where in most models, the soft SUSY breaking up-Higgs mass is driven to large, TeV-scale negative values (in order to break electroweak symmetry). Then, to maintain the measured value of GeV, one must tune the superpotential mass term to some large positive value. Alternatively, for natural SUSY, one may expect that runs to small negative values in which case both and are of order 100-200 GeV. This already leads to a prediction: since is supersymmetric and feeds mass to both SM particles (W,Z,h) and superpartners (higgsinos), then it is expected from the natural MSSM that light higgsinos exist nearby to the 100-200 GeV scale. This simple realization has profound implications for WSS collider and dark matter searches.
Naturalness in the MSSM has historically been expressed in terms of the boson mass, and indeed this approach leads to more stringent upper bounds on sparticle masses. By minimizing the (Coleman-Weinberg) scalar potential of the MSSM, then one may relate the measured value of GeV to the SUSY Lagrangian parameters:
Here, is the ratio of Higgs field vacuum expectation values and is the down-Higgs soft breaking mass term. The and contain a variety of loop corrections labelled by indices i and j, the most important of which typically comes from the top-squarks.
In the renowned review work of P. Nilles, titled "Supersymmetry, Supergravity and Particle Physics", published on Phys.Rept. 110 (1984) 1-162, one finds the sentence "Experiments within the next five to ten years will enable us to decide whether supersymmetry as a solution of the naturalness problem of the weak interaction scale is a myth or a reality".
- Fowlie, Andrew; Balazs, Csaba; White, Graham; Marzola, Luca; Raidal, Martti (17 August 2016). "Naturalness of the relaxion mechanism". Journal of High Energy Physics. 2016 (8): 100. arXiv:1602.03889. Bibcode:2016JHEP...08..100F. doi:10.1007/JHEP08(2016)100. S2CID 119102534.
- Fowlie, Andrew (10 July 2014). "CMSSM, naturalness and the ?fine-tuning price? of the Very Large Hadron Collider". Physical Review D. 90 (1): 015010. arXiv:1403.3407. Bibcode:2014PhRvD..90a5010F. doi:10.1103/PhysRevD.90.015010. S2CID 118362634.
- Fowlie, Andrew (15 October 2014). "Is the CNMSSM more credible than the CMSSM?". The European Physical Journal C. 74 (10). arXiv:1407.7534. doi:10.1140/epjc/s10052-014-3105-y. S2CID 119304794.
- Cabrera, Maria Eugenia; Casas, Alberto; Austri, Roberto Ruiz de (2009). "Bayesian approach and naturalness in MSSM analyses for the LHC". Journal of High Energy Physics. 2009 (3): 075. arXiv:0812.0536. Bibcode:2009JHEP...03..075C. doi:10.1088/1126-6708/2009/03/075. S2CID 18276270.
- Fichet, S. (18 December 2012). "Quantified naturalness from Bayesian statistics". Physical Review D. 86 (12): 125029. arXiv:1204.4940. Bibcode:2012PhRvD..86l5029F. doi:10.1103/PhysRevD.86.125029. S2CID 119282331.
- N. Seiberg (1993). "Naturalness versus supersymmetric non-renormalization theorems". Physics Letters B. 318 (3): 469–475. arXiv:hep-ph/9309335. Bibcode:1993PhLB..318..469S. doi:10.1016/0370-2693(93)91541-T. S2CID 14683964.
- N. Arkani-Hamed, M. Schmaltz (2000). "Hierarchies without Symmetries from Extra Dimensions". Physical Review D. 61 (3): 033005. arXiv:hep-ph/9903417. Bibcode:2000PhRvD..61c3005A. doi:10.1103/PhysRevD.61.033005. S2CID 18030407.
- 't Hooft, G. (1980). "Naturalness, Chiral Symmetry and Spontaneous Chiral Symmetry Breaking". In 't Hooft, G. (ed.). Recent Developments in Gauge Theories. Plenum Press. ISBN 978-0-306-40479-5.
- Giudice, G. (2008). "Naturally Speaking: The Naturalness Criterion and Physics at the LHC". In Kane, G. (ed.). Perspectives on LHC physics. World Scientific. arXiv:0801.2562. Bibcode:2008plnc.book..155G. doi:10.1142/9789812779762_0010. ISBN 978-9812833891. S2CID 15078813.
- Sabine Hossenfelder (2018). Lost in Math: How Beauty Leads Physics Astray, Basic Books.
- Burton Richter, Is "naturalness" unnatural? Invited talk presented at SUSY06: 14th International Conference On Supersymmetry And The Unification Of Fundamental Interactions 6/12/2006—6/17/2006