HOME

TheInfoList



OR:

In
physics Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which r ...
, naturalness is the aesthetic property that the
dimensionless A dimensionless quantity (also known as a bare quantity, pure quantity, or scalar quantity as well as quantity of dimension one) is a quantity to which no physical dimension is assigned, with a corresponding SI unit of measurement of one (or 1) ...
ratio In mathematics, a ratio shows how many times one number contains another. For example, if there are eight oranges and six lemons in a bowl of fruit, then the ratio of oranges to lemons is eight to six (that is, 8:6, which is equivalent to the ...
s between
free parameter A free parameter is a variable in a mathematical model which cannot be predicted precisely or constrained by the model and must be estimated experimentally or theoretically. A mathematical model, theory, or conjecture is more likely to be right ...
s or
physical constant A physical constant, sometimes fundamental physical constant or universal constant, is a physical quantity that is generally believed to be both universal in nature and have constant value in time. It is contrasted with a mathematical constant, ...
s appearing in a physical theory should take values "of order 1" and that free parameters are not
fine-tuned In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into suc ...
. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234. The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in
particle physics Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
. It is a criterion that arises from the seeming non-naturalness of the standard model and the broader topics of the
hierarchy problem In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravit ...
, fine-tuning, and the
anthropic principle The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
. However it does tend to suggest a possible area of weakness or future development for current theories such as the
Standard Model The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
, where some parameters vary by many
orders of magnitude An order of magnitude is an approximation of the logarithm of a value relative to some contextually understood reference value, usually 10, interpreted as the base of the logarithm and the representative of values of magnitude one. Logarithmic dis ...
, and which require extensive "
fine-tuning In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into suc ...
" of their current values of the models concerned. The concern is that it is not yet clear whether these seemingly exact values we currently recognize, have arisen by chance (based upon the
anthropic principle The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
or similar) or whether they arise from a more advanced theory not yet developed, in which these turn out to be expected and well-explained, because of other factors not yet part of particle physics models. The concept of naturalness is not always compatible with
Occam's razor Occam's razor, Ockham's razor, or Ocham's razor ( la, novacula Occami), also known as the principle of parsimony or the law of parsimony ( la, lex parsimoniae), is the problem-solving principle that "entities should not be multiplied beyond neces ...
, since many instances of "natural" theories have more parameters than "fine-tuned" theories such as the Standard Model. Naturalness in physics is closely related to the issue of
fine-tuning In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into suc ...
, and over the past decade many scientists argued that the principle of naturalness is a specific application of
Bayesian statistics Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about the event, ...
. In the history of particle physics, the naturalness principle has given correct predictions three times - in the case of electron self-energy, pion mass difference and kaon mass difference.


Overview

A simple example: Suppose a physics model requires four parameters which allow it to produce a very high quality working model, calculations, and predictions of some aspect of our physical universe. Suppose we find through experiments that the parameters have values: * 1.2 * 1.31 * 0.9 and * 404,331,557,902,116,024,553,602,703,216.58 (roughly 4 x 1029). We might wonder how such figures arise. But in particular we might be especially curious about a theory where three values are close to one, and the fourth is so different; in other words, the huge disproportion we seem to find between the first three parameters and the fourth. We might also wonder, if these values represent the strengths of forces and one force is so much larger than the others that it needs a factor of 4 x 1029 to allow it to be related to them in terms of effects, how did our universe come to be so exactly balanced when its forces emerged. In current particle physics the differences between some parameters are much larger than this, so the question is even more noteworthy. One answer given by some physicists is the
anthropic principle The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
. If the universe came to exist by chance, and perhaps vast numbers of other universes exist or have existed, then life capable of physics experiments only arose in universes that by chance had very balanced forces. All the universes where the forces were not balanced, didn't develop life capable of the question. So if a lifeform like
human being Humans (''Homo sapiens'') are the most abundant and widespread species of primate, characterized by bipedality, bipedalism and exceptional cognitive skills due to a large and complex Human brain, brain. This has enabled the development of ad ...
s asks such a question, it must have arisen in a universe having balanced forces, however rare that might be. So when we look, that is what we would expect to find, and what we do find. A second answer is that perhaps there is a deeper understanding of physics, which, if we discovered and understood it, would make clear these aren't really fundamental parameters and there is a good reason why they have the exact values we have found, because they all derive from other more fundamental parameters that are not so unbalanced.


Introduction

In
particle physics Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
, the assumption of naturalness means that, unless a more detailed explanation exists, all conceivable terms in the
effective action In quantum field theory, the quantum effective action is a modified expression for the classical action taking into account quantum corrections while ensuring that the principle of least action applies, meaning that extremizing the effective act ...
that preserve the required symmetries should appear in this effective action with natural coefficients. In an
effective field theory In physics, an effective field theory is a type of approximation, or effective theory, for an underlying physical theory, such as a quantum field theory or a statistical mechanics model. An effective field theory includes the appropriate degrees ...
, is the cutoff scale, an energy or length scale at which the theory breaks down. Due to
dimensional analysis In engineering and science, dimensional analysis is the analysis of the relationships between different physical quantities by identifying their base quantities (such as length, mass, time, and electric current) and units of measure (such as m ...
, natural coefficients have the form :h = c \Lambda^, where is the dimension of the
field operator In physics, canonical quantization is a procedure for quantizing a classical theory, while attempting to preserve the formal structure, such as symmetries, of the classical theory, to the greatest extent possible. Historically, this was not quite ...
; and is a dimensionless number which should be "random" and smaller than 1 at the scale where the effective theory breaks down. Further
renormalization group In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
running can reduce the value of at an energy scale , but by a small factor proportional to . Some parameters in the effective action of the
Standard Model The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
seem to have far smaller coefficients than required by consistency with the assumption of naturalness, leading to some of the fundamental open questions in physics. In particular: * The naturalness of the
QCD In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type o ...
"theta parameter" leads to the
strong CP problem The strong CP problem is a puzzling question in particle physics: Why does quantum chromodynamics (QCD) seem to preserve CP-symmetry? In particle physics, CP stands for the combination of charge conjugation symmetry (C) and parity symmetry (P) ...
, because it is very small (experimentally consistent with "zero") rather than of order of magnitude unity. * The naturalness of the Higgs mass leads to the
hierarchy problem In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravit ...
, because it is 17 orders of magnitude smaller than the Planck mass that characterizes gravity. (Equivalently, the
Fermi constant In particle physics, Fermi's interaction (also the Fermi theory of beta decay or the Fermi four-fermion interaction) is an explanation of the beta decay, proposed by Enrico Fermi in 1933. The theory posits four fermions directly interacting ...
characterizing the strength of the
weak force Weak may refer to: Songs * "Weak" (AJR song), 2016 * "Weak" (Melanie C song), 2011 * "Weak" (SWV song), 1993 * "Weak" (Skunk Anansie song), 1995 * "Weak", a song by Seether from '' Seether: 2002-2013'' Television episodes * "Weak" (''Fear t ...
is very large compared to the
gravitational constant The gravitational constant (also known as the universal gravitational constant, the Newtonian constant of gravitation, or the Cavendish gravitational constant), denoted by the capital letter , is an empirical physical constant involved in ...
characterizing the strength of
gravity In physics, gravity () is a fundamental interaction which causes mutual attraction between all things with mass or energy. Gravity is, by far, the weakest of the four fundamental interactions, approximately 1038 times weaker than the stro ...
.) * The naturalness of the
cosmological constant In cosmology, the cosmological constant (usually denoted by the Greek capital letter lambda: ), alternatively called Einstein's cosmological constant, is the constant coefficient of a term that Albert Einstein temporarily added to his field equ ...
leads to the
cosmological constant problem In cosmology, the cosmological constant problem or vacuum catastrophe is the disagreement between the observed values of vacuum energy density (the small value of the cosmological constant) and theoretical large value of zero-point energy sugges ...
because it is at least 40 and perhaps as much as 100 or more orders of magnitude smaller than naively expected. In addition, the coupling of the electron to the Higgs, the mass of the electron, is abnormally small, and to a lesser extent, the masses of the light quarks. In models with
large extra dimension In particle physics and string theory (M-theory), the ADD model, also known as the model with large extra dimensions (LED), is a model framework that attempts to solve the hierarchy problem. (''Why is the force of gravity so weak compared to the el ...
s, the assumption of naturalness is violated for operators which multiply field operators that create objects which are localized at different positions in the extra dimensions.


Naturalness and the gauge hierarchy problem

A more practical definition of naturalness is that for any observable O which consists of independent contributions :O=a_1 +\cdots +a_n, then all ''independent'' contributions to O should be comparable to or less than O. Otherwise, if one contribution, say a_1\gg O, then some other independent contribution would have to be fine-tuned to a large opposite-sign value such as to maintain O at its measured value. Such fine-tuning is regarded as unnatural and indicative of some missing ingredient in the theory. For instance, in the Standard Model with Higgs potential given by : V=-\mu^2\phi^\dagger\phi +\lambda (\phi^\dagger\phi )^2 the physical Higgs boson mass is calculated to be : m_h^2 = 2\mu^2+\delta m_h^2 where the quadratically divergent radiative correction is given by : \delta m_h^2 \simeq \frac \Bigl( -\lambda_t^2 + \frac + \frac + \lambda \Bigr) \Lambda^2 where \lambda_t is the top-quark Yukawa coupling, g is the SU(2) gauge coupling and \Lambda is the energy cut-off to the divergent loop integrals. As \delta m_h^2 increases (depending on the chosen cut-off \Lambda), then \mu^2 can be freely dialed so as to maintain m_h at its measured value (now known to be m_h\simeq 125 GeV). By insisting on naturalness, then \delta m_h^2. Solving for \Lambda , one finds \Lambda<1 TeV. This then implies that the Standard Model as a natural effective field theory is only valid up to the 1 TeV energy scale. Sometimes it is complained that this argument depends on the regularization scheme introducing the cut-off \Lambda and perhaps the problem disappears under dimensional regularization. In this case, if new particles which couple to the Higgs are introduced, one once again regains the quadratic divergence now in terms of the new particle squared masses. For instance, if one includes see-saw neutrinos into the Standard Model, then \delta m_h would blow up to near the see-saw scale, typically expected in the 10^ GeV range.


MSSM and the little hierarchy


Overview

By supersymmetrizing the Standard Model, one arrives at a solution to the gauge hierarchy, or big hierarchy, problem in that supersymmetry guarantees cancellation of quadratic divergences to all orders in perturbation theory. The simplest supersymmetrization of the SM leads to the Minimal Supersymmetric Standard Model or MSSM. In the MSSM, each SM particle has a partner particle known as a super-partner or sparticle. For instance, the left- and right-electron helicity components have scalar partner selectrons \tilde_L and \tilde_R respectively whilst the eight colored gluons have eight colored spin-1/2 gluino superpartners. The MSSM Higgs sector must necessarily be expanded to include two rather than one doublets leading to five physical Higgs particles h,\ H, A and H^\pm whilst three of the eight Higgs component fields are absorbed by the W^\pm and Z bosons to make them massive. The MSSM is actually supported by three different sets of measurements which test for the presence of virtual superpartners: 1. the celebrated weak scale measurements of the three gauge couplings strengths are just what is needed for gauge coupling unification at a scale Q\simeq 2\times 10^ GeV, 2. the value of m_t\simeq 173 GeV falls squarely in the range needed to trigger a radiatively-driven breakdown in electroweak symmetry and 3. the measured value of m_h\simeq 125 GeV falls within the narrow window of allowed values for the MSSM. Nonetheless, verification of weak scale SUSY (WSS, SUSY with superpartner masses at or around the weak scale as characterized by m(W,Z,h)\sim 100 GeV) requires the direct observation of at least some of the superpartners at sufficiently energetic colliding beam experiments. As recent as 2017, the CERN Large Hadron Collider, a pp collider operating at center-of-mass energy 13 TeV, has not found any evidence for superpartners. This has led to mass limits on the gluino m_>2 TeV and on the lighter top squark m_>1 TeV (within the context of certain simplified models which are assumed to make the experimental analysis more tractable). Along with these limits, the rather large measured value of m_h\simeq 125 GeV seems to require TeV-scale highly mixed top squarks. These combined measurements have raised concern now about an emerging Little Hierarchy problem characterized by m_\ll m_. Under the Little Hierarchy, one might expect the now log-divergent light Higgs mass to blow up to the sparticle mass scale unless one fine-tunes. The Little Hierarchy problem has led to concern that WSS is perhaps not realized in nature, or at least not in the manner typically expected by theorists in years past.


Status

In the MSSM, the light Higgs mass is calculated to be : m_h^2=\mu^2 +m_^2+ + where the mixing and loop contributions are < m_h^2 but where in most models, the soft SUSY breaking up-Higgs mass m_^2 is driven to large, TeV-scale negative values (in order to break electroweak symmetry). Then, to maintain the measured value of m_h=125 GeV, one must tune the superpotential mass term \mu^2 to some large positive value. Alternatively, for natural SUSY, one may expect that m_^2 runs to small negative values in which case both \mu and , m_, are of order 100-200 GeV. This already leads to a prediction: since \mu is supersymmetric and feeds mass to both SM particles (W,Z,h) and superpartners (higgsinos), then it is expected from the natural MSSM that light higgsinos exist nearby to the 100-200 GeV scale. This simple realization has profound implications for WSS collider and dark matter searches. Naturalness in the MSSM has historically been expressed in terms of the Z boson mass, and indeed this approach leads to more stringent upper bounds on sparticle masses. By minimizing the (Coleman-Weinberg) scalar potential of the MSSM, then one may relate the measured value of m_Z=91.2 GeV to the SUSY Lagrangian parameters: :\frac=\frac-\mu^2\simeq -m_^2-\Sigma_u^u (i)-\mu^2 Here, \tan\beta\sim 5-50 is the ratio of Higgs field vacuum expectation values v_u/v_d and m_^2 is the down-Higgs soft breaking mass term. The \Sigma_d^d(i) and \Sigma_u^u(j) contain a variety of loop corrections labelled by indices i and j, the most important of which typically comes from the top-squarks. In the renowned review work of P. Nilles, titled "Supersymmetry, Supergravity and Particle Physics", published on Phys.Rept. 110 (1984) 1-162, one finds the sentence "Experiments within the next five to ten years will enable us to decide whether supersymmetry as a solution of the naturalness problem of the weak interaction scale is a myth or a reality".


See also

*
Fine tuning In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into suc ...
*
Hierarchy problem In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravit ...
*
Large extra dimension In particle physics and string theory (M-theory), the ADD model, also known as the model with large extra dimensions (LED), is a model framework that attempts to solve the hierarchy problem. (''Why is the force of gravity so weak compared to the el ...
s *
Split supersymmetry In particle physics, split supersymmetry is a proposal for physics beyond the Standard Model. History It was proposed separately in three papers. The first by James Wells in June 2003 in a more modest form that mildly relaxed the assumption abo ...
* Weak gravity conjecture


References


Further reading

* * *
Sabine Hossenfelder Sabine Hossenfelder (born 1976) is a German theoretical physicist, author, musician and YouTuber. She is currently employed as a research fellow at the Frankfurt Institute for Advanced Studies. She is the author of ''Lost in Math: How Beauty Le ...
(2018). ''Lost in Math: How Beauty Leads Physics Astray'', Basic Books. *
Burton Richter Burton Richter (March 22, 1931 – July 18, 2018) was an American physicist. He led the Stanford Linear Accelerator Center (SLAC) team which co-discovered the J/ψ meson in 1974, alongside the Brookhaven National Laboratory (BNL) team led by S ...

Is "naturalness" unnatural?
Invited talk presented at SUSY06: 14th International Conference On Supersymmetry And The Unification Of Fundamental Interactions 6/12/2006—6/17/2006 {{DEFAULTSORT:Naturalness (Physics) Particle physics