In
physics, naturalness is the aesthetic property that the
dimensionless ratios between
free parameter
A free parameter is a variable in a mathematical model which cannot be predicted precisely or constrained by the model and must be estimated experimentally or theoretically. A mathematical model, theory, or conjecture is more likely to be right ...
s or
physical constant
A physical constant, sometimes fundamental physical constant or universal constant, is a physical quantity that is generally believed to be both universal in nature and have constant value in time. It is contrasted with a mathematical constant, ...
s appearing in a physical theory should take values "of order 1" and that free parameters are not
fine-tuned. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234.
The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in
particle physics. It is a criterion that arises from the seeming non-naturalness of the standard model and the broader topics of the
hierarchy problem
In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravit ...
, fine-tuning, and the
anthropic principle
The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
. However it does tend to suggest a possible area of weakness or future development for current theories such as the
Standard Model
The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
, where some parameters vary by many
orders of magnitude, and which require extensive "
fine-tuning" of their current values of the models concerned. The concern is that it is not yet clear whether these seemingly exact values we currently recognize, have arisen by chance (based upon the
anthropic principle
The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
or similar) or whether they arise from a more advanced theory not yet developed, in which these turn out to be expected and well-explained, because of other factors not yet part of particle physics models.
The concept of naturalness is not always compatible with
Occam's razor
Occam's razor, Ockham's razor, or Ocham's razor ( la, novacula Occami), also known as the principle of parsimony or the law of parsimony ( la, lex parsimoniae), is the problem-solving principle that "entities should not be multiplied beyond neces ...
, since many instances of "natural" theories have more parameters than "fine-tuned" theories such as the Standard Model. Naturalness in physics is closely related to the issue of
fine-tuning, and over the past decade many scientists argued that the principle of naturalness is a specific application of
Bayesian statistics.
In the history of particle physics, the naturalness principle has given correct predictions three times - in the case of electron self-energy, pion mass difference and kaon mass difference.
Overview
A simple example:
Suppose a physics model requires four parameters which allow it to produce a very high quality working model, calculations, and predictions of some aspect of our physical universe. Suppose we find through experiments that the parameters have values:
* 1.2
* 1.31
* 0.9 and
* 404,331,557,902,116,024,553,602,703,216.58 (roughly 4 x 10
29).
We might wonder how such figures arise. But in particular we might be especially curious about a theory where three values are close to one, and the fourth is so different; in other words, the huge disproportion we seem to find between the first three parameters and the fourth. We might also wonder, if these values represent the strengths of forces and one force is so much larger than the others that it needs a factor of 4 x 10
29 to allow it to be related to them in terms of effects, how did our universe come to be so exactly balanced when its forces emerged. In current particle physics the differences between some parameters are much larger than this, so the question is even more noteworthy.
One answer given by some physicists is the
anthropic principle
The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
. If the universe came to exist by chance, and perhaps vast numbers of other universes exist or have existed, then life capable of physics experiments only arose in universes that by chance had very balanced forces. All the universes where the forces were not balanced, didn't develop life capable of the question. So if a lifeform like
human being
Humans (''Homo sapiens'') are the most abundant and widespread species of primate, characterized by bipedality, bipedalism and exceptional cognitive skills due to a large and complex Human brain, brain. This has enabled the development of ad ...
s asks such a question, it must have arisen in a universe having balanced forces, however rare that might be. So when we look, that is what we would expect to find, and what we do find.
A second answer is that perhaps there is a deeper understanding of physics, which, if we discovered and understood it, would make clear these aren't really fundamental parameters and there is a good reason why they have the exact values we have found, because they all derive from other more fundamental parameters that are not so unbalanced.
Introduction
In
particle physics, the assumption of naturalness means that, unless a more detailed explanation exists, all conceivable terms in the
effective action that preserve the required symmetries should appear in this effective action with natural coefficients.
[
]
In an
effective field theory
In physics, an effective field theory is a type of approximation, or effective theory, for an underlying physical theory, such as a quantum field theory or a statistical mechanics model. An effective field theory includes the appropriate degrees ...
, is the
cutoff scale, an energy or length scale at which the theory breaks down. Due to
dimensional analysis, natural coefficients have the form
:
where is the dimension of the
field operator
In physics, canonical quantization is a procedure for quantizing a classical theory, while attempting to preserve the formal structure, such as symmetries, of the classical theory, to the greatest extent possible.
Historically, this was not quite ...
; and is a dimensionless number which should be "random" and smaller than 1 at the scale where the effective theory breaks down. Further
renormalization group running can reduce the value of at an energy scale , but by a small factor proportional to .
Some parameters in the effective action of the
Standard Model
The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
seem to have far smaller coefficients than required by consistency with the assumption of naturalness, leading to some of the fundamental open questions in physics. In particular:
* The naturalness of the
QCD
In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type o ...
"theta parameter" leads to the
strong CP problem
The strong CP problem is a puzzling question in particle physics: Why does quantum chromodynamics (QCD) seem to preserve CP-symmetry?
In particle physics, CP stands for the combination of charge conjugation symmetry (C) and parity symmetry (P) ...
, because it is very small (experimentally consistent with "zero") rather than of order of magnitude unity.
* The naturalness of the
Higgs
Higgs may refer to:
Physics
*Higgs boson, an elementary particle
*Higgs mechanism, an explanation for electroweak symmetry breaking
*Higgs field, a quantum field
People
*Alan Higgs (died 1979), English businessman and philanthropist
*Blaine Higgs ...
mass leads to the
hierarchy problem
In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravit ...
, because it is 17 orders of magnitude smaller than the
Planck mass that characterizes gravity. (Equivalently, the
Fermi constant
In particle physics, Fermi's interaction (also the Fermi theory of beta decay or the Fermi four-fermion interaction) is an explanation of the beta decay, proposed by Enrico Fermi in 1933. The theory posits four fermions directly interacting ...
characterizing the strength of the
weak force is very large compared to the
gravitational constant
The gravitational constant (also known as the universal gravitational constant, the Newtonian constant of gravitation, or the Cavendish gravitational constant), denoted by the capital letter , is an empirical physical constant involved in ...
characterizing the strength of
gravity.)
* The naturalness of the
cosmological constant leads to the
cosmological constant problem
In cosmology, the cosmological constant problem or vacuum catastrophe is the disagreement between the observed values of vacuum energy density (the small value of the cosmological constant) and theoretical large value of zero-point energy sugges ...
because it is at least 40 and perhaps as much as 100 or more orders of magnitude smaller than naively expected.
In addition, the coupling of the electron to the Higgs, the mass of the electron, is abnormally small, and to a lesser extent, the masses of the light quarks.
In models with
large extra dimensions, the assumption of naturalness is violated for operators which multiply field operators that create objects which are localized at different positions in the extra dimensions.
[
]
Naturalness and the gauge hierarchy problem
A more practical definition of naturalness is that for any observable
which consists of independent contributions
:
then all ''independent'' contributions to
should be comparable to or less than
.
Otherwise, if one contribution, say
, then some other independent contribution would have to be fine-tuned to a large opposite-sign value
such as to maintain
at its measured value. Such fine-tuning is regarded as unnatural and indicative of some missing ingredient in the theory.
For instance, in the Standard Model with Higgs potential given by
:
the physical Higgs boson mass is calculated to be
:
where the quadratically divergent radiative correction is given by
:
where
is the top-quark Yukawa coupling,
is the SU(2) gauge coupling and
is the energy cut-off to the divergent loop integrals. As
increases (depending on the chosen cut-off
), then
can be freely dialed so as
to maintain
at its measured value (now known to be
GeV).
By insisting on naturalness, then