In
physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which r ...
, naturalness is the aesthetic property that the
dimensionless
A dimensionless quantity (also known as a bare quantity, pure quantity, or scalar quantity as well as quantity of dimension one) is a quantity to which no physical dimension is assigned, with a corresponding SI unit of measurement of one (or 1) ...
ratio
In mathematics, a ratio shows how many times one number contains another. For example, if there are eight oranges and six lemons in a bowl of fruit, then the ratio of oranges to lemons is eight to six (that is, 8:6, which is equivalent to the ...
s between
free parameter
A free parameter is a variable in a mathematical model which cannot be predicted precisely or constrained by the model and must be estimated experimentally or theoretically. A mathematical model, theory, or conjecture is more likely to be right ...
s or
physical constant
A physical constant, sometimes fundamental physical constant or universal constant, is a physical quantity that is generally believed to be both universal in nature and have constant value in time. It is contrasted with a mathematical constant, ...
s appearing in a physical theory should take values "of order 1" and that free parameters are not
fine-tuned
In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into suc ...
. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234.
The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in
particle physics
Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
. It is a criterion that arises from the seeming non-naturalness of the standard model and the broader topics of the
hierarchy problem
In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravit ...
, fine-tuning, and the
anthropic principle
The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
. However it does tend to suggest a possible area of weakness or future development for current theories such as the
Standard Model
The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
, where some parameters vary by many
orders of magnitude
An order of magnitude is an approximation of the logarithm of a value relative to some contextually understood reference value, usually 10, interpreted as the base of the logarithm and the representative of values of magnitude one. Logarithmic dis ...
, and which require extensive "
fine-tuning
In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into suc ...
" of their current values of the models concerned. The concern is that it is not yet clear whether these seemingly exact values we currently recognize, have arisen by chance (based upon the
anthropic principle
The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
or similar) or whether they arise from a more advanced theory not yet developed, in which these turn out to be expected and well-explained, because of other factors not yet part of particle physics models.
The concept of naturalness is not always compatible with
Occam's razor
Occam's razor, Ockham's razor, or Ocham's razor ( la, novacula Occami), also known as the principle of parsimony or the law of parsimony ( la, lex parsimoniae), is the problem-solving principle that "entities should not be multiplied beyond neces ...
, since many instances of "natural" theories have more parameters than "fine-tuned" theories such as the Standard Model. Naturalness in physics is closely related to the issue of
fine-tuning
In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into suc ...
, and over the past decade many scientists argued that the principle of naturalness is a specific application of
Bayesian statistics
Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about the event, ...
.
In the history of particle physics, the naturalness principle has given correct predictions three times - in the case of electron self-energy, pion mass difference and kaon mass difference.
Overview
A simple example:
Suppose a physics model requires four parameters which allow it to produce a very high quality working model, calculations, and predictions of some aspect of our physical universe. Suppose we find through experiments that the parameters have values:
* 1.2
* 1.31
* 0.9 and
* 404,331,557,902,116,024,553,602,703,216.58 (roughly 4 x 10
29).
We might wonder how such figures arise. But in particular we might be especially curious about a theory where three values are close to one, and the fourth is so different; in other words, the huge disproportion we seem to find between the first three parameters and the fourth. We might also wonder, if these values represent the strengths of forces and one force is so much larger than the others that it needs a factor of 4 x 10
29 to allow it to be related to them in terms of effects, how did our universe come to be so exactly balanced when its forces emerged. In current particle physics the differences between some parameters are much larger than this, so the question is even more noteworthy.
One answer given by some physicists is the
anthropic principle
The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that there is a restrictive lower bound on how statistically probable our observations of the universe are, beca ...
. If the universe came to exist by chance, and perhaps vast numbers of other universes exist or have existed, then life capable of physics experiments only arose in universes that by chance had very balanced forces. All the universes where the forces were not balanced, didn't develop life capable of the question. So if a lifeform like
human being
Humans (''Homo sapiens'') are the most abundant and widespread species of primate, characterized by bipedality, bipedalism and exceptional cognitive skills due to a large and complex Human brain, brain. This has enabled the development of ad ...
s asks such a question, it must have arisen in a universe having balanced forces, however rare that might be. So when we look, that is what we would expect to find, and what we do find.
A second answer is that perhaps there is a deeper understanding of physics, which, if we discovered and understood it, would make clear these aren't really fundamental parameters and there is a good reason why they have the exact values we have found, because they all derive from other more fundamental parameters that are not so unbalanced.
Introduction
In
particle physics
Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
, the assumption of naturalness means that, unless a more detailed explanation exists, all conceivable terms in the
effective action
In quantum field theory, the quantum effective action is a modified expression for the classical action taking into account quantum corrections while ensuring that the principle of least action applies, meaning that extremizing the effective act ...
that preserve the required symmetries should appear in this effective action with natural coefficients.
[
]
In an
effective field theory
In physics, an effective field theory is a type of approximation, or effective theory, for an underlying physical theory, such as a quantum field theory or a statistical mechanics model. An effective field theory includes the appropriate degrees ...
, is the
cutoff scale, an energy or length scale at which the theory breaks down. Due to
dimensional analysis
In engineering and science, dimensional analysis is the analysis of the relationships between different physical quantities by identifying their base quantities (such as length, mass, time, and electric current) and units of measure (such as m ...
, natural coefficients have the form
:
where is the dimension of the
field operator
In physics, canonical quantization is a procedure for quantizing a classical theory, while attempting to preserve the formal structure, such as symmetries, of the classical theory, to the greatest extent possible.
Historically, this was not quite ...
; and is a dimensionless number which should be "random" and smaller than 1 at the scale where the effective theory breaks down. Further
renormalization group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
running can reduce the value of at an energy scale , but by a small factor proportional to .
Some parameters in the effective action of the
Standard Model
The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
seem to have far smaller coefficients than required by consistency with the assumption of naturalness, leading to some of the fundamental open questions in physics. In particular:
* The naturalness of the
QCD
In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type o ...
"theta parameter" leads to the
strong CP problem
The strong CP problem is a puzzling question in particle physics: Why does quantum chromodynamics (QCD) seem to preserve CP-symmetry?
In particle physics, CP stands for the combination of charge conjugation symmetry (C) and parity symmetry (P) ...
, because it is very small (experimentally consistent with "zero") rather than of order of magnitude unity.
* The naturalness of the
Higgs mass leads to the
hierarchy problem
In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravit ...
, because it is 17 orders of magnitude smaller than the
Planck mass that characterizes gravity. (Equivalently, the
Fermi constant
In particle physics, Fermi's interaction (also the Fermi theory of beta decay or the Fermi four-fermion interaction) is an explanation of the beta decay, proposed by Enrico Fermi in 1933. The theory posits four fermions directly interacting ...
characterizing the strength of the
weak force
Weak may refer to:
Songs
* "Weak" (AJR song), 2016
* "Weak" (Melanie C song), 2011
* "Weak" (SWV song), 1993
* "Weak" (Skunk Anansie song), 1995
* "Weak", a song by Seether from '' Seether: 2002-2013''
Television episodes
* "Weak" (''Fear t ...
is very large compared to the
gravitational constant
The gravitational constant (also known as the universal gravitational constant, the Newtonian constant of gravitation, or the Cavendish gravitational constant), denoted by the capital letter , is an empirical physical constant involved in ...
characterizing the strength of
gravity
In physics, gravity () is a fundamental interaction which causes mutual attraction between all things with mass or energy. Gravity is, by far, the weakest of the four fundamental interactions, approximately 1038 times weaker than the stro ...
.)
* The naturalness of the
cosmological constant
In cosmology, the cosmological constant (usually denoted by the Greek capital letter lambda: ), alternatively called Einstein's cosmological constant,
is the constant coefficient of a term that Albert Einstein temporarily added to his field equ ...
leads to the
cosmological constant problem
In cosmology, the cosmological constant problem or vacuum catastrophe is the disagreement between the observed values of vacuum energy density (the small value of the cosmological constant) and theoretical large value of zero-point energy sugges ...
because it is at least 40 and perhaps as much as 100 or more orders of magnitude smaller than naively expected.
In addition, the coupling of the electron to the Higgs, the mass of the electron, is abnormally small, and to a lesser extent, the masses of the light quarks.
In models with
large extra dimension
In particle physics and string theory (M-theory), the ADD model, also known as the model with large extra dimensions (LED), is a model framework that attempts to solve the hierarchy problem. (''Why is the force of gravity so weak compared to the el ...
s, the assumption of naturalness is violated for operators which multiply field operators that create objects which are localized at different positions in the extra dimensions.
[
]
Naturalness and the gauge hierarchy problem
A more practical definition of naturalness is that for any observable
which consists of independent contributions
:
then all ''independent'' contributions to
should be comparable to or less than
.
Otherwise, if one contribution, say
, then some other independent contribution would have to be fine-tuned to a large opposite-sign value
such as to maintain
at its measured value. Such fine-tuning is regarded as unnatural and indicative of some missing ingredient in the theory.
For instance, in the Standard Model with Higgs potential given by
:
the physical Higgs boson mass is calculated to be
:
where the quadratically divergent radiative correction is given by
:
where
is the top-quark Yukawa coupling,
is the SU(2) gauge coupling and
is the energy cut-off to the divergent loop integrals. As
increases (depending on the chosen cut-off
), then
can be freely dialed so as
to maintain
at its measured value (now known to be
GeV).
By insisting on naturalness, then