Renormalization is a collection of techniques in
quantum field theory
In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and ...
, the
statistical mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
of fields, and the theory of
self-similar geometric structures, that are used to treat
infinities arising in calculated quantities by altering values of these quantities to compensate for effects of their self-interactions. But even if no infinities arose in
loop diagrams in quantum field theory, it could be shown that it would be necessary to renormalize the mass and fields appearing in the original
Lagrangian
Lagrangian may refer to:
Mathematics
* Lagrangian function, used to solve constrained minimization problems in optimization theory; see Lagrange multiplier
** Lagrangian relaxation, the method of approximating a difficult constrained problem with ...
.
For example, an
electron
The electron ( or ) is a subatomic particle with a negative one elementary electric charge. Electrons belong to the first generation of the lepton particle family,
and are generally thought to be elementary particles because they have no kn ...
theory may begin by postulating an electron with an initial mass and charge. In
quantum field theory
In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and ...
a cloud of
virtual particle
A virtual particle is a theoretical transient particle that exhibits some of the characteristics of an ordinary particle, while having its existence limited by the uncertainty principle. The concept of virtual particles arises in the perturbat ...
s, such as
photon
A photon () is an elementary particle that is a quantum of the electromagnetic field, including electromagnetic radiation such as light and radio waves, and the force carrier for the electromagnetic force. Photons are massless, so they always ...
s,
positron
The positron or antielectron is the antiparticle or the antimatter counterpart of the electron. It has an electric charge of +1 '' e'', a spin of 1/2 (the same as the electron), and the same mass as an electron. When a positron collides ...
s, and others surrounds and interacts with the initial electron. Accounting for the interactions of the surrounding particles (e.g. collisions at different energies) shows that the electron-system behaves as if it had a different mass and charge than initially postulated. Renormalization, in this example, mathematically replaces the initially postulated mass and charge of an electron with the experimentally observed mass and charge. Mathematics and experiments prove that positrons and more massive particles like
proton
A proton is a stable subatomic particle, symbol , H+, or 1H+ with a positive electric charge of +1 ''e'' elementary charge. Its mass is slightly less than that of a neutron and 1,836 times the mass of an electron (the proton–electron mass ...
s exhibit precisely the same observed charge as the electron – even in the presence of much stronger interactions and more intense clouds of virtual particles.
Renormalization specifies relationships between parameters in the theory when parameters describing large distance scales differ from parameters describing small distance scales. Physically, the pileup of contributions from an infinity of scales involved in a problem may then result in further infinities. When describing
spacetime
In physics, spacetime is a mathematical model that combines the three dimensions of space and one dimension of time into a single four-dimensional manifold. Spacetime diagrams can be used to visualize relativistic effects, such as why differen ...
as a continuum, certain statistical and quantum mechanical constructions are not
well-defined
In mathematics, a well-defined expression or unambiguous expression is an expression whose definition assigns it a unique interpretation or value. Otherwise, the expression is said to be ''not well defined'', ill defined or ''ambiguous''. A funct ...
. To define them, or make them unambiguous, a
continuum limit
In mathematical physics and mathematics, the continuum limit or scaling limit of a lattice model (physics), lattice model refers to its behaviour in the limit as the lattice spacing goes to zero. It is often useful to use lattice models to approxi ...
must carefully remove "construction scaffolding" of lattices at various scales. Renormalization procedures are based on the requirement that certain physical quantities (such as the mass and charge of an electron) equal observed (experimental) values. That is, the experimental value of the physical quantity yields practical applications, but due to their empirical nature the observed measurement represents areas of quantum field theory that require deeper derivation from theoretical bases.
Renormalization was first developed in
quantum electrodynamics
In particle physics, quantum electrodynamics (QED) is the relativistic quantum field theory of electrodynamics. In essence, it describes how light and matter interact and is the first theory where full agreement between quantum mechanics and spec ...
(QED) to make sense of
infinite
Infinite may refer to:
Mathematics
* Infinite set, a set that is not a finite set
*Infinity, an abstract concept describing something without any limit
Music
*Infinite (group), a South Korean boy band
*''Infinite'' (EP), debut EP of American m ...
integrals in
perturbation theory
In mathematics and applied mathematics, perturbation theory comprises methods for finding an approximate solution to a problem, by starting from the exact solution of a related, simpler problem. A critical feature of the technique is a middle ...
. Initially viewed as a suspect provisional procedure even by some of its originators, renormalization eventually was embraced as an important and
self-consistent
In classical deductive logic, a consistent theory is one that does not lead to a logical contradiction. The lack of contradiction can be defined in either semantic or syntactic terms. The semantic definition states that a theory is consistent ...
actual mechanism of scale physics in several fields of
physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which r ...
and
mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
.
Today, the point of view has shifted: on the basis of the breakthrough
renormalization group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
insights of
Nikolay Bogolyubov
Nikolay Nikolayevich Bogolyubov (russian: Никола́й Никола́евич Боголю́бов; 21 August 1909 – 13 February 1992), also transliterated as Bogoliubov and Bogolubov, was a Soviet and Russian mathematician and theoretic ...
and
Kenneth Wilson, the focus is on variation of physical quantities across contiguous scales, while distant scales are related to each other through "effective" descriptions. All scales are linked in a broadly systematic way, and the actual physics pertinent to each is extracted with the suitable specific computational techniques appropriate for each. Wilson clarified which variables of a system are crucial and which are redundant.
Renormalization is distinct from
regularization
Regularization may refer to:
* Regularization (linguistics)
* Regularization (mathematics)
* Regularization (physics)
In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in ...
, another technique to control infinities by assuming the existence of new unknown physics at new scales.
Self-interactions in classical physics
The problem of infinities first arose in the
classical electrodynamics
Classical electromagnetism or classical electrodynamics is a branch of theoretical physics that studies the interactions between electric charges and currents using an extension of the classical Newtonian model; It is, therefore, a classical fie ...
of
point particles
A point particle (ideal particle or point-like particle, often spelled pointlike particle) is an idealization of particles heavily used in physics. Its defining feature is that it lacks spatial extension; being dimensionless, it does not take u ...
in the 19th and early 20th century.
The mass of a charged particle should include the mass–energy in its electrostatic field (
electromagnetic mass Electromagnetic mass was initially a concept of classical mechanics, denoting as to how much the electromagnetic field, or the self-energy, is contributing to the mass of charged particles. It was first derived by J. J. Thomson in 1881 and was for ...
). Assume that the particle is a charged spherical shell of radius . The mass–energy in the field is
:
which becomes infinite as . This implies that the point particle would have infinite
inertia
Inertia is the idea that an object will continue its current motion until some force causes its speed or direction to change. The term is properly understood as shorthand for "the principle of inertia" as described by Newton in his first law ...
and thus cannot be accelerated. Incidentally, the value of that makes
equal to the electron mass is called the
classical electron radius
The classical electron radius is a combination of fundamental physical quantities that define a length scale for problems involving an electron interacting with electromagnetic radiation. It links the classical electrostatic self-interaction energ ...
, which (setting
and restoring factors of and
) turns out to be
:
where
is the
fine-structure constant
In physics, the fine-structure constant, also known as the Sommerfeld constant, commonly denoted by (the Greek letter ''alpha''), is a fundamental physical constant which quantifies the strength of the electromagnetic interaction between ele ...
, and
is the reduced
Compton wavelength
The Compton wavelength is a quantum mechanical property of a particle. The Compton wavelength of a particle is equal to the wavelength of a photon whose energy is the same as the rest energy of that particle (see mass–energy equivalence). It was ...
of the electron.
Renormalization: The total effective mass of a spherical charged particle includes the actual bare mass of the spherical shell (in addition to the mass mentioned above associated with its electric field). If the shell's bare mass is allowed to be negative, it might be possible to take a consistent point limit. This was called ''renormalization'', and
Lorentz
Lorentz is a name derived from the Roman surname, Laurentius, which means "from Laurentum". It is the German form of Laurence. Notable people with the name include:
Given name
* Lorentz Aspen (born 1978), Norwegian heavy metal pianist and keyboar ...
and
Abraham
Abraham, ; ar, , , name=, group= (originally Abram) is the common Hebrew patriarch of the Abrahamic religions, including Judaism, Christianity, and Islam. In Judaism, he is the founding father of the special relationship between the Jew ...
attempted to develop a classical theory of the electron this way. This early work was the inspiration for later attempts at
regularization
Regularization may refer to:
* Regularization (linguistics)
* Regularization (mathematics)
* Regularization (physics)
In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in ...
and renormalization in quantum field theory.
(See also
regularization (physics)
In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in order to make them finite by the introduction of a suitable parameter called the regulator. The regulator, also know ...
for an alternative way to remove infinities from this classical problem, assuming new physics exists at small scales.)
When calculating the
electromagnetic
In physics, electromagnetism is an interaction that occurs between particles with electric charge. It is the second-strongest of the four fundamental interactions, after the strong force, and it is the dominant force in the interactions of a ...
interactions of
charged particles, it is tempting to ignore the ''
back-reaction'' of a particle's own field on itself. (Analogous to the
back-EMF of circuit analysis.) But this back-reaction is necessary to explain the friction on charged particles when they emit radiation. If the electron is assumed to be a point, the value of the back-reaction diverges, for the same reason that the mass diverges, because the field is
inverse-square
In science, an inverse-square law is any scientific law stating that a specified physical quantity is inversely proportional to the square of the distance from the source of that physical quantity. The fundamental cause for this can be understo ...
.
The
Abraham–Lorentz theory had a noncausal "pre-acceleration". Sometimes an electron would start moving ''before'' the force is applied. This is a sign that the point limit is inconsistent.
The trouble was worse in classical field theory than in quantum field theory, because in quantum field theory a charged particle experiences
Zitterbewegung
In physics, the zitterbewegung ("jittery motion" in German, ) is the predicted rapid oscillatory motion of elementary particles that obey relativistic wave equations. The existence of such motion was first discussed by Gregory Breit in 1928 and la ...
due to interference with virtual particle–antiparticle pairs, thus effectively smearing out the charge over a region comparable to the Compton wavelength. In quantum electrodynamics at small coupling, the electromagnetic mass only diverges as the logarithm of the radius of the particle.
Divergences in quantum electrodynamics
When developing
quantum electrodynamics
In particle physics, quantum electrodynamics (QED) is the relativistic quantum field theory of electrodynamics. In essence, it describes how light and matter interact and is the first theory where full agreement between quantum mechanics and spec ...
in the 1930s,
Max Born
Max Born (; 11 December 1882 – 5 January 1970) was a German physicist and mathematician who was instrumental in the development of quantum mechanics. He also made contributions to solid-state physics and optics and supervised the work of a n ...
,
Werner Heisenberg
Werner Karl Heisenberg () (5 December 1901 – 1 February 1976) was a German theoretical physicist and one of the main pioneers of the theory of quantum mechanics. He published his work in 1925 in a breakthrough paper. In the subsequent series ...
,
Pascual Jordan
Ernst Pascual Jordan (; 18 October 1902 – 31 July 1980) was a German theoretical and mathematical physicist who made significant contributions to quantum mechanics and quantum field theory. He contributed much to the mathematical form of matri ...
, and
Paul Dirac
Paul Adrien Maurice Dirac (; 8 August 1902 – 20 October 1984) was an English theoretical physicist who is regarded as one of the most significant physicists of the 20th century. He was the Lucasian Professor of Mathematics at the Univer ...
discovered that in perturbative corrections many integrals were divergent (see
The problem of infinities).
One way of describing the
perturbation theory
In mathematics and applied mathematics, perturbation theory comprises methods for finding an approximate solution to a problem, by starting from the exact solution of a related, simpler problem. A critical feature of the technique is a middle ...
corrections' divergences was discovered in 1947–49 by
Hans Kramers
Hendrik Anthony "Hans" Kramers (17 December 1894 – 24 April 1952) was a Dutch physicist who worked with Niels Bohr to understand how electromagnetic waves interact with matter and made important contributions to quantum mechanics and statistical ...
,
Hans Bethe
Hans Albrecht Bethe (; July 2, 1906 – March 6, 2005) was a German-American theoretical physicist who made major contributions to nuclear physics, astrophysics, quantum electrodynamics, and solid-state physics, and who won the 1967 Nobel ...
,
Julian Schwinger
Julian Seymour Schwinger (; February 12, 1918 – July 16, 1994) was a Nobel Prize winning American theoretical physicist. He is best known for his work on quantum electrodynamics (QED), in particular for developing a relativistically invariant ...
,
Richard Feynman
Richard Phillips Feynman (; May 11, 1918 – February 15, 1988) was an American theoretical physicist, known for his work in the path integral formulation of quantum mechanics, the theory of quantum electrodynamics, the physics of the superflu ...
, and
Shin'ichiro Tomonaga, and systematized by
Freeman Dyson
Freeman John Dyson (15 December 1923 – 28 February 2020) was an English-American theoretical physicist and mathematician known for his works in quantum field theory, astrophysics, random matrices, mathematical formulation of quantum m ...
in 1949. The divergences appear in radiative corrections involving
Feynman diagram
In theoretical physics, a Feynman diagram is a pictorial representation of the mathematical expressions describing the behavior and interaction of subatomic particles. The scheme is named after American physicist Richard Feynman, who introduc ...
s with closed ''loops'' of
virtual particle
A virtual particle is a theoretical transient particle that exhibits some of the characteristics of an ordinary particle, while having its existence limited by the uncertainty principle. The concept of virtual particles arises in the perturbat ...
s in them.
While virtual particles obey
conservation of energy
In physics and chemistry, the law of conservation of energy states that the total energy of an isolated system remains constant; it is said to be ''conserved'' over time. This law, first proposed and tested by Émilie du Châtelet, means th ...
and
momentum
In Newtonian mechanics, momentum (more specifically linear momentum or translational momentum) is the product of the mass and velocity of an object. It is a vector quantity, possessing a magnitude and a direction. If is an object's mass an ...
, they can have any energy and momentum, even one that is not allowed by the relativistic
energy–momentum relation
In physics, the energy–momentum relation, or relativistic dispersion relation, is the relativistic equation relating total energy (which is also called relativistic energy) to invariant mass (which is also called rest mass) and momentum. It is t ...
for the observed mass of that particle (that is,
is not necessarily the squared mass of the particle in that process, e.g. for a photon it could be nonzero). Such a particle is called
off-shell
In physics, particularly in quantum field theory, configurations of a physical system that satisfy classical equations of motion are called "on the mass shell" or simply more often on shell; while those that do not are called "off the mass shell" ...
. When there is a loop, the momentum of the particles involved in the loop is not uniquely determined by the energies and momenta of incoming and outgoing particles. A variation in the energy of one particle in the loop can be balanced by an equal and opposite change in the energy of another particle in the loop, without affecting the incoming and outgoing particles. Thus many variations are possible. So to find the amplitude for the loop process, one must
integrate over ''all'' possible combinations of energy and momentum that could travel around the loop.
These integrals are often ''divergent'', that is, they give infinite answers. The divergences that are significant are the "
ultraviolet
Ultraviolet (UV) is a form of electromagnetic radiation with wavelength from 10 nanometer, nm (with a corresponding frequency around 30 Hertz, PHz) to 400 nm (750 Hertz, THz), shorter than that of visible light, but longer than ...
" (UV) ones. An ultraviolet divergence can be described as one that comes from
* the region in the integral where all particles in the loop have large energies and momenta,
* very short
wavelength
In physics, the wavelength is the spatial period of a periodic wave—the distance over which the wave's shape repeats.
It is the distance between consecutive corresponding points of the same phase on the wave, such as two adjacent crests, tro ...
s and high-
frequencies
Frequency is the number of occurrences of a repeating event per unit of time. It is also occasionally referred to as ''temporal frequency'' for clarity, and is distinct from ''angular frequency''. Frequency is measured in hertz (Hz) which is eq ...
fluctuations of the fields, in the
path integral for the field,
* very short proper-time between particle emission and absorption, if the loop is thought of as a sum over particle paths.
So these divergences are short-distance, short-time phenomena.
Shown in the pictures at the right margin, there are exactly three one-loop divergent loop diagrams in quantum electrodynamics:
:(a) A photon creates a virtual electron–
positron
The positron or antielectron is the antiparticle or the antimatter counterpart of the electron. It has an electric charge of +1 '' e'', a spin of 1/2 (the same as the electron), and the same mass as an electron. When a positron collides ...
pair, which then annihilates. This is a
vacuum polarization
In quantum field theory, and specifically quantum electrodynamics, vacuum polarization describes a process in which a background electromagnetic field produces virtual electron–positron pairs that change the distribution of charges and curr ...
diagram.
:(b) An electron quickly emits and reabsorbs a virtual photon, called a
self-energy
In quantum field theory, the energy that a particle has as a result of changes that it causes in its environment defines self-energy \Sigma, and represents the contribution to the particle's energy, or effective mass, due to interactions between ...
.
:(c) An electron emits a photon, emits a second photon, and reabsorbs the first. This process is shown in the section below in figure 2, and it is called a ''
vertex renormalization
Vertex, vertices or vertexes may refer to:
Science and technology Mathematics and computer science
*Vertex (geometry), a point where two or more curves, lines, or edges meet
*Vertex (computer graphics), a data structure that describes the position ...
''. The Feynman diagram for this is also called a “
penguin diagram
In quantum field theory, penguin diagrams are a class of Feynman diagrams which are important for understanding CP violating processes in the standard model. They refer to one-loop processes in which a quark temporarily changes flavor (via a W ...
” due to its shape remotely resembling a penguin.
The three divergences correspond to the three parameters in the theory under consideration:
# The field normalization Z.
# The mass of the electron.
# The charge of the electron.
The second class of divergence called an
infrared divergence
In physics, an infrared divergence (also IR divergence or infrared catastrophe) is a situation in which an integral, for example a Feynman diagram, diverges because of contributions of objects with very small energy approaching zero, or equivalent ...
, is due to massless particles, like the photon. Every process involving charged particles emits infinitely many coherent photons of infinite wavelength, and the amplitude for emitting any finite number of photons is zero. For photons, these divergences are well understood. For example, at the 1-loop order, the
vertex function
In quantum electrodynamics, the vertex function describes the coupling between a photon and an electron beyond the leading order of perturbation theory. In particular, it is the one particle irreducible correlation function involving the fermion ...
has both ultraviolet and ''infrared'' divergences. In contrast to the ultraviolet divergence, the infrared divergence does not require the renormalization of a parameter in the theory involved. The infrared divergence of the vertex diagram is removed by including a diagram similar to the vertex diagram with the following important difference: the photon connecting the two legs of the electron is cut and replaced by two
on-shell
In physics, particularly in quantum field theory, configurations of a physical system that satisfy classical equations of motion are called "on the mass shell" or simply more often on shell; while those that do not are called "off the mass shell", ...
(i.e. real) photons whose wavelengths tend to infinity; this diagram is equivalent to the
bremsstrahlung
''Bremsstrahlung'' (), from "to brake" and "radiation"; i.e., "braking radiation" or "deceleration radiation", is electromagnetic radiation produced by the deceleration of a charged particle when deflected by another charged particle, typicall ...
process. This additional diagram must be included because there is no physical way to distinguish a zero-energy photon flowing through a loop as in the vertex diagram and zero-energy photons emitted through
bremsstrahlung
''Bremsstrahlung'' (), from "to brake" and "radiation"; i.e., "braking radiation" or "deceleration radiation", is electromagnetic radiation produced by the deceleration of a charged particle when deflected by another charged particle, typicall ...
. From a mathematical point of view, the IR divergences can be regularized by assuming fractional differentiation w.r.t. a parameter, for example:
:
is well defined at but is UV divergent; if we take the -th
fractional derivative
Fractional calculus is a branch of mathematical analysis that studies the several different possibilities of defining real number powers or complex number powers of the differentiation operator D
:D f(x) = \frac f(x)\,,
and of the integration ...
with respect to , we obtain the IR divergence
:
so we can cure IR divergences by turning them into UV divergences.
A loop divergence
The diagram in Figure 2 shows one of the several one-loop contributions to electron–electron scattering in QED. The electron on the left side of the diagram, represented by the solid line, starts out with 4-momentum and ends up with 4-momentum . It emits a virtual photon carrying to transfer energy and momentum to the other electron. But in this diagram, before that happens, it emits another virtual photon carrying 4-momentum , and it reabsorbs this one after emitting the other virtual photon. Energy and momentum conservation do not determine the 4-momentum uniquely, so all possibilities contribute equally and we must integrate.
This diagram's amplitude ends up with, among other things, a factor from the loop of
:
The various factors in this expression are
gamma matrices
In mathematical physics, the gamma matrices, \left\ , also called the Dirac matrices, are a set of conventional matrices with specific anticommutation relations that ensure they generate a matrix representation of the Clifford algebra Cl1,3(\ma ...
as in the covariant formulation of the
Dirac equation
In particle physics, the Dirac equation is a relativistic wave equation derived by British physicist Paul Dirac in 1928. In its free form, or including electromagnetic interactions, it describes all spin- massive particles, called "Dirac par ...
; they have to do with the spin of the electron. The factors of are the electric coupling constant, while the
provide a heuristic definition of the contour of integration around the poles in the space of momenta. The important part for our purposes is the dependency on of the three big factors in the integrand, which are from the
propagators of the two electron lines and the photon line in the loop.
This has a piece with two powers of on top that dominates at large values of (Pokorski 1987, p. 122):
:
This integral is divergent and infinite, unless we cut it off at finite energy and momentum in some way.
Similar loop divergences occur in other quantum field theories.
Renormalized and bare quantities
The solution was to realize that the quantities initially appearing in the theory's formulae (such as the formula for the
Lagrangian
Lagrangian may refer to:
Mathematics
* Lagrangian function, used to solve constrained minimization problems in optimization theory; see Lagrange multiplier
** Lagrangian relaxation, the method of approximating a difficult constrained problem with ...
), representing such things as the electron's
electric charge
Electric charge is the physical property of matter that causes charged matter to experience a force when placed in an electromagnetic field. Electric charge can be ''positive'' or ''negative'' (commonly carried by protons and electrons respe ...
and
mass
Mass is an intrinsic property of a body. It was traditionally believed to be related to the quantity of matter in a physical body, until the discovery of the atom and particle physics. It was found that different atoms and different elementar ...
, as well as the normalizations of the quantum fields themselves, did ''not'' actually correspond to the physical constants measured in the laboratory. As written, they were ''bare'' quantities that did not take into account the contribution of virtual-particle loop effects to ''the physical constants themselves''. Among other things, these effects would include the quantum counterpart of the electromagnetic back-reaction that so vexed classical theorists of electromagnetism. In general, these effects would be just as divergent as the amplitudes under consideration in the first place; so finite measured quantities would, in general, imply divergent bare quantities.
To make contact with reality, then, the formulae would have to be rewritten in terms of measurable, ''renormalized'' quantities. The charge of the electron, say, would be defined in terms of a quantity measured at a specific
kinematic
Kinematics is a subfield of physics, developed in classical mechanics, that describes the motion of points, bodies (objects), and systems of bodies (groups of objects) without considering the forces that cause them to move. Kinematics, as a fie ...
''renormalization point'' or ''subtraction point'' (which will generally have a characteristic energy, called the ''renormalization scale'' or simply the
energy scale
In physics, length scale is a particular length or distance determined with the precision of at most a few orders of magnitude. The concept of length scale is particularly important because physical phenomena of different length scales cannot aff ...
). The parts of the Lagrangian left over, involving the remaining portions of the bare quantities, could then be reinterpreted as
counterterms, involved in divergent diagrams exactly ''canceling out'' the troublesome divergences for other diagrams.
Renormalization in QED
For example, in the
Lagrangian of QED
:
the fields and coupling constant are really ''bare'' quantities, hence the subscript above. Conventionally the bare quantities are written so that the corresponding Lagrangian terms are multiples of the renormalized ones:
:
:
:
Gauge invariance
In physics, a gauge theory is a type of field theory in which the Lagrangian (and hence the dynamics of the system itself) does not change (is invariant) under local transformations according to certain smooth families of operations (Lie group ...
, via a
Ward–Takahashi identity
In quantum field theory, a Ward–Takahashi identity is an identity between correlation functions that follows from the global or gauge symmetries of the theory, and which remains valid after renormalization.
The Ward–Takahashi identity of quan ...
, turns out to imply that we can renormalize the two terms of the
covariant derivative piece
:
together (Pokorski 1987, p. 115), which is what happened to ; it is the same as .
A term in this Lagrangian, for example, the electron–photon interaction pictured in Figure 1, can then be written
:
The physical constant , the electron's charge, can then be defined in terms of some specific experiment: we set the renormalization scale equal to the energy characteristic of this experiment, and the first term gives the interaction we see in the laboratory (up to small, finite corrections from loop diagrams, providing such exotica as the high-order corrections to the
magnetic moment
In electromagnetism, the magnetic moment is the magnetic strength and orientation of a magnet or other object that produces a magnetic field. Examples of objects that have magnetic moments include loops of electric current (such as electromagnets ...
). The rest is the counterterm. If the theory is ''renormalizable'' (see below for more on this), as it is in QED, the ''divergent'' parts of loop diagrams can all be decomposed into pieces with three or fewer legs, with an algebraic form that can be canceled out by the second term (or by the similar counterterms that come from and ).
The diagram with the counterterm's interaction vertex placed as in Figure 3 cancels out the divergence from the loop in Figure 2.
Historically, the splitting of the "bare terms" into the original terms and counterterms came before the
renormalization group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
insight due to
Kenneth Wilson.
According to such
renormalization group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
insights, detailed in the next section, this splitting is unnatural and actually unphysical, as all scales of the problem enter in continuous systematic ways.
Running couplings
To minimize the contribution of loop diagrams to a given calculation (and therefore make it easier to extract results), one chooses a renormalization point close to the energies and momenta exchanged in the interaction. However, the renormalization point is not itself a physical quantity: the physical predictions of the theory, calculated to all orders, should in principle be ''independent'' of the choice of renormalization point, as long as it is within the domain of application of the theory. Changes in renormalization scale will simply affect how much of a result comes from Feynman diagrams without loops, and how much comes from the remaining finite parts of loop diagrams. One can exploit this fact to calculate the effective variation of
physical constants
A physical constant, sometimes fundamental physical constant or universal constant, is a physical quantity that is generally believed to be both universal in nature and have constant value in time. It is contrasted with a mathematical constant, ...
with changes in scale. This variation is encoded by
beta-function
In theoretical physics, specifically quantum field theory, a beta function, ''β(g)'', encodes the dependence of a coupling parameter, ''g'', on the energy scale, ''μ'', of a given physical process described by quantum field theory.
It is ...
s, and the general theory of this kind of scale-dependence is known as the
renormalization group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
.
Colloquially, particle physicists often speak of certain physical "constants" as varying with the energy of interaction, though in fact, it is the renormalization scale that is the independent quantity. This
''running'' does, however, provide a convenient means of describing changes in the behavior of a field theory under changes in the energies involved in an interaction. For example, since the coupling in
quantum chromodynamics
In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type ...
becomes small at large energy scales, the theory behaves more like a free theory as the energy exchanged in an interaction becomes large – a phenomenon known as
asymptotic freedom
In quantum field theory, asymptotic freedom is a property of some gauge theories that causes interactions between particles to become asymptotically weaker as the energy scale increases and the corresponding length scale decreases.
Asymptotic fre ...
. Choosing an increasing energy scale and using the renormalization group makes this clear from simple Feynman diagrams; were this not done, the prediction would be the same, but would arise from complicated high-order cancellations.
For example,
:
is ill-defined.
To eliminate the divergence, simply change lower limit of integral into and :
:
Making sure , then
Regularization
Since the quantity is ill-defined, in order to make this notion of canceling divergences precise, the divergences first have to be tamed mathematically using the
theory of limits, in a process known as
regularization
Regularization may refer to:
* Regularization (linguistics)
* Regularization (mathematics)
* Regularization (physics)
In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in ...
(Weinberg, 1995).
An essentially arbitrary modification to the loop integrands, or ''regulator'', can make them drop off faster at high energies and momenta, in such a manner that the integrals converge. A regulator has a characteristic energy scale known as the
cutoff; taking this cutoff to infinity (or, equivalently, the corresponding length/time scale to zero) recovers the original integrals.
With the regulator in place, and a finite value for the cutoff, divergent terms in the integrals then turn into finite but cutoff-dependent terms. After canceling out these terms with the contributions from cutoff-dependent counterterms, the cutoff is taken to infinity and finite physical results recovered. If physics on scales we can measure is independent of what happens at the very shortest distance and time scales, then it should be possible to get cutoff-independent results for calculations.
Many different types of regulator are used in quantum field theory calculations, each with its advantages and disadvantages. One of the most popular in modern use is ''
dimensional regularization
__NOTOC__
In theoretical physics, dimensional regularization is a method introduced by Giambiagi and Bollini as well as – independently and more comprehensively – by 't Hooft and Veltman for regularizing integrals in the evaluation of Fe ...
'', invented by
Gerardus 't Hooft
Gerardus (Gerard) 't Hooft (; born July 5, 1946) is a Dutch theoretical physicist and professor at Utrecht University, the Netherlands. He shared the 1999 Nobel Prize in Physics with his thesis advisor Martinus J. G. Veltman "for elucidating the ...
and
Martinus J. G. Veltman
Martinus Justinus Godefriedus "Tini" Veltman (; 27 June 1931 – 4 January 2021) was a Dutch theoretical physicist. He shared the 1999 Nobel Prize in physics with his former PhD student Gerardus 't Hooft for their work on particle theory.
Biogr ...
, which tames the integrals by carrying them into a space with a fictitious fractional number of dimensions. Another is ''
Pauli–Villars regularization
__NOTOC__
In theoretical physics, Pauli–Villars regularization (P–V) is a procedure that isolates divergent terms from finite parts in loop calculations in field theory in order to renormalize the theory. Wolfgang Pauli and Felix Villars pu ...
'', which adds fictitious particles to the theory with very large masses, such that loop integrands involving the massive particles cancel out the existing loops at large momenta.
Yet another regularization scheme is the ''
lattice regularization
In physics, lattice field theory is the study of lattice models of quantum field theory, that is, of field theory on a space or spacetime that has been discretised onto a lattice.
Details
Although most lattice field theories are not exactly so ...
'', introduced by
Kenneth Wilson, which pretends that hyper-cubical lattice constructs our spacetime with fixed grid size. This size is a natural cutoff for the maximal momentum that a particle could possess when propagating on the lattice. And after doing a calculation on several lattices with different grid size, the physical result is
extrapolate
In mathematics, extrapolation is a type of estimation, beyond the original observation range, of the value of a variable on the basis of its relationship with another variable. It is similar to interpolation, which produces estimates between kn ...
d to grid size 0, or our natural universe. This presupposes the existence of a
scaling limit.
A rigorous mathematical approach to renormalization theory is the so-called
causal perturbation theory
Causal perturbation theory is a mathematically rigorous approach to renormalization theory, which makes it possible to put the theoretical setup of perturbative quantum field theory on a sound mathematical basis. It goes back to a seminal work ...
, where ultraviolet divergences are avoided from the start in calculations by performing well-defined mathematical operations only within the framework of
distribution Distribution may refer to:
Mathematics
*Distribution (mathematics), generalized functions used to formulate solutions of partial differential equations
* Probability distribution, the probability of a particular value or value range of a vari ...
theory. In this approach, divergences are replaced by ambiguity: corresponding to a divergent diagram is a term which now has a finite, but undetermined, coefficient. Other principles, such as gauge symmetry, must then be used to reduce or eliminate the ambiguity.
Zeta function regularization
Julian Schwinger
Julian Seymour Schwinger (; February 12, 1918 – July 16, 1994) was a Nobel Prize winning American theoretical physicist. He is best known for his work on quantum electrodynamics (QED), in particular for developing a relativistically invariant ...
discovered a relationship between
zeta function regularization
Zeta (, ; uppercase Ζ, lowercase ζ; grc, ζῆτα, el, ζήτα, label= Demotic Greek, classical or ''zē̂ta''; ''zíta'') is the sixth letter of the Greek alphabet. In the system of Greek numerals, it has a value of 7. It was derived f ...
and renormalization, using the asymptotic relation:
:
as the regulator . Based on this, he considered using the values of to get finite results. Although he reached inconsistent results, an improved formula studied by
Hartle, J. Garcia, and based on the works by
E. Elizalde includes the technique of the
zeta regularization
Zeta (, ; uppercase Ζ, lowercase ζ; grc, ζῆτα, el, ζήτα, label=Demotic Greek, classical or ''zē̂ta''; ''zíta'') is the sixth letter of the Greek alphabet. In the system of Greek numerals, it has a value of 7. It was derived fr ...
algorithm
:
where the ''Bs are the
Bernoulli number
In mathematics, the Bernoulli numbers are a sequence of rational numbers which occur frequently in analysis. The Bernoulli numbers appear in (and can be defined by) the Taylor series expansions of the tangent and hyperbolic tangent functions, ...
s and
:
So every can be written as a linear combination of .
Or simply using
Abel–Plana formula
In mathematics, the Abel–Plana formula is a summation formula discovered independently by and . It states that
:\sum_^\infty f(n)=\frac 1 2 f(0)+ \int_0^\infty f(x) \, dx+ i \int_0^\infty \frac \, dt.
It holds for functions ''f'' that are holo ...
we have for every divergent integral:
:
valid when , Here the zeta function is
Hurwitz zeta function
In mathematics, the Hurwitz zeta function is one of the many zeta functions. It is formally defined for complex variables with and by
:\zeta(s,a) = \sum_^\infty \frac.
This series is absolutely convergent for the given values of and and can ...
and Beta is a positive real number.
The "geometric" analogy is given by, (if we use
rectangle method
In mathematics, a Riemann sum is a certain kind of approximation of an integral by a finite sum. It is named after nineteenth century German mathematician Bernhard Riemann. One very common application is approximating the area of functions or lin ...
) to evaluate the integral so:
:
Using Hurwitz zeta regularization plus the rectangle method with step h (not to be confused with the
Planck constant
The Planck constant, or Planck's constant, is a fundamental physical constant of foundational importance in quantum mechanics. The constant gives the relationship between the energy of a photon and its frequency, and by the mass-energy equivale ...
).
The logarithmic divergent integral has the regularization
:
since for the Harmonic series
in the limit
we must recover the series
For
multi-loop integrals that will depend on several variables
we can make a change of variables to polar coordinates and then replace the integral over the angles
by a sum so we have only a divergent integral, that will depend on the modulus
and then we can apply the zeta regularization algorithm, the main idea for multi-loop integrals is to replace the factor
after a change to hyperspherical coordinates so the UV overlapping divergences are encoded in variable . In order to regularize these integrals one needs a regulator, for the case of multi-loop integrals, these regulator can be taken as
:
so the multi-loop integral will converge for big enough using the Zeta regularization we can analytic continue the variable to the physical limit where and then regularize any UV integral, by replacing a divergent integral by a linear combination of divergent series, which can be regularized in terms of the negative values of the Riemann zeta function .
Attitudes and interpretation
The early formulators of QED and other quantum field theories were, as a rule, dissatisfied with this state of affairs. It seemed illegitimate to do something tantamount to subtracting infinities from infinities to get finite answers.
Freeman Dyson
Freeman John Dyson (15 December 1923 – 28 February 2020) was an English-American theoretical physicist and mathematician known for his works in quantum field theory, astrophysics, random matrices, mathematical formulation of quantum m ...
argued that these infinities are of a basic nature and cannot be eliminated by any formal mathematical procedures, such as the renormalization method.
Dirac
Distributed Research using Advanced Computing (DiRAC) is an integrated supercomputing facility used for research in particle physics, astronomy and cosmology in the United Kingdom. DiRAC makes use of multi-core processors and provides a variety o ...
's criticism was the most persistent. As late as 1975, he was saying:
: Most physicists are very satisfied with the situation. They say: 'Quantum electrodynamics is a good theory and we do not have to worry about it any more.' I must say that I am very dissatisfied with the situation because this so-called 'good theory' does involve neglecting infinities which appear in its equations, ignoring them in an arbitrary way. This is just not sensible mathematics. Sensible mathematics involves disregarding a quantity when it is small – not neglecting it just because it is infinitely great and you do not want it!
Another important critic was
Feynman
Richard Phillips Feynman (; May 11, 1918 – February 15, 1988) was an American theoretical physicist, known for his work in the path integral formulation of quantum mechanics, the theory of quantum electrodynamics, the physics of the superflu ...
. Despite his crucial role in the development of quantum electrodynamics, he wrote the following in 1985:
: The shell game that we play is technically called 'renormalization'. But no matter how clever the word, it is still what I would call a dippy process! Having to resort to such hocus-pocus has prevented us from proving that the theory of quantum electrodynamics is mathematically self-consistent. It's surprising that the theory still hasn't been proved self-consistent one way or the other by now; I suspect that renormalization is not mathematically legitimate.
Feynman was concerned that all field theories known in the 1960s had the property that the interactions become infinitely strong at short enough distance scales. This property called a
Landau pole
In physics, the Landau pole (or the Moscow zero, or the Landau ghost) is the momentum (or energy) scale at which the coupling constant (interaction strength) of a quantum field theory becomes infinite. Such a possibility was pointed out by the phy ...
, made it plausible that quantum field theories were all inconsistent. In 1974,
Gross,
Politzer Politzer is a surname deriving from Politz. Notable people with the surname include:
* Adam Politzer, Hungarian physician
**Politzerization, a treatment technique for ear infections, developed by him
* Georges Politzer, French philosopher
* H. Dav ...
and
Wilczek showed that another quantum field theory,
quantum chromodynamics
In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type ...
, does not have a Landau pole. Feynman, along with most others, accepted that QCD was a fully consistent theory.
The general unease was almost universal in texts up to the 1970s and 1980s. Beginning in the 1970s, however, inspired by work on the
renormalization group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
and
effective field theory, and despite the fact that Dirac and various others—all of whom belonged to the older generation—never withdrew their criticisms, attitudes began to change, especially among younger theorists.
Kenneth G. Wilson
Kenneth Geddes "Ken" Wilson (June 8, 1936 – June 15, 2013) was an American theoretical physicist and a pioneer in leveraging computers for studying particle physics. He was awarded the 1982 Nobel Prize in Physics for his work on phase ...
and others demonstrated that the renormalization group is useful in
statistical
Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industria ...
field theory applied to
condensed matter physics
Condensed matter physics is the field of physics that deals with the macroscopic and microscopic physical properties of matter, especially the solid and liquid phases which arise from electromagnetic forces between atoms. More generally, the sub ...
, where it provides important insights into the behavior of
phase transition
In chemistry, thermodynamics, and other related fields, a phase transition (or phase change) is the physical process of transition between one state of a medium and another. Commonly the term is used to refer to changes among the basic states of ...
s. In condensed matter physics, a ''physical'' short-distance regulator exists:
matter
In classical physics and general chemistry, matter is any substance that has mass and takes up space by having volume. All everyday objects that can be touched are ultimately composed of atoms, which are made up of interacting subatomic partic ...
ceases to be continuous on the scale of
atom
Every atom is composed of a nucleus and one or more electrons bound to the nucleus. The nucleus is made of one or more protons and a number of neutrons. Only the most common variety of hydrogen has no neutrons.
Every solid, liquid, gas, and ...
s. Short-distance divergences in condensed matter physics do not present a philosophical problem since the field theory is only an effective, smoothed-out representation of the behavior of matter anyway; there are no infinities since the cutoff is always finite, and it makes perfect sense that the bare quantities are cutoff-dependent.
If
QFT holds all the way down past the
Planck length (where it might yield to
string theory
In physics, string theory is a theoretical framework in which the point-like particles of particle physics are replaced by one-dimensional objects called strings. String theory describes how these strings propagate through space and interac ...
,
causal set theory or something different), then there may be no real problem with short-distance divergences in
particle physics
Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
either; ''all'' field theories could simply be effective field theories. In a sense, this approach echoes the older attitude that the divergences in QFT speak of human ignorance about the workings of nature, but also acknowledges that this ignorance can be quantified and that the resulting effective theories remain useful.
Be that as it may,
Salam's remark in 1972 seems still relevant
: Field-theoretic infinities – first encountered in Lorentz's computation of electron self-mass – have persisted in classical electrodynamics for seventy and in quantum electrodynamics for some thirty-five years. These long years of frustration have left in the subject a curious affection for the infinities and a passionate belief that they are an inevitable part of nature; so much so that even the suggestion of a hope that they may, after all, be circumvented — and finite values for the renormalization constants computed – is considered irrational. Compare
Russell's postscript to the third volume of his autobiography ''The Final Years, 1944–1969'' (George Allen and Unwin, Ltd., London 1969), p. 221:
:: In the modern world, if communities are unhappy, it is often because they have ignorances, habits, beliefs, and passions, which are dearer to them than happiness or even life. I find many men in our dangerous age who seem to be in love with misery and death, and who grow angry when hopes are suggested to them. They think hope is irrational and that, in sitting down to lazy despair, they are merely facing facts.
In QFT, the value of a physical constant, in general, depends on the scale that one chooses as the renormalization point, and it becomes very interesting to examine the renormalization group running of physical constants under changes in the energy scale. The coupling constants in the
Standard Model
The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
of particle physics vary in different ways with increasing energy scale: the coupling of
quantum chromodynamics
In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type ...
and the weak isospin coupling of the
electroweak force
In particle physics, the electroweak interaction or electroweak force is the unified description of two of the four known fundamental interactions of nature: electromagnetism and the weak interaction. Although these two forces appear very differe ...
tend to decrease, and the weak hypercharge coupling of the electroweak force tends to increase. At the colossal energy scale of 10
15 GeV GEV may refer to:
* ''G.E.V.'' (board game), a tabletop game by Steve Jackson Games
* Ashe County Airport, in North Carolina, United States
* Gällivare Lapland Airport, in Sweden
* Generalized extreme value distribution
* Gev Sella, Israeli-Sou ...
(far beyond the reach of our current
particle accelerator
A particle accelerator is a machine that uses electromagnetic fields to propel charged particles to very high speeds and energies, and to contain them in well-defined beams.
Large accelerators are used for fundamental research in particle ...
s), they all become approximately the same size (Grotz and Klapdor 1990, p. 254), a major motivation for speculations about
grand unified theory
A Grand Unified Theory (GUT) is a model in particle physics in which, at high energies, the three gauge interactions of the Standard Model comprising the electromagnetic, weak, and strong forces are merged into a single force. Although this ...
. Instead of being only a worrisome problem, renormalization has become an important theoretical tool for studying the behavior of field theories in different regimes.
If a theory featuring renormalization (e.g. QED) can only be sensibly interpreted as an effective field theory, i.e. as an approximation reflecting human ignorance about the workings of nature, then the problem remains of discovering a more accurate theory that does not have these renormalization problems. As
Lewis Ryder has put it, "In the Quantum Theory, these
lassicaldivergences do not disappear; on the contrary, they appear to get worse. And despite the comparative success of renormalisation theory, the feeling remains that there ought to be a more satisfactory way of doing things."
Renormalizability
From this philosophical reassessment, a new concept follows naturally: the notion of renormalizability. Not all theories lend themselves to renormalization in the manner described above, with a finite supply of counterterms and all quantities becoming cutoff-independent at the end of the calculation. If the Lagrangian contains combinations of field operators of high enough
dimension
In physics and mathematics, the dimension of a Space (mathematics), mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any Point (geometry), point within it. Thus, a Line (geometry), lin ...
in energy units, the counterterms required to cancel all divergences proliferate to infinite number, and, at first glance, the theory would seem to gain an infinite number of free parameters and therefore lose all predictive power, becoming scientifically worthless. Such theories are called ''nonrenormalizable''.
The
Standard Model
The Standard Model of particle physics is the theory describing three of the four known fundamental forces (electromagnetism, electromagnetic, weak interaction, weak and strong interactions - excluding gravity) in the universe and classifying a ...
of particle physics contains only renormalizable operators, but the interactions of
general relativity
General relativity, also known as the general theory of relativity and Einstein's theory of gravity, is the geometric theory of gravitation published by Albert Einstein in 1915 and is the current description of gravitation in modern physics ...
become nonrenormalizable operators if one attempts to construct a field theory of
quantum gravity
Quantum gravity (QG) is a field of theoretical physics that seeks to describe gravity according to the principles of quantum mechanics; it deals with environments in which neither gravitational nor quantum effects can be ignored, such as in the vi ...
in the most straightforward manner (treating the metric in the
Einstein–Hilbert Lagrangian as a perturbation about the
Minkowski metric
In mathematical physics, Minkowski space (or Minkowski spacetime) () is a combination of Three-dimensional space, three-dimensional Euclidean space and time into a four-dimensional manifold where the spacetime interval between any two Event (rel ...
), suggesting that
perturbation theory
In mathematics and applied mathematics, perturbation theory comprises methods for finding an approximate solution to a problem, by starting from the exact solution of a related, simpler problem. A critical feature of the technique is a middle ...
is not satisfactory in application to quantum gravity.
However, in an
effective field theory, "renormalizability" is, strictly speaking, a
misnomer
A misnomer is a name that is incorrectly or unsuitably applied. Misnomers often arise because something was named long before its correct nature was known, or because an earlier form of something has been replaced by a later form to which the name ...
. In nonrenormalizable effective field theory, terms in the Lagrangian do multiply to infinity, but have coefficients suppressed by ever-more-extreme inverse powers of the energy cutoff. If the cutoff is a real, physical quantity—that is, if the theory is only an effective description of physics up to some maximum energy or minimum distance scale—then these additional terms could represent real physical interactions. Assuming that the dimensionless constants in the theory do not get too large, one can group calculations by inverse powers of the cutoff, and extract approximate predictions to finite order in the cutoff that still have a finite number of free parameters. It can even be useful to renormalize these "nonrenormalizable" interactions.
Nonrenormalizable interactions in effective field theories rapidly become weaker as the energy scale becomes much smaller than the cutoff. The classic example is the
Fermi theory of the
weak nuclear force
In nuclear physics and particle physics, the weak interaction, which is also often called the weak force or weak nuclear force, is one of the four known fundamental interactions, with the others being electromagnetism, the strong interaction ...
, a nonrenormalizable effective theory whose cutoff is comparable to the mass of the
W particle. This fact may also provide a possible explanation for ''why'' almost all of the particle interactions we see are describable by renormalizable theories. It may be that any others that may exist at the
GUT or Planck scale simply become too weak to detect in the realm we can observe, with one exception:
gravity
In physics, gravity () is a fundamental interaction which causes mutual attraction between all things with mass or energy. Gravity is, by far, the weakest of the four fundamental interactions, approximately 1038 times weaker than the stro ...
, whose exceedingly weak interaction is magnified by the presence of the enormous masses of
star
A star is an astronomical object comprising a luminous spheroid of plasma (physics), plasma held together by its gravity. The List of nearest stars and brown dwarfs, nearest star to Earth is the Sun. Many other stars are visible to the naked ...
s and
planet
A planet is a large, rounded astronomical body that is neither a star nor its remnant. The best available theory of planet formation is the nebular hypothesis, which posits that an interstellar cloud collapses out of a nebula to create a you ...
s.
Renormalization schemes
In actual calculations, the counterterms introduced to cancel the divergences in Feynman diagram calculations beyond tree level must be ''fixed'' using a set of '' renormalisation conditions''. The common renormalization schemes in use include:
*
Minimal subtraction (MS) scheme and the related modified minimal subtraction (MS-bar) scheme
*
On-shell scheme
Besides, there exists a "natural" definition of the renormalized coupling (combined with the photon propagator) as a propagator of dual free bosons, which does not explicitly require introducing the counterterms.
Renormalization in statistical physics
History
A deeper understanding of the physical meaning and generalization of the
renormalization process, which goes beyond the dilatation group of conventional ''renormalizable'' theories, came from condensed matter physics.
Leo P. Kadanoff's paper in 1966 proposed the "block-spin" renormalization group.
[ L.P. Kadanoff (1966): "Scaling laws for Ising models near ", ''Physics (Long Island City, N.Y.)'' 2, 263.] The ''blocking idea'' is a way to define the components of the theory at large distances as aggregates of components at shorter distances.
This approach covered the conceptual point and was given full computational substance
in the extensive important contributions of
Kenneth Wilson. The power of Wilson's ideas was demonstrated by a constructive iterative renormalization solution of a long-standing problem, the
Kondo problem, in 1974, as well as the preceding seminal developments of his new method in the theory of second-order phase transitions and
critical phenomena
In physics, critical phenomena is the collective name associated with the
physics of critical points. Most of them stem from the divergence of the
correlation length, but also the dynamics slows down. Critical phenomena include scaling relatio ...
in 1971. He was awarded the Nobel prize for these decisive contributions in 1982.
Principles
In more technical terms, let us assume that we have a theory described
by a certain function
of the state variables
and a certain set of coupling constants
. This function may be a
partition function,
an
action
Action may refer to:
* Action (narrative), a literary mode
* Action fiction, a type of genre fiction
* Action game, a genre of video game
Film
* Action film, a genre of film
* ''Action'' (1921 film), a film by John Ford
* ''Action'' (1980 fil ...
, a
Hamiltonian
Hamiltonian may refer to:
* Hamiltonian mechanics, a function that represents the total energy of a system
* Hamiltonian (quantum mechanics), an operator corresponding to the total energy of that system
** Dyall Hamiltonian, a modified Hamiltonian ...
, etc. It must contain the
whole description of the physics of the system.
Now we consider a certain blocking transformation of the state
variables
,
the number of
must be lower than the number of
. Now let us try to rewrite the
function ''only'' in terms of the
. If this is achievable by a
certain change in the parameters,
, then the theory is said to be
renormalizable.
The possible
macroscopic states of the system, at a large scale, are given by this
set of fixed points.
Renormalization group fixed points
The most important information in the RG flow is its fixed points. A fixed point is defined by the vanishing of the
beta function
In mathematics, the beta function, also called the Euler integral of the first kind, is a special function that is closely related to the gamma function and to binomial coefficients. It is defined by the integral
: \Beta(z_1,z_2) = \int_0^1 t^( ...
associated to the flow. Then, fixed points of the renormalization group are by definition scale invariant. In many cases of physical interest scale invariance enlarges to conformal invariance. One then has a
conformal field theory
A conformal field theory (CFT) is a quantum field theory that is invariant under conformal transformations. In two dimensions, there is an infinite-dimensional algebra of local conformal transformations, and conformal field theories can sometimes ...
at the fixed point.
The ability of several theories to flow to the same fixed point leads to
universality.
If these fixed points correspond to free field theory, the theory is said to exhibit
quantum triviality
In a quantum field theory, charge screening can restrict the value of the observable "renormalized" charge of a classical theory. If
the only resulting value of the renormalized charge is zero, the theory is said to be "trivial" or noninteracting. ...
. Numerous fixed points appear in the study of
lattice Higgs theories, but the nature of the quantum field theories associated with these remains an open question.
See also
*
History of quantum field theory
*
Quantum triviality
In a quantum field theory, charge screening can restrict the value of the observable "renormalized" charge of a classical theory. If
the only resulting value of the renormalized charge is zero, the theory is said to be "trivial" or noninteracting. ...
*
Zeno's paradoxes
Zeno's paradoxes are a set of philosophical problems generally thought to have been devised by Greek philosopher Zeno of Elea (c. 490–430 BC) to support Parmenides' doctrine that contrary to the evidence of one's senses, the belief in pluralit ...
References
Further reading
General introduction
* DeDeo, Simon
''Introduction to Renormalization''(2017).
Santa Fe Institute
The Santa Fe Institute (SFI) is an independent, nonprofit theoretical research institute located in Santa Fe, New Mexico, United States and dedicated to the multidisciplinary study of the fundamental principles of complex adaptive systems, includ ...
Complexity Explorer MOOC. Renormalization from a complex systems point of view, including Markov Chains, Cellular Automata, the real space Ising model, the Krohn-Rhodes Theorem, QED, and rate distortion theory.
*
* Baez, John
''Renormalization Made Easy'' (2005). A qualitative introduction to the subject.
* Blechman, Andrew E.
''Renormalization: Our Greatly Misunderstood Friend'' (2002). Summary of a lecture; has more information about specific regularization and divergence-subtraction schemes.
*
*
Shirkov, Dmitry; ''Fifty Years of the Renormalization Group'', C.E.R.N. Courrier 41(7) (2001). Full text available at
''I.O.P Magazines''
* E. Elizalde; ''Zeta regularization techniques with Applications''.
Mainly: quantum field theory
*
N. N. Bogoliubov,
D. V. Shirkov (1959): ''The Theory of Quantized Fields''. New York, Interscience. The first text-book on the
renormalization group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the ...
theory.
* Ryder, Lewis H.; ''Quantum Field Theory '' (Cambridge University Press, 1985), Highly readable textbook, certainly the best introduction to relativistic Q.F.T. for particle physics.
* Zee, Anthony; ''Quantum Field Theory in a Nutshell'', Princeton University Press (2003) . Another excellent textbook on Q.F.T.
* Weinberg, Steven; ''The Quantum Theory of Fields'' (3 volumes) Cambridge University Press (1995). A monumental treatise on Q.F.T. written by a leading expert
''Nobel laureate 1979''
* Pokorski, Stefan; ''Gauge Field Theories'', Cambridge University Press (1987) .
* 't Hooft, Gerard; ''The Glorious Days of Physics – Renormalization of Gauge theories'', lecture given at Erice (August/September 1998) by th
. Full text available at
''hep-th/9812203''
* Rivasseau, Vincent; ''An introduction to renormalization'', Poincaré Seminar (Paris, Oct. 12, 2002), published in : Duplantier, Bertrand; Rivasseau, Vincent (Eds.); ''Poincaré Seminar 2002'', Progress in Mathematical Physics 30, Birkhäuser (2003) . Full text available i
''PostScript''
* Rivasseau, Vincent; ''From perturbative to constructive renormalization'', Princeton University Press (1991) . Full text available i
''PostScript''and i
PDF (draft version)
* Iagolnitzer, Daniel & Magnen, J.; ''Renormalization group analysis'', Encyclopaedia of Mathematics, Kluwer Academic Publisher (1996). Full text available in PostScript and pd
''here''
* Scharf, Günter; ''Finite quantum electrodynamics: The causal approach'', Springer Verlag Berlin Heidelberg New York (1995) .
* A. S. Švarc (
Albert Schwarz
Albert Solomonovich Schwarz (; russian: А. С. Шварц; born June 24, 1934) is a Soviet and American mathematician and a theoretical physicist educated in the Soviet Union and now a professor at the University of California, Davis.
Early lif ...
), Математические основы квантовой теории поля, (Mathematical aspects of quantum field theory), Atomizdat, Moscow, 1975. 368 pp.
Mainly: statistical physics
* A. N. Vasil'ev; ''The Field Theoretic Renormalization Group in Critical Behavior Theory and Stochastic Dynamics'' (Routledge Chapman & Hall 2004);
*
Nigel Goldenfeld
Nigel David Goldenfeld (born May 1, 1957) is a Swanlund Chair, Professor of Physics Department in the University of Illinois at Urbana-Champaign (UIUC), the director of the NASA Astrobiology Institute for Universal Biology, and the leader of the ...
; ''Lectures on Phase Transitions and the Renormalization Group'', Frontiers in Physics 85, Westview Press (June, 1992) . Covering the elementary aspects of the physics of phases transitions and the renormalization group, this popular book emphasizes understanding and clarity rather than technical manipulations.
* Zinn-Justin, Jean; ''Quantum Field Theory and Critical Phenomena'', Oxford University Press (4th edition – 2002) . A masterpiece on applications of renormalization methods to the calculation of critical exponents in statistical mechanics, following Wilson's ideas (Kenneth Wilson was
''Nobel laureate 1982''.
* Zinn-Justin, Jean; ''Phase Transitions & Renormalization Group: from Theory to Numbers'', Poincaré Seminar (Paris, Oct. 12, 2002), published in : Duplantier, Bertrand; Rivasseau, Vincent (Eds.); ''Poincaré Seminar 2002'', Progress in Mathematical Physics 30, Birkhäuser (2003) . Full text available i
''PostScript''.
* Domb, Cyril; ''The Critical Point: A Historical Introduction to the Modern Theory of Critical Phenomena'', CRC Press (March, 1996) .
* Brown, Laurie M. (Ed.); ''Renormalization: From Lorentz to Landau (and Beyond)'', Springer-Verlag (New York-1993) .
*
Cardy, John; ''Scaling and Renormalization in Statistical Physics'', Cambridge University Press (1996) .
Miscellaneous
*
Shirkov, Dmitry; ''The Bogoliubov Renormalization Group'', JINR Communication E2-96-15 (1996). Full text available at
''hep-th/9602024''* Zinn-Justin, Jean; ''Renormalization and renormalization group: From the discovery of UV divergences to the concept of effective field theories'', in: de Witt-Morette C., Zuber J.-B. (eds), Proceedings of the NATO ASI on ''Quantum Field Theory: Perspective and Prospective'', June 15–26, 1998, Les Houches, France, Kluwer Academic Publishers, NATO ASI Series C 530, 375–388 (1999). Full text available i
''PostScript''
* Connes, Alain; ''Symétries Galoisiennes & Renormalisation'', Poincaré Seminar (Paris, Oct. 12, 2002), published in : Duplantier, Bertrand; Rivasseau, Vincent (Eds.); ''Poincaré Seminar 2002'', Progress in Mathematical Physics 30, Birkhäuser (2003) . French mathematicia
''Alain Connes''(Fields medallist 1982) describe the mathematical underlying structure (the
Hopf algebra) of renormalization, and its link to the Riemann-Hilbert problem. Full text (in French) available at .
External links
* {{wikiquote-inline
Quantum field theory
Renormalization group
Mathematical physics