Renormalisation
Renormalization is a collection of techniques in quantum field theory, the statistical mechanics of fields, and the theory of self-similarity, self-similar geometric structures, that are used to treat infinity, infinities arising in calculated quantities by altering values of these quantities to compensate for effects of their self-interactions. But even if no infinities arose in One-loop Feynman diagram, loop diagrams in quantum field theory, it could be shown that it would be necessary to renormalize the mass and fields appearing in the original Lagrangian (field theory), Lagrangian. For example, an electron theory may begin by postulating an electron with an initial mass and charge. In quantum field theory a cloud of virtual particles, such as photons, positrons, and others surrounds and interacts with the initial electron. Accounting for the interactions of the surrounding particles (e.g. collisions at different energies) shows that the electron-system behaves as if it had ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Quantum Field Theory
In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and in condensed matter physics to construct models of quasiparticles. QFT treats particles as excited states (also called Quantum, quanta) of their underlying quantum field (physics), fields, which are more fundamental than the particles. The equation of motion of the particle is determined by minimization of the Lagrangian, a functional of fields associated with the particle. Interactions between particles are described by interaction terms in the Lagrangian (field theory), Lagrangian involving their corresponding quantum fields. Each interaction can be visually represented by Feynman diagrams according to perturbation theory (quantum mechanics), perturbation theory in quantum mechanics. History Quantum field theory emerged from the wo ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Renormalization Group
In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the underlying force laws (codified in a quantum field theory) as the energy scale at which physical processes occur varies, energy/momentum and resolution distance scales being effectively conjugate under the uncertainty principle. A change in scale is called a scale transformation. The renormalization group is intimately related to ''scale invariance'' and ''conformal invariance'', symmetries in which a system appears the same at all scales (so-called self-similarity). As the scale varies, it is as if one is changing the magnifying power of a notional microscope viewing the system. In so-called renormalizable theories, the system at one scale will generally be seen to consist of self-similar copies of itself when viewed at a smaller sca ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which relates to the order of nature, or, in other words, to the regular succession of events." Physics is one of the most fundamental scientific disciplines, with its main goal being to understand how the universe behaves. "Physics is one of the most fundamental of the sciences. Scientists of all disciplines use the ideas of physics, including chemists who study the structure of molecules, paleontologists who try to reconstruct how dinosaurs walked, and climatologists who study how human activities affect the atmosphere and oceans. Physics is also the foundation of all engineering and technology. No engineer could design a flat-screen TV, an interplanetary spacecraft, or even a better mousetrap without first understanding the basic laws of physic ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Compton Wavelength
The Compton wavelength is a quantum mechanical property of a particle. The Compton wavelength of a particle is equal to the wavelength of a photon whose energy is the same as the rest energy of that particle (see mass–energy equivalence). It was introduced by Arthur Compton in 1923 in his explanation of the scattering of photons by electrons (a process known as Compton scattering). The standard Compton wavelength of a particle is given by \lambda = \frac, while its frequency is given by f = \frac, where is the Planck constant, is the particle's proper mass, and is the speed of light. The significance of this formula is shown in the derivation of the Compton shift formula. It is equivalent to the de Broglie wavelength with v = \frac . The CODATA 2018 value for the Compton wavelength of the electron is . Other particles have different Compton wavelengths. Reduced Compton wavelength The reduced Compton wavelength (barred lambda) is defined as the Compton wavelength div ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Fine-structure Constant
In physics, the fine-structure constant, also known as the Sommerfeld constant, commonly denoted by (the Greek letter ''alpha''), is a fundamental physical constant which quantifies the strength of the electromagnetic interaction between elementary charged particles. It is a dimensionless quantity, independent of the system of units used, which is related to the strength of the coupling of an elementary charge ''e'' with the electromagnetic field, by the formula . Its numerical value is approximately , with a relative uncertainty of The constant was named by Arnold Sommerfeld, who introduced it in 1916 Equation 12a, ''"rund 7·" (about ...)'' when extending the Bohr model of the atom. quantified the gap in the fine structure of the spectral lines of the hydrogen atom, which had been measured precisely by Michelson and Morley in 1887. Definition In terms of other fundamental physical constants, may be defined as: \alpha = \frac = \frac , where * is the elementary char ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Classical Electron Radius
The classical electron radius is a combination of fundamental physical quantities that define a length scale for problems involving an electron interacting with electromagnetic radiation. It links the classical electrostatic self-interaction energy of a homogeneous charge distribution to the electron's relativistic mass–energy. According to modern understanding, the electron is a point particle with a point charge and no spatial extent. Nevertheless, it is useful to define a length that characterizes electron interactions in atomic-scale problems. The classical electron radius is given as :r_\text = \frac\frac = 2.817 940 3227(19) \times 10^ \text = 2.817 940 3227(19) \text , where e is the elementary charge, m_ is the electron mass, c is the speed of light, and \varepsilon_0 is the permittivity of free space. This numerical value is several times larger than the radius of the proton. In cgs units, the permittivity factor and \frac do not enter, but the classical electron ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Inertia
Inertia is the idea that an object will continue its current motion until some force causes its speed or direction to change. The term is properly understood as shorthand for "the principle of inertia" as described by Newton in his first law of motion. After some other definitions, Newton states in his first law of motion: The word "perseveres" is a direct translation from Newton's Latin. Other, less forceful terms such as "to continue" or "to remain" are commonly found in modern textbooks. The modern use follows from some changes in Newton's original mechanics (as stated in the ''Principia'') made by Euler, d'Alembert, and other Cartesians. The term inertia comes from the Latin word ''iners'', meaning idle, sluggish. The term inertia may also refer to the resistance of any physical object to a change in its velocity. This includes changes to the object's speed or direction of motion. An aspect of this property is the tendency of objects to keep moving in a straight li ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Electromagnetic Mass
Electromagnetic mass was initially a concept of classical mechanics, denoting as to how much the electromagnetic field, or the self-energy, is contributing to the mass of charged particles. It was first derived by J. J. Thomson in 1881 and was for some time also considered as a dynamical explanation of inertial mass ''per se''. Today, the relation of mass, momentum, velocity, and all forms of energy – including electromagnetic energy – is analyzed on the basis of Albert Einstein's special relativity and mass–energy equivalence. As to the cause of mass of elementary particles, the Higgs mechanism in the framework of the relativistic Standard Model is currently used. However, some problems concerning the electromagnetic mass and self-energy of charged particles are still studied. Charged particles Rest mass and energy It was recognized by J. J. Thomson in 1881 that a charged sphere moving in a space filled with a medium of a specific inductive capacity (the electromagnetic a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Elementary Particle
In particle physics, an elementary particle or fundamental particle is a subatomic particle that is not composed of other particles. Particles currently thought to be elementary include electrons, the fundamental fermions ( quarks, leptons, antiquarks, and antileptons, which generally are matter particles and antimatter particles), as well as the fundamental bosons ( gauge bosons and the Higgs boson), which generally are force particles that mediate interactions among fermions. A particle containing two or more elementary particles is a composite particle. Ordinary matter is composed of atoms, once presumed to be elementary particles – ''atomos'' meaning "unable to be cut" in Greek – although the atom's existence remained controversial until about 1905, as some leading physicists regarded molecules as mathematical illusions, and matter as ultimately composed of energy. Subatomic constituents of the atom were first identified in the early 1930s; the electron and the proto ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Classical Electrodynamics
Classical electromagnetism or classical electrodynamics is a branch of theoretical physics that studies the interactions between electric charges and currents using an extension of the classical Newtonian model; It is, therefore, a classical field theory. The theory provides a description of electromagnetic phenomena whenever the relevant length scales and field strengths are large enough that quantum mechanical effects are negligible. For small distances and low field strengths, such interactions are better described by quantum electrodynamics, which is a quantum field theory. Fundamental physical aspects of classical electrodynamics are presented in many texts, such as those by Feynman, Leighton and Sands, Griffiths, Panofsky and Phillips, and Jackson. History The physical phenomena that electromagnetism describes have been studied as separate fields since antiquity. For example, there were many advances in the field of optics centuries before light was understood to be ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Regularization (physics)
In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in order to make them finite by the introduction of a suitable parameter called the regulator. The regulator, also known as a "cutoff", models our lack of knowledge about physics at unobserved scales (e.g. scales of small size or large energy levels). It compensates for (and requires) the possibility that "new physics" may be discovered at those scales which the present theory is unable to model, while enabling the current theory to give accurate predictions as an "effective theory" within its intended scale of use. It is distinct from renormalization, another technique to control infinities without assuming new physics, by adjusting for self-interaction feedback. Regularization was for many decades controversial even amongst its inventors, as it combines physical and epistemological claims into the same equations. However, it is now well understood and ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |