In
information theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
and
statistics
Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by
Erwin Schrödinger
Erwin Rudolf Josef Alexander Schrödinger (, ; ; 12 August 1887 – 4 January 1961), sometimes written as or , was a Nobel Prize-winning Austrian physicist with Irish citizenship who developed a number of fundamental results in quantum theory ...
in his 1944 popular-science book ''
What is Life?
''What Is Life? The Physical Aspect of the Living Cell'' is a 1944 science book written for the lay reader by physicist Erwin Schrödinger. The book was based on a course of public lectures delivered by Schrödinger in February 1943, under the ...
'' Later,
Léon Brillouin
Léon Nicolas Brillouin (; August 7, 1889 – October 4, 1969) was a French physicist. He made contributions to quantum mechanics, radio wave propagation in the atmosphere, solid state physics, and information theory.
Early life
Brillouin ...
shortened the phrase to ''negentropy''. In 1974,
Albert Szent-Györgyi
Albert Imre Szent-Györgyi de Nagyrápolt ( hu, nagyrápolti Szent-Györgyi Albert Imre; September 16, 1893 – October 22, 1986) was a Hungarian biochemist who won the Nobel Prize in Physiology or Medicine in 1937. He is credited with fi ...
proposed replacing the term ''negentropy'' with ''syntropy''. That term may have originated in the 1940s with the Italian mathematician
Luigi Fantappiè
Luigi Fantappiè (15 September 1901 – 28 July 1956) was an Italian mathematician, known for work in mathematical analysis and for creating the theory of analytic functionals: he was a student and follower of Vito Volterra. Later in life, he pro ...
, who tried to construct a unified theory of
biology
Biology is the scientific study of life. It is a natural science with a broad scope but has several unifying themes that tie it together as a single, coherent field. For instance, all organisms are made up of cells that process hereditary i ...
and
physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which r ...
.
Buckminster Fuller
Richard Buckminster Fuller (; July 12, 1895 – July 1, 1983) was an American architect, systems theorist, writer, designer, inventor, philosopher, and futurist. He styled his name as R. Buckminster Fuller in his writings, publishing more t ...
tried to popularize this usage, but ''negentropy'' remains common.
In a note to ''
What is Life?
''What Is Life? The Physical Aspect of the Living Cell'' is a 1944 science book written for the lay reader by physicist Erwin Schrödinger. The book was based on a course of public lectures delivered by Schrödinger in February 1943, under the ...
'' Schrödinger explained his use of this phrase.
Information theory
In
information theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
and
statistics
Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, negentropy is used as a measure of distance to normality. Out of all
distributions with a given mean and variance, the normal or
Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes
if and only if
In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false.
The connective is bicondi ...
the signal is Gaussian.
Negentropy is defined as
:
where
is the
differential entropy
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continu ...
of the Gaussian density with the same
mean
There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set.
For a data set, the ''arithme ...
and
variance
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
as
and
is the differential entropy of
:
:
Negentropy is used in
statistics
Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
and
signal processing
Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing ''signals'', such as audio signal processing, sound, image processing, images, and scientific measurements. Signal processing techniq ...
. It is related to network
entropy
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
, which is used in
independent component analysis
In signal processing, independent component analysis (ICA) is a computational method for separating a multivariate signal into additive subcomponents. This is done by assuming that at most one subcomponent is Gaussian and that the subcomponents ar ...
.
The negentropy of a distribution is equal to the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
between
and a Gaussian distribution with the same mean and variance as
(see ' for a proof). In particular, it is always nonnegative.
Correlation between statistical negentropy and Gibbs' free energy
There is a physical quantity closely linked to
free energy (
free enthalpy
In thermodynamics, the Gibbs free energy (or Gibbs energy; symbol G) is a thermodynamic potential that can be used to calculate the maximum amount of work that may be performed by a thermodynamically closed system at constant temperature and pre ...
), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873,
Willard Gibbs
Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
created a diagram illustrating the concept of free energy corresponding to
free enthalpy
In thermodynamics, the Gibbs free energy (or Gibbs energy; symbol G) is a thermodynamic potential that can be used to calculate the maximum amount of work that may be performed by a thermodynamically closed system at constant temperature and pre ...
. On the diagram one can see the quantity called
capacity for entropy
In information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book ''What is Life? (Schrödinger), What i ...
. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume. In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by
Massieu for the
isothermal process
In thermodynamics, an isothermal process is a type of thermodynamic process in which the temperature ''T'' of a system remains constant: Δ''T'' = 0. This typically occurs when a system is in contact with an outside thermal reservoir, and ...
(both quantities differs just with a figure sign) and then
Planck
Max Karl Ernst Ludwig Planck (, ; 23 April 1858 – 4 October 1947) was a German theoretical physicist whose discovery of energy quanta won him the Nobel Prize in Physics in 1918.
Planck made many substantial contributions to theoretical p ...
for the
isothermal
In thermodynamics, an isothermal process is a type of thermodynamic process in which the temperature ''T'' of a system remains constant: Δ''T'' = 0. This typically occurs when a system is in contact with an outside thermal reservoir, and a ...
-
isobaric
Isobar may refer to:
* Isobar (meteorology), a line connecting points of equal atmospheric pressure reduced to sea level on the maps.
* Isobaric process
In thermodynamics, an isobaric process is a type of thermodynamic process in which the pr ...
process. More recently, the Massieu–Planck
thermodynamic potential
A thermodynamic potential (or more accurately, a thermodynamic potential energy)ISO/IEC 80000-5, Quantities an units, Part 5 - Thermodynamics, item 5-20.4 Helmholtz energy, Helmholtz functionISO/IEC 80000-5, Quantities an units, Part 5 - Thermod ...
, known also as ''
free entropy
A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entrop ...
'', has been shown to play a great role in the so-called entropic formulation of
statistical mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
, applied among the others in molecular biology and thermodynamic non-equilibrium processes.
::
::where:
::
is
entropy
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
::
is negentropy (Gibbs "capacity for entropy")
::
is the
Massieu potential
::
is the
partition function
::
the
Boltzmann constant
The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, ...
In particular, mathematically the negentropy (the negative entropy function, in physics interpreted as free entropy) is the
convex conjugate
In mathematics and mathematical optimization, the convex conjugate of a function is a generalization of the Legendre transformation which applies to non-convex functions. It is also known as Legendre–Fenchel transformation, Fenchel transformation ...
of
LogSumExp
The LogSumExp (LSE) (also called RealSoftMax or multivariable softplus) function is a smooth maximum – a smooth approximation to the maximum function, mainly used by machine learning algorithms. It is defined as the logarithm of the sum of t ...
(in physics interpreted as the free energy).
Brillouin's negentropy principle of information
In 1953,
Léon Brillouin
Léon Nicolas Brillouin (; August 7, 1889 – October 4, 1969) was a French physicist. He made contributions to quantum mechanics, radio wave propagation in the atmosphere, solid state physics, and information theory.
Early life
Brillouin ...
derived a general equation stating that the changing of an information bit value requires at least
energy. This is the same energy as the work
Leó Szilárd
Leo Szilard (; hu, Szilárd Leó, pronounced ; born Leó Spitz; February 11, 1898 – May 30, 1964) was a Hungarian-German-American physicist and inventor. He conceived the nuclear chain reaction in 1933, patented the idea of a nuclear ...
's engine produces in the idealistic case. In his book,
[Leon Brillouin, ''Science and Information theory'', Dover, 1956] he further explored this problem concluding that any cause of this bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount of energy.
See also
*
Exergy
In thermodynamics, the exergy of a system is the maximum useful work possible during a process that brings the system into equilibrium with a heat reservoir, reaching maximum entropy. When the surroundings are the reservoir, exergy is the potent ...
*
Free entropy
A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entrop ...
*
Entropy in thermodynamics and information theory The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, develo ...
Notes
{{Wiktionary
Entropy and information
Statistical deviation and dispersion
Thermodynamic entropy