In physics, the Tsallis entropy is a generalization of the standard
Boltzmann–Gibbs entropy.
Overview
The concept was introduced in 1988 by
Constantino Tsallis as a basis for generalizing the standard
statistical mechanics and is identical in form to
Havrda–Charvát structural α-entropy,
introduced in 1967 within
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
. In scientific literature, the physical relevance of the Tsallis entropy has been debated.
However, from the years 2000 on, an increasingly wide spectrum of natural, artificial and social
complex system
A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communicatio ...
s have been identified which confirm the predictions and consequences that are derived from this nonadditive entropy, such as nonextensive statistical mechanics,
which generalizes the Boltzmann–Gibbs theory.
Among the various experimental verifications and applications presently available in the literature, the following ones deserve a special mention:
# The distribution characterizing the motion of cold atoms in dissipative
optical lattice
An optical lattice is formed by the interference of counter-propagating laser beams, creating a spatially periodic polarization pattern. The resulting periodic potential may trap neutral atoms via the Stark shift. Atoms are cooled and congreg ...
s predicted in 2003 and observed in 2006.
# The fluctuations of the magnetic field in the
solar wind
The solar wind is a stream of charged particles released from the upper atmosphere of the Sun, called the corona. This plasma mostly consists of electrons, protons and alpha particles with kinetic energy between . The composition of the ...
enabled the calculation of the q-triplet (or Tsallis triplet).
# The velocity distributions in a driven dissipative
dusty plasma
A dusty plasma is a plasma containing micrometer (10−6) to nanometer (10−9) sized particles suspended in it. Dust particles are charged and the plasma and particles behave as a plasma. Dust particles may form larger particles resulting in "grai ...
.
#
Spin glass
In condensed matter physics, a spin glass is a magnetic state characterized by randomness, besides cooperative behavior in freezing of spins at a temperature called 'freezing temperature' ''Tf''. In ferromagnetic solids, component atoms' magn ...
relaxation.
#
Trapped ion interacting with a classical
buffer gas A buffer gas is an inert or nonflammable gas. In the Earth's atmosphere, nitrogen acts as a buffer gas. A buffer gas adds pressure to a system and controls the speed of combustion with any oxygen present. Any inert gas such as helium, neon, or a ...
.
# High energy collisional experiments at LHC/CERN (CMS, ATLAS and ALICE detectors) and RHIC/Brookhaven (STAR and PHENIX detectors).
Among the various available theoretical results which clarify the physical conditions under which Tsallis entropy and associated statistics apply, the following ones can be selected:
#
Anomalous diffusion
Anomalous diffusion is a diffusion process with a non-linear relationship between the mean squared displacement (MSD), \langle r^(\tau )\rangle , and time. This behavior is in stark contrast to Brownian motion, the typical diffusion process descri ...
.
#
Uniqueness theorem In mathematics, a uniqueness theorem, also called a unicity theorem, is a theorem asserting the uniqueness of an object satisfying certain conditions, or the equivalence of all objects satisfying the said conditions. Examples of uniqueness theorems ...
.
# Sensitivity to
initial conditions and entropy production at the edge of chaos.
#
Probability set
Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking ...
s that make the nonadditive Tsallis entropy to be extensive in the thermodynamical sense.
# Strongly
quantum entangled systems and thermodynamics.
# Thermostatistics of
overdamped
Damping is an influence within or upon an oscillatory system that has the effect of reducing or preventing its oscillation. In physical systems, damping is produced by processes that dissipate the energy stored in the oscillation. Examples in ...
motion of interacting particles.
# Nonlinear generalizations of the Schroedinger,
Klein–Gordon and
Dirac equation
In particle physics, the Dirac equation is a relativistic wave equation derived by British physicist Paul Dirac in 1928. In its free form, or including electromagnetic interactions, it describes all spin- massive particles, called "Dirac pa ...
s.
# Blackhole entropy calculation.
For further details a bibliography is available at http://tsallis.cat.cbpf.br/biblio.htm
Given a discrete set of probabilities
with the condition
, and
any real number, the Tsallis entropy is defined as
:
where
is a real parameter sometimes called ''entropic-index'' and
a positive constant.
In the limit as
, the usual Boltzmann–Gibbs entropy is recovered, namely
:
where one identifies
with the
Boltzmann constant
The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas consta ...
.
For continuous probability distributions, we define the entropy as
:
where
is a
probability density function
In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) c ...
.
The Tsallis Entropy has been used along with the
Principle of maximum entropy
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition ...
to derive the
Tsallis distribution.
Various relationships
The discrete Tsallis entropy satisfies
:
where ''D''
''q'' is the
q-derivative
In mathematics, in the area of combinatorics and quantum calculus, the ''q''-derivative, or Jackson derivative, is a ''q''-analog of the ordinary derivative, introduced by Frank Hilton Jackson. It is the inverse of Jackson's ''q''-integratio ...
with respect to ''x''. This may be compared to the standard entropy formula:
:
Non-additivity
Given two independent systems ''A'' and ''B'', for which the joint
probability density satisfies
:
the Tsallis entropy of this system satisfies
:
From this result, it is evident that the parameter
is a measure of the departure from additivity. In the limit when ''q'' = 1,
:
which is what is expected for an additive system. This property is sometimes referred to as "pseudo-additivity".
Exponential families
Many common distributions like the normal distribution belongs to the statistical
exponential families.
Tsallis entropy for an exponential family can be written
as
:
where ''F'' is log-normalizer and ''k'' the term indicating the carrier measure.
For multivariate normal, term ''k'' is zero, and therefore the Tsallis entropy is in closed-form.
Generalized entropies
Several interesting physical systems
abide by entropic
functional
Functional may refer to:
* Movements in architecture:
** Functionalism (architecture)
** Form follows function
* Functional group, combination of atoms within molecules
* Medical conditions without currently visible organic basis:
** Functional s ...
s that are more general than the standard Tsallis entropy. Therefore, several physically meaningful generalizations have been introduced. The two most generals of those are notably: Superstatistics, introduced by C. Beck and E. G. D. Cohen in 2003
and Spectral Statistics, introduced by G. A. Tsekouras and
Constantino Tsallis in 2005.
Both these entropic forms have Tsallis and Boltzmann–Gibbs statistics as special cases; Spectral Statistics has been proven to at least contain Superstatistics and it has been conjectured to also cover some additional cases.
See also
*
Rényi entropy In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for th ...
*
Tsallis distribution
References
Further reading
*
*
*
External links
Tsallis Statistics, Statistical Mechanics for Non-extensive Systems and Long-Range Interactions
{{Tsallis
Statistical mechanics
Entropy and information
Thermodynamic entropy
Information theory