In quantum information theory, strong subadditivity of quantum entropy (SSA) is the relation among the
von Neumann entropies of various quantum subsystems of a larger quantum system consisting of three subsystems (or of one quantum system with three degrees of freedom). It is a basic theorem in modern
quantum information theory
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both ...
. It was conjectured by
D. W. Robinson and
D. Ruelle
David Pierre Ruelle (; born 20 August 1935) is a Belgians, Belgian mathematical physicist, naturalized French people, French. He has worked on statistical physics and dynamical systems. With Floris Takens, Ruelle coined the term ''strange attrac ...
in 1966 and
O. E. Lanford III and D. W. Robinson in 1968 and proved in 1973 by
E.H. Lieb
Elliott Hershel Lieb (born July 31, 1932) is an American mathematical physicist and professor of mathematics and physics at Princeton University who specializes in statistical mechanics, condensed matter theory, and functional analysis.
Lieb is ...
and
M.B. Ruskai,
building on results obtained by Lieb in his proof of the Wigner-Yanase-Dyson conjecture.
The classical version of SSA was long known and appreciated in classical probability theory and information theory. The proof of this relation in the classical case is quite easy, but the quantum case is difficult because of the non-commutativity of the
reduced density matrices
Reduction, reduced, or reduce may refer to:
Science and technology Chemistry
* Reduction (chemistry), part of a reduction-oxidation (redox) reaction in which atoms have their oxidation state changed.
** Organic redox reaction, a redox react ...
describing the quantum subsystems.
Some useful references here include:
*"Quantum Computation and Quantum Information"
*"Quantum Entropy and Its Use"
*''Trace Inequalities and Quantum Entropy: An Introductory Course''
[E. Carlen, Trace Inequalities and Quantum Entropy: An Introductory Course, Contemp. Math. 529 (2009).]
Definitions
We use the following notation throughout the following: A
Hilbert space
In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natu ...
is denoted by
, and
denotes the bounded linear operators on
.
Tensor products are denoted by superscripts, e.g.,
. The trace
is denoted by
.
Density matrix
A
density matrix
In quantum mechanics, a density matrix (or density operator) is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any measurement performed upon this system, usin ...
is a
Hermitian {{Short description, none
Numerous things are named after the French mathematician Charles Hermite (1822–1901):
Hermite
* Cubic Hermite spline, a type of third-degree spline
* Gauss–Hermite quadrature, an extension of Gaussian quadrature m ...
,
positive semi-definite matrix of
trace one. It allows for the description of a
quantum system
Quantum mechanics is a fundamental Scientific theory, theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including qua ...
in a
mixed state. Density matrices on a tensor product are denoted by superscripts, e.g.,
is a density matrix on
.
Entropy
The von Neumann
quantum entropy of a density matrix
is
:
.
Relative entropy
Umegaki's
quantum relative entropy In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy.
Motivation
For simplicity, it will be assumed that all objects in t ...
of two density matrices
and
is
:
.
Joint concavity
A function
of two variables is said to be jointly concave if for any
the following holds
:
Subadditivity of entropy
Ordinary subadditivity
concerns only two spaces
and a density matrix
. It states that
:
This inequality is true, of course, in classical probability theory, but the latter also contains the
theorem that the
conditional entropies and
are both non-negative. In the quantum case, however, both can be negative,
e.g.
can be zero while
. Nevertheless, the subadditivity upper bound on
continues to hold. The closest thing one has
to
is the Araki–Lieb triangle inequality
:
which is derived in
from subadditivity by a mathematical technique known as
purification.
Strong subadditivity (SSA)
Suppose that the Hilbert space of the system is a
tensor product
In mathematics, the tensor product V \otimes W of two vector spaces and (over the same Field (mathematics), field) is a vector space to which is associated a bilinear map V\times W \to V\otimes W that maps a pair (v,w),\ v\in V, w\in W to an e ...
of three spaces:
. Physically, these three spaces can
be interpreted as the space of three different systems, or else as three parts or three degrees of freedom
of one physical system.
Given a density matrix
on
, we define a density matrix
on
as a
partial trace
In linear algebra and functional analysis, the partial trace is a generalization of the trace. Whereas the trace is a scalar valued function on operators, the partial trace is an operator-valued function. The partial trace has applications in ...
:
. Similarly, we can define density matrices:
,
,
,
,
.
Statement
For any tri-partite state
the following holds
:
,
where
, for example.
Equivalently, the statement can be recast in terms of
conditional entropies to show that for tripartite state
,
:
.
This can also be restated in terms of
quantum mutual information
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual informati ...
,
:
.
These statements run parallel to classical intuition, except that quantum conditional entropies can be negative, and quantum mutual informations can exceed the classical bound of the marginal entropy.
The strong subadditivity inequality was improved in the following way by Carlen and Lieb
:
,
with the optimal constant
.
J. Kiefer
proved a peripherally related convexity result in 1959, which is a corollary of an operator Schwarz inequality proved by E.H.Lieb and M.B.Ruskai.
However, these results are comparatively simple, and the proofs do not use the results of Lieb's 1973 paper on convex and concave trace functionals.
It was this paper that provided the mathematical basis of the proof of SSA by Lieb and Ruskai. The extension from a Hilbert space setting to a von Neumann algebra setting, where states are not given by density matrices, was done by
Narnhofer and Thirring
.
The theorem can also be obtained by proving numerous equivalent statements, some of which are summarized below.
Wigner–Yanase–Dyson conjecture
E. P. Wigner and M. M. Yanase
proposed a different definition of entropy, which was generalized by
Freeman Dyson
Freeman John Dyson (15 December 1923 – 28 February 2020) was an English-American theoretical physicist and mathematician known for his works in quantum field theory, astrophysics, random matrices, mathematical formulation of quantum m ...
.
The Wigner–Yanase–Dyson ''p''-skew information
The Wigner–Yanase–Dyson
-skew information of a density matrix
. with respect to an operator
is
:
where
is a commutator,
is the
adjoint of
and
is fixed.
Concavity of ''p''-skew information
It was conjectured by E. P. Wigner and M. M. Yanase in
that
- skew information is concave as a function of a density matrix
for a fixed
.
Since the term
is concave (it is linear), the conjecture reduces to the problem of concavity of
. As noted in,
this conjecture (for all
) implies SSA, and was proved
for
in,
and for all
in
in the following more general form: The function of
two matrix variables
is jointly concave in
and
when
and
.
This theorem is an essential part of the proof of SSA in.
In their paper
E. P. Wigner and M. M. Yanase also conjectured the subadditivity of
-skew information for
, which was disproved by Hansen by giving a counterexample.
First two statements equivalent to SSA
It was pointed out in
that the first statement below is equivalent to SSA and A. Ulhmann in
[A. Ulhmann, Endlich Dimensionale Dichtmatrizen, II, Wiss. Z. Karl-Marx-University Leipzig 22 Jg. H. 2., 139 (1973).] showed the equivalence between the second statement below and SSA.
*
Note that the conditional entropies
and
do not have to be both non-negative.
* The map
is convex.
Both of these statements were proved directly in.
Joint convexity of relative entropy
As noted by Lindblad
and Uhlmann,
if, in equation (), one takes
and
and
and differentiates in
at
, one
obtains the joint convexity of relative entropy:
i.e., if
, and
, then
where
with
.
Monotonicity of quantum relative entropy
The relative entropy decreases monotonically under
completely positive trace preserving (CPTP) operations
on density matrices,
.
This inequality is called Monotonicity of quantum relative entropy. Owing to the
Stinespring factorization theorem In mathematics, Stinespring's dilation theorem, also called Stinespring's factorization theorem, named after W. Forrest Stinespring, is a result from operator theory that represents any completely positive map on a C*-algebra ''A'' as a compositio ...
, this inequality is a consequence of a particular choice of the CPTP map - a partial trace map described below.
The most important and basic class of CPTP maps is a partial trace operation
, given by
. Then
which is called Monotonicity of quantum relative entropy under partial trace.
To see how this follows from the joint convexity of relative entropy, observe that
can be written in Uhlmann's representation as
:
for some finite
and some collection of unitary matrices on
(alternatively, integrate over
Haar measure). Since the trace (and hence the relative entropy) is unitarily invariant,
inequality () now follows from (). This theorem is due to Lindblad
and Uhlmann,
whose proof is the one given here.
SSA is obtained from ()
with
replaced by
and
replaced
. Take
.
Then () becomes
:
Therefore,
:
which is SSA. Thus,
the monotonicity of quantum relative entropy (which follows from () implies SSA.
Relationship among inequalities
All of the above important inequalities are equivalent to each other, and can also be proved directly. The following are equivalent:
* Monotonicity of quantum relative entropy (MONO);
* Monotonicity of quantum relative entropy under partial trace (MPT);
* Strong subadditivity (SSA);
* Joint convexity of quantum relative entropy (JC);
The following implications show the equivalence between these inequalities.
* MONO
MPT: follows since the MPT is a particular case of MONO;
* MPT
MONO: was shown by Lindblad,
using a representation of stochastic maps as a partial trace over an auxiliary system;
* MPT
SSA: follows by taking a particular choice of tri-partite states in MPT, described in the section above, "Monotonicity of quantum relative entropy";
* SSA
MPT: by choosing
to be block diagonal, one can show that SSA implies that the map
is convex. In
it was observed that this convexity yields MPT;
* MPT
JC: as it was mentioned above, by choosing
(and similarly,
) to be block diagonal matrix with blocks
(and
), the partial trace is a sum over blocks so that
, so from MPT one can obtain JC;
* JC
SSA: using the 'purification process', Araki and Lieb,
observed that one could obtain new useful inequalities from the known ones. By purifying
to
it can be shown that SSA is equivalent to
:
Moreover, if
is pure, then
and
, so the equality holds in the above inequality. Since the extreme points of the convex set of density matrices are pure states, SSA follows from JC;
See,
[ erratum 46, 019901 (2005)] for a discussion.
The case of equality
Equality in monotonicity of quantum relative entropy inequality
In,
[D. Petz, Sufficiency of Channels over von Neumann Algebras, Quart. J. Math. Oxford 35, 475–483 (1986).] D. Petz showed that the only case of equality in the monotonicity relation is to have a proper "recovery" channel:
For all states
and
on a Hilbert space
and all quantum operators
,
:
if and only if there exists a quantum operator
such that
:
and
Moreover,
can be given explicitly by the formula
:
where
is the
adjoint map
In mathematics, the adjoint representation (or adjoint action) of a Lie group ''G'' is a way of representing the elements of the group as linear transformations of the group's Lie algebra, considered as a vector space. For example, if ''G'' is GL ...
of
.
D. Petz also gave another condition
when the equality holds in Monotonicity of quantum relative entropy: the first statement below. Differentiating it at
we have the second condition. Moreover, M.B. Ruskai gave another proof of the second statement.
For all states
and
on
and all quantum operators
,
:
if and only if the following equivalent conditions are satisfied:
*
for all real
.
*
where
is the adjoint map of
.
Equality in strong subadditivity inequality
P. Hayden, R. Jozsa, D. Petz and
A. Winter described the states for which the equality holds in SSA.
[ P. Hayden, R. Jozsa, D. Petz, A. Winter, Structure of States which Satisfy Strong Subadditivity of Quantum Entropy with Equality, Comm. Math. Phys. 246, 359–374 (2003).]
A state
on a Hilbert space
satisfies strong subadditivity with equality if and only if there is a decomposition of second system as
:
into a direct sum of tensor products, such that
:
with states
on
and
on
, and a probability distribution
.
Carlen-Lieb Extension
E. H. Lieb and
E.A. Carlen have found an explicit error term in the SSA inequality,
namely,
If
and
, as is always the case for the classical Shannon entropy, this inequality has nothing to say. For the quantum entropy, on the other hand, it is quite possible that the conditional entropies satisfy
or
(but never both!). Then, in this "highly quantum" regime, this inequality provides additional information.
The constant 2 is optimal, in the sense that for any constant larger than 2, one can find a state for which the inequality is violated with that constant.
Operator extension of strong subadditivity
In his paper
[I. Kim, Operator Extension of Strong Subadditivity of Entropy, (2012).] I. Kim studied an operator extension of strong subadditivity, proving the following inequality:
For a tri-partite state (density matrix)
on
,
:
The proof of this inequality is based on
Effros's theorem, for which particular functions and operators are chosen to derive the inequality above. M. B. Ruskai describes this work in details in
[M. B. Ruskai, Remarks on Kim’s Strong Subadditivity Matrix Inequality: Extensions and Equality Conditions, (2012).] and discusses how to prove a large class of new matrix inequalities in the tri-partite and bi-partite cases by taking a partial trace over all but one of the spaces.
Extensions of strong subadditivity in terms of recoverability
A significant strengthening of strong subadditivity was proved in 2014, which was subsequently improved in and. In 2017,
it was shown that the recovery channel can be taken to be the original Petz recovery map. These improvements of strong subadditivity have physical interpretations in terms of recoverability, meaning that if the conditional mutual information
of a tripartite quantum state
is nearly equal to zero, then it is possible to perform a recovery channel
(from system E to AE) such that
. These results thus generalize the exact equality conditions mentioned above.
See also
*
Von Neumann entropy
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density mat ...
*
Conditional quantum entropy
*
Quantum mutual information
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual informati ...
*
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
References
{{reflist
Quantum mechanical entropy
Quantum mechanics