HOME

TheInfoList



OR:

In
quantum information theory Quantum information is the information of the quantum state, state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information re ...
, quantum mutual information, or von Neumann mutual information, after
John von Neumann John von Neumann (; hu, Neumann János Lajos, ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest cove ...
, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such ...
.


Motivation

For simplicity, it will be assumed that all objects in the article are finite-dimensional. The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables ''p''(''x'', ''y''), the two marginal distributions are :p(x) = \sum_ p(x,y), \qquad p(y) = \sum_ p(x,y). The classical mutual information ''I''(''X'':''Y'') is defined by :I(X:Y) = S(p(x)) + S(p(y)) - S(p(x,y)) where ''S''(''q'') denotes the
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum W ...
of the probability distribution ''q''. One can calculate directly :\begin S(p(x)) + S(p(y)) &= - \left (\sum_x p_x \log p(x) + \sum_y p_y \log p(y) \right ) \\ &= -\left (\sum_x \left ( \sum_ p(x,y') \log \sum_ p(x,y') \right ) + \sum_y \left ( \sum_ p(x',y) \log \sum_ p(x',y) \right ) \right ) \\ &= -\left (\sum_ p(x,y) \left (\log \sum_ p(x,y') + \log \sum_ p(x',y) \right ) \right )\\ &= -\sum_ p(x,y) \log p(x) p(y) \end So the mutual information is :I(X:Y) = \sum_ p(x,y) \log \frac, Where the logarithm is taken in basis 2 to obtain the mutual information in
bit The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represente ...
s. But this is precisely the
relative entropy Relative may refer to: General use *Kinship and family, the principle binding the most basic social units society. If two people are connected by circumstances of birth, they are said to be ''relatives'' Philosophy *Relativism, the concept that ...
between ''p''(''x'', ''y'') and ''p''(''x'')''p''(''y''). In other words, if we assume the two variables ''x'' and ''y'' to be uncorrelated, mutual information is the ''discrepancy in uncertainty'' resulting from this (possibly erroneous) assumption. It follows from the property of relative entropy that ''I''(''X'':''Y'') ≥ 0 and equality holds if and only if ''p''(''x'', ''y'') = ''p''(''x'')''p''(''y'').


Definition

The quantum mechanical counterpart of classical probability distributions are modeled with density matrices. Consider a quantum system that can be divided into two parts, A and B, such that independent measurements can be made on either part. The state space of the entire quantum system is then the tensor product of the spaces for the two parts. :H_ := H_A \otimes H_B. Let ''ρ''''AB'' be a density matrix acting on states in ''H''''AB''. The
von Neumann entropy In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ...
of a density matrix S(''ρ''), is the quantum mechanical analogy of the Shannon entropy. :S(\rho) = - \operatorname \rho \log \rho. For a probability distribution ''p''(''x'',''y''), the marginal distributions are obtained by integrating away the variables ''x'' or ''y''. The corresponding operation for density matrices is the
partial trace In linear algebra and functional analysis, the partial trace is a generalization of the trace. Whereas the trace is a scalar valued function on operators, the partial trace is an operator-valued function. The partial trace has applications in q ...
. So one can assign to ''ρ'' a state on the subsystem ''A'' by :\rho^A = \operatorname_B \; \rho^ where Tr''B'' is partial trace with respect to system ''B''. This is the reduced state of ''ρAB'' on system ''A''. The reduced von Neumann entropy of ''ρAB'' with respect to system ''A'' is :\;S(\rho^A). ''S''(''ρB'') is defined in the same way. It can now be seen that the definition of quantum mutual information, corresponding to the classical definition, should be as follows. :\; I(A\!:\!B) := S(\rho^A) + S(\rho^B) - S(\rho^). Quantum mutual information can be interpreted the same way as in the classical case: it can be shown that :I(A\!:\!B) = S(\rho^ \, \rho^A \otimes \rho^B) where S(\cdot \, \cdot) denotes
quantum relative entropy In quantum information theory, quantum relative entropy is a measure of distinguishability between two density matrix, quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that al ...
. Note that there is an alternative generalization of mutual information to the quantum case. The difference between the two for a given state is called
quantum discord In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system. It includes correlations that are due to quantum physical effects but do not necessarily involve quantum entangle ...
, a measure for the quantum correlations of the state in question.


Properties

When the state \rho^ is pure (and thus S(\rho^)=0, the mutual information is twice the entanglement entropy of the state: :I(A\!:\!B) = S(\rho^A) + S(\rho^B) - S(\rho^) = S(\rho^A) + S(\rho^B) = 2S(\rho^A) A positive quantum mutual information is not necessarily indicative of entanglement, however. A classical mixture of
separable state In quantum mechanics, separable states are quantum states belonging to a composite space that can be factored into individual states belonging to separate subspaces. A state is said to be entangled if it is not separable. In general, determinin ...
s will always have zero entanglement, but can have nonzero QMI, such as :\rho^ = \frac\left(, 00\rangle\langle00, + , 11\rangle\langle11, \right) :I(A\!:\!B) = S(\rho^A) + S(\rho^B) - S(\rho^) = S\left(\frac(, 0\rangle\langle0, + , 1\rangle\langle1, )\right) + S\left(\frac(, 0\rangle\langle0, + , 1\rangle\langle1, )\right) - S\left(\frac(, 00\rangle\langle00, + , 11\rangle\langle11, )\right) = \log 2 +\log 2 - \log 2= \log 2 In this case, the state is merely a classically correlated state.


Multiparty generalization

Suppose a system is composed by n subsystems A_1,\dots,A_n then:https://arxiv.org/abs/1504.07176 : I(A_1\!:\!A_2:\dots:A_n) = \sum (X_,\,X_,\dots,X_)(n-1)S(A_1,\,A_2,\,\dots,\,A_n) where X_ \in \{A_1,\,A_2,\,\dots,\,A_n\} and the sum is over all the distinct combinations of the subsystem without repetition. For example, take n=3 : : I(A\!:\!B\!:\!C)=S(AB)+S(AC)+S(BC)-2S(ABC) Take now n=4: : I(A_1\!:\!A_2\!:\!A_3\!:\!A_4)=S(A_1A_2A_3)+S(A_1A_2A_4)+S(A_1A_3A_4)+S(A_2A_3A_4)-3S(A_1A_2A_3A_4) Note that what we are actually doing is taking the partial trace over one subsystem per time, take the n=4 example, in the first term we are tracing over A_4, in the second term the trace is over A_3 and so on.


References

Quantum mechanical entropy Quantum information theory