Von Neumann entropy
   HOME

TheInfoList



OR:

In
physics Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which ...
, the von Neumann entropy, named after
John von Neumann John von Neumann (; hu, Neumann János Lajos, ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest c ...
, is an extension of the concept of
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
from classical
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
to
quantum statistical mechanics Quantum statistical mechanics is statistical mechanics applied to quantum mechanical systems. In quantum mechanics a statistical ensemble (probability distribution over possible quantum states) is described by a density operator ''S'', which is ...
. For a quantum-mechanical system described by a
density matrix In quantum mechanics, a density matrix (or density operator) is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any measurement performed upon this system, using ...
, the von Neumann entropy is : S = - \operatorname(\rho \ln \rho), where \operatorname denotes the trace and ln denotes the (natural)
matrix logarithm In mathematics, a logarithm of a matrix is another matrix such that the matrix exponential of the latter matrix equals the original matrix. It is thus a generalization of the scalar logarithm and in some sense an inverse function of the matrix exp ...
. If is written in terms of its
eigenvectors In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
, 1\rangle, , 2\rangle, , 3\rangle, \dots as : \rho = \sum_j \eta_j \left, j \right\rang \left\lang j \ , then the von Neumann entropy is merely : S = -\sum_j \eta_j \ln \eta_j . In this form, ''S'' can be seen as the information theoretic
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum W ...
. The von Neumann entropy is also used in different forms ( conditional entropies, relative entropies, etc.) in the framework of quantum information theory to characterize the entropy of entanglement.


Background

John von Neumann John von Neumann (; hu, Neumann János Lajos, ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest c ...
established a rigorous mathematical framework for quantum mechanics in his 1932 work '' Mathematical Foundations of Quantum Mechanics''. In it, he provided a theory of measurement, where the usual notion of wave-function collapse is described as an irreversible process (the so-called von Neumann or projective measurement). The
density matrix In quantum mechanics, a density matrix (or density operator) is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any measurement performed upon this system, using ...
was introduced, with different motivations, by von Neumann and by
Lev Landau Lev Davidovich Landau (russian: Лев Дави́дович Ланда́у; 22 January 1908 – 1 April 1968) was a Soviet-Azerbaijani physicist of Jewish descent who made fundamental contributions to many areas of theoretical physics. His ac ...
. The motivation that inspired Landau was the impossibility of describing a subsystem of a composite quantum system by a state vector. On the other hand, von Neumann introduced the density matrix in order to develop both quantum statistical mechanics and a theory of quantum measurements. The density matrix formalism, thus developed, extended the tools of classical statistical mechanics to the quantum domain. In the classical framework, the
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon ...
and partition function of the system allows us to compute all possible thermodynamic quantities. Von Neumann introduced the density matrix to play the same role in the context of quantum states and operators in a complex Hilbert space. The knowledge of the statistical density matrix operator would allow us to compute all average quantum entities in a conceptually similar, but mathematically different, way. Let us suppose we have a set of wave functions , ''Ψ''〉 that depend parametrically on a set of quantum numbers ''n''1, ''n''2, ..., ''n''''N''. The natural variable which we have is the amplitude with which a particular wavefunction of the basic set participates in the actual wavefunction of the system. Let us denote the square of this amplitude by ''p''(''n''1, ''n''2, ..., ''n''''N''). The goal is to turn this quantity ''p'' into the classical density function in phase space. We have to verify that ''p'' goes over into the density function in the classical limit, and that it has
ergodic In mathematics, ergodicity expresses the idea that a point of a moving system, either a dynamical system or a stochastic process, will eventually visit all parts of the space that the system moves in, in a uniform and random sense. This implies tha ...
properties. After checking that ''p''(''n''1, ''n''2, ..., ''n''''N'') is a constant of motion, an ergodic assumption for the probabilities ''p''(''n''1, ''n''2, ..., ''n''''N'') makes ''p'' a function of the energy only. After this procedure, one finally arrives at the density matrix formalism when seeking a form where ''p''(''n''1, ''n''2, ..., ''n''''N'') is invariant with respect to the representation used. In the form it is written, it will only yield the correct expectation values for quantities which are diagonal with respect to the quantum numbers ''n''1, ''n''2, ..., ''n''''N''. Expectation values of operators which are not diagonal involve the phases of the quantum amplitudes. Suppose we encode the quantum numbers ''n''1, ''n''2, ..., ''n''''N'' into the single index ''i'' or ''j''. Then our wave function has the form : \left, \Psi \right\rangle = \sum_i a_i \left, \psi_i \right\rangle . The expectation value of an operator ''B'' which is not diagonal in these wave functions, so : \left\langle B \right\rangle = \sum_ a_i^a_j \left\langle i \ B \left, j \right\rangle . The role which was originally reserved for the quantities \left, a_i \ ^2 is thus taken over by the density matrix of the system ''S''. : \left\langle j \ \rho \left, i \right\rangle = a_j a_i^ . Therefore, 〈''B''〉 reads : \left\langle B \right\rangle = \operatorname (\rho B) . The invariance of the above term is described by matrix theory. A mathematical framework was described where the expectation value of quantum operators, as described by matrices, is obtained by taking the trace of the product of the density operator \hat and an operator \hat (Hilbert scalar product between operators). The matrix formalism here is in the statistical mechanics framework, although it applies as well for finite quantum systems, which is usually the case, where the state of the system cannot be described by a
pure state In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement on a system. Knowledge of the quantum state together with the rules for the system's evolution in t ...
, but as a statistical operator \hat of the above form. Mathematically, \hat is a positive-semidefinite
Hermitian matrix In mathematics, a Hermitian matrix (or self-adjoint matrix) is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the -th row and -th column is equal to the complex conjugate of the element in the -t ...
with unit trace.


Definition

Given the density matrix ''ρ'', von Neumann defined the entropy as :S(\rho) = -\operatorname (\rho \ln \rho), which is a proper extension of the
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
(up to a factor ) and the
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum W ...
to the quantum case. To compute ''S''(''ρ'') it is convenient (see
logarithm of a matrix In mathematics, a logarithm of a matrix is another matrix such that the matrix exponential of the latter matrix equals the original matrix. It is thus a generalization of the scalar logarithm and in some sense an inverse function of the matrix exp ...
) to compute the
eigendecomposition In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matr ...
of ~\rho = \sum_j \eta_j \left, j \right\rangle \left\langle j \ . The von Neumann entropy is then given by :S(\rho) = - \sum_j \eta_j \ln \eta_j . Since, for a pure state, the density matrix is
idempotent Idempotence (, ) is the property of certain operations in mathematics and computer science whereby they can be applied multiple times without changing the result beyond the initial application. The concept of idempotence arises in a number of pl ...
, , the entropy ''S''(''ρ'') for it vanishes. Thus, if the system is finite (finite-dimensional matrix representation), the entropy ''S''(''ρ'') quantifies ''the departure of the system from a pure state''. In other words, it codifies the degree of mixing of the state describing a given finite system. Measurement decoheres a quantum system into something noninterfering and ostensibly classical; so, e.g., the vanishing entropy of a pure state \Psi = ( \left, 0 \right\rangle + \left, 1 \right\rangle ) / \sqrt, corresponding to a density matrix :\rho = \begin 1 & 1 \\ 1 & 1 \end increases to for the measurement outcome mixture :\rho = \begin 1 & 0 \\ 0 & 1 \end as the quantum interference information is erased.


Properties

Some properties of the von Neumann entropy: * is zero if and only if represents a pure state. * is maximal and equal to \frac for a maximally mixed state, being the dimension of the
Hilbert space In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natural ...
. * is invariant under changes in the basis of , that is, , with a unitary transformation. * is concave, that is, given a collection of positive numbers which sum to unity (\Sigma_i \lambda_i = 1) and density operators , we have :: S\bigg(\sum_^k \lambda_i \rho_i \bigg) \geq \sum_^k \lambda_i S(\rho_i). * satisfies the bound :: S\bigg(\sum_^k \lambda_i \rho_i \bigg) \leq \sum_^k \lambda_i S(\rho_i) - \sum_^k \lambda_i \log \lambda_i. :where equality is achieved if the have orthogonal support, and as before are density operators and is a collection of positive numbers which sum to unity (\Sigma_i \lambda_i = 1) * is additive for independent systems. Given two density matrices describing independent systems ''A'' and ''B'', we have ::S(\rho_A \otimes \rho_B)=S(\rho_A)+S(\rho_B). * is strongly subadditive for any three systems ''A'', ''B'', and ''C'': ::S(\rho_) + S(\rho_) \leq S(\rho_) + S(\rho_). :This automatically means that is subadditive: ::S(\rho_) \leq S(\rho_) +S(\rho_). Below, the concept of subadditivity is discussed, followed by its generalization to strong subadditivity.


Subadditivity

If are the reduced density matrices of the general state , then : \left, S(\rho_A) - S(\rho_B) \ \leq S(\rho_) \leq S(\rho_A) + S(\rho_B) . This right hand inequality is known as ''
subadditivity In mathematics, subadditivity is a property of a function that states, roughly, that evaluating the function for the sum of two elements of the domain always returns something less than or equal to the sum of the function's values at each element. ...
''. The two inequalities together are sometimes known as the ''
triangle inequality In mathematics, the triangle inequality states that for any triangle, the sum of the lengths of any two sides must be greater than or equal to the length of the remaining side. This statement permits the inclusion of degenerate triangles, but ...
''. They were proved in 1970 by
Huzihiro Araki is a Japanese mathematical physicist and mathematician, who worked on the foundations of quantum field theory, on quantum statistical mechanics, and on the theory of operator algebras. Biography Araki is the son of the University of Kyoto ...
and Elliott H. Lieb. While in Shannon's theory the entropy of a composite system can never be lower than the entropy of any of its parts, in quantum theory this is not the case, i.e., it is possible that , while . Intuitively, this can be understood as follows: In quantum mechanics, the entropy of the joint system can be less than the sum of the entropy of its components because the components may be entangled. For instance, as seen explicitly, the
Bell state The Bell states or EPR pairs are specific quantum states of two qubits that represent the simplest (and maximal) examples of quantum entanglement; conceptually, they fall under the study of quantum information science. The Bell states are a form ...
of two spin-½s, : \left, \psi \right\rangle = \left, \uparrow \downarrow \right\rangle + \left, \downarrow \uparrow \right\rangle , is a pure state with zero entropy, but each spin has maximum entropy when considered individually in its
reduced density matrix Reduction, reduced, or reduce may refer to: Science and technology Chemistry * Reduction (chemistry), part of a reduction-oxidation (redox) reaction in which atoms have their oxidation state changed. ** Organic redox reaction, a redox react ...
. The entropy in one spin can be "cancelled" by being correlated with the entropy of the other. The left-hand inequality can be roughly interpreted as saying that entropy can only be cancelled by an equal amount of entropy. If system and system have different amounts of entropy, the smaller can only partially cancel the greater, and some entropy must be left over. Likewise, the right-hand inequality can be interpreted as saying that the entropy of a composite system is maximized when its components are uncorrelated, in which case the total entropy is just a sum of the sub-entropies. This may be more intuitive in the
phase space formulation The phase-space formulation of quantum mechanics places the position ''and'' momentum variables on equal footing in phase space. In contrast, the Schrödinger picture uses the position ''or'' momentum representations (see also position and mome ...
, instead of Hilbert space one, where the Von Neumann entropy amounts to minus the expected value of the -logarithm of the Wigner function, , up to an offset shift. Up to this normalization offset shift, the entropy is majorized by that of its
classical limit The classical limit or correspondence limit is the ability of a physical theory to approximate or "recover" classical mechanics when considered over special values of its parameters. The classical limit is used with physical theories that predict n ...
.


Strong subadditivity

The von Neumann entropy is also '' strongly subadditive''. Given three
Hilbert space In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natural ...
s, ''A'', ''B'', ''C'', :S(\rho_) + S(\rho_) \leq S(\rho_) + S(\rho_). This is a more difficult theorem and was proved first by J. Kiefer in 1959 and independently by Elliott H. Lieb and
Mary Beth Ruskai Mary Beth Ruskai (born 1944) is an American mathematical physicist and Professor Emerita of Mathematics at the University of Massachusetts, with interest in mathematical problems in quantum theory. She is a Fellow of the AAAS, AMS, APS, and A ...
in 1973, using a matrix inequality of Elliott H. Lieb proved in 1973. By using the proof technique that establishes the left side of the triangle inequality above, one can show that the strong subadditivity inequality is equivalent to the following inequality. :S(\rho_) + S(\rho_) \leq S(\rho_) + S(\rho_) when , etc. are the reduced density matrices of a density matrix . If we apply ordinary subadditivity to the left side of this inequality, and consider all permutations of ''A'', ''B'', ''C'', we obtain the ''
triangle inequality In mathematics, the triangle inequality states that for any triangle, the sum of the lengths of any two sides must be greater than or equal to the length of the remaining side. This statement permits the inclusion of degenerate triangles, but ...
'' for : Each of the three numbers is less than or equal to the sum of the other two.


See also

*
Entropy (information theory) In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
* Linear entropy * Partition function (mathematics) * Quantum conditional entropy * Quantum mutual information *
Quantum entanglement Quantum entanglement is the phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of ...
* Strong subadditivity of quantum entropy * Wehrl entropy


References

{{Statistical mechanics topics Quantum mechanical entropy John von Neumann