The joint quantum entropy generalizes the classical
joint entropy to the context of
quantum information theory. Intuitively, given two
quantum state
In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement on a system. Knowledge of the quantum state together with the rules for the system's evolution in ...
s
and
, represented as
density operator
In quantum mechanics, a density matrix (or density operator) is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any measurement performed upon this system, using ...
s that are subparts of a quantum system, the joint quantum entropy is a measure of the total uncertainty or
entropy
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
of the joint system. It is written
or
, depending on the notation being used for the
von Neumann entropy. Like other entropies, the joint quantum entropy is measured in
bits, i.e. the logarithm is taken in base 2.
In this article, we will use
for the joint quantum entropy.
Background
In
information theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, for any classical
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
, the classical
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Wi ...
is a measure of how uncertain we are about the outcome of
. For example, if
is a probability distribution concentrated at one point, the outcome of
is certain and therefore its entropy
. At the other extreme, if
is the uniform probability distribution with
possible values, intuitively one would expect
is associated with the most uncertainty. Indeed such uniform probability distributions have maximum possible entropy
.
In
quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or
density matrices
In quantum mechanics, a density matrix (or density operator) is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any Measurement in quantum mechanics, measurement ...
. For a state
, the
von Neumann entropy is defined by
:
Applying the
spectral theorem
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix (mathematics), matrix can be Diagonalizable matrix, diagonalized (that is, represented as a diagonal matrix i ...
, or
Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A
maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a
pure state, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy
(or sometimes
.
Definition
Given a quantum system with two subsystems ''A'' and ''B'', the term joint quantum entropy simply refers to the von Neumann entropy of the combined system. This is to distinguish from the entropy of the subsystems.
In symbols, if the combined system is in state
,
the joint quantum entropy is then
:
Each subsystem has its own entropy. The state of the subsystems are given by the
partial trace operation.
Properties
The classical joint entropy is always at least equal to the entropy of each individual system. This is not the case for the joint quantum entropy. If the quantum state
exhibits
quantum entanglement, then the entropy of each subsystem may be larger than the joint entropy. This is equivalent to the fact that the conditional quantum entropy may be negative, while the classical conditional entropy may never be.
Consider a
maximally entangled state
Quantum entanglement is the phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of ...
such as a
Bell state
The Bell states or EPR pairs are specific quantum states of two qubits that represent the simplest (and maximal) examples of quantum entanglement; conceptually, they fall under the study of quantum information science. The Bell states are a form o ...
. If
is a Bell state, say,
:
then the total system is a pure state, with entropy 0, while each individual subsystem is a maximally mixed state, with maximum von Neumann entropy
. Thus the joint entropy of the combined system is less than that of subsystems. This is because for entangled states, definite states cannot be assigned to subsystems, resulting in positive entropy.
Notice that the above phenomenon cannot occur if a state is a separable pure state. In that case, the reduced states of the subsystems are also pure. Therefore all entropies are zero.
Relations to other entropy measures
The joint quantum entropy
can be used to define of the
conditional quantum entropy:
:
and the
quantum mutual information
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual informati ...
:
:
These definitions parallel the use of the classical
joint entropy to define the
conditional entropy and
mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such ...
.
See also
*
Quantum relative entropy In quantum information theory, quantum relative entropy is a measure of distinguishability between two density matrix, quantum states. It is the quantum mechanical analog of relative entropy.
Motivation
For simplicity, it will be assumed that al ...
*
Quantum mutual information
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual informati ...
References
* Nielsen, Michael A. and Isaac L. Chuang, ''Quantum Computation and Quantum Information''. Cambridge University Press, 2000.
{{DEFAULTSORT:Joint Quantum Entropy
Quantum mechanical entropy
Quantum information theory