Generalized Relative Entropy
   HOME

TheInfoList



OR:

Generalized relative entropy (\epsilon-relative entropy) is a measure of dissimilarity between two
quantum states In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement in quantum mechanics, measurement on a system. Knowledge of the quantum state together with the rul ...
. It is a "one-shot" analogue of
quantum relative entropy In quantum information theory, quantum relative entropy is a measure of distinguishability between two density matrix, quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that al ...
and shares many properties of the latter quantity. In the study of
quantum information theory Quantum information is the information of the quantum state, state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information re ...
, we typically assume that information processing tasks are repeated multiple times, independently. The corresponding information-theoretic notions are therefore defined in the asymptotic limit. The quintessential entropy measure,
von Neumann entropy In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ...
, is one such notion. In contrast, the study of one-shot quantum information theory is concerned with information processing when a task is conducted only once. New entropic measures emerge in this scenario, as traditional notions cease to give a precise characterization of resource requirements. \epsilon-relative entropy is one such particularly interesting measure. In the asymptotic scenario, relative entropy acts as a parent quantity for other measures besides being an important measure itself. Similarly, \epsilon-relative entropy functions as a parent quantity for other measures in the one-shot scenario.


Definition

To motivate the definition of the \epsilon-relative entropy D^(\rho, , \sigma), consider the information processing task of
hypothesis testing A statistical hypothesis test is a method of statistical inference used to decide whether the data at hand sufficiently support a particular hypothesis. Hypothesis testing allows us to make probabilistic statements about population parameters. ...
. In hypothesis testing, we wish to devise a strategy to distinguish between two density operators \rho and \sigma. A strategy is a
POVM In functional analysis and quantum measurement theory, a positive operator-valued measure (POVM) is a measure whose values are positive semi-definite operators on a Hilbert space. POVMs are a generalisation of projection-valued measures (PVM) and ...
with elements Q and I - Q. The probability that the strategy produces a correct guess on input \rho is given by \operatorname(\rho Q) and the probability that it produces a wrong guess is given by \operatorname(\sigma Q). \epsilon-relative entropy captures the minimum probability of error when the state is \sigma, given that the success probability for \rho is at least \epsilon. For \epsilon \in (0,1), the \epsilon-relative entropy between two quantum states\rho and \sigma is defined as ::: D^(\rho, , \sigma) = - \log \frac \min \ ~. From the definition, it is clear that D^(\rho, , \sigma)\geq 0. This inequality is saturated if and only if \rho = \sigma, as shown
below Below may refer to: *Earth *Ground (disambiguation) *Soil *Floor *Bottom (disambiguation) Bottom may refer to: Anatomy and sex * Bottom (BDSM), the partner in a BDSM who takes the passive, receiving, or obedient role, to that of the top or ...
.


Relationship to the trace distance

Suppose the
trace distance In quantum mechanics, and especially quantum information and the study of open quantum systems, the trace distance ''T'' is a metric (mathematics), metric on the space of density matrix, density matrices and gives a measure of the distinguishability ...
between two density operators \rho and \sigma is ::: , , \rho - \sigma, , _1 = \delta ~. For 0< \epsilon< 1, it holds that :::a) \log \frac \quad \leq \quad D^(\rho, , \sigma) \quad \leq \quad \log \frac ~. In particular, this implies the following analogue of the Pinsker inequality :::b) \frac, , \rho-\sigma, , _1 \quad \leq \quad D^(\rho, , \sigma) ~. Furthermore, the proposition implies that for any \epsilon \in (0,1), D^(\rho, , \sigma) = 0 if and only if \rho = \sigma, inheriting this property from the trace distance. This result and its proof can be found in Dupuis et al.


Proof of inequality a)

''Upper bound'': Trace distance can be written as ::: , , \rho - \sigma, , _1 = \max_ \operatorname(Q(\rho - \sigma)) ~. This maximum is achieved when Q is the orthogonal projector onto the positive eigenspace of \rho - \sigma. For any
POVM In functional analysis and quantum measurement theory, a positive operator-valued measure (POVM) is a measure whose values are positive semi-definite operators on a Hilbert space. POVMs are a generalisation of projection-valued measures (PVM) and ...
element Q we have :::\operatorname(Q(\rho - \sigma)) \leq \delta so that if \operatorname(Q\rho) \geq \epsilon, we have :::\operatorname(Q\sigma) ~\geq~ \operatorname(Q\rho) - \delta ~\geq~ \epsilon - \delta~. From the definition of the \epsilon-relative entropy, we get ::: 2^\geq \frac ~. ''Lower bound'': Let Q be the orthogonal projection onto the positive eigenspace of \rho - \sigma, and let \bar Q be the following convex combination of I and Q: ::: \bar Q = (\epsilon - \mu)I + (1 - \epsilon + \mu)Q where \mu = \frac ~. This means :::\mu = (1-\epsilon + \mu)\operatorname(Q\rho) and thus ::: \operatorname(\bar Q \rho) ~=~ (\epsilon - \mu) + (1-\epsilon + \mu)\operatorname(Q\rho) ~=~ \epsilon ~. Moreover, :::\operatorname(\bar Q \sigma) ~=~ \epsilon - \mu + (1-\epsilon + \mu)\operatorname(Q\sigma) ~. Using \mu = (1-\epsilon + \mu)\operatorname(Q\rho), our choice of Q, and finally the definition of \mu, we can re-write this as :::\operatorname(\bar Q \sigma) ~=~ \epsilon - (1 - \epsilon + \mu)\operatorname(Q\rho) + (1 - \epsilon + \mu)\operatorname(Q\sigma) :::::: ~=~ \epsilon - \frac ~\leq~ \epsilon - (1-\epsilon)\delta ~. Hence :::D^(\rho, , \sigma) \geq \log \frac ~.


Proof of inequality b)

To derive this ''Pinsker-like inequality'', observe that :::\log \frac ~=~ -\log\left( 1 - \frac \right) ~\geq~ \delta \frac ~.


Alternative proof of the Data Processing inequality

A fundamental property of von Neumann entropy is strong subadditivity. Let S(\sigma) denote the von Neumann entropy of the quantum state \sigma, and let \rho_ be a quantum state on the tensor product
Hilbert space In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natural ...
\mathcal_A\otimes \mathcal_B \otimes \mathcal_C. Strong subadditivity states that :::S(\rho_) + S(\rho_B) \leq S(\rho_) + S(\rho_) where \rho_,\rho_,\rho_ refer to the reduced density matrices on the spaces indicated by the subscripts. When re-written in terms of
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such ...
, this inequality has an intuitive interpretation; it states that the information content in a system cannot increase by the action of a local
quantum operation In quantum mechanics, a quantum operation (also known as quantum dynamical map or quantum process) is a mathematical formalism used to describe a broad class of transformations that a quantum mechanical system can undergo. This was first discusse ...
on that system. In this form, it is better known as the
data processing inequality The data processing inequality is an information theoretic concept which states that the information content of a signal cannot be increased via a local physical operation. This can be expressed concisely as 'post-processing cannot increase inform ...
, and is equivalent to the monotonicity of relative entropy under quantum operations: :::S(\rho, , \sigma) - S(\mathcal(\rho), , \mathcal(\sigma)) \geq 0 for every CPTP map \mathcal, where S(\omega, , \tau) denotes the relative entropy of the quantum states \omega, \tau. It is readily seen that \epsilon-relative entropy also obeys monotonicity under quantum operations: :::D^(\rho, , \sigma) \geq D^(\mathcal(\rho), , \mathcal(\sigma)), for any CPTP map \mathcal. To see this, suppose we have a POVM (R,I-R) to distinguish between \mathcal(\rho) and \mathcal(\sigma) such that \langle R, \mathcal(\rho)\rangle = \langle \mathcal^(R), \rho \rangle \geq \epsilon. We construct a new POVM (\mathcal^(R), I - \mathcal^(R)) to distinguish between \rho and \sigma. Since the adjoint of any CPTP map is also positive and unital, this is a valid POVM. Note that \langle R, \mathcal(\sigma)\rangle = \langle \mathcal^(R), \sigma\rangle \geq \langle Q,\sigma\rangle, where (Q, I-Q) is the POVM that achieves D^(\rho, , \sigma). Not only is this interesting in itself, but it also gives us the following alternative method to prove the data processing inequality. By the quantum analogue of the Stein lemma, :::\lim_\fracD^(\rho^, , \sigma^) = \lim_\frac\log \min \frac\operatorname(\sigma^ Q) ::::::::::: = D(\rho, , \sigma) - \lim_\frac\left( \log\frac \right) ::::::::::: = D(\rho, , \sigma) ~, where the minimum is taken over 0\leq Q\leq 1 such that \operatorname(Q\rho^)\geq \epsilon ~. Applying the data processing inequality to the states \rho^ and \sigma^ with the CPTP map \mathcal^, we get :::D^(\rho^, , \sigma^) ~\geq~ D^(\mathcal(\rho)^, , \mathcal(\sigma)^) ~. Dividing by n on either side and taking the limit as n \rightarrow\infty, we get the desired result.


See also

*
Entropic value at risk In financial mathematics and stochastic optimization, the concept of risk measure is used to quantify the risk involved in a random outcome or risk position. Many risk measures have hitherto been proposed, each having certain characteristics. The en ...
*
Quantum relative entropy In quantum information theory, quantum relative entropy is a measure of distinguishability between two density matrix, quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that al ...
* Strong subadditivity * Classical information theory *
Min-entropy The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the ''mos ...


References

{{reflist Quantum mechanical entropy