Generalized Relative Entropy
   HOME
*





Generalized Relative Entropy
Generalized relative entropy (\epsilon-relative entropy) is a measure of dissimilarity between two quantum states. It is a "one-shot" analogue of quantum relative entropy and shares many properties of the latter quantity. In the study of quantum information theory, we typically assume that information processing tasks are repeated multiple times, independently. The corresponding information-theoretic notions are therefore defined in the asymptotic limit. The quintessential entropy measure, von Neumann entropy, is one such notion. In contrast, the study of one-shot quantum information theory is concerned with information processing when a task is conducted only once. New entropic measures emerge in this scenario, as traditional notions cease to give a precise characterization of resource requirements. \epsilon-relative entropy is one such particularly interesting measure. In the asymptotic scenario, relative entropy acts as a parent quantity for other measures besides being an import ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Quantum States
In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement in quantum mechanics, measurement on a system. Knowledge of the quantum state together with the rules for the system's evolution in time exhausts all that can be predicted about the system's behavior. A mixture distribution, mixture of quantum states is again a quantum state. Quantum states that cannot be written as a mixture of other states are called pure quantum states, while all other states are called mixed quantum states. A pure quantum state can be represented by a ray (quantum theory), ray in a Hilbert space over the complex numbers, while mixed states are represented by density matrix, density matrices, which are Definiteness of a matrix, positive semidefinite operators that act on Hilbert spaces. Pure states are also known as state vectors or wave functions, the latter term applying particularly when they are represented as fu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mutual Information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable. Not limited to real-valued random variables and linear dependence like the correlation coefficient, MI is more general and determines how different the joint distribution of the pair (X,Y) is from the product of the marginal distributions of X and Y. MI is the expected value of the pointwise mutual information (PMI). The quantity was defined and analyzed by Claude Shannon in his landmark paper "A Mathemati ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Strong Subadditivity
Strong may refer to: Education * The Strong, an educational institution in Rochester, New York, United States * Strong Hall (Lawrence, Kansas), an administrative hall of the University of Kansas * Strong School, New Haven, Connecticut, United States, an overflow school for district kindergartners and first graders Music Albums * ''Strong'' (Anette Olzon album), 2021 * ''Strong'' (Arrested Development album), 2010 * ''Strong'' (Michelle Wright album), 2013 * ''Strong'' (Thomas Anders album), 2010 * ''Strong'' (Tracy Lawrence album), 2004 * ''Strong'', a 2000 album by Clare Quilty Songs * "Strong" (London Grammar song), 2013 * "Strong" (One Direction song), 2013 * "Strong" (Robbie Williams song), 1998 * "Strong", a song by After Forever from ''Remagine'' * "Strong", a song by Audio Adrenaline from '' Worldwide'' * "Strong", a song by LeAnn Rimes from '' Whatever We Wanna'' * "Strong", a song by London Grammar from ''If You Wait'' * "Strong", a song by Will Hoge from '' N ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Quantum Relative Entropy
In quantum information theory, quantum relative entropy is a measure of distinguishability between two density matrix, quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that all objects in the article are finite-dimensional. We first discuss the classical case. Suppose the probabilities of a finite sequence of events is given by the probability distribution ''P'' = , but somehow we mistakenly assumed it to be ''Q'' = . For instance, we can mistake an unfair coin for a fair one. According to this erroneous assumption, our uncertainty about the ''j''-th event, or equivalently, the amount of information provided after observing the ''j''-th event, is :\; - \log q_j. The (assumed) average uncertainty of all possible events is then :\; - \sum_j p_j \log q_j. On the other hand, the Shannon entropy of the probability distribution ''p'', defined by :\; - \sum_j p_j \log p_j, is the real amount of uncertainty befor ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Entropic Value At Risk
In financial mathematics and stochastic optimization, the concept of risk measure is used to quantify the risk involved in a random outcome or risk position. Many risk measures have hitherto been proposed, each having certain characteristics. The entropic value at risk (EVaR) is a coherent risk measure introduced by Ahmadi-Javid, which is an upper bound for the value at risk (VaR) and the conditional value at risk (CVaR), obtained from the Chernoff inequality. The EVaR can also be represented by using the concept of relative entropy. Because of its connection with the VaR and the relative entropy, this risk measure is called "entropic value at risk". The EVaR was developed to tackle some computational inefficiencies of the CVaR. Getting inspiration from the dual representation of the EVaR, Ahmadi-Javid developed a wide class of coherent risk measures, called g-entropic risk measures. Both the CVaR and the EVaR are members of this class. Definition Let (\Omega,\mathcal,P) be a pro ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Quantum Channel
In quantum information theory, a quantum channel is a communication channel which can transmit quantum information, as well as classical information. An example of quantum information is the state of a qubit. An example of classical information is a text document transmitted over the Internet. More formally, quantum channels are completely positive (CP) trace-preserving maps between spaces of operators. In other words, a quantum channel is just a quantum operation viewed not merely as the reduced dynamics of a system but as a pipeline intended to carry quantum information. (Some authors use the term "quantum operation" to also include trace-decreasing maps while reserving "quantum channel" for strictly trace-preserving maps.) Memoryless quantum channel We will assume for the moment that all state spaces of the systems considered, classical or quantum, are finite-dimensional. The memoryless in the section title carries the same meaning as in classical information theory: the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Data Processing Inequality
The data processing inequality is an information theoretic concept which states that the information content of a signal cannot be increased via a local physical operation. This can be expressed concisely as 'post-processing cannot increase information'. Definition Let three random variables form the Markov chain X \rightarrow Y \rightarrow Z, implying that the conditional distribution of Z depends only on Y and is conditionally independent of X. Specifically, we have such a Markov chain if the joint probability mass function can be written as :p(x,y,z) = p(x)p(y, x)p(z, y)=p(y)p(x, y)p(z, y) In this setting, no processing of Y, deterministic or random, can increase the information that Y contains about X. Using the mutual information, this can be written as : : I(X;Y) \geqslant I(X;Z) With the equality I(X;Y) = I(X;Z) if and only if I(X;Y\mid Z)=0 , i.e. Z and Y contain the same information about X, and X \rightarrow Z \rightarrow Y also forms a Markov chain. Proof One can ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Quantum Operation
In quantum mechanics, a quantum operation (also known as quantum dynamical map or quantum process) is a mathematical formalism used to describe a broad class of transformations that a quantum mechanical system can undergo. This was first discussed as a general stochastic transformation for a density matrix by George Sudarshan. The quantum operation formalism describes not only unitary time evolution or symmetry transformations of isolated systems, but also the effects of measurement and transient interactions with an environment. In the context of quantum computation, a quantum operation is called a quantum channel. Note that some authors use the term "quantum operation" to refer specifically to completely positive (CP) and non-trace-increasing maps on the space of density matrices, and the term "quantum channel" to refer to the subset of those that are strictly trace-preserving. Quantum operations are formulated in terms of the density operator description of a quantum mechanica ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Reduced Density Matrices
Reduction, reduced, or reduce may refer to: Science and technology Chemistry * Reduction (chemistry), part of a reduction-oxidation (redox) reaction in which atoms have their oxidation state changed. ** Organic redox reaction, a redox reaction that takes place with organic compounds ** Ore reduction: see smelting Computing and algorithms * Reduction (complexity), a transformation of one problem into another problem * Reduction (recursion theory), given sets A and B of natural numbers, is it possible to effectively convert a method for deciding membership in B into a method for deciding membership in A? * Bit Rate Reduction, an audio compression method * Data reduction, simplifying data in order to facilitate analysis * Graph reduction, an efficient version of non-strict evaluation * L-reduction, a transformation of optimization problems which keeps the approximability features * Partial order reduction, a technique for reducing the size of the state-space to be searche ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Quantum Relative Entropy
In quantum information theory, quantum relative entropy is a measure of distinguishability between two density matrix, quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that all objects in the article are finite-dimensional. We first discuss the classical case. Suppose the probabilities of a finite sequence of events is given by the probability distribution ''P'' = , but somehow we mistakenly assumed it to be ''Q'' = . For instance, we can mistake an unfair coin for a fair one. According to this erroneous assumption, our uncertainty about the ''j''-th event, or equivalently, the amount of information provided after observing the ''j''-th event, is :\; - \log q_j. The (assumed) average uncertainty of all possible events is then :\; - \sum_j p_j \log q_j. On the other hand, the Shannon entropy of the probability distribution ''p'', defined by :\; - \sum_j p_j \log p_j, is the real amount of uncertainty befor ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Hilbert Space
In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise naturally and frequently in mathematics and physics, typically as function spaces. Formally, a Hilbert space is a vector space equipped with an inner product that defines a distance function for which the space is a complete metric space. The earliest Hilbert spaces were studied from this point of view in the first decade of the 20th century by David Hilbert, Erhard Schmidt, and Frigyes Riesz. They are indispensable tools in the theories of partial differential equations, quantum mechanics, Fourier analysis (which includes applications to signal processing and heat transfer), and ergodic theory (which forms the mathematical underpinning of thermodynamics). John von Neumann coined the term ''Hilbert space'' for the abstract concept that under ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]