Strong Subadditivity Of Quantum Entropy
   HOME
*





Strong Subadditivity Of Quantum Entropy
In quantum information theory, strong subadditivity of quantum entropy (SSA) is the relation among the von Neumann entropies of various quantum subsystems of a larger quantum system consisting of three subsystems (or of one quantum system with three degrees of freedom). It is a basic theorem in modern quantum information theory. It was conjectured by D. W. Robinson and D. Ruelle in 1966 and O. E. Lanford III and D. W. Robinson in 1968 and proved in 1973 by E.H. Lieb and M.B. Ruskai, building on results obtained by Lieb in his proof of the Wigner-Yanase-Dyson conjecture. The classical version of SSA was long known and appreciated in classical probability theory and information theory. The proof of this relation in the classical case is quite easy, but the quantum case is difficult because of the non-commutativity of the reduced density matrices describing the quantum subsystems. Some useful references here include: *"Quantum Computation and Quantum Information" *"Quantum Entr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Freeman Dyson
Freeman John Dyson (15 December 1923 – 28 February 2020) was an English-American theoretical physicist and mathematician known for his works in quantum field theory, astrophysics, random matrices, mathematical formulation of quantum mechanics, condensed matter physics, nuclear physics, and engineering. He was Professor Emeritus in the Institute for Advanced Study in Princeton and a member of the Board of Sponsors of the Bulletin of the Atomic Scientists. Dyson originated several concepts that bear his name, such as Dyson's transform, a fundamental technique in additive number theory, which he developed as part of his proof of Mann's theorem; the Dyson tree, a hypothetical genetically engineered plant capable of growing in a comet; the Dyson series, a perturbative series where each term is represented by Feynman diagrams; the Dyson sphere, a thought experiment that attempts to explain how a spacefaring, space-faring civilization would meet its energy requirements with ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Von Neumann Entropy
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix , the von Neumann entropy is : S = - \operatorname(\rho \ln \rho), where \operatorname denotes the trace and ln denotes the (natural) matrix logarithm. If is written in terms of its eigenvectors , 1\rangle, , 2\rangle, , 3\rangle, \dots as : \rho = \sum_j \eta_j \left, j \right\rang \left\lang j \ , then the von Neumann entropy is merely : S = -\sum_j \eta_j \ln \eta_j . In this form, ''S'' can be seen as the information theoretic Shannon entropy. The von Neumann entropy is also used in different forms ( conditional entropies, relative entropies, etc.) in the framework of quantum information theory to characterize the entropy of entanglement. Background John von Neumann established a rigorous mathematical framework for quantum me ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Schrödinger–HJW Theorem
In quantum information theory and quantum optics, the Schrödinger–HJW theorem is a result about the realization of a mixed state of a quantum system as an ensemble of pure quantum states and the relation between the corresponding purifications of the density operators. The theorem is named after physicists and mathematicians Erwin Schrödinger, Lane P. Hughston, Richard Jozsa and William Wootters. The result was also found independently (albeit partially) by Nicolas Gisin, and by Nicolas Hadjisavvas building upon work by Ed Jaynes, while a significant part of it was likewise independently discovered by N. David Mermin. Thanks to its complicated history, it is also known by various other names such as the GHJW theorem, the HJW theorem, and the purification theorem. Purification of a mixed quantum state Let \mathcal H_S be a finite-dimensional Hilbert space, and consider a generic (possibly mixed) quantum state \rho defined on \mathcal H_S, and admitting a decomposition of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Andreas Winter
Andreas J. Winter (born 14 June 1971, Mühldorf, Germany) is a German mathematician and mathematical physicist at the Universitat Autònoma de Barcelona (UAB) in Spain. He received his Ph.D. in 1999 under Rudolf Ahlswede and Friedrich Götze at the Universität Bielefeld in Germany before moving to the University of Bristol and then to the Centre for Quantum Technologies (CQT) at the National University of Singapore. In 2013 he was appointed ICREA Research Professor at UAB. Winter's research is focused in the field of quantum information theory. Some of his main contributions concern the understanding of quantum communication protocols, the coding theory for quantum channels, and the theory of quantum entanglement. Together with Michał Horodecki and Jonathan Oppenheim, he discovered quantum state-merging and used this primitive to show that quantum information could be negative. Together with Marcin Pawlowski, Tomasz Paterek, Dagomir Kaszlikowski, and Valerio Scarani, he disc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Patrick Hayden (scientist)
Patrick Hayden is a physicist and computer scientist active in the fields of quantum information theory and quantum computing. He is currently a professor in the Stanford University physics department and a distinguished research chair at the Perimeter Institute for Theoretical Physics. Prior to that he held a Canada Research Chair in the physics of information at McGill University. He received a B.Sc. (1998) from McGill University and won a Rhodes Scholarship to study for a D.Phil. (2001) at the University of Oxford under the supervision of Artur Ekert. In 2007 he was awarded the Sloan Research Fellowship in Computer Science. He was a Canadian Mathematical Society Public Lecturer in 2008 and received a Simons Investigator Award in 2014. Hayden has contributed substantially to quantum information theory. His contributions range from quantum information approaches to the theory of black holes to the study of quantum entanglement. Hayden and John Preskill considered informatio ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hermitian Adjoint
In mathematics, specifically in operator theory, each linear operator A on a Euclidean vector space defines a Hermitian adjoint (or adjoint) operator A^* on that space according to the rule :\langle Ax,y \rangle = \langle x,A^*y \rangle, where \langle \cdot,\cdot \rangle is the inner product on the vector space. The adjoint may also be called the Hermitian conjugate or simply the Hermitian after Charles Hermite. It is often denoted by in fields like physics, especially when used in conjunction with bra–ket notation in quantum mechanics. In finite dimensions where operators are represented by matrices, the Hermitian adjoint is given by the conjugate transpose (also known as the Hermitian transpose). The above definition of an adjoint operator extends verbatim to bounded linear operators on Hilbert spaces H. The definition has been further extended to include unbounded '' densely defined'' operators whose domain is topologically dense in—but not necessarily equal to— ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Haar Measure
In mathematical analysis, the Haar measure assigns an "invariant volume" to subsets of locally compact topological groups, consequently defining an integral for functions on those groups. This measure was introduced by Alfréd Haar in 1933, though its special case for Lie groups had been introduced by Adolf Hurwitz in 1897 under the name "invariant integral". Haar measures are used in many parts of analysis, number theory, group theory, representation theory, statistics, probability theory, and ergodic theory. Preliminaries Let (G, \cdot) be a locally compact Hausdorff topological group. The \sigma-algebra generated by all open subsets of G is called the Borel algebra. An element of the Borel algebra is called a Borel set. If g is an element of G and S is a subset of G, then we define the left and right translates of S by ''g'' as follows: * Left translate: g S = \. * Right translate: S g = \. Left and right translates map Borel sets onto Borel sets. A measure \mu on th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Stinespring Factorization Theorem
In mathematics, Stinespring's dilation theorem, also called Stinespring's factorization theorem, named after W. Forrest Stinespring, is a result from operator theory that represents any completely positive map on a C*-algebra ''A'' as a composition of two completely positive maps each of which has a special form: #A *-representation of ''A'' on some auxiliary Hilbert space ''K'' followed by #An operator map of the form ''T'' ↦ ''V*TV''. Moreover, Stinespring's theorem is a structure theorem from a C*-algebra into the algebra of bounded operators on a Hilbert space. Completely positive maps are shown to be simple modifications of *-representations, or sometimes called *-homomorphisms. Formulation In the case of a unital C*-algebra, the result is as follows: :Theorem. Let ''A'' be a unital C*-algebra, ''H'' be a Hilbert space, and ''B''(''H'') be the bounded operators on ''H''. For every completely positive ::\Phi : A \to B(H), :there exists a Hilbert space ''K'' and a unit ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Trace (linear Algebra)
In linear algebra, the trace of a square matrix , denoted , is defined to be the sum of elements on the main diagonal (from the upper left to the lower right) of . The trace is only defined for a square matrix (). It can be proved that the trace of a matrix is the sum of its (complex) eigenvalues (counted with multiplicities). It can also be proved that for any two matrices and . This implies that similar matrices have the same trace. As a consequence one can define the trace of a linear operator mapping a finite-dimensional vector space into itself, since all matrices describing such an operator with respect to a basis are similar. The trace is related to the derivative of the determinant (see Jacobi's formula). Definition The trace of an square matrix is defined as \operatorname(\mathbf) = \sum_^n a_ = a_ + a_ + \dots + a_ where denotes the entry on the th row and th column of . The entries of can be real numbers or (more generally) complex numbers. The trace is not de ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Completely Positive Map
In mathematics a positive map is a map between C*-algebras that sends positive elements to positive elements. A completely positive map is one which satisfies a stronger, more robust condition. Definition Let A and B be C*-algebras. A linear map \phi: A\to B is called positive map if \phi maps positive elements to positive elements: a\geq 0 \implies \phi(a)\geq 0. Any linear map \phi:A\to B induces another map :\textrm \otimes \phi : \mathbb^ \otimes A \to \mathbb^ \otimes B in a natural way. If \mathbb^\otimes A is identified with the C*-algebra A^ of k\times k-matrices with entries in A, then \textrm\otimes\phi acts as : \begin a_ & \cdots & a_ \\ \vdots & \ddots & \vdots \\ a_ & \cdots & a_ \end \mapsto \begin \phi(a_) & \cdots & \phi(a_) \\ \vdots & \ddots & \vdots \\ \phi(a_) & \cdots & \phi(a_) \end. We say that \phi is k-positive if \textrm_ \otimes \phi is a positive map, and \phi is called completely positive if \phi is k-positive for all k. Properties * Positi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Quantum Mutual Information
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information. Motivation For simplicity, it will be assumed that all objects in the article are finite-dimensional. The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables ''p''(''x'', ''y''), the two marginal distributions are :p(x) = \sum_ p(x,y), \qquad p(y) = \sum_ p(x,y). The classical mutual information ''I''(''X'':''Y'') is defined by :I(X:Y) = S(p(x)) + S(p(y)) - S(p(x,y)) where ''S''(''q'') denotes the Shannon entropy of the probability distribution ''q''. One can calculate directly :\begin S(p(x)) + S(p(y)) &= - \left (\sum_x p_x \log p(x) + \sum_y p_y \log p(y) \right ) \\ &= -\left (\sum_x \left ( \sum_ p(x,y') \log \sum_ p(x,y') \right ) + \sum_y \left ( \s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]