A generalized probabilistic theory (GPT) is a general framework to describe the
operational features of arbitrary
physical theories. A GPT must specify what kind of physical systems one can find in the lab, as well as rules to compute the outcome statistics of any experiment involving labeled preparations, transformations and measurements. The framework of GPTs has been used to define hypothetical non-quantum physical theories which nonetheless possess
quantum theory's most remarkable features, such as
entanglement or
teleportation
Teleportation is the hypothetical transfer of matter or energy from one point to another without traversing the physical space between them. It is a common subject in science fiction literature and in other popular culture. Teleportation is oft ...
. Notably, a small set of physically motivated
axioms
An axiom, postulate, or assumption is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Ancient Greek word (), meaning 'that which is thought worthy or ...
is enough to single out the GPT representation of quantum theory.
The mathematical formalism of GPTs has been developed since the 1950s and 1960s by many authors, and rediscovered independently several times. The earliest ideas are due to Segal and Mackey, although the first comprehensive and mathematically rigorous treatment can be traced back to the work of Ludwig, Dähn, and Stolz, all three based at the University of Marburg.
While the formalism in these earlier works is less similar to the modern one, already in the early 1970s the ideas of the Marburg school had matured and the notation had developed towards the modern usage, thanks also to the independent contribution of Davies and Lewis.
The books by Ludwig and the proceedings of a conference held in Marburg in 1973 offer a comprehensive account of these early developments.
The term "generalized probabilistic theory" itself was coined by Jonathan Barrett in 2007,
based on the version of the framework introduced by Lucien Hardy.
Note that some authors also use the term ''operational probabilistic theory'' to denote a particular variant of GPTs.
Definition
A GPT is specified by a number of mathematical structures, namely:
* a family of state spaces, each of which represents a physical system;
* a composition rule (usually corresponds to a
tensor product
In mathematics, the tensor product V \otimes W of two vector spaces and (over the same Field (mathematics), field) is a vector space to which is associated a bilinear map V\times W \to V\otimes W that maps a pair (v,w),\ v\in V, w\in W to an e ...
), which specifies how joint state spaces are formed;
* a set of measurements, which map states to probabilities and are usually described by an
effect algebra;
* a set of possible physical operations, i.e., transformations that map state spaces to state spaces.
It can be argued that if one can prepare a state
and a different state
, then one can also toss a (possibly biased) coin which lands on one side with probability
and on the other with probability
and prepare either
or
, depending on the side the coin lands on. The resulting state is a statistical mixture of the states
and
and in GPTs such statistical mixtures are described by convex combinations, in this case
. For this reason all state spaces are assumed to be
convex set
In geometry, a subset of a Euclidean space, or more generally an affine space over the reals, is convex if, given any two points in the subset, the subset contains the whole line segment that joins them. Equivalently, a convex set or a convex ...
s. Following a similar reasoning, one can argue that also the set of measurement outcomes and set of physical operations must be convex.
Additionally it is always assumed that measurement outcomes and physical operations are affine maps, i.e. that if
is a physical transformation, then we must have
and similarly for measurement outcomes. This follows from the argument that we should obtain the same outcome if we first prepare a statistical mixture and then apply the physical operation, or if we prepare a statistical mixture of the outcomes of the physical operations.
Note that physical operations are a subset of all affine maps which transform states into states as we must require that a physical operation yields a valid state even when it is applied to a part of a system (the notion of "part" is subtle: it is specified by explaining how different system types compose and how the global parameters of the composite system are affected by local operations).
For practical reasons it is often assumed that a general GPT is embedded in a finite-dimensional vector space, although infinite-dimensional formulations exist.
Classical, quantum, and beyond
Classical theory is a GPT where states correspond to probability distributions and both measurements and physical operations are stochastic maps. One can see that in this case all state spaces are
simplex
In geometry, a simplex (plural: simplexes or simplices) is a generalization of the notion of a triangle or tetrahedron to arbitrary dimensions. The simplex is so-named because it represents the simplest possible polytope in any given dimension ...
es.
Quantum theory is a GPT where system types are described by a natural number
which corresponds to the Hilbert space dimension. States of the systems of Hilbert space dimension
are described by the normalized positive semidefinite matrices, i.e. by the
density matrices
In quantum mechanics, a density matrix (or density operator) is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any Measurement in quantum mechanics, measurement ...
. Measurements are identified with
Positive Operator valued Measures (POVMs), and the physical operations are
completely positive map
In mathematics a positive map is a map between C*-algebras that sends positive elements to positive elements. A completely positive map is one which satisfies a stronger, more robust condition.
Definition
Let A and B be C*-algebras. A linear ...
s. Systems compose via the tensor product of the underlying Hilbert spaces.
The framework of GPTs has provided examples of consistent physical theories which cannot be embedded in quantum theory and indeed exhibit very non-quantum features. One of the first ones was Box-world, the theory with maximal non-local correlations.
Other examples are theories with third-order interference
and the family of GPTs known as generalized bits.
Many features that were considered purely quantum are actually present in all non-classical GPTs. These include the impossibility of universal broadcasting, i.e., the
no-cloning theorem In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computing among others. The theore ...
; the existence of incompatible measurements;
and the existence of entangled states or entangled measurements.
See also
*
Quantum foundations
Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relat ...
References
{{reflist, colwidth=30em
Quantum mechanics