HOME

TheInfoList



OR:

Neural Network Quantum States (NQS or NNQS) is a general class of variational
quantum states In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement in quantum mechanics, measurement on a system. Knowledge of the quantum state together with the rul ...
parameterized in terms of an
artificial neural network Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected unit ...
. It was first introduced in 2017 by the
physicists A physicist is a scientist who specializes in the field of physics, which encompasses the interactions of matter and energy at all length and time scales in the physical universe. Physicists generally are interested in the root or ultimate caus ...
Giuseppe Carleo Giuseppe Carleo (born 1984) is an Italian physicist. He is a professor of computational physics at EPFL (École Polytechnique Fédérale de Lausanne) and the head of the Laboratory of Computational Quantum Science. Career Carleo studied physi ...
and
Matthias Troyer Matthias is a name derived from the Greek Ματθαίος, in origin similar to Matthew. People Notable people named Matthias include the following: In religion: * Saint Matthias, chosen as an apostle in Acts 1:21–26 to replace Judas Iscariot ...
to approximate
wave functions A wave function in quantum physics is a mathematical description of the quantum state of an isolated quantum system. The wave function is a complex-valued probability amplitude, and the probabilities for the possible results of measurements m ...
of
many-body The many-body problem is a general name for a vast category of physical problems pertaining to the properties of microscopic systems made of many interacting particles. ''Microscopic'' here implies that quantum mechanics has to be used to provid ...
quantum In physics, a quantum (plural quanta) is the minimum amount of any physical entity (physical property) involved in an interaction. The fundamental notion that a physical property can be "quantized" is referred to as "the hypothesis of quantizati ...
systems. Given a many-body quantum state , \Psi\rangle comprising N degrees of freedom and a choice of associated quantum numbers s_1 \ldots s_N , then an NQS parameterizes the wave-function amplitudes \langle s_1 \ldots s_N , \Psi; W \rangle = F(s_1 \ldots s_N; W), where F(s_1 \ldots s_N; W) is an
artificial neural network Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected unit ...
of parameters (weights) W , N input variables ( s_1 \ldots s_N ) and one complex-valued output corresponding to the wave-function amplitude. This variational form is used in conjunction with specific stochastic learning approaches to approximate quantum states of interest.


Learning the Ground-State Wave Function

One common application of NQS is to find an approximate representation of the ground state
wave function A wave function in quantum physics is a mathematical description of the quantum state of an isolated quantum system. The wave function is a complex-valued probability amplitude, and the probabilities for the possible results of measurements mad ...
of a given
Hamiltonian Hamiltonian may refer to: * Hamiltonian mechanics, a function that represents the total energy of a system * Hamiltonian (quantum mechanics), an operator corresponding to the total energy of that system ** Dyall Hamiltonian, a modified Hamiltonian ...
\hat . The learning procedure in this case consists in finding the best neural-network weights that minimize the variational energy E(W) = \langle \Psi; W , \hat, \Psi; W \rangle . Since, for a general artificial neural network, computing the expectation value is an exponentially costly operation in N , stochastic techniques based, for example, on the
Monte Carlo method Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be determi ...
are used to estimate E(W) , analogously to what is done in
Variational Monte Carlo In computational physics, variational Monte Carlo (VMC) is a quantum Monte Carlo method that applies the variational method to approximate the ground state of a quantum system. The basic building block is a generic wave function , \Psi(a) \rangle ...
, see for example for a review. More specifically, a set of M samples S^, S^ \ldots S^ , with S^=s^_1\ldots s^_N , is generated such that they are uniformly distributed according to the Born probability density P(S) \propto , F(s_1 \ldots s_N; W), ^2 . Then it can be shown that the sample mean of the so-called "local energy" E_(S) = \langle S, \hat, \Psi\rangle/ \langle S, \Psi\rangle is a statistical estimate of the quantum expectation value E(W) , i.e. E(W) \simeq \frac \sum_i^M E_(S^). Similarly, it can be shown that the
gradient In vector calculus, the gradient of a scalar-valued differentiable function of several variables is the vector field (or vector-valued function) \nabla f whose value at a point p is the "direction and rate of fastest increase". If the gradi ...
of the energy with respect to the network weights W is also approximated by a sample mean \frac \simeq \frac \sum_i^M (E_(S^) - E(W)) O^\star_k(S^), where O(S^)= \frac and can be efficiently computed, in deep networks through
backpropagation In machine learning, backpropagation (backprop, BP) is a widely used algorithm for training feedforward neural network, feedforward artificial neural networks. Generalizations of backpropagation exist for other artificial neural networks (ANN ...
. The stochastic approximation of the gradients is then used to minimize the energy E(W) typically using a
stochastic gradient descent Stochastic gradient descent (often abbreviated SGD) is an iterative method for optimizing an objective function with suitable smoothness properties (e.g. differentiable or subdifferentiable). It can be regarded as a stochastic approximation of ...
approach. When the neural-network parameters are updated at each step of the learning procedure, a new set of samples S^ is generated, in an iterative procedure similar to what done in
unsupervised learning Unsupervised learning is a type of algorithm that learns patterns from untagged data. The hope is that through mimicry, which is an important mode of learning in people, the machine is forced to build a concise representation of its world and t ...
.


Connection with Tensor Networks

Neural-Network representations of quantum wave functions share some similarities with variational
quantum states In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement in quantum mechanics, measurement on a system. Knowledge of the quantum state together with the rul ...
based on
tensor networks In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects related to a vector space. Tensors may map between different objects such as vectors, scalars, and even other tens ...
. For example, connections with
matrix product state Matrix product state (MPS) is a quantum state of many particles (in N sites), written in the following form: : , \Psi\rangle = \sum_ \operatorname\left _1^ A_2^ \cdots A_N^\right, s_1 s_2 \ldots s_N\rangle, where A_i^ are complex, square matric ...
s have been established. These studies have shown that NQS support volume law scaling for the
entropy of entanglement The entropy of entanglement (or entanglement entropy) is a measure of the degree of quantum entanglement between two subsystems constituting a two-part composite quantum system. Given a pure bipartite quantum state of the composite system, it is pos ...
. In general, given a NQS with fully-connected weights, it corresponds, in the worse case, to a
matrix product state Matrix product state (MPS) is a quantum state of many particles (in N sites), written in the following form: : , \Psi\rangle = \sum_ \operatorname\left _1^ A_2^ \cdots A_N^\right, s_1 s_2 \ldots s_N\rangle, where A_i^ are complex, square matric ...
of exponentially large bond dimension in N .


See also

*
Differentiable programming Differentiable programming is a programming paradigm in which a numeric computer program can be differentiated throughout via automatic differentiation. This allows for gradient-based optimization of parameters in the program, often via grad ...


References

{{reflist, refs= {{cite journal , last1=Carleo , first1=Giuseppe , last2=Troyer , first2=Matthias , year=2017 , title= Solving the quantum many-body problem with artificial neural networks , journal= Science , arxiv = 1606.02318 , doi= 10.1126/science.aag2302 , volume=355 , issue=6325 , pages=602–606 , pmid=28183973 , bibcode=2017Sci...355..602C , s2cid=206651104 {{cite book , last1=Becca , first1=Federico , last2=Sorella , first2=Sandro, title=Quantum Monte Carlo Approaches for Correlated Systems , publisher=Cambridge University Press, date=2017 , isbn=9781316417041 , doi=10.1017/9781316417041 {{cite journal , last1=Chen , first1=Jing , last2=Cheng , first2=Song , last3=Xie , first3=Haidong , last4=Wang , first4=Lei , last5=Xiang , first5=Tao , year=2018 , title= Equivalence of restricted Boltzmann machines and tensor network states , journal= Phys. Rev. B , arxiv = 1701.04831 , doi= 10.1103/PhysRevB.97.085104 , volume=97 , issue=8 , pages=085104 , bibcode=2018PhRvB..97h5104C , s2cid=73659611 Quantum mechanics Quantum Monte Carlo Machine learning