Neural Network Quantum States (NQS or NNQS) is a general class of
variational quantum states parameterized in terms of an
artificial neural network
Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains.
An ANN is based on a collection of connected units ...
. It was first introduced in 2017 by the
physicists
A physicist is a scientist who specializes in the field of physics, which encompasses the interactions of matter and energy at all length and time scales in the physical universe.
Physicists generally are interested in the root or ultimate caus ...
Giuseppe Carleo and
Matthias Troyer to approximate
wave functions of
many-body
The many-body problem is a general name for a vast category of physical problems pertaining to the properties of microscopic systems made of many interacting particles. ''Microscopic'' here implies that quantum mechanics has to be used to provid ...
quantum systems.
Given a
many-body quantum state comprising
degrees of freedom and a choice of associated quantum numbers
, then an NQS parameterizes the wave-function amplitudes
where
is an
artificial neural network
Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains.
An ANN is based on a collection of connected units ...
of parameters (weights)
,
input variables (
) and one complex-valued output corresponding to the wave-function amplitude.
This variational form is used in conjunction with specific
stochastic learning approaches to approximate quantum states of interest.
Learning the Ground-State Wave Function
One common application of NQS is to find an approximate representation of the ground state
wave function
A wave function in quantum physics is a mathematical description of the quantum state of an isolated quantum system. The wave function is a complex-valued probability amplitude, and the probabilities for the possible results of measurements m ...
of a given
Hamiltonian . The learning procedure in this case consists in finding the best neural-network weights that minimize the variational energy
Since, for a general artificial neural network, computing the expectation value is an exponentially costly operation in
, stochastic techniques based, for example, on the
Monte Carlo method
Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deter ...
are used to estimate
, analogously to what is done in
Variational Monte Carlo In computational physics, variational Monte Carlo (VMC) is a quantum Monte Carlo method that applies the variational method to approximate the ground state of a quantum system.
The basic building block is a generic wave function , \Psi(a) \rangle ...
, see for example
for a review. More specifically, a set of
samples
, with
, is generated such that they are uniformly distributed according to the
Born probability density . Then it can be shown that the sample mean of the so-called "local energy"
is a statistical estimate of the quantum expectation value
, i.e.
Similarly, it can be shown that the
gradient
In vector calculus, the gradient of a scalar-valued differentiable function of several variables is the vector field (or vector-valued function) \nabla f whose value at a point p is the "direction and rate of fastest increase". If the gr ...
of the energy with respect to the network weights
is also approximated by a sample mean
where
and can be efficiently computed, in
deep networks through
backpropagation
In machine learning, backpropagation (backprop, BP) is a widely used algorithm for training feedforward artificial neural networks. Generalizations of backpropagation exist for other artificial neural networks (ANNs), and for functions gener ...
.
The stochastic approximation of the gradients is then used to minimize the energy
typically using a
stochastic gradient descent
Stochastic gradient descent (often abbreviated SGD) is an iterative method for optimizing an objective function with suitable smoothness properties (e.g. differentiable or subdifferentiable). It can be regarded as a stochastic approximation of ...
approach. When the neural-network parameters are updated at each step of the learning procedure, a new set of samples
is generated, in an iterative procedure similar to what done in
unsupervised learning
Unsupervised learning is a type of algorithm that learns patterns from untagged data. The hope is that through mimicry, which is an important mode of learning in people, the machine is forced to build a concise representation of its world and t ...
.
Connection with Tensor Networks
Neural-Network representations of quantum wave functions share some similarities with variational
quantum states based on
tensor networks. For example, connections with
matrix product states have been established.
These studies have shown that NQS support volume law scaling for the
entropy of entanglement. In general, given a NQS with fully-connected weights, it corresponds, in the worse case, to a
matrix product state of exponentially large bond dimension in
.
See also
*
Differentiable programming
References
{{reflist, refs=
[
{{cite journal
, last1=Carleo , first1=Giuseppe
, last2=Troyer , first2=Matthias
, year=2017
, title= Solving the quantum many-body problem with artificial neural networks
, journal= Science
, arxiv = 1606.02318
, doi= 10.1126/science.aag2302
, volume=355
, issue=6325
, pages=602–606
, pmid=28183973
, bibcode=2017Sci...355..602C
, s2cid=206651104
]
[
{{cite book
, last1=Becca , first1=Federico
, last2=Sorella , first2=Sandro,
title=Quantum Monte Carlo Approaches for Correlated Systems , publisher=Cambridge University Press, date=2017 , isbn=9781316417041 , doi=10.1017/9781316417041
]
[
{{cite journal
, last1=Chen , first1=Jing
, last2=Cheng , first2=Song
, last3=Xie , first3=Haidong
, last4=Wang , first4=Lei
, last5=Xiang , first5=Tao
, year=2018
, title= Equivalence of restricted Boltzmann machines and tensor network states
, journal= Phys. Rev. B
, arxiv = 1701.04831
, doi= 10.1103/PhysRevB.97.085104
, volume=97
, issue=8
, pages=085104
, bibcode=2018PhRvB..97h5104C
, s2cid=73659611
]
Quantum mechanics
Quantum Monte Carlo
Machine learning