Mean-field Particle Methods
   HOME

TheInfoList



OR:

Mean-field particle methods are a broad class of ''interacting type''
Monte Carlo Monte Carlo (; ; french: Monte-Carlo , or colloquially ''Monte-Carl'' ; lij, Munte Carlu ; ) is officially an administrative area of the Principality of Monaco, specifically the ward of Monte Carlo/Spélugues, where the Monte Carlo Casino is ...
algorithms for simulating from a sequence of probability distributions satisfying a nonlinear evolution equation. These flows of probability measures can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depends on the distributions of the current random states. A natural way to simulate these sophisticated nonlinear Markov processes is to sample a large number of copies of the process, replacing in the evolution equation the unknown distributions of the random states by the sampled
empirical measure In probability theory, an empirical measure is a random measure arising from a particular realization of a (usually finite) sequence of random variables. The precise definition is found below. Empirical measures are relevant to mathematical stat ...
s. In contrast with traditional Monte Carlo and
Markov chain Monte Carlo In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain ...
methods these mean-field particle techniques rely on sequential interacting samples. The terminology mean-field reflects the fact that each of the ''samples (a.k.a. particles, individuals, walkers, agents, creatures, or phenotypes)'' interacts with the empirical measures of the process. When the size of the system tends to infinity, these random empirical measures converge to the deterministic distribution of the random states of the nonlinear Markov chain, so that the statistical interaction between particles vanishes. In other words, starting with a chaotic configuration based on independent copies of initial state of the nonlinear Markov chain model, the chaos propagates at any time horizon as the size the system tends to infinity; that is, finite blocks of particles reduces to independent copies of the nonlinear Markov process. This result is called the propagation of chaos property. The terminology "propagation of chaos" originated with the work of
Mark Kac Mark Kac ( ; Polish: ''Marek Kac''; August 3, 1914 – October 26, 1984) was a Polish American mathematician. His main interest was probability theory. His question, " Can one hear the shape of a drum?" set off research into spectral theory, the i ...
in 1976 on a colliding mean-field kinetic gas model.


History

The theory of mean-field interacting particle models had certainly started by the mid-1960s, with the work of Henry P. McKean Jr. on Markov interpretations of a class of nonlinear parabolic partial differential equations arising in fluid mechanics. The mathematical foundations of these classes of models were developed from the mid-1980s to the mid-1990s by several mathematicians, including Werner Braun, Klaus Hepp, Karl Oelschläger, Gérard Ben Arous and Marc Brunaud, Donald Dawson, Jean Vaillancourt and Jürgen Gärtner, Christian Léonard, Sylvie Méléard, Sylvie Roelly,
Alain-Sol Sznitman Alain-Sol Sznitman (born 13 December 1955) is a French and Swiss mathematician who works as a professor of mathematics at ETH Zurich. His research concerns probability theory and mathematical physics.
and Hiroshi Tanaka for diffusion type models; F. Alberto Grünbaum, Tokuzo Shiga, Hiroshi Tanaka, Sylvie Méléard and Carl Graham for general classes of interacting jump-diffusion processes. We also quote an earlier pioneering article by Theodore E. Harris and
Herman Kahn Herman Kahn (February 15, 1922 – July 7, 1983) was a founder of the Hudson Institute and one of the preeminent futurists of the latter part of the twentieth century. He originally came to prominence as a military strategist and systems theori ...
, published in 1951, using mean-field but heuristic-like genetic methods for estimating particle transmission energies. Mean-field genetic type particle methods are also used as heuristic natural search algorithms (a.k.a.
metaheuristic In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, or select a heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimizati ...
) in evolutionary computing. The origins of these mean-field computational techniques can be traced to 1950 and 1954 with the work of
Alan Turing Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical com ...
on genetic type mutation-selection learning machines and the articles by
Nils Aall Barricelli Nils Aall Barricelli (24 January 1912 – 27 January 1993) was a Norwegian-Italian mathematician. Barricelli's early computer-assisted experiments in symbiogenesis and evolution are considered pioneering in artificial life research. Barricel ...
at the
Institute for Advanced Study The Institute for Advanced Study (IAS), located in Princeton, New Jersey, in the United States, is an independent center for theoretical research and intellectual inquiry. It has served as the academic home of internationally preeminent scholar ...
in
Princeton, New Jersey Princeton is a municipality with a borough form of government in Mercer County, in the U.S. state of New Jersey. It was established on January 1, 2013, through the consolidation of the Borough of Princeton and Princeton Township, both of whi ...
. The Australian geneticist Alex Fraser also published in 1957 a series of papers on the genetic type simulation of
artificial selection Selective breeding (also called artificial selection) is the process by which humans use animal breeding and plant breeding to selectively develop particular phenotypic traits (characteristics) by choosing which typically animal or plant m ...
of organisms.
Quantum Monte Carlo Quantum Monte Carlo encompasses a large family of computational methods whose common aim is the study of complex quantum systems. One of the major goals of these approaches is to provide a reliable solution (or an accurate approximation) of the ...
, and more specifically Diffusion Monte Carlo methods can also be interpreted as a mean-field particle approximation of Feynman-Kac path integrals. The origins of Quantum Monte Carlo methods are often attributed to Enrico Fermi and Robert Richtmyer who developed in 1948 a mean field particle interpretation of neutron-chain reactions, but the first heuristic-like and genetic type particle algorithm (a.k.a. Resampled or Reconfiguration Monte Carlo methods) for estimating ground state energies of quantum systems (in reduced matrix models) is due to Jack H. Hetherington in 1984 In molecular chemistry, the use of genetic heuristic-like particle methods (a.k.a. pruning and enrichment strategies) can be traced back to 1955 with the seminal work of Marshall. N. Rosenbluth and Arianna. W. Rosenbluth. The first pioneering articles on the applications of these heuristic-like particle methods in nonlinear filtering problems were the independent studies of Neil Gordon, David Salmon and Adrian Smith (bootstrap filter), Genshiro Kitagawa (Monte Carlo filter) , and the one by Himilcon Carvalho, Pierre Del Moral, André Monin and Gérard Salut published in the 1990s. The term interacting "particle filters" was first coined in 1996 by Del Moral. Particle filters were also developed in signal processing in the early 1989-1992 by P. Del Moral, J.C. Noyer, G. Rigal, and G. Salut in the LAAS-CNRS in a series of restricted and classified research reports with STCAN (Service Technique des Constructions et Armes Navales), the IT company DIGILOG, and th
LAAS-CNRS
(the Laboratory for Analysis and Architecture of Systems) on RADAR/SONAR and GPS signal processing problems. The foundations and the first rigorous analysis on the convergence of genetic type models and mean field Feynman-Kac particle methods are due to Pierre Del Moral in 1996. Branching type particle methods with varying population sizes were also developed in the end of the 1990s by Dan Crisan, Jessica Gaines and Terry Lyons, and by Dan Crisan, Pierre Del Moral and Terry Lyons. The first uniform convergence results with respect to the time parameter for mean field particle models were developed in the end of the 1990s by Pierre Del Moral and Alice Guionnet for interacting jump type processes, and by Florent Malrieu for nonlinear diffusion type processes. New classes of mean field particle simulation techniques for Feynman-Kac path-integration problems includes genealogical tree based models, backward particle models, adaptive mean field particle models, island type particle models, and particle Markov chain Monte Carlo methods


Applications

In
physics Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which r ...
, and more particularly in
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
, these nonlinear evolution equations are often used to describe the statistical behavior of microscopic interacting particles in a fluid or in some condensed matter. In this context, the random evolution of a virtual fluid or a gas particle is represented by McKean-Vlasov diffusion processes,
reaction–diffusion system Reaction–diffusion systems are mathematical models which correspond to several physical phenomena. The most common is the change in space and time of the concentration of one or more chemical substances: local chemical reactions in which the s ...
s, or Boltzmann type collision processes. As its name indicates, the mean field particle model represents the collective behavior of microscopic particles weakly interacting with their occupation measures. The macroscopic behavior of these many-body particle systems is encapsulated in the limiting model obtained when the size of the population tends to infinity. Boltzmann equations represent the macroscopic evolution of colliding particles in rarefied gases, while McKean Vlasov diffusions represent the macroscopic behavior of fluid particles and granular gases. In
computational physics Computational physics is the study and implementation of numerical analysis to solve problems in physics for which a quantitative theory already exists. Historically, computational physics was the first application of modern computers in science, ...
and more specifically in
quantum mechanics Quantum mechanics is a fundamental theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including quantum chemistry, ...
, the ground state energies of quantum systems is associated with the top of the spectrum of Schrödinger's operators. The
Schrödinger equation The Schrödinger equation is a linear partial differential equation that governs the wave function of a quantum-mechanical system. It is a key result in quantum mechanics, and its discovery was a significant landmark in the development of the ...
is the quantum mechanics version of the Newton's second law of motion of classical mechanics (the mass times the acceleration is the sum of the forces). This equation represents the wave function (a.k.a. the quantum state) evolution of some physical system, including molecular, atomic of subatomic systems, as well as macroscopic systems like the universe. The solution of the imaginary time Schrödinger equation (a.k.a. the heat equation) is given by a Feynman-Kac distribution associated with a free evolution Markov process (often represented by Brownian motions) in the set of electronic or macromolecular configurations and some potential energy function. The long time behavior of these nonlinear semigroups is related to top eigenvalues and ground state energies of Schrödinger's operators. The genetic type mean field interpretation of these Feynman-Kac models are termed Resample Monte Carlo, or Diffusion Monte Carlo methods. These branching type evolutionary algorithms are based on mutation and selection transitions. During the mutation transition, the walkers evolve randomly and independently in a potential energy landscape on particle configurations. The mean field selection process (a.k.a. quantum teleportation, population reconfiguration, resampled transition) is associated with a fitness function that reflects the particle absorption in an energy well. Configurations with low relative energy are more likely to duplicate. In molecular chemistry, and statistical physics Mean field particle methods are also used to sample Boltzmann-Gibbs measures associated with some cooling schedule, and to compute their normalizing constants (a.k.a. free energies, or partition functions). In
computational biology Computational biology refers to the use of data analysis, mathematical modeling and computational simulations to understand biological systems and relationships. An intersection of computer science, biology, and big data, the field also has fo ...
, and more specifically in
population genetics Population genetics is a subfield of genetics that deals with genetic differences within and between populations, and is a part of evolutionary biology. Studies in this branch of biology examine such phenomena as adaptation, speciation, and pop ...
, spatial
branching process In probability theory, a branching process is a type of mathematical object known as a stochastic process, which consists of collections of random variables. The random variables of a stochastic process are indexed by the natural numbers. The origi ...
es with competitive selection and migration mechanisms can also represented by mean field genetic type population dynamics models. The first moments of the occupation measures of a spatial branching process are given by Feynman-Kac distribution flows. The mean field genetic type approximation of these flows offers a fixed population size interpretation of these branching processes. Extinction probabilities can be interpreted as absorption probabilities of some Markov process evolving in some absorbing environment. These absorption models are represented by Feynman-Kac models. The long time behavior of these processes conditioned on non-extinction can be expressed in an equivalent way by
quasi-invariant measure In mathematics, a quasi-invariant measure ''μ'' with respect to a transformation ''T'', from a measure space ''X'' to itself, is a measure which, roughly speaking, is multiplied by a numerical function of ''T''. An important class of examples occ ...
s, Yaglom limits, or invariant measures of nonlinear normalized Feynman-Kac flows. In
computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical discipli ...
s, and more particularly in
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by animals and humans. Example tasks in which this is done include speech re ...
these mean field type
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
s are used as random search heuristics that mimic the process of evolution to generate useful solutions to complex optimization problems. These stochastic search algorithms belongs to the class of Evolutionary models. The idea is to propagate a population of feasible candidate solutions using mutation and selection mechanisms. The mean field interaction between the individuals is encapsulated in the selection and the cross-over mechanisms. In mean field games and multi-agent interacting systems theories, mean field particle processes are used to represent the collective behavior of complex systems with interacting individuals. In this context, the mean field interaction is encapsulated in the decision process of interacting agents. The limiting model as the number of agents tends to infinity is sometimes called the continuum model of agents In
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, and more specifically in statistical
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machine ...
and
signal processing Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing ''signals'', such as audio signal processing, sound, image processing, images, and scientific measurements. Signal processing techniq ...
, mean field particle methods are used to sample sequentially from the conditional distributions of some random process with respect to a sequence of observations or a cascade of
rare events Rare or extreme events are events that occur with low frequency, and often refers to infrequent events that have widespread impact and which might destabilize systems (for example, stock markets, ocean wave intensity or optical fibers or society). ...
. In discrete time nonlinear filtering problems, the conditional distributions of the random states of a signal given partial and noisy observations satisfy a nonlinear updating-prediction evolution equation. The updating step is given by
Bayes' rule In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule), named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For examp ...
, and the prediction step is a Chapman-Kolmogorov transport equation. The mean field particle interpretation of these nonlinear filtering equations is a genetic type selection-mutation particle algorithm During the mutation step, the particles evolve independently of one another according to the Markov transitions of the signal . During the selection stage, particles with small relative likelihood values are killed, while the ones with high relative values are multiplied. These mean field particle techniques are also used to solve multiple-object tracking problems, and more specifically to estimate association measures The continuous time version of these particle models are mean field Moran type particle interpretations of the robust optimal filter evolution equations or the Kushner-Stratonotich stochastic partial differential equation. These genetic type mean field particle algorithms also termed Particle Filters and Sequential Monte Carlo methods are extensively and routinely used in operation research and statistical inference . The term "particle filters" was first coined in 1996 by Del Moral, and the term "sequential Monte Carlo" by Liu and Chen in 1998.
Subset simulation Subset simulation is a method used in reliability engineering to compute small (i.e., rare event) failure probabilities encountered in engineering systems. The basic idea is to express a small failure probability as a product of larger conditional ...
and Monte Carlo splitting techniques are particular instances of genetic particle schemes and Feynman-Kac particle models equipped with
Markov chain Monte Carlo In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain ...
mutation transitions


Illustrations of the mean field simulation method


Countable state space models

To motivate the mean field simulation algorithm we start with ''S'' a
finite Finite is the opposite of infinite. It may refer to: * Finite number (disambiguation) * Finite set, a set whose cardinality (number of elements) is some natural number * Finite verb, a verb form that has a subject, usually being inflected or marked ...
or countable state space and let ''P''(''S'') denote the set of all probability measures on ''S''. Consider a sequence of
probability distributions In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon i ...
(\eta_0, \eta_1, \cdots) on ''S'' satisfying an evolution equation: for some, possibly nonlinear, mapping \Phi: P(S) \to P(S). These distributions are given by vectors :\eta_n=(\eta_n(x))_, that satisfy: :0 \leqslant \eta_n(x) \leqslant 1, \qquad \sum\nolimits_\eta_n(x)=1. Therefore, \Phi is a mapping from the (s-1)- unit simplex into itself, where ''s'' stands for the
cardinality In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...
of the set ''S''. When ''s'' is too large, solving equation () is intractable or computationally very costly. One natural way to approximate these evolution equations is to reduce sequentially the state space using a mean field particle model. One of the simplest mean field simulation scheme is defined by the Markov chain :\xi^_n=\left(\xi^_n, \cdots, \xi^_n \right) on the product space S^N, starting with ''N'' independent random variables with probability distribution \eta_0 and elementary transitions :\mathbf \left( \left. \xi^_=y^1,\cdots,\xi^_=y^N \right , \xi^_n\right)=\prod_^N \Phi\left(\eta_n^N\right)\left(y^i\right), with the
empirical measure In probability theory, an empirical measure is a random measure arising from a particular realization of a (usually finite) sequence of random variables. The precise definition is found below. Empirical measures are relevant to mathematical stat ...
:\eta^N_n=\frac\sum_^N1_ where 1_x is the
indicator function In mathematics, an indicator function or a characteristic function of a subset of a set is a function that maps elements of the subset to one, and all other elements to zero. That is, if is a subset of some set , one has \mathbf_(x)=1 if x\i ...
of the state ''x''. In other words, given \xi^_n the samples \xi^_ are independent random variables with probability distribution \Phi\left(\eta_n^N\right). The rationale behind this mean field simulation technique is the following: We expect that when \eta_^N is a good approximation of \eta_n, then \Phi\left(\eta_n^N\right) is an approximation of \Phi\left(\eta_n\right)=\eta_. Thus, since \eta_^N is the empirical measure of ''N'' conditionally independent random variables with common probability distribution \Phi\left(\eta_n^N\right), we expect \eta_^N to be a good approximation of \eta_. Another strategy is to find a collection :K_=\left(K_(x,y)\right)_ of stochastic matrices indexed by \eta_n\in P(S) such that This formula allows us to interpret the sequence (\eta_0, \eta_1, \cdots) as the probability distributions of the random states \left(\overline_0, \overline_1, \cdots \right) of the nonlinear Markov chain model with elementary transitions :\mathbf \left ( \left.\overline_=y \ \overline_n=x \right )=K_(x,y), \qquad \text(\overline_n)=\eta_n. A collection of Markov transitions K_ satisfying the equation () is called a McKean interpretation of the sequence of measures \eta_n. The mean field particle interpretation of () is now defined by the Markov chain :\xi^_n=\left(\xi^_n, \cdots, \xi^_n \right) on the product space S^N, starting with ''N'' independent random copies of X_0 and elementary transitions :\mathbf\left( \left. \xi^_=y^1,\cdots,\xi^_=y^N \right , \xi^_n\right)=\prod_^N K_\left(\xi^_n,y^i\right), with the empirical measure :\eta^N_n=\frac\sum_^N1_ Under some weak regularity conditions on the mapping \Phi for any function f: S\to \mathbf, we have the almost sure convergence : \frac\sum_^N f\left(\xi^_n\right)\to_E\left(f(\overline_n)\right)=\sum_\eta_n(x)f(x) These nonlinear Markov processes and their mean field particle interpretation can be extended to time non homogeneous models on general
measurable In mathematics, the concept of a measure is a generalization and formalization of Geometry#Length, area, and volume, geometrical measures (length, area, volume) and other common notions, such as mass and probability of events. These seemingly ...
state spaces.


Feynman-Kac models

To illustrate the abstract models presented above, we consider a stochastic matrix M=(M(x,y))_ and some function G : S \to (0,1). We associate with these two objects the mapping :\begin \Phi : P(S) \to P(S) \\ (\eta_n(x))_ \mapsto \left(\Phi(\eta_n)(y)\right)_ \end \qquad \Phi(\eta_n)(y)=\sum_ \Psi_(\eta_n)(x)M(x,y) and the Boltzmann-Gibbs measures \Psi_(\eta_n)(x) defined by :\Psi_(\eta_n)(x)=\frac. We denote by K_=\left(K_(x,y)\right)_ the collection of stochastic matrices indexed by \eta_n\in P(S) given by :K_(x,y)=\epsilon G(x) M(x,y)+(1-\epsilon G(x)) \Phi(\eta_n)(y) for some parameter \epsilon \in ,1/math>. It is readily checked that the equation () is satisfied. In addition, we can also show (cf. for instance) that the solution of () is given by the Feynman-Kac formula :\eta_n(x) =\frac, with a Markov chain X_n with initial distribution \eta_0 and Markov transition ''M''. For any function f : S\to \mathbf we have :\eta_n(f):=\sum_\eta_n(x)f(x) =\frac If G(x)=1 is the unit function and \epsilon=1, then we have :K_(x,y)=M(x,y)=\mathbf \left( \left. X_=y \right , X_n=x\right), \qquad \eta_n(x) =E\left(1_x(X_n)\right)=\mathbf(X_n=x). And the equation () reduces to the Chapman-Kolmogorov equation :\eta_(y)=\sum_\eta_n(x)M(x,y) \qquad \Leftrightarrow \qquad \mathbf\left(X_=y\right) =\sum_ \mathbf(X_=y, X_n=x) \mathbf\left(X_n=x\right) The mean field particle interpretation of this Feynman-Kac model is defined by sampling sequentially ''N'' conditionally independent random variables \xi^_ with probability distribution :K_\left(\xi^_n,y\right)=\epsilon G\left(\xi^_n\right) M\left(\xi^_n,y\right)+\left(1-\epsilon G\left(\xi^_n\right)\right) \sum_^N \frac M\left(\xi^_n,y\right) In other words, with a probability \epsilon G\left(\xi^_n\right) the particle \xi^_n evolves to a new state \xi^_=y randomly chosen with the probability distribution M\left(\xi^_n,y\right); otherwise, \xi^_n jumps to a new location \xi^_ randomly chosen with a probability proportional to G\left(\xi^_n\right) and evolves to a new state \xi^_=y randomly chosen with the probability distribution M\left(\xi^_n, y\right). If G(x)=1 is the unit function and \epsilon=1, the interaction between the particle vanishes and the particle model reduces to a sequence of independent copies of the Markov chain X_n. When \epsilon=0 the mean field particle model described above reduces to a simple mutation-selection genetic algorithm with fitness function ''G'' and mutation transition ''M''. These nonlinear Markov chain models and their mean field particle interpretation can be extended to time non homogeneous models on general measurable state spaces (including transition states, path spaces and random excursion spaces) and continuous time models.


Gaussian nonlinear state space models

We consider a sequence of real valued random variables \left (\overline_0, \overline_1, \cdots \right) defined sequentially by the equations with a collection W_n of independent standard Gaussian random variables, a positive parameter ''σ'', some functions a,b,c: \mathbf \to \mathbf, and some standard Gaussian initial random state \overline_0. We let \eta_n be the probability distribution of the random state \overline_n; that is, for any bounded
measurable function In mathematics and in particular measure theory, a measurable function is a function between the underlying sets of two measurable spaces that preserves the structure of the spaces: the preimage of any measurable set is measurable. This is in di ...
''f'', we have :E\left(f(\overline_n)\right)=\int_ f(x) \eta_n(dx), with :\mathbf \left (\overline_n\in dx \right )=\eta_n(dx) The integral is the
Lebesgue integral In mathematics, the integral of a non-negative function of a single variable can be regarded, in the simplest case, as the area between the graph of that function and the -axis. The Lebesgue integral, named after French mathematician Henri Lebe ...
, and ''dx'' stands for an infinitesimal neighborhood of the state ''x''. The Markov transition of the chain is given for any bounded measurable functions ''f'' by the formula :E\left( \left. f \left (\overline_ \right ) \right , \overline_n=x\right)=\int_ K_(x,dy) f(y), with :K_(x,dy)=\mathbf \left ( \left.\overline_\in dy\right , \overline_n=x \right )=\frac \exp dy Using the tower property of
conditional expectation In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take “on average” over an arbitrarily large number of occurrences – give ...
s we prove that the probability distributions \eta_n satisfy the nonlinear equation :\int_ \eta_(dy) f(y)=\int_\left int_ \eta_n(dx)K_(x,dy)\rightf(y) for any bounded measurable functions ''f''. This equation is sometimes written in the more synthetic form :\eta_ =\Phi\left(\eta_n\right)= \eta_nK_\quad\Leftrightarrow\quad\eta_(dy)= \left(\eta_nK_\right)(dy) =\int_\eta_n(dx)K_(x,dy) The mean field particle interpretation of this model is defined by the Markov chain :\xi^_n=\left(\xi^_n, \cdots, \xi^_n \right) on the product space \mathbf^N by :\xi^_=\left(\frac\sum_^N a\left(\xi^_n\right)\right) b\left(\xi^_n\right)+c\left(\xi^_n\right)+\sigma W^i_n\qquad 1\leqslant i\leqslant N where :\xi^_0= \left(\xi^_0, \cdots, \xi^_0\right), \qquad \left( W^1_n, \cdots, W^N_n\right) stand for ''N'' independent copies of \overline_0 and W_n; n \geqslant 1, respectively. For regular models (for instance for bounded Lipschitz functions ''a'', ''b'', ''c'') we have the almost sure convergence : \frac\sum_^N f\left(\xi^_n\right)=\int_ f(y) \eta^N_n(dy) \to_ E\left(f(\overline_n)\right) = \int_f(y)\eta_n(dy), with the empirical measure : \eta^N_n=\frac\sum_^N \delta_ for any bounded measurable functions ''f'' (cf. for instance ). In the above display, \delta_x stands for the
Dirac measure In mathematics, a Dirac measure assigns a size to a set based solely on whether it contains a fixed element ''x'' or not. It is one way of formalizing the idea of the Dirac delta function, an important tool in physics and other technical fields. ...
at the state ''x''.


Continuous time mean field models

We consider a
standard Brownian motion In mathematics, the Wiener process is a real-valued continuous-time stochastic process named in honor of American mathematician Norbert Wiener for his investigations on the mathematical properties of the one-dimensional Brownian motion. It is o ...
\overline_ (a.k.a.
Wiener Process In mathematics, the Wiener process is a real-valued continuous-time stochastic process named in honor of American mathematician Norbert Wiener for his investigations on the mathematical properties of the one-dimensional Brownian motion. It is o ...
) evaluated on a time mesh sequence t_0=0 with a given time step t_n-t_=h. We choose c(x)=x in equation (), we replace b(x) and ''σ'' by b(x) \times h and \sigma \times \sqrt, and we write \overline_ instead of \overline_n the values of the random states evaluated at the time step t_n. Recalling that \left(\overline_-\overline_\right) are independent centered Gaussian random variables with variance t_n-t_ = h, the resulting equation can be rewritten in the following form When ''h'' → 0, the above equation converge to the nonlinear diffusion process :d\overline_=E\left(a\left(\overline_\right)\right)b(\overline_)dt+\sigma d\overline_ The mean field continuous time model associated with these nonlinear diffusions is the (interacting) diffusion process \xi^_t=\left(\xi^_t\right)_ on the product space \mathbf^N defined by :d\xi^_=\left(\frac\sum_^N a\left(\xi^_t\right)\right)b\left(\xi^_t\right)+\sigma d\overline_^i\qquad 1\leqslant i\leqslant N where :\xi^_0= \left(\xi^_0, \cdots, \xi^_0\right), \qquad \left( \overline_^1, \cdots, \overline_t^N\right) are ''N'' independent copies of \overline_0 and \overline_t. For regular models (for instance for bounded Lipschitz functions ''a'', ''b'') we have the almost sure convergence :\frac\sum_^N f\left(\xi^_t\right)=\int_ f(y) \eta^N_t(dy)\to_ E\left(f(\overline_t)\right)=\int_ f(y) \eta_t(dy), with \eta_t=\text\left(\overline_\right), and the empirical measure : \eta^N_t=\frac\sum_^N \delta_ for any bounded measurable functions ''f'' (cf. for instance.). These nonlinear Markov processes and their mean field particle interpretation can be extended to interacting jump-diffusion processes


References


External links


Feynman-Kac models and interacting particle systems
theoretical aspects and a list of application domains of Feynman-Kac particle methods




QMC in Cambridge and around the world
general information about Quantum Monte Carlo * EVOLVER Software package for stochastic optimisation using genetic algorithms * CASINO Quantum Monte Carlo program developed by the Theory of Condensed Matter group at the Cavendish Laboratory in Cambridge
Biips is a probabilistic programming software for Bayesian inference with interacting particle systems.
{{Statistics Telecommunication theory Statistical data types Monte Carlo methods Statistical mechanics Sampling techniques Stochastic simulation Randomized algorithms Risk analysis methodologies