Markov Chains On A Measurable State Space
   HOME

TheInfoList



OR:

A Markov chain on a measurable state space is a discrete-time-homogeneous Markov chain with a measurable space as state space.


History

The definition of Markov chains has evolved during the 20th century. In 1953 the term Markov chain was used for
stochastic process In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appea ...
es with discrete or continuous index set, living on a countable or finite state space, see Doob. or Chung. Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space.Daniel Revuz: ''Markov Chains''. 2nd edition, 1984.Rick Durrett: ''Probability: Theory and Examples''. Fourth edition, 2005.


Definition

Denote with (E , \Sigma) a measurable space and with p a
Markov kernel In probability theory, a Markov kernel (also known as a stochastic kernel or probability kernel) is a map that in the general theory of Markov processes plays the role that the transition matrix does in the theory of Markov processes with a finite ...
with source and target (E , \Sigma). A stochastic process (X_n)_ on (\Omega,\mathcal,\mathbb) is called a time homogeneous Markov chain with Markov kernel p and start distribution \mu if : \mathbb _0 \in A_0 , X_1 \in A_1, \dots , X_n \in A_n= \int_ \dots \int_ p(y_,A_n) \, p(y_,dy_) \dots p(y_0,dy_1) \, \mu(dy_0) is satisfied for any n \in \mathbb, \, A_0,\dots,A_n \in \Sigma. One can construct for any Markov kernel and any probability measure an associated Markov chain.


Remark about Markov kernel integration

For any
measure Measure may refer to: * Measurement, the assignment of a number to a characteristic of an object or event Law * Ballot measure, proposed legislation in the United States * Church of England Measure, legislation of the Church of England * Mea ...
\mu \colon \Sigma \to ,\infty we denote for \mu-integrable function f \colon E \to \mathbb\cup\ the Lebesgue integral as \int_E f(x) \, \mu(dx) . For the measure \nu_x \colon \Sigma \to ,\infty/math> defined by \nu_x(A):= p(x,A) we used the following notation: :\int_E f(y) \, p(x,dy) :=\int_E f(y) \, \nu_x (dy).


Basic properties


Starting in a single point

If \mu is a Dirac measure in x, we denote for a Markov kernel p with starting distribution \mu the associated Markov chain as (X_n)_ on (\Omega,\mathcal,\mathbb_x) and the expectation value : \mathbb_x = \int_\Omega X(\omega) \, \mathbb_x(d\omega) for a \mathbb_x-integrable function X. By definition, we have then \mathbb_x _0 = x= 1 . We have for any measurable function f \colon E \to ,\infty/math> the following relation: :\int_E f(y) \, p(x,dy) = \mathbb_x
(X_1) X, or x, is the twenty-fourth and third-to-last letter in the Latin alphabet, used in the modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is ''"ex"'' (pronounced ), ...


Family of Markov kernels

For a Markov kernel p with starting distribution \mu one can introduce a family of Markov kernels (p_n)_ by :p_(x,A) := \int_E p_n(y,A) \, p(x,dy) for n \in \mathbb, \, n \geq 1 and p_1 := p. For the associated Markov chain (X_n)_ according to p and \mu one obtains :\mathbb _0 \in A , \, X_n \in B = \int_A p_n(x,B) \, \mu(dx).


Stationary measure

A probability measure \mu is called stationary measure of a Markov kernel p if :\int_A \mu(dx) = \int_E p(x,A) \, \mu(dx) holds for any A \in \Sigma. If (X_n)_ on (\Omega,\mathcal,\mathbb) denotes the Markov chain according to a Markov kernel p with stationary measure \mu, and the distribution of X_0 is \mu , then all X_n have the same probability distribution, namely: : \mathbb _n \in A = \mu(A) for any A \in \Sigma.


Reversibility

A Markov kernel p is called reversible according to a probability measure \mu if : \int_A p(x,B) \, \mu(dx) = \int_B p(x,A) \, \mu(dx) holds for any A,B \in \Sigma. Replacing A=E shows that if p is reversible according to \mu, then \mu must be a stationary measure of p.


See also

*
Harris chain In the mathematical study of stochastic processes, a Harris chain is a Markov chain where the chain returns to a particular part of the state space an unbounded number of times. Harris chains are regenerative processes and are named after Theodo ...
* Subshift of finite type


References

{{reflist Markov processes