A Markov chain on a measurable state space is a
discrete-time-homogeneous Markov chain with a
measurable space as state space.
History
The definition of Markov chains has evolved during the 20th century. In 1953 the term Markov chain was used for
stochastic process
In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appea ...
es with discrete or continuous index set, living on a countable or finite state space, see Doob. or Chung. Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space.
[Daniel Revuz: ''Markov Chains''. 2nd edition, 1984.][Rick Durrett: ''Probability: Theory and Examples''. Fourth edition, 2005.]
Definition
Denote with
a measurable space and with
a
Markov kernel In probability theory, a Markov kernel (also known as a stochastic kernel or probability kernel) is a map that in the general theory of Markov processes plays the role that the transition matrix does in the theory of Markov processes with a finite ...
with source and target
.
A stochastic process
on
is called a time homogeneous Markov chain with Markov kernel
and start distribution
if
:
is satisfied for any
. One can construct for any Markov kernel and any probability measure an associated Markov chain.
Remark about Markov kernel integration
For any
measure
Measure may refer to:
* Measurement, the assignment of a number to a characteristic of an object or event
Law
* Ballot measure, proposed legislation in the United States
* Church of England Measure, legislation of the Church of England
* Mea ...
we denote for
-integrable function
the
Lebesgue integral as
. For the measure