In
probability
Probability is the branch of mathematics concerning numerical descriptions of how likely an Event (probability theory), event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and ...
, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a
stochastic process
In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appea ...
, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past. For instance, a machine may have two states, ''A'' and ''E''. When it is in state ''A'', there is a 40% chance of it moving to state ''E'' and a 60% chance of it remaining in state ''A''. When it is in state ''E'', there is a 70% chance of it moving to ''A'' and a 30% chance of it staying in ''E''. The sequence of states of the machine is a Markov chain. If we denote the chain by
then
is the state which the machine starts in and
is the
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
describing its state after 10 transitions. The process continues forever, indexed by the
natural numbers
In mathematics, the natural numbers are those numbers used for counting (as in "there are ''six'' coins on the table") and ordering (as in "this is the ''third'' largest city in the country").
Numbers used for counting are called ''cardinal n ...
.
An example of a stochastic process which is not a Markov chain is the model of a machine which has states ''A'' and ''E'' and moves to ''A'' from either state with 50% chance if it has ever visited ''A'' before, and 20% chance if it has never visited ''A'' before (leaving a 50% or 80% chance that the machine moves to ''E''). This is because the behavior of the machine depends on the whole history—if the machine is in ''E'', it may have a 50% or 20% chance of moving to ''A'', depending on its past values. Hence, it does not have the
Markov property.
A Markov chain can be described by a
stochastic matrix, which lists the probabilities of moving to each state from any individual state. From this matrix, the probability of being in a particular state ''n'' steps in the future can be calculated. A Markov chain's state space can be partitioned into communicating classes that describe which states are reachable from each other (in one transition or in many). Each state can be described as transient or recurrent, depending on the probability of the chain ever returning to that state. Markov chains can have properties including periodicity, reversibility and stationarity. A
continuous-time Markov chain is like a discrete-time Markov chain, but it moves states continuously through time rather than as discrete time steps. Other stochastic processes can satisfy the Markov property, the property that past behavior does not affect the process, only the present state.
Definition
A discrete-time Markov chain is a sequence of
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s
with the
Markov property, namely that the probability of moving to the next state depends only on the present state and not on the previous states:
:
if both
conditional probabilities are well defined, that is, if
The possible values of ''X''
''i'' form a
countable set
In mathematics, a set is countable if either it is finite or it can be made in one to one correspondence with the set of natural numbers. Equivalently, a set is ''countable'' if there exists an injective function from it into the natural numbers; ...
''S'' called the state space of the chain.
Markov chains are often described by a sequence of
directed graphs, where the edges of graph ''n'' are labeled by the probabilities of going from one state at time ''n'' to the other states at time ''n'' + 1,
The same information is represented by the transition matrix from time ''n'' to time ''n'' + 1. However, Markov chains are frequently assumed to be time-homogeneous (see variations below), in which case the graph and matrix are independent of ''n'' and are thus not presented as sequences.
These descriptions highlight the structure of the Markov chain that is independent of the initial distribution
When time-homogeneous, the chain can be interpreted as a
state machine
A finite-state machine (FSM) or finite-state automaton (FSA, plural: ''automata''), finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number o ...
assigning a probability of hopping from each vertex or state to an adjacent one. The probability
of the machine's state can be analyzed as the statistical behavior of the machine with an element
of the state space as input, or as the behavior of the machine with the initial distribution