In
probability theory
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
, the mixing time of a
Markov chain
In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally ...
is the time until the Markov chain is "close" to its
steady state
In systems theory, a system or a process is in a steady state if the variables (called state variables) which define the behavior of the system or the process are unchanging in time. In continuous time, this means that for those properties ''p' ...
distribution.
More precisely, a fundamental result about
Markov chains is that a finite state irreducible aperiodic chain has a unique stationary distribution and, regardless of the initial state, the time-''t'' distribution of the chain converges to as ''t'' tends to infinity. Mixing time refers to any of several variant formalizations of the idea: how large must ''t'' be until the time-''t'' distribution is approximately ? One variant, ''total variation distance mixing time'', is defined as the smallest ''t'' such that the
total variation distance of probability measures is small:
:
Choosing a different
, as long as
, can only change the mixing time up to a constant factor (depending on
) and so one often fixes
and simply writes
.
This is the sense in which proved that the number of riffle
shuffles needed to mix an ordinary 52 card deck is 7. Mathematical theory focuses on how mixing times change as a function of the size of the structure underlying the chain. For an
-card deck, the number of riffle shuffles needed grows as
. The most developed theory concerns
randomized algorithms
A randomized algorithm is an algorithm that employs a degree of randomness as part of its logic or procedure. The algorithm typically uses Uniform distribution (discrete), uniformly random bits as an auxiliary input to guide its behavior, in the ...
for
#P-complete algorithmic counting problems such as the number of
graph coloring
In graph theory, graph coloring is a methodic assignment of labels traditionally called "colors" to elements of a Graph (discrete mathematics), graph. The assignment is subject to certain constraints, such as that no two adjacent elements have th ...
s of a given
vertex graph. Such problems can, for sufficiently large number of colors, be answered using the
Markov chain Monte Carlo
In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that ...
method and showing that the mixing time grows only as
. This example and the shuffling example possess the rapid mixing property, that the mixing time grows at most polynomially fast in
(number of states of the chain). Tools for proving rapid mixing include arguments based on
conductance and the method of
coupling. In broader uses of the Markov chain
Monte Carlo method
Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be ...
, rigorous justification of simulation results would require a theoretical bound on mixing time, and many interesting practical cases have resisted such theoretical analysis.
See also
*
Mixing (mathematics) for a formal definition of mixing
References
*.
*.
*.
*.
*{{citation
, last = Sinclair , first = Alistair
, doi = 10.1007/978-1-4612-0323-0
, isbn = 0-8176-3658-7
, mr = 1201590
, publisher = Birkhäuser Boston, Inc., Boston, MA
, series = Progress in Theoretical Computer Science
, title = Algorithms for random generation and counting: A Markov chain approach
, year = 1993.
Markov processes