Discrete Time Markov Chain
In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past. For instance, a machine may have two states, ''A'' and ''E''. When it is in state ''A'', there is a 40% chance of it moving to state ''E'' and a 60% chance of it remaining in state ''A''. When it is in state ''E'', there is a 70% chance of it moving to ''A'' and a 30% chance of it staying in ''E''. The sequence of states of the machine is a Markov chain. If we denote the chain by X_0, X_1, X_2, ... then X_0 is the state which the machine starts in and X_ is the random variable describing its state after 10 transitions. The process continues forever, indexed by the natural numbers. An example of a stochastic process which is not a Markov chain is the model of a machine which has states ''A'' and ''E'' and moves to ''A'' from either state with ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Equivalence Relation
In mathematics, an equivalence relation is a binary relation that is reflexive, symmetric and transitive. The equipollence relation between line segments in geometry is a common example of an equivalence relation. Each equivalence relation provides a partition of the underlying set into disjoint equivalence classes. Two elements of the given set are equivalent to each other if and only if they belong to the same equivalence class. Notation Various notations are used in the literature to denote that two elements a and b of a set are equivalent with respect to an equivalence relation R; the most common are "a \sim b" and "", which are used when R is implicit, and variations of "a \sim_R b", "", or "" to specify R explicitly. Non-equivalence may be written "" or "a \not\equiv b". Definition A binary relation \,\sim\, on a set X is said to be an equivalence relation, if and only if it is reflexive, symmetric and transitive. That is, for all a, b, and c in X: * a \sim a ( ref ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Matrix Norm
In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices (of given dimensions). Preliminaries Given a field K of either real or complex numbers, let K^ be the -vector space of matrices with m rows and n columns and entries in the field K. A matrix norm is a norm on K^. This article will always write such norms with double vertical bars (like so: \, A\, ). Thus, the matrix norm is a function \, \cdot\, : K^ \to \R that must satisfy the following properties: For all scalars \alpha \in K and matrices A, B \in K^, *\, A\, \ge 0 (''positive-valued'') *\, A\, = 0 \iff A=0_ (''definite'') *\left\, \alpha A\right\, =\left, \alpha\ \left\, A\right\, (''absolutely homogeneous'') *\, A+B\, \le \, A\, +\, B\, (''sub-additive'' or satisfying the ''triangle inequality'') The only feature distinguishing matrices from rearranged vectors is multiplication. Matrix norms are particularly useful if they are also sub-multiplicative: *\left\, ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Markov Chain Extremly Simple1
Markov (Bulgarian, russian: Марков), Markova, and Markoff are common surnames used in Russia and Bulgaria. Notable people with the name include: Academics *Ivana Markova (born 1938), Czechoslovak-British emeritus professor of psychology at the University of Stirling *John Markoff (sociologist) (born 1942), American professor of sociology and history at the University of Pittsburgh *Konstantin Markov (1905–1980), Soviet geomorphologist and quaternary geologist Mathematics, science, and technology *Alexander V. Markov (1965-), Russian biologist *Andrey Markov (1856–1922), Russian mathematician *Vladimir Andreevich Markov (1871–1897), Russian mathematician, brother of Andrey Markov (Sr.) *Andrey Markov Jr. (1903–1979), Russian mathematician and son of Andrey Markov *John Markoff (born 1949), American journalist of computer industry and technology *Moisey Markov (1908–1994), Russian physicist Performing arts *Albert Markov, Russian American violinist, composer * Alexan ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Inner Product Space
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often denoted with angle brackets such as in \langle a, b \rangle. Inner products allow formal definitions of intuitive geometric notions, such as lengths, angles, and orthogonality (zero inner product) of vectors. Inner product spaces generalize Euclidean vector spaces, in which the inner product is the dot product or ''scalar product'' of Cartesian coordinates. Inner product spaces of infinite dimension are widely used in functional analysis. Inner product spaces over the field of complex numbers are sometimes referred to as unitary spaces. The first usage of the concept of a vector space with an inner product is due to Giuseppe Peano, in 1898. An inner product naturally induces an associated norm, (denoted , x, and , y, in the picture); so, ev ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Kolmogorov's Criterion
In probability theory, Kolmogorov's criterion, named after Andrey Kolmogorov, is a theorem giving a necessary and sufficient condition for a Markov chain or continuous-time Markov chain to be stochastically identical to its time-reversed version. Discrete-time Markov chains The theorem states that an irreducible, positive recurrent, aperiodic Markov chain with transition matrix ''P'' is reversible if and only if its stationary Markov chain satisfies : p_ p_ \cdots p_ p_ = p_ p_ \cdots p_ p_ for all finite sequences of states : j_1, j_2, \ldots, j_n \in S . Here ''pij'' are components of the transition matrix ''P'', and ''S'' is the state space of the chain. Example Consider this figure depicting a section of a Markov chain with states ''i'', ''j'', ''k'' and ''l'' and the corresponding transition probabilities. Here Kolmogorov's criterion implies that the product of probabilities when traversing through any closed loop must be equal, so the product around the loop ''i'' t ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Detailed Balance
The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes (collisions, or steps, or elementary reactions). It states that at equilibrium, each elementary process is in equilibrium with its reverse process. History The principle of detailed balance was explicitly introduced for collisions by Ludwig Boltzmann. In 1872, he proved his H-theorem using this principle.Boltzmann, L. (1964), Lectures on gas theory, Berkeley, CA, USA: U. of California Press. The arguments in favor of this property are founded upon microscopic reversibility. Tolman, R. C. (1938). ''The Principles of Statistical Mechanics''. Oxford University Press, London, UK. Five years before Boltzmann, James Clerk Maxwell used the principle of detailed balance for gas kinetics with the reference to the principle of sufficient reason. He compared the idea of detailed balance with other types of balancing (like cyclic balance) and found that "Now it is impossible to as ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Absorbing Markov Chain
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left. Like general Markov chains, there can be continuous-time absorbing Markov chains with an infinite state space. However, this article concentrates on the discrete-time discrete-state-space case. Formal definition A Markov chain is an absorbing chain if # there is at least one absorbing state and # it is possible to go from any state to at least one absorbing state in a finite number of steps. In an absorbing Markov chain, a state that is not absorbing is called transient. Canonical form Let an absorbing Markov chain with transition matrix ''P'' have ''t'' transient states and ''r'' absorbing states. Unlike a typical transition matrix, the rows of ''P'' represent sources, while columns represent destinations. Then : P = \left( \begin Q & R\\ \mathbf & I_r \end \right), where '' ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Expected Value
In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a large number of independently selected outcomes of a random variable. The expected value of a random variable with a finite number of outcomes is a weighted average of all possible outcomes. In the case of a continuum of possible outcomes, the expectation is defined by integration. In the axiomatic foundation for probability provided by measure theory, the expectation is given by Lebesgue integration. The expected value of a random variable is often denoted by , , or , with also often stylized as or \mathbb. History The idea of the expected value originated in the middle of the 17th century from the study of the so-called problem of points, which seeks to divide the stakes ''in a fair way'' between two players, who have to end th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
If And Only If
In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false. The connective is biconditional (a statement of material equivalence), and can be likened to the standard material conditional ("only if", equal to "if ... then") combined with its reverse ("if"); hence the name. The result is that the truth of either one of the connected statements requires the truth of the other (i.e. either both statements are true, or both are false), though it is controversial whether the connective thus defined is properly rendered by the English "if and only if"—with its pre-existing meaning. For example, ''P if and only if Q'' means that ''P'' is true whenever ''Q'' is true, and the only case in which ''P'' is true is if ''Q'' is also true, whereas in the case of ''P if Q'', there could be other scenarios where ''P'' is true and ''Q'' is ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Greatest Common Divisor
In mathematics, the greatest common divisor (GCD) of two or more integers, which are not all zero, is the largest positive integer that divides each of the integers. For two integers ''x'', ''y'', the greatest common divisor of ''x'' and ''y'' is denoted \gcd (x,y). For example, the GCD of 8 and 12 is 4, that is, \gcd (8, 12) = 4. In the name "greatest common divisor", the adjective "greatest" may be replaced by "highest", and the word "divisor" may be replaced by "factor", so that other names include highest common factor (hcf), etc. Historically, other names for the same concept have included greatest common measure. This notion can be extended to polynomials (see Polynomial greatest common divisor) and other commutative rings (see below). Overview Definition The ''greatest common divisor'' (GCD) of two nonzero integers and is the greatest positive integer such that is a divisor of both and ; that is, there are integers and such that and , and is the largest s ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |