Nearly Completely Decomposable Markov Chain
   HOME
*





Nearly Completely Decomposable Markov Chain
In probability theory, a nearly completely decomposable (NCD) Markov chain is a Markov chain where the state space can be partitioned in such a way that movement within a partition occurs much more frequently than movement between partitions. Particularly efficient algorithms exist to compute the stationary distribution of Markov chains with this property. Definition Ando and Fisher define a completely decomposable matrix as one where "an identical rearrangement of rows and columns leaves a set of square submatrices on the principal diagonal and zeros everywhere else." A nearly completely decomposable matrix is one where an identical rearrangement of rows and columns leaves a set of square submatrices on the principal diagonal and ''small nonzeros'' everywhere else. Example A Markov chain A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Markov Chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs ''now''." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov. Markov chains have many applications as statistical models of real-world processes, such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, currency exchange rates and animal population dynamics. Markov processes are the basis for general stochastic simulation methods known as Markov chain Monte Carlo, which are used for simulating sampling from complex probability dist ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Stationary Distribution
Stationary distribution may refer to: * A special distribution for a Markov chain such that if the chain starts with its stationary distribution, the marginal distribution of all states at any time will always be the stationary distribution. Assuming irreducibility, the stationary distribution is always unique if it exists, and its existence can be implied by positive recurrence of all states. The stationary distribution has the interpretation of the limiting distribution when the chain is irreducible and aperiodic. * The marginal distribution of a stationary process or stationary time series * The set of joint probability distributions of a stationary process or stationary time series In some fields of application, the term stable distribution is used for the equivalent of a stationary (marginal) distribution, although in probability and statistics the term has a rather different meaning: see stable distribution. Crudely stated, all of the above are specific cases of a common ge ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


SIAM Journal On Algebraic And Discrete Methods
The ''SIAM Journal on Matrix Analysis and Applications'' (until 1989: ''SIAM Journal on Algebraic and Discrete Methods'') is a peer-reviewed scientific journal covering matrix analysis and its applications. The relevant applications include signal processing, systems and control theory, statistics, Markov chains, mathematical biology, graph theory, and data science. The journal is published by the Society for Industrial and Applied Mathematics. The founding editor-in-chief was Gene H. Golub, who established the journal in 1980. The current editor is Michele Benzi (Scuola Normale Superiore). See also *Michele Benzi Michele Benzi (born 1962 in Bologna) is an Italian mathematician who works as a full professor in the Scuola Normale Superiore in Pisa. He is known for his contributions to numerical linear algebra and its applications, especially to the solu ... External links * Mathematics journals Publications established in 1980 English-language journals Quarterly jou ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Albert Ando
was a Japanese-born economist. Biography He was born in Tokyo, as a member of family running Ando Corporation, a major construction company. He didn't join the family business, and came to the United States after World War II. He received his B.S. in economics from the Seattle University in 1951, his M.A. in economics from St. Louis University in 1953, and an M.S. in economics in 1956 and a Ph.D. in mathematical economics in 1959 from Carnegie Institute of Technology (now Carnegie Mellon University). At Carnegie Mellon he collaborated, among others, with Herbert A. Simon on questions regarding aggregation and causation in economic systems and with Franco Modigliani on the life cycle analysis of saving, spending, and income. Albert Ando was a tenured professor of economics and finance at the University of Pennsylvania from 1967 until his death from leukemia in 2002. Awards and fellowships * Ford Foundation Faculty Research Fellow, 1970 * Japan Foundation Fellow * Alexander ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Franklin M
Franklin may refer to: People * Franklin (given name) * Franklin (surname) * Franklin (class), a member of a historical English social class Places Australia * Franklin, Tasmania, a township * Division of Franklin, federal electoral division in Tasmania * Division of Franklin (state), state electoral division in Tasmania * Franklin, Australian Capital Territory, a suburb in the Canberra district of Gungahlin * Franklin River, river of Tasmania * Franklin Sound, waterway of Tasmania Canada * District of Franklin, a former district of the Northwest Territories * Franklin, Quebec, a municipality in the Montérégie region * Rural Municipality of Franklin, Manitoba * Franklin, Manitoba, an unincorporated community in the Rural Municipality of Rosedale, Manitoba * Franklin Glacier Complex, a volcano in southwestern British Columbia * Franklin Range, a mountain range on Vancouver Island, British Columbia * Franklin River (Vancouver Island), British Columbia * Franklin Strait ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Submatrices
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a "-matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra. Therefore, the study of matrices is a large part of linear algebra, and most properties and operations of abstract linear algebra can be expressed in terms of matrices. For example, matrix multiplication represents composition of linear maps. Not all matrices are related to linear algebra. This is, in particular, the case in graph theory, of incidence matrices, and adjacency matrices. ''This article focuses on matrices related to linear algebra, and, unles ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Principal Diagonal
In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix A is the list of entries a_ where i = j. All off-diagonal elements are zero in a diagonal matrix. The following four matrices have their main diagonals indicated by red ones: :\begin \color & 0 & 0\\ 0 & \color & 0\\ 0 & 0 & \color\end \qquad \begin \color & 0 & 0 & 0 \\ 0 & \color & 0 & 0 \\ 0 & 0 & \color & 0 \end \qquad \begin \color & 0 & 0 \\ 0 & \color & 0 \\ 0 & 0 & \color \\ 0 & 0 & 0 \end \qquad \begin \color & 0 & 0 & 0 \\ 0 & \color & 0 & 0 \\ 0 & 0 &\color & 0 \\ 0 & 0 & 0 & \color \end \qquad Antidiagonal The antidiagonal (sometimes counter diagonal, secondary diagonal, trailing diagonal, minor diagonal, off diagonal, or bad diagonal) of an order N square matrix B is the collection of entries b_ such that i + j = N+1 for all 1 \leq i, j \leq N. That is, it runs from the top right corner to the bottom left corner. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Stochastic Matrix
In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, substitution matrix, or Markov matrix. The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics. There are several different definitions and types of stochastic matrices: :A right stochastic matrix is a real square matrix, with each row summing to 1. :A left stochastic matrix is a real square matrix, with each column summing to 1. :A doubly stochastic matrix is a square matrix of nonnegative real numbers with each row and column summing to 1. In the same vein, one may define a stochastic vector (also ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Lumpability
In probability theory, lumpability is a method for reducing the size of the state space of some continuous-time Markov chains, first published by Kemeny and Snell. Definition Suppose that the complete state-space of a Markov chain is divided into disjoint subsets of states, where these subsets are denoted by ''ti''. This forms a partition \scriptstyle of the states. Both the state-space and the collection of subsets may be either finite or countably infinite. A continuous-time Markov chain \ is lumpable with respect to the partition ''T'' if and only if, for any subsets ''ti'' and ''tj'' in the partition, and for any states ''n,n’'' in subset ''ti'', : \sum_ q(n,m) = \sum_ q(n',m) , where ''q''(''i,j'') is the transition rate from state ''i'' to state ''j''. Similarly, for a stochastic matrix ''P'', ''P'' is a lumpable matrix on a partition ''T'' if and only if, for any subsets ''ti'' and ''tj'' in the partition, and for any states ''n,n’'' in subset ''ti'', : \sum_ p(n,m) ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]