Lumpable Markov Chain
   HOME
*





Lumpable Markov Chain
In probability theory, lumpability is a method for reducing the size of the state space of some continuous-time Markov chains, first published by Kemeny and Snell. Definition Suppose that the complete state-space of a Markov chain is divided into disjoint subsets of states, where these subsets are denoted by ''ti''. This forms a partition Partition may refer to: Computing Hardware * Disk partitioning, the division of a hard disk drive * Memory partition, a subdivision of a computer's memory, usually for use by a single job Software * Partition (database), the division of a ... \scriptstyle of the states. Both the state-space and the collection of subsets may be either finite or countably infinite. A continuous-time Markov chain \ is lumpable with respect to the partition ''T'' if and only if, for any subsets ''ti'' and ''tj'' in the partition, and for any states ''n,n’'' in subset ''ti'', : \sum_ q(n,m) = \sum_ q(n',m) , where ''q''(''i,j'') is the transition rate f ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Continuous-time Markov Chain
A continuous-time Markov chain (CTMC) is a continuous stochastic process in which, for each state, the process will change state according to an exponential random variable and then move to a different state as specified by the probabilities of a stochastic matrix. An equivalent formulation describes the process as changing state according to the least value of a set of exponential random variables, one for each possible state it can move to, with the parameters determined by the current state. An example of a CTMC with three states \ is as follows: the process makes a transition after the amount of time specified by the holding time—an exponential random variable E_i, where ''i'' is its current state. Each random variable is independent and such that E_0\sim \text(6), E_1\sim \text(12) and E_2\sim \text(18). When a transition is to be made, the process moves according to the jump chain, a discrete-time Markov chain with stochastic matrix: :\begin 0 & \frac & \frac \\ \frac & 0 ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

John George Kemeny
John George Kemeny (born Kemény János György; May 31, 1926 – December 26, 1992) was a Hungarian-born American mathematician, computer scientist, and educator best known for co-developing the BASIC programming language in 1964 with Thomas E. Kurtz and Mary Kenneth Keller. Kemeny served as the 13th President of Dartmouth College from 1970 to 1981 and pioneered the use of computers in college education. Kemeny chaired the presidential commission that investigated the Three Mile Island accident in 1979. According to György Marx he was one of The Martians. Early life Born in Budapest, Hungary, into a Jewish family, Kemeny attended the Rácz private primary school in Budapest and was a classmate of Nándor Balázs. In 1938 his father left for the United States alone. In 1940, he took the whole Kemeny family to the United States when the adoption of the second anti-Jewish law in Hungary became imminent. His grandfather, however, refused to leave and was murdered in the Holocaust ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Markov Chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs ''now''." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov. Markov chains have many applications as statistical models of real-world processes, such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, currency exchange rates and animal population dynamics. Markov processes are the basis for general stochastic simulation methods known as Markov chain Monte Carlo, which are used for simulating sampling from complex probability dist ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Partition Of A Set
In mathematics, a partition of a set is a grouping of its elements into non-empty subsets, in such a way that every element is included in exactly one subset. Every equivalence relation on a set defines a partition of this set, and every partition defines an equivalence relation. A set equipped with an equivalence relation or a partition is sometimes called a setoid, typically in type theory and proof theory. Definition and Notation A partition of a set ''X'' is a set of non-empty subsets of ''X'' such that every element ''x'' in ''X'' is in exactly one of these subsets (i.e., ''X'' is a disjoint union of the subsets). Equivalently, a family of sets ''P'' is a partition of ''X'' if and only if all of the following conditions hold: *The family ''P'' does not contain the empty set (that is \emptyset \notin P). *The union of the sets in ''P'' is equal to ''X'' (that is \textstyle\bigcup_ A = X). The sets in ''P'' are said to exhaust or cover ''X''. See also collectively exhaus ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Jane Hillston
Jane Elizabeth Hillston (born 1963) is British professor of Quantitative Modelling and Head of School in the School of Informatics, University of Edinburgh, Scotland. Early life and education Hillston received a BA in Mathematics from the University of York in 1985, an MSc in Mathematics from Lehigh University in the United States in 1987 and a PhD in Computer Science from the University of Edinburgh in 1994, where she has spent her subsequent academic career. Her PhD thesis was awarded the BCS/ CPHC Distinguished Dissertation Awards in 1995 and has been published by Cambridge University Press. Research and career She has been an EPSRC Research Fellow (1994–1995), Lecturer (1995–2001), Reader (2001–2006) and Professor of Quantitative Modelling since 2006. Hillston is a member of the Laboratory for Foundations of Computer Science at Edinburgh. In 2018 she was appointed Head of the School of Informatics at Edinburgh, taking over from Johanna Moore. Jane Hillston is ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Stochastic Matrix
In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, substitution matrix, or Markov matrix. The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics. There are several different definitions and types of stochastic matrices: :A right stochastic matrix is a real square matrix, with each row summing to 1. :A left stochastic matrix is a real square matrix, with each column summing to 1. :A doubly stochastic matrix is a square matrix of nonnegative real numbers with each row and column summing to 1. In the same vein, one may define a stochastic vector (also ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Peter Harrison (computer Scientist)
Peter George Harrison (born 1951) is an Emeritus Professor of Computing Science at Imperial College London known for the reversed compound agent theorem, which gives conditions for a stochastic network to have a product-form solution. Harrison attended Christ's College, Cambridge, where he was a Wrangler in Mathematics (1972) and gained a Distinction in Part III of the Mathematical Tripos (1973), winning the Mayhew Prize for Applied Mathematics. After spending two years in industry, Harrison moved to Imperial College, London where he has worked since, obtaining his Ph.D. in Computing Science in 1979 with a thesis titled "Representative queueing network models of computer systems in terms of time delay probability distributions" and lecturing since 1983. Current research interests include parallel algorithms, performance engineering, queueing theory, stochastic models and stochastic process algebra, particularly the application of RCAT to find product-form solutions. Harriso ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Michael N
Michael may refer to: People * Michael (given name), a given name * Michael (surname), including a list of people with the surname Michael Given name "Michael" * Michael (archangel), ''first'' of God's archangels in the Jewish, Christian and Islamic religions * Michael (bishop elect), English 13th-century Bishop of Hereford elect * Michael (Khoroshy) (1885–1977), cleric of the Ukrainian Orthodox Church of Canada * Michael Donnellan (fashion designer), Michael Donnellan (1915–1985), Irish-born London fashion designer, often referred to simply as "Michael" * Michael (footballer, born 1982), Brazilian footballer * Michael (footballer, born 1983), Brazilian footballer * Michael (footballer, born 1993), Brazilian footballer * Michael (footballer, born February 1996), Brazilian footballer * Michael (footballer, born March 1996), Brazilian footballer * Michael (footballer, born 1999), Brazilian footballer Rulers =Byzantine emperors= *Michael I Rangabe (d. 844), married the d ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Performance Evaluation
A performance appraisal, also referred to as a performance review, performance evaluation,Muchinsky, P. M. (2012). ''Psychology Applied to Work'' (10th ed.). Summerfield, NC: Hypergraphic Press. (career) development discussion, or employee appraisal, sometimes shortened to "PA", is a periodic and systematic process whereby the job performance of an employee is documented and evaluated. This is done after employees are trained about work and settle into their jobs. Performance appraisals are a part of career development and consist of regular reviews of employee performance within organizations. Performance appraisals are most often conducted by an employee's immediate manager or line manager. While extensively practiced, annual performance reviews have also been criticized as providing feedback too infrequently to be useful, and some critics argue that performance reviews in general do more harm than good. It is an element of the principal-agent framework, that describes the re ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Nearly Completely Decomposable Markov Chain
In probability theory, a nearly completely decomposable (NCD) Markov chain is a Markov chain where the state space can be partitioned in such a way that movement within a partition occurs much more frequently than movement between partitions. Particularly efficient algorithms exist to compute the stationary distribution of Markov chains with this property. Definition Ando and Fisher define a completely decomposable matrix as one where "an identical rearrangement of rows and columns leaves a set of square submatrices on the principal diagonal and zeros everywhere else." A nearly completely decomposable matrix is one where an identical rearrangement of rows and columns leaves a set of square submatrices on the principal diagonal and ''small nonzeros'' everywhere else. Example A Markov chain A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]