Conway–Maxwell–binomial Distribution
   HOME
*





Conway–Maxwell–binomial Distribution
In probability theory and statistics, the Conway–Maxwell–binomial (CMB) distribution is a three parameter discrete probability distribution that generalises the binomial distribution in an analogous manner to the way that the Conway–Maxwell–Poisson distribution generalises the Poisson distribution. The CMB distribution can be used to model both positive and negative association among the Bernoulli summands,. The distribution was introduced by Shumeli et al. (2005),Shmueli G., Minka T., Kadane J.B., Borle S., and Boatwright, P.B. "A useful distribution for fitting discrete data: revival of the Conway–Maxwell–Poisson distribution." Journal of the Royal Statistical Society: Series C (Applied Statistics) 54.1 (2005): 127–14/ref> and the name Conway–Maxwell–binomial distribution was introduced independently by Kadane (2016) Kadane, J.B. " Sums of Possibly Associated Bernoulli Variables: The Conway–Maxwell–Binomial Distribution." Bayesian Analysis 11 (2016): ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Probability Generating Function
In probability theory, the probability generating function of a discrete random variable is a power series representation (the generating function) of the probability mass function of the random variable. Probability generating functions are often employed for their succinct description of the sequence of probabilities Pr(''X'' = ''i'') in the probability mass function for a random variable ''X'', and to make available the well-developed theory of power series with non-negative coefficients. Definition Univariate case If ''X'' is a discrete random variable taking values in the non-negative integers , then the ''probability generating function'' of ''X'' is defined as http://www.am.qub.ac.uk/users/g.gribakin/sor/Chap3.pdf :G(z) = \operatorname (z^X) = \sum_^p(x)z^x, where ''p'' is the probability mass function of ''X''. Note that the subscripted notations ''G''''X'' and ''pX'' are often used to emphasize that these pertain to a particular random variable ''X'', and to its distr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Joint Distribution
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered for any given number of random variables. The joint distribution encodes the marginal distributions, i.e. the distributions of each of the individual random variables. It also encodes the conditional probability distributions, which deal with how the outputs of one random variable are distributed when given information on the outputs of the other random variable(s). In the formal mathematical setup of measure theory, the joint distribution is given by the pushforward measure, by the map obtained by pairing together the given random variables, of the sample space's probability measure. In the case of real-valued random variables, the joint distribution, as a particular multivariate distribution, may be expressed by a multivariate cumulativ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Convergence Of Random Variables
In probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to statistics and stochastic processes. The same concepts are known in more general mathematics as stochastic convergence and they formalize the idea that a sequence of essentially random or unpredictable events can sometimes be expected to settle down into a behavior that is essentially unchanging when items far enough into the sequence are studied. The different possible notions of convergence relate to how such a behavior can be characterized: two readily understood behaviors are that the sequence eventually takes a constant value, and that values in the sequence continue to change but can be described by an unchanging probability distribution. Background "Stochastic convergence" formalizes the idea that a sequence of essentially rando ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Integer
An integer is the number zero (), a positive natural number (, , , etc.) or a negative integer with a minus sign (−1, −2, −3, etc.). The negative numbers are the additive inverses of the corresponding positive numbers. In the language of mathematics, the set of integers is often denoted by the boldface or blackboard bold \mathbb. The set of natural numbers \mathbb is a subset of \mathbb, which in turn is a subset of the set of all rational numbers \mathbb, itself a subset of the real numbers \mathbb. Like the natural numbers, \mathbb is countably infinite. An integer may be regarded as a real number that can be written without a fractional component. For example, 21, 4, 0, and −2048 are integers, while 9.75, , and  are not. The integers form the smallest group and the smallest ring containing the natural numbers. In algebraic number theory, the integers are sometimes qualified as rational integers to distinguish them from the more general algebraic integers ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mode (statistics)
The mode is the value that appears most often in a set of data values. If is a discrete random variable, the mode is the value (i.e, ) at which the probability mass function takes its maximum value. In other words, it is the value that is most likely to be sampled. Like the statistical mean and median, the mode is a way of expressing, in a (usually) single number, important information about a random variable or a population. The numerical value of the mode is the same as that of the mean and median in a normal distribution, and it may be very different in highly skewed distributions. The mode is not necessarily unique to a given discrete distribution, since the probability mass function may take the same maximum value at several points , , etc. The most extreme case occurs in uniform distributions, where all values occur equally frequently. When the probability density function of a continuous distribution has multiple local maxima it is common to refer to all of the local ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Falling Factorial
In mathematics, the falling factorial (sometimes called the descending factorial, falling sequential product, or lower factorial) is defined as the polynomial :\begin (x)_n = x^\underline &= \overbrace^ \\ &= \prod_^n(x-k+1) = \prod_^(x-k) \,. \end The rising factorial (sometimes called the Pochhammer function, Pochhammer polynomial, ascending factorial, (A reprint of the 1950 edition by Chelsea Publishing Co.) rising sequential product, or upper factorial) is defined as :\begin x^ = x^\overline &= \overbrace^ \\ &= \prod_^n(x+k-1) = \prod_^(x+k) \,. \end The value of each is taken to be 1 (an empty product) when . These symbols are collectively called factorial powers. The Pochhammer symbol, introduced by Leo August Pochhammer, is the notation , where is a non-negative integer. It may represent ''either'' the rising or the falling factorial, with different articles and authors using different conventions. Pochhammer himself actually used with yet another meaning, namely to d ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Formula
In science, a formula is a concise way of expressing information symbolically, as in a mathematical formula or a ''chemical formula''. The informal use of the term ''formula'' in science refers to the general construct of a relationship between given quantities. The plural of ''formula'' can be either ''formulas'' (from the most common English plural noun form) or, under the influence of scientific Latin, ''formulae'' (from the original Latin). In mathematics In mathematics, a formula generally refers to an identity which equates one mathematical expression to another, with the most important ones being mathematical theorems. Syntactically, a formula (often referred to as a ''well-formed formula'') is an entity which is constructed using the symbols and formation rules of a given logical language. For example, determining the volume of a sphere requires a significant amount of integral calculus or its geometrical analogue, the method of exhaustion. However, having done t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Moment (mathematics)
In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment (normalized by total mass) is the center of mass, and the second moment is the moment of inertia. If the function is a probability distribution, then the first moment is the expected value, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis. The mathematical concept is closely related to the concept of moment in physics. For a distribution of mass or probability on a bounded interval, the collection of all the moments (of all orders, from to ) uniquely determines the distribution (Hausdorff moment problem). The same is not true on unbounded intervals (Hamburger moment problem). In the mid-nineteenth century, Pafnuty Chebyshev became the first person to think systematic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Characteristic Function
In mathematics, the term "characteristic function" can refer to any of several distinct concepts: * The indicator function of a subset, that is the function ::\mathbf_A\colon X \to \, :which for a given subset ''A'' of ''X'', has value 1 at points of ''A'' and 0 at points of ''X'' − ''A''. * There is an indicator function for affine varieties over a finite field: given a finite set of functions f_\alpha \in \mathbb_q _1,\ldots,x_n/math> let V = \left\ be their vanishing locus. Then, the function P(x) = \prod\left(1 - f_\alpha(x)^\right) acts as an indicator function for V. If x \in V then P(x) = 1, otherwise, for some f_\alpha, we have f_\alpha(x) \neq 0, which implies that f_\alpha(x)^ = 1, hence P(x) = 0. * The characteristic function in convex analysis, closely related to the indicator function of a set: *:\chi_A (x) := \begin 0, & x \in A; \\ + \infty, & x \not \in A. \end * In probability theory, the characteristic function of any probability distribution on the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Moment Generating Function
In probability theory and statistics, the moment-generating function of a real-valued random variable is an alternative specification of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables. However, not all random variables have moment-generating functions. As its name implies, the moment-generating function can be used to compute a distribution’s moments: the ''n''th moment about 0 is the ''n''th derivative of the moment-generating function, evaluated at 0. In addition to real-valued distributions (univariate distributions), moment-generating functions can be defined for vector- or matrix-valued random variables, and can even be extended to more general cases. The moment-generating func ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Exchangeable Random Variables
In statistics, an exchangeable sequence of random variables (also sometimes interchangeable) is a sequence ''X''1, ''X''2, ''X''3, ... (which may be finitely or infinitely long) whose joint probability distribution does not change when the positions in the sequence in which finitely many of them appear are altered. Thus, for example the sequences : X_1, X_2, X_3, X_4, X_5, X_6 \quad \text \quad X_3, X_6, X_1, X_5, X_2, X_4 both have the same joint probability distribution. It is closely related to the use of independent and identically distributed random variables in statistical models. Exchangeable sequences of random variables arise in cases of simple random sampling. Definition Formally, an exchangeable sequence of random variables is a finite or infinite sequence ''X''1, ''X''2, ''X''3, ... of random variables such that for any finite permutation σ of the indices 1, 2, 3, ..., (the permutation acts on only finitely many indices, with the res ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]