HOME



picture info

Conway–Maxwell–Poisson Distribution
In probability theory and statistics, the Conway–Maxwell–Poisson (CMP or COM–Poisson) distribution is a discrete probability distribution named after Richard W. Conway, William L. Maxwell, and Siméon Denis Poisson that generalizes the Poisson distribution by adding a parameter to model overdispersion and underdispersion. It is a member of the exponential family, has the Poisson distribution and geometric distribution as special cases and the Bernoulli distribution as a limiting case (mathematics), limiting case. Background The CMP distribution was originally proposed by Conway and Maxwell in 1962 as a solution to handling queueing systems with state-dependent service rates. The CMP distribution was introduced into the statistics literature by Boatwright et al. 2003 Boatwright, P., Borle, S. and Kadane, J.B. "A model of the joint distribution of purchase quantity and timing." Journal of the American Statistical Association 98 (2003): 564–572. and Shmueli et al. (2005). ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




CMP PMF
CMP may refer to: Arts and entertainment * , a Spanish manuscript of Renaissance music * ''Chipmunk Punk'', a 1990 album by The Chipmunks * CM Punk (born 1978), American wrestler Computing * cmp (Unix), a file-comparison utility * Certificate Management Protocol, an Internet protocol * Chip multiprocessor, a CPU die type * Cloud management platform * Consent management provider, for HTTP cookies Government, law and politics * Porto Municipal Chamber, Câmara Municipal do Porto, Portugal * Center for Medical Progress, United States * Citizens' Municipal Party, Australia (1935–1976) * Closed material procedures#Closed material procedures, Closed material procedures, in UK law courts * Comparative Manifestos Project, in political science * Congestion management program, California, US *Chattogram Metropolitan Police, Bangladesh *Republican Nation Party (), Turkey (1954-1958) Medicine * Cardiomyopathy, a heart muscle disease * Chondromalacia patellae, a degenerative condi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Journal Of The American Statistical Association
The ''Journal of the American Statistical Association'' is a quarterly peer-reviewed scientific journal published by Taylor & Francis on behalf of the American Statistical Association. It covers work primarily focused on the application of statistics, statistical theory and methods in economic, social, physical, engineering, and health sciences. The journal also includes reviews of books which are relevant to the field. The journal was established in 1888 as the ''Publications of the American Statistical Association''. It was renamed ''Quarterly Publications of the American Statistical Association'' in 1912, obtaining its current title in 1922. Reception According to the ''Journal Citation Reports ''Journal Citation Reports'' (''JCR'') is an annual publication by Clarivate. It has been integrated with the Web of Science and is accessed from the Web of Science Core Collection. It provides information about academic journals in the natur ...'', the journal has a 2023 impac ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Asymptotic Expansion
In mathematics, an asymptotic expansion, asymptotic series or Poincaré expansion (after Henri Poincaré) is a formal series of functions which has the property that truncating the series after a finite number of terms provides an approximation to a given function as the argument of the function tends towards a particular, often infinite, point. Investigations by revealed that the divergent part of an asymptotic expansion is latently meaningful, i.e. contains information about the exact value of the expanded function. The theory of asymptotic series was created by Poincaré (and independently by Stieltjes) in 1886. The most common type of asymptotic expansion is a power series in either positive or negative powers. Methods of generating such expansions include the Euler–Maclaurin summation formula and integral transforms such as the Laplace and Mellin transforms. Repeated integration by parts will often lead to an asymptotic expansion. Since a '' convergent'' Taylor s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Modified Bessel Function
Bessel functions, named after Friedrich Bessel who was the first to systematically study them in 1824, are canonical solutions of Bessel's differential equation x^2 \frac + x \frac + \left(x^2 - \alpha^2 \right)y = 0 for an arbitrary complex number \alpha, which represents the ''order'' of the Bessel function. Although \alpha and -\alpha produce the same differential equation, it is conventional to define different Bessel functions for these two values in such a way that the Bessel functions are mostly smooth functions of \alpha. The most important cases are when \alpha is an integer or half-integer. Bessel functions for integer \alpha are also known as cylinder functions or the cylindrical harmonics because they appear in the solution to Laplace's equation in cylindrical coordinates. #Spherical Bessel functions, Spherical Bessel functions with half-integer \alpha are obtained when solving the Helmholtz equation in spherical coordinates. Applications Bessel's equation arise ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Cumulants
In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will have identical cumulants as well, and vice versa. The first cumulant is the mean, the second cumulant is the variance, and the third cumulant is the same as the third central moment. But fourth and higher-order cumulants are not equal to central moments. In some cases theoretical treatments of problems in terms of cumulants are simpler than those using moments. In particular, when two or more random variables are statistically independent, the th-order cumulant of their sum is equal to the sum of their th-order cumulants. As well, the third and higher-order cumulants of a normal distribution are zero, and it is the only distribution with this property. Just as for moments, where ''joint moments'' are used for collections of random variables ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Variance
In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers is spread out from their average value. It is the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by \sigma^2, s^2, \operatorname(X), V(X), or \mathbb(X). An advantage of variance as a measure of dispersion is that it is more amenable to algebraic manipulation than other measures of dispersion such as the expected absolute deviation; for example, the variance of a sum of uncorrelated random variables is equal to the sum of their variances. A disadvantage of the variance for practical applications is that, unlike the standard deviation, its units differ from the random variable, which is why the standard devi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mean
A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statistics. Each attempts to summarize or typify a given group of data, illustrating the magnitude and sign of the data set. Which of these measures is most illuminating depends on what is being measured, and on context and purpose. The ''arithmetic mean'', also known as "arithmetic average", is the sum of the values divided by the number of values. The arithmetic mean of a set of numbers ''x''1, ''x''2, ..., x''n'' is typically denoted using an overhead bar, \bar. If the numbers are from observing a sample of a larger group, the arithmetic mean is termed the '' sample mean'' (\bar) to distinguish it from the group mean (or expected value) of the underlying distribution, denoted \mu or \mu_x. Outside probability and statistics, a wide rang ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Probability Generating Function
In probability theory, the probability generating function of a discrete random variable is a power series representation (the generating function) of the probability mass function of the random variable. Probability generating functions are often employed for their succinct description of the sequence of probabilities Pr(''X'' = ''i'') in the probability mass function for a random variable ''X'', and to make available the well-developed theory of power series with non-negative coefficients. Definition Univariate case If ''X'' is a discrete random variable taking values ''x'' in the non-negative integers , then the ''probability generating function'' of ''X'' is defined as G(z) = \operatorname (z^X) = \sum_^ p(x) z^x, where p is the probability mass function of X. Note that the subscripted notations G_X and p_X are often used to emphasize that these pertain to a particular random variable X, and to its distribution. The power series converges absolutely at least for all ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Generalized Hypergeometric Function
In mathematics, a generalized hypergeometric series is a power series in which the ratio of successive coefficients indexed by ''n'' is a rational function of ''n''. The series, if convergent, defines a generalized hypergeometric function, which may then be defined over a wider domain of the argument by analytic continuation. The generalized hypergeometric series is sometimes just called the hypergeometric series, though this term also sometimes just refers to the Gaussian hypergeometric series. Generalized hypergeometric functions include the (Gaussian) hypergeometric function and the confluent hypergeometric function as special cases, which in turn have many particular special functions as special cases, such as elementary functions, Bessel functions, and the orthogonal polynomials, classical orthogonal polynomials. Notation A hypergeometric series is formally defined as a power series :\beta_0 + \beta_1 z + \beta_2 z^2 + \dots = \sum_ \beta_n z^n in which the ratio of succe ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Cumulative Distribution Function
In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. Every probability distribution Support (measure theory), supported on the real numbers, discrete or "mixed" as well as Continuous variable, continuous, is uniquely identified by a right-continuous Monotonic function, monotone increasing function (a càdlàg function) F \colon \mathbb R \rightarrow [0,1] satisfying \lim_F(x)=0 and \lim_F(x)=1. In the case of a scalar continuous distribution, it gives the area under the probability density function from negative infinity to x. Cumulative distribution functions are also used to specify the distribution of multivariate random variables. Definition The cumulative distribution function of a real-valued random variable X is the function given by where the right-hand side represents the probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Normalization Constant
In probability theory, a normalizing constant or normalizing factor is used to reduce any probability function to a probability density function with total probability of one. For example, a Gaussian function can be normalized into a probability density function, which gives the standard normal distribution. In Bayes' theorem, a normalizing constant is used to ensure that the sum of all possible hypotheses equals 1. Other uses of normalizing constants include making the value of a Legendre polynomial at 1 and in the orthogonality of orthonormal functions. A similar concept has been used in areas other than probability, such as for polynomials. Definition In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g., to make it a probability density function or a probability mass function. Examples If we start from the simple Gaussian function p(x) = e^, \quad x\in(-\infty,\ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]