HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
and statistics, the chi-squared distribution (also chi-square or \chi^2-distribution) with k degrees of freedom is the distribution of a sum of the squares of k
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
standard normal random variables. The chi-squared distribution is a special case of the
gamma distribution In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-square distribution are special cases of the gamma d ...
and is one of the most widely used probability distributions in
inferential statistics Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability.Upton, G., Cook, I. (2008) ''Oxford Dictionary of Statistics'', OUP. . Inferential statistical analysis infers propertie ...
, notably in
hypothesis testing A statistical hypothesis test is a method of statistical inference used to decide whether the data at hand sufficiently support a particular hypothesis. Hypothesis testing allows us to make probabilistic statements about population parameters. ...
and in construction of confidence intervals. This distribution is sometimes called the central chi-squared distribution, a special case of the more general
noncentral chi-squared distribution In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral \chi^2 distribution) is a noncentral generalization of the chi-squared distribution. It often arises in the power a ...
. The chi-squared distribution is used in the common
chi-squared test A chi-squared test (also chi-square or test) is a statistical hypothesis test used in the analysis of contingency tables when the sample sizes are large. In simpler terms, this test is primarily used to examine whether two categorical variables ...
s for
goodness of fit The goodness of fit of a statistical model describes how well it fits a set of observations. Measures of goodness of fit typically summarize the discrepancy between observed values and the values expected under the model in question. Such measure ...
of an observed distribution to a theoretical one, the
independence Independence is a condition of a person, nation, country, or state in which residents and population, or some portion thereof, exercise self-government, and usually sovereignty, over its territory. The opposite of independence is the statu ...
of two criteria of classification of
qualitative data Qualitative properties are properties that are observed and can generally not be measured with a numerical result. They are contrasted to quantitative properties which have numerical characteristics. Some engineering and scientific properties are ...
, and in confidence interval estimation for a population standard deviation of a normal distribution from a sample standard deviation. Many other statistical tests also use this distribution, such as Friedman's analysis of variance by ranks.


Definitions

If are
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
, standard normal random variables, then the sum of their squares, : Q\ = \sum_^k Z_i^2, is distributed according to the chi-squared distribution with degrees of freedom. This is usually denoted as : Q\ \sim\ \chi^2(k)\ \ \text\ \ Q\ \sim\ \chi^2_k. The chi-squared distribution has one parameter: a positive integer that specifies the number of degrees of freedom (the number of random variables being summed, ''Z''''i'' s).


Introduction

The chi-squared distribution is used primarily in hypothesis testing, and to a lesser extent for confidence intervals for population variance when the underlying distribution is normal. Unlike more widely known distributions such as the
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
and the exponential distribution, the chi-squared distribution is not as often applied in the direct modeling of natural phenomena. It arises in the following hypothesis tests, among others: *
Chi-squared test A chi-squared test (also chi-square or test) is a statistical hypothesis test used in the analysis of contingency tables when the sample sizes are large. In simpler terms, this test is primarily used to examine whether two categorical variables ...
of independence in
contingency tables In statistics, a contingency table (also known as a cross tabulation or crosstab) is a type of table in a matrix format that displays the (multivariate) frequency distribution of the variables. They are heavily used in survey research, business ...
*
Chi-squared test A chi-squared test (also chi-square or test) is a statistical hypothesis test used in the analysis of contingency tables when the sample sizes are large. In simpler terms, this test is primarily used to examine whether two categorical variables ...
of goodness of fit of observed data to hypothetical distributions *
Likelihood-ratio test In statistics, the likelihood-ratio test assesses the goodness of fit of two competing statistical models based on the ratio of their likelihoods, specifically one found by maximization over the entire parameter space and another found after im ...
for nested models *
Log-rank test The logrank test, or log-rank test, is a hypothesis test to compare the survival distributions of two samples. It is a nonparametric test and appropriate to use when the data are right skewed and censored (technically, the censoring must be non ...
in survival analysis * Cochran–Mantel–Haenszel test for stratified contingency tables *
Wald test In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the ...
*
Score test In statistics, the score test assesses constraints on statistical parameters based on the gradient of the likelihood function—known as the ''score''—evaluated at the hypothesized parameter value under the null hypothesis. Intuitively, if the ...
It is also a component of the definition of the ''t''-distribution and the ''F''-distribution used in ''t''-tests, analysis of variance, and regression analysis. The primary reason for which the chi-squared distribution is extensively used in hypothesis testing is its relationship to the normal distribution. Many hypothesis tests use a test statistic, such as the ''t''-statistic in a ''t''-test. For these hypothesis tests, as the sample size, , increases, the
sampling distribution In statistics, a sampling distribution or finite-sample distribution is the probability distribution of a given random-sample-based statistic. If an arbitrarily large number of samples, each involving multiple observations (data points), were sep ...
of the test statistic approaches the normal distribution (
central limit theorem In probability theory, the central limit theorem (CLT) establishes that, in many situations, when independent random variables are summed up, their properly normalized sum tends toward a normal distribution even if the original variables themsel ...
). Because the test statistic (such as ) is asymptotically normally distributed, provided the sample size is sufficiently large, the distribution used for hypothesis testing may be approximated by a normal distribution. Testing hypotheses using a normal distribution is well understood and relatively easy. The simplest chi-squared distribution is the square of a standard normal distribution. So wherever a normal distribution could be used for a hypothesis test, a chi-squared distribution could be used. Suppose that Z is a random variable sampled from the standard normal distribution, where the mean is 0 and the variance is 1: Z \sim N(0,1). Now, consider the random variable Q = Z^2. The distribution of the random variable Q is an example of a chi-squared distribution: \ Q\ \sim\ \chi^2_1. The subscript 1 indicates that this particular chi-squared distribution is constructed from only 1 standard normal distribution. A chi-squared distribution constructed by squaring a single standard normal distribution is said to have 1 degree of freedom. Thus, as the sample size for a hypothesis test increases, the distribution of the test statistic approaches a normal distribution. Just as extreme values of the normal distribution have low probability (and give small p-values), extreme values of the chi-squared distribution have low probability. An additional reason that the chi-squared distribution is widely used is that it turns up as the large sample distribution of generalized likelihood ratio tests (LRT). LRTs have several desirable properties; in particular, simple LRTs commonly provide the highest power to reject the null hypothesis (
Neyman–Pearson lemma In statistics, the Neyman–Pearson lemma was introduced by Jerzy Neyman and Egon Pearson in a paper in 1933. The Neyman-Pearson lemma is part of the Neyman-Pearson theory of statistical testing, which introduced concepts like errors of the seco ...
) and this leads also to optimality properties of generalised LRTs. However, the normal and chi-squared approximations are only valid asymptotically. For this reason, it is preferable to use the ''t'' distribution rather than the normal approximation or the chi-squared approximation for a small sample size. Similarly, in analyses of contingency tables, the chi-squared approximation will be poor for a small sample size, and it is preferable to use
Fisher's exact test Fisher's exact test is a statistical significance test used in the analysis of contingency tables. Although in practice it is employed when sample sizes are small, it is valid for all sample sizes. It is named after its inventor, Ronald Fisher, a ...
. Ramsey shows that the exact
binomial test In statistics, the binomial test is an exact test of the statistical significance of deviations from a theoretically expected distribution of observations into two categories using sample data. Usage The binomial test is useful to test hypoth ...
is always more powerful than the normal approximation. Lancaster shows the connections among the binomial, normal, and chi-squared distributions, as follows. De Moivre and Laplace established that a binomial distribution could be approximated by a normal distribution. Specifically they showed the asymptotic normality of the random variable : \chi = where m is the observed number of successes in N trials, where the probability of success is p, and q = 1 - p. Squaring both sides of the equation gives \chi^2 = Using N = Np + N(1 - p), N = m + (N - m), and q = 1 - p, this equation can be rewritten as \chi^2 = + The expression on the right is of the form that Karl Pearson would generalize to the form \chi^2 = \sum_^n \frac where \chi^2 = Pearson's cumulative test statistic, which asymptotically approaches a \chi^2 distribution; O_i = the number of observations of type i; E_i = N p_i = the expected (theoretical) frequency of type i, asserted by the null hypothesis that the fraction of type i in the population is p_i; and n = the number of cells in the table. In the case of a binomial outcome (flipping a coin), the binomial distribution may be approximated by a normal distribution (for sufficiently large n). Because the square of a standard normal distribution is the chi-squared distribution with one degree of freedom, the probability of a result such as 1 heads in 10 trials can be approximated either by using the normal distribution directly, or the chi-squared distribution for the normalised, squared difference between observed and expected value. However, many problems involve more than the two possible outcomes of a binomial, and instead require 3 or more categories, which leads to the multinomial distribution. Just as de Moivre and Laplace sought for and found the normal approximation to the binomial, Pearson sought for and found a degenerate multivariate normal approximation to the multinomial distribution (the numbers in each category add up to the total sample size, which is considered fixed). Pearson showed that the chi-squared distribution arose from such a multivariate normal approximation to the multinomial distribution, taking careful account of the statistical dependence (negative correlations) between numbers of observations in different categories.


Probability density function

The
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) ca ...
(pdf) of the chi-squared distribution is : f(x;\,k) = \begin \dfrac, & x > 0; \\ 0, & \text. \end where \Gamma(k/2) denotes the
gamma function In mathematics, the gamma function (represented by , the capital letter gamma from the Greek alphabet) is one commonly used extension of the factorial function to complex numbers. The gamma function is defined for all complex numbers except ...
, which has closed-form values for integer k. For derivations of the pdf in the cases of one, two and k degrees of freedom, see Proofs related to chi-squared distribution.


Cumulative distribution function

Its cumulative distribution function is: : F(x;\,k) = \frac = P\left(\frac,\,\frac\right), where \gamma(s,t) is the
lower incomplete gamma function In mathematics, the upper and lower incomplete gamma functions are types of special functions which arise as solutions to various mathematical problems such as certain integrals. Their respective names stem from their integral definitions, which ...
and P(s,t) is the
regularized gamma function In mathematics, the upper and lower incomplete gamma functions are types of special functions which arise as solutions to various mathematical problems such as certain integrals. Their respective names stem from their integral definitions, whic ...
. In a special case of k = 2 this function has the simple form: : F(x;\,2) = 1 - e^ which can be easily derived by integrating f(x;\,2)=\frace^ directly. The integer recurrence of the gamma function makes it easy to compute F(x;\,k) for other small, even k. Tables of the chi-squared cumulative distribution function are widely available and the function is included in many
spreadsheet A spreadsheet is a computer application for computation, organization, analysis and storage of data in tabular form. Spreadsheets were developed as computerized analogs of paper accounting worksheets. The program operates on data entered in c ...
s and all statistical packages. Letting z \equiv x/k, Chernoff bounds on the lower and upper tails of the CDF may be obtained. For the cases when 0 < z < 1 (which include all of the cases when this CDF is less than half): F(z k;\,k) \leq (z e^)^. The tail bound for the cases when z > 1, similarly, is : 1-F(z k;\,k) \leq (z e^)^. For another approximation for the CDF modeled after the cube of a Gaussian, see under
Noncentral chi-squared distribution In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral \chi^2 distribution) is a noncentral generalization of the chi-squared distribution. It often arises in the power a ...
.


Properties


Cochran's theorem

If are
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
identically distributed (i.i.d.), standard normal random variables, then \sum_^k(Z_i - \overline)^2 \sim \chi^2_ where \overline Z = \frac \sum_^k Z_i.


Additivity

It follows from the definition of the chi-squared distribution that the sum of independent chi-squared variables is also chi-squared distributed. Specifically, if X_i,i=\overline are independent chi-squared variables with k_i, i=\overline degrees of freedom, respectively, then Y = X_1 + ... + X_n is chi-squared distributed with k_1 + ... + k_n degrees of freedom.


Sample mean

The sample mean of n
i.i.d. In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is us ...
chi-squared variables of degree k is distributed according to a gamma distribution with shape \alpha and scale \theta parameters: : \overline X = \frac \sum_^n X_i \sim \operatorname\left(\alpha=n\, k /2, \theta= 2/n \right) \qquad \text X_i \sim \chi^2(k) Asymptotically, given that for a scale parameter \alpha going to infinity, a Gamma distribution converges towards a normal distribution with expectation \mu = \alpha\cdot \theta and variance \sigma^2 = \alpha\, \theta^2 , the sample mean converges towards: \overline X \xrightarrow N(\mu = k, \sigma^2 = 2\, k /n ) Note that we would have obtained the same result invoking instead the
central limit theorem In probability theory, the central limit theorem (CLT) establishes that, in many situations, when independent random variables are summed up, their properly normalized sum tends toward a normal distribution even if the original variables themsel ...
, noting that for each chi-squared variable of degree k the expectation is k , and its variance 2\,k (and hence the variance of the sample mean \overline being \sigma^2 = \frac ).


Entropy

The
differential entropy Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuo ...
is given by : h = \int_^\infty f(x;\,k)\ln f(x;\,k) \, dx = \frac k 2 + \ln \left \,\Gamma \left(\frac k 2 \right)\right+ \left(1-\frac k 2 \right)\, \psi\!\left(\frac k 2 \right), where \psi(x) is the
Digamma function In mathematics, the digamma function is defined as the logarithmic derivative of the gamma function: :\psi(x)=\frac\ln\big(\Gamma(x)\big)=\frac\sim\ln-\frac. It is the first of the polygamma functions. It is strictly increasing and strict ...
. The chi-squared distribution is the
maximum entropy probability distribution In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entro ...
for a random variate X for which \operatorname(X)=k and \operatorname(\ln(X))=\psi(k/2)+\ln(2) are fixed. Since the chi-squared is in the family of gamma distributions, this can be derived by substituting appropriate values in the Expectation of the log moment of gamma. For derivation from more basic principles, see the derivation in moment-generating function of the sufficient statistic.


Noncentral moments

The moments about zero of a chi-squared distribution with k degrees of freedom are given by : \operatorname(X^m) = k (k+2) (k+4) \cdots (k+2m-2) = 2^m \frac.


Cumulants

The
cumulant In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will have ...
s are readily obtained by a (formal) power series expansion of the logarithm of the characteristic function: : \kappa_n = 2^(n-1)!\,k


Concentration

The chi-squared distribution exhibits strong concentration around its mean. The standard Laurent-Massart bounds are: : \operatorname(X - k \ge 2 \sqrt + 2x) \le \exp(-x) : \operatorname(k - X \ge 2 \sqrt) \le \exp(-x)


Asymptotic properties

By the
central limit theorem In probability theory, the central limit theorem (CLT) establishes that, in many situations, when independent random variables are summed up, their properly normalized sum tends toward a normal distribution even if the original variables themsel ...
, because the chi-squared distribution is the sum of k independent random variables with finite mean and variance, it converges to a normal distribution for large k. For many practical purposes, for k>50 the distribution is sufficiently close to a
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
, so the difference is ignorable. Specifically, if X \sim \chi^2(k), then as k tends to infinity, the distribution of (X-k)/\sqrt tends to a standard normal distribution. However, convergence is slow as the
skewness In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean. The skewness value can be positive, zero, negative, or undefined. For a unimodal ...
is \sqrt and the
excess kurtosis In probability theory and statistics, kurtosis (from el, κυρτός, ''kyrtos'' or ''kurtos'', meaning "curved, arching") is a measure of the "tailedness" of the probability distribution of a real-valued random variable. Like skewness, kurtosi ...
is 12/k. The sampling distribution of \ln(\chi^2) converges to normality much faster than the sampling distribution of \chi^2, as the logarithmic transform removes much of the asymmetry. Other functions of the chi-squared distribution converge more rapidly to a normal distribution. Some examples are: * If X \sim \chi^2(k) then \sqrt is approximately normally distributed with mean \sqrt and unit variance (1922, by
R. A. Fisher Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who ...
, see (18.23), p. 426 of Johnson. * If X \sim \chi^2(k) then \sqrt /math> is approximately normally distributed with mean 1-\frac and variance \frac . This is known as the Wilson–Hilferty transformation, see (18.24), p. 426 of Johnson. **This normalizing transformation leads directly to the commonly used median approximation k\bigg(1-\frac\bigg)^3\; by back-transforming from the mean, which is also the median, of the normal distribution.


Related distributions

* As k\to\infty, (\chi^2_k-k)/\sqrt ~ \xrightarrow\ N(0,1) \, (
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
) * \chi_k^2 \sim ^2_k(0) (
noncentral chi-squared distribution In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral \chi^2 distribution) is a noncentral generalization of the chi-squared distribution. It often arises in the power a ...
with non-centrality parameter \lambda = 0 ) *If Y \sim \mathrm(\nu_1, \nu_2) then X = \lim_ \nu_1 Y has the chi-squared distribution \chi^2_ :*As a special case, if Y \sim \mathrm(1, \nu_2)\, then X = \lim_ Y\, has the chi-squared distribution \chi^2_ * \, \boldsymbol_ (0,1) \, ^2 \sim \chi^2_k (The squared
norm Naturally occurring radioactive materials (NORM) and technologically enhanced naturally occurring radioactive materials (TENORM) consist of materials, usually industrial wastes or by-products enriched with radioactive elements found in the envi ...
of ''k'' standard normally distributed variables is a chi-squared distribution with ''k'' degrees of freedom) *If X \sim \chi^2_\nu\, and c>0 \,, then cX \sim \Gamma(k=\nu/2, \theta=2c)\,. (
gamma distribution In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-square distribution are special cases of the gamma d ...
) *If X \sim \chi^2_k then \sqrt \sim \chi_k (
chi distribution In probability theory and statistics, the chi distribution is a continuous probability distribution. It is the distribution of the positive square root of the sum of squares of a set of independent random variables each following a standard norm ...
) *If X \sim \chi^2_2, then X \sim \operatorname(1/2) is an exponential distribution. (See
gamma distribution In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-square distribution are special cases of the gamma d ...
for more.) *If X \sim \chi^2_, then X \sim \operatorname(k, 1/2) is an
Erlang distribution The Erlang distribution is a two-parameter family of continuous probability distributions with support x \in independent exponential distribution">exponential variables with mean 1/\lambda each. Equivalently, it is the distribution of the tim ...
. *If X \sim \operatorname(k,\lambda), then 2\lambda X\sim \chi^2_ *If X \sim \operatorname(1)\, (
Rayleigh distribution In probability theory and statistics, the Rayleigh distribution is a continuous probability distribution for nonnegative-valued random variables. Up to rescaling, it coincides with the chi distribution with two degrees of freedom. The distribut ...
) then X^2 \sim \chi^2_2\, *If X \sim \operatorname(1)\, (
Maxwell distribution Maxwell may refer to: People * Maxwell (surname), including a list of people and fictional characters with the name ** James Clerk Maxwell, mathematician and physicist * Justice Maxwell (disambiguation) * Maxwell baronets, in the Baronetage of ...
) then X^2 \sim \chi^2_3\, *If X \sim \chi^2_\nu then \tfrac \sim \operatorname\chi^2_\nu\, (
Inverse-chi-squared distribution In probability and statistics, the inverse-chi-squared distribution (or inverted-chi-square distributionBernardo, J.M.; Smith, A.F.M. (1993) ''Bayesian Theory'' ,Wiley (pages 119, 431) ) is a continuous probability distribution of a positive-val ...
) *The chi-squared distribution is a special case of type III
Pearson distribution The Pearson distribution is a family of continuous probability distributions. It was first published by Karl Pearson in 1895 and subsequently extended by him in 1901 and 1916 in a series of articles on biostatistics. History The Pearson system ...
* If X \sim \chi^2_\, and Y \sim \chi^2_\, are independent then \tfrac \sim \operatorname(\tfrac, \tfrac)\, (
beta distribution In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval , 1in terms of two positive parameters, denoted by ''alpha'' (''α'') and ''beta'' (''β''), that appear as ...
) *If X \sim \operatorname(0,1)\, ( uniform distribution) then -2\log(X) \sim \chi^2_2\, *If X_i \sim \operatorname(\mu,\beta)\, then \sum_^n \frac \sim \chi^2_\, * If X_i follows the
generalized normal distribution The generalized normal distribution or generalized Gaussian distribution (GGD) is either of two families of parametric continuous probability distributions on the real line. Both families add a shape parameter to the normal distribution. To dis ...
(version 1) with parameters \mu,\alpha,\beta then \sum_^n \frac \sim \chi^2_\, * chi-squared distribution is a transformation of Pareto distribution * Student's t-distribution is a transformation of chi-squared distribution * Student's t-distribution can be obtained from chi-squared distribution and
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
* Noncentral beta distribution can be obtained as a transformation of chi-squared distribution and
Noncentral chi-squared distribution In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral \chi^2 distribution) is a noncentral generalization of the chi-squared distribution. It often arises in the power a ...
*
Noncentral t-distribution The noncentral ''t''-distribution generalizes Student's ''t''-distribution using a noncentrality parameter. Whereas the central probability distribution describes how a test statistic ''t'' is distributed when the difference tested is null, ...
can be obtained from normal distribution and chi-squared distribution A chi-squared variable with k degrees of freedom is defined as the sum of the squares of k independent standard normal random variables. If Y is a k-dimensional Gaussian random vector with mean vector \mu and rank k covariance matrix C, then X = (Y-\mu )^C^(Y-\mu) is chi-squared distributed with k degrees of freedom. The sum of squares of
statistically independent Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two events are independent, statistically independent, or stochastically independent if, informally speaking, the occurrence of o ...
unit-variance Gaussian variables which do ''not'' have mean zero yields a generalization of the chi-squared distribution called the
noncentral chi-squared distribution In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral \chi^2 distribution) is a noncentral generalization of the chi-squared distribution. It often arises in the power a ...
. If Y is a vector of k
i.i.d. In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is us ...
standard normal random variables and A is a k\times k
symmetric Symmetry (from grc, συμμετρία "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definiti ...
,
idempotent matrix In linear algebra, an idempotent matrix is a matrix which, when multiplied by itself, yields itself. That is, the matrix A is idempotent if and only if A^2 = A. For this product A^2 to be defined, A must necessarily be a square matrix. Viewed thi ...
with
rank Rank is the relative position, value, worth, complexity, power, importance, authority, level, etc. of a person or object within a ranking, such as: Level or position in a hierarchical organization * Academic rank * Diplomatic rank * Hierarchy * ...
k-n, then the
quadratic form In mathematics, a quadratic form is a polynomial with terms all of degree two ("form" is another name for a homogeneous polynomial). For example, :4x^2 + 2xy - 3y^2 is a quadratic form in the variables and . The coefficients usually belong to a ...
Y^TAY is chi-square distributed with k-n degrees of freedom. If \Sigma is a p\times p positive-semidefinite covariance matrix with strictly positive diagonal entries, then for X\sim N(0,\Sigma) and w a random p-vector independent of X such that w_1+\cdots+w_p=1 and w_i\geq 0, i=1,\cdots,p, it holds that \frac\sim\chi_1^2. The chi-squared distribution is also naturally related to other distributions arising from the Gaussian. In particular, * Y is F-distributed, Y \sim F(k_1, k_2) if Y = \frac, where X_1 \sim \chi^2_ and X_2 \sim \chi^2_ are statistically independent. * If X_1 \sim \chi^2_ and X_2 \sim \chi^2_ are statistically independent, then X_1 + X_2\sim \chi^2_. If X_1 and X_2 are not independent, then X_1+X_2 is not chi-square distributed.


Generalizations

The chi-squared distribution is obtained as the sum of the squares of independent, zero-mean, unit-variance Gaussian random variables. Generalizations of this distribution can be obtained by summing the squares of other types of Gaussian random variables. Several such distributions are described below.


Linear combination

If X_1,\ldots,X_n are chi square random variables and a_1,\ldots,a_n\in\mathbb_, then a closed expression for the distribution of X=\sum_^n a_i X_i is not known. It may be, however, approximated efficiently using the property of characteristic functions of chi-square random variables.


Chi-squared distributions


Noncentral chi-squared distribution

The noncentral chi-squared distribution is obtained from the sum of the squares of independent Gaussian random variables having unit variance and ''nonzero'' means.


Generalized chi-squared distribution

The generalized chi-squared distribution is obtained from the quadratic form where is a zero-mean Gaussian vector having an arbitrary covariance matrix, and is an arbitrary matrix.


Gamma, exponential, and related distributions

The chi-squared distribution X \sim \chi_k^2 is a special case of the
gamma distribution In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-square distribution are special cases of the gamma d ...
, in that X \sim \Gamma \left(\frac2,\frac2\right) using the rate parameterization of the gamma distribution (or X \sim \Gamma \left(\frac2,2 \right) using the scale parameterization of the gamma distribution) where is an integer. Because the exponential distribution is also a special case of the gamma distribution, we also have that if X \sim \chi_2^2, then X\sim \operatorname\left(\frac 1 2\right) is an exponential distribution. The
Erlang distribution The Erlang distribution is a two-parameter family of continuous probability distributions with support x \in independent exponential distribution">exponential variables with mean 1/\lambda each. Equivalently, it is the distribution of the tim ...
is also a special case of the gamma distribution and thus we also have that if X \sim\chi_k^2 with even \text, then \text is Erlang distributed with shape parameter \text/2 and scale parameter 1/2.


Occurrence and applications

The chi-squared distribution has numerous applications in inferential statistics, for instance in
chi-squared test A chi-squared test (also chi-square or test) is a statistical hypothesis test used in the analysis of contingency tables when the sample sizes are large. In simpler terms, this test is primarily used to examine whether two categorical variables ...
s and in estimating
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
s. It enters the problem of estimating the mean of a normally distributed population and the problem of estimating the slope of a regression line via its role in Student's t-distribution. It enters all
analysis of variance Analysis of variance (ANOVA) is a collection of statistical models and their associated estimation procedures (such as the "variation" among and between groups) used to analyze the differences among means. ANOVA was developed by the statisticia ...
problems via its role in the
F-distribution In probability theory and statistics, the ''F''-distribution or F-ratio, also known as Snedecor's ''F'' distribution or the Fisher–Snedecor distribution (after Ronald Fisher and George W. Snedecor) is a continuous probability distribution ...
, which is the distribution of the ratio of two independent chi-squared
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s, each divided by their respective degrees of freedom. Following are some of the most common situations in which the chi-squared distribution arises from a Gaussian-distributed sample. *if X_1, ..., X_n are
i.i.d. In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is us ...
N(\mu, \sigma^2)
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s, then \sum_^n(X_i - \overline)^2 \sim \sigma^2 \chi^2_ where \overline X = \frac \sum_^n X_i. *The box below shows some statistics based on X_i \sim N(\mu_i, \sigma^2_i), i= 1, \ldots, k independent random variables that have probability distributions related to the chi-squared distribution: The chi-squared distribution is also often encountered in
magnetic resonance imaging Magnetic resonance imaging (MRI) is a medical imaging technique used in radiology to form pictures of the anatomy and the physiological processes of the body. MRI scanners use strong magnetic fields, magnetic field gradients, and radio wave ...
.


Computational methods


Table of χ2 values vs ''p''-values

The ''p''-value is the probability of observing a test statistic ''at least'' as extreme in a chi-squared distribution. Accordingly, since the cumulative distribution function (CDF) for the appropriate degrees of freedom ''(df)'' gives the probability of having obtained a value ''less extreme'' than this point, subtracting the CDF value from 1 gives the ''p''-value. A low ''p''-value, below the chosen significance level, indicates statistical significance, i.e., sufficient evidence to reject the null hypothesis. A significance level of 0.05 is often used as the cutoff between significant and non-significant results. The table below gives a number of ''p''-values matching to \chi^2 for the first 10 degrees of freedom. These values can be calculated evaluating the quantile function (also known as "inverse CDF" or "ICDF") of the chi-squared distribution; e. g., the ICDF for and yields as in the table above, noticing that is the ''p''-value from the table.


History

This distribution was first described by the German geodesist and statistician
Friedrich Robert Helmert Friedrich Robert Helmert (31 July 1843 – 15 June 1917) was a German geodesist and statistician with important contributions to the theory of errors. Career Helmert was born in Freiberg, Kingdom of Saxony. After schooling in Freiberg and ...
in papers of 1875–6, where he computed the sampling distribution of the sample variance of a normal population. Thus in German this was traditionally known as the ''Helmert'sche'' ("Helmertian") or "Helmert distribution". The distribution was independently rediscovered by the English mathematician Karl Pearson in the context of
goodness of fit The goodness of fit of a statistical model describes how well it fits a set of observations. Measures of goodness of fit typically summarize the discrepancy between observed values and the values expected under the model in question. Such measure ...
, for which he developed his Pearson's chi-squared test, published in 1900, with computed table of values published in , collected in . The name "chi-square" ultimately derives from Pearson's shorthand for the exponent in a
multivariate normal distribution In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
with the Greek letter
Chi Chi or CHI may refer to: Greek *Chi (letter), the Greek letter (uppercase Χ, lowercase χ); Chinese *Chi (length), ''Chi'' (length) (尺), a traditional unit of length, about ⅓ meter *Chi (mythology) (螭), a dragon *Chi (surname) (池, pin ...
, writing for what would appear in modern notation as (Σ being the covariance matrix).R. L. Plackett, ''Karl Pearson and the Chi-Squared Test'', International Statistical Review, 1983
61f.
See also Jeff Miller

The idea of a family of "chi-squared distributions", however, is not due to Pearson but arose as a further development due to Fisher in the 1920s.


See also

*
Chi distribution In probability theory and statistics, the chi distribution is a continuous probability distribution. It is the distribution of the positive square root of the sum of squares of a set of independent random variables each following a standard norm ...
*
Scaled inverse chi-squared distribution The scaled inverse chi-squared distribution is the distribution for ''x'' = 1/''s''2, where ''s''2 is a sample mean of the squares of ν independent normal random variables that have mean 0 and inverse variance 1/σ2 = τ2. The distribu ...
*
Gamma distribution In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-square distribution are special cases of the gamma d ...
*
Generalized chi-squared distribution In probability theory and statistics, the generalized chi-squared distribution (or generalized chi-square distribution) is the distribution of a quadratic form of a multinormal variable (normal vector), or a linear combination of different no ...
*
Noncentral chi-squared distribution In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral \chi^2 distribution) is a noncentral generalization of the chi-squared distribution. It often arises in the power a ...
* Pearson's chi-squared test *
Reduced chi-squared statistic In statistics, the reduced chi-square statistic is used extensively in goodness of fit testing. It is also known as mean squared weighted deviation (MSWD) in isotopic dating and variance of unit weight in the context of weighted least squares. ...
*
Wilks's lambda distribution In statistics, Wilks' lambda distribution (named for Samuel S. Wilks), is a probability distribution used in multivariate hypothesis testing, especially with regard to the likelihood-ratio test and multivariate analysis of variance (MANOVA). De ...


References


Further reading

* * * *


External links


Earliest Uses of Some of the Words of Mathematics: entry on Chi squared has a brief history
from Yale University Stats 101 class.
''Mathematica'' demonstration showing the chi-squared sampling distribution of various statistics, e. g. Σ''x''², for a normal populationSimple algorithm for approximating cdf and inverse cdf for the chi-squared distribution with a pocket calculator

Values of the Chi-squared distribution
{{ProbDistributions, continuous-semi-infinite Normal distribution Infinitely divisible probability distributions