HOME

TheInfoList



OR:

In
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
and
statistics Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, the ''F''-distribution or ''F''-ratio, also known as Snedecor's ''F'' distribution or the Fisher–Snedecor distribution (after
Ronald Fisher Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who a ...
and George W. Snedecor), is a
continuous probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
that arises frequently as the null distribution of a
test statistic Test statistic is a quantity derived from the sample for statistical hypothesis testing.Berger, R. L.; Casella, G. (2001). ''Statistical Inference'', Duxbury Press, Second Edition (p.374) A hypothesis test is typically specified in terms of a tes ...
, most notably in the
analysis of variance Analysis of variance (ANOVA) is a family of statistical methods used to compare the Mean, means of two or more groups by analyzing variance. Specifically, ANOVA compares the amount of variation ''between'' the group means to the amount of variati ...
(ANOVA) and other ''F''-tests.


Definitions

The ''F''-distribution with ''d''1 and ''d''2 degrees of freedom is the distribution of X = \frac where U_1 and U_2 are independent
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
s with chi-square distributions with respective degrees of freedom d_1 and d_2. It can be shown to follow that the
probability density function In probability theory, a probability density function (PDF), density function, or density of an absolutely continuous random variable, is a Function (mathematics), function whose value at any given sample (or point) in the sample space (the s ...
(pdf) for ''X'' is given by \begin f(x; d_1,d_2) &= \frac \\ pt&=\frac \left(\frac\right)^ x^ \left(1+\frac \, x \right)^ \end for real ''x'' > 0. Here \mathrm is the beta function. In many applications, the parameters ''d''1 and ''d''2 are
positive integer In mathematics, the natural numbers are the numbers 0, 1, 2, 3, and so on, possibly excluding 0. Some start counting with 0, defining the natural numbers as the non-negative integers , while others start with 1, defining them as the positiv ...
s, but the distribution is well-defined for positive real values of these parameters. The
cumulative distribution function In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. Ever ...
is F(x; d_1,d_2)=I_\left (\tfrac, \tfrac \right) , where ''I'' is the regularized incomplete beta function.


Properties

The expectation, variance, and other details about the F(''d''1, ''d''2) are given in the sidebox; for ''d''2 > 8, the excess kurtosis is \gamma_2 = 12\frac. The ''k''-th moment of an F(''d''1, ''d''2) distribution exists and is finite only when 2''k'' < ''d''2 and it is equal to \mu _X(k) =\left( \frac\right)^k \frac \frac. The ''F''-distribution is a particular parametrization of the beta prime distribution, which is also called the beta distribution of the second kind. The characteristic function is listed incorrectly in many standard references (e.g.,). The correct expression is \varphi^F_(s) = \frac U \! \left(\frac,1-\frac,-\frac \imath s \right) where ''U''(''a'', ''b'', ''z'') is the confluent hypergeometric function of the second kind.


Related distributions


Relation to the chi-squared distribution

In instances where the ''F''-distribution is used, for example in the
analysis of variance Analysis of variance (ANOVA) is a family of statistical methods used to compare the Mean, means of two or more groups by analyzing variance. Specifically, ANOVA compares the amount of variation ''between'' the group means to the amount of variati ...
, independence of U_1 and U_2 (defined above) might be demonstrated by applying Cochran's theorem. Equivalently, since the chi-squared distribution is the sum of squares of independent standard normal random variables, the random variable of the ''F''-distribution may also be written X = \frac \div \frac, where s_1^2 = \frac and s_2^2 = \frac, S_1^2 is the sum of squares of d_1 random variables from normal distribution N(0,\sigma_1^2) and S_2^2 is the sum of squares of d_2 random variables from normal distribution N(0,\sigma_2^2). In a frequentist context, a scaled ''F''-distribution therefore gives the probability p(s_1^2/s_2^2 \mid \sigma_1^2, \sigma_2^2), with the ''F''-distribution itself, without any scaling, applying where \sigma_1^2 is being taken equal to \sigma_2^2. This is the context in which the ''F''-distribution most generally appears in ''F''-tests: where the null hypothesis is that two independent normal variances are equal, and the observed sums of some appropriately selected squares are then examined to see whether their ratio is significantly incompatible with this null hypothesis. The quantity X has the same distribution in Bayesian statistics, if an uninformative rescaling-invariant Jeffreys prior is taken for the prior probabilities of \sigma_1^2 and \sigma_2^2. In this context, a scaled ''F''-distribution thus gives the posterior probability p(\sigma^2_2 /\sigma_1^2 \mid s^2_1, s^2_2), where the observed sums s^2_1 and s^2_2 are now taken as known.


In general

*If X \sim \chi^2_ and Y \sim \chi^2_ ( Chi squared distribution) are independent, then \frac \sim \mathrm(d_1, d_2) *If X_k \sim \Gamma(\alpha_k,\beta_k)\, ( Gamma distribution) are independent, then \frac \sim \mathrm(2\alpha_1, 2\alpha_2) *If X \sim \operatorname(d_1/2,d_2/2) (
Beta distribution In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval
, 1 The comma is a punctuation mark that appears in several variants in different languages. Some typefaces render it as a small line, slightly curved or straight, but inclined from the vertical; others give it the appearance of a miniature fille ...
or (0, 1) in terms of two positive Statistical parameter, parameters, denoted by ''alpha'' (''α'') an ...
) then \frac \sim \operatorname(d_1,d_2) *Equivalently, if X \sim F(d_1, d_2), then \frac \sim \operatorname(d_1/2,d_2/2). *If X \sim F(d_1, d_2), then \fracX has a beta prime distribution: \fracX \sim \operatorname\left(\tfrac,\tfrac\right). *If X \sim F(d_1, d_2) then Y = \lim_ d_1 X has the chi-squared distribution \chi^2_ *F(d_1, d_2) is equivalent to the scaled Hotelling's T-squared distribution \frac \operatorname^2 (d_1, d_1 +d_2-1) . *If X \sim F(d_1, d_2) then X^ \sim F(d_2, d_1). *If X\sim t_
Student's t-distribution In probability theory and statistics, Student's  distribution (or simply the  distribution) t_\nu is a continuous probability distribution that generalizes the Normal distribution#Standard normal distribution, standard normal distribu ...
— then: \begin X^ &\sim \operatorname(1, n) \\ X^ &\sim \operatorname(n, 1) \end *''F''-distribution is a special case of type 6 Pearson distribution *If X and Y are independent, with X,Y\sim Laplace(''μ'', ''b'') then \frac \sim \operatorname(2,2) *If X\sim F(n,m) then \tfrac \sim \operatorname(n,m) ( Fisher's z-distribution) *The noncentral ''F''-distribution simplifies to the ''F''-distribution if \lambda=0. *The doubly noncentral ''F''-distribution simplifies to the ''F''-distribution if \lambda_1 = \lambda_2 = 0 *If \operatorname_X(p) is the quantile ''p'' for X\sim F(d_1,d_2) and \operatorname_Y(1-p) is the quantile 1-p for Y\sim F(d_2,d_1), then \operatorname_X(p)=\frac. * ''F''-distribution is an instance of ratio distributions * W-distribution is a unique parametrization of F-distribution.


See also

* Beta prime distribution * Chi-square distribution * Chow test * Gamma distribution * Hotelling's T-squared distribution *
Wilks' lambda distribution In statistics, Wilks' lambda distribution (named for Samuel S. Wilks), is a probability distribution used in multivariate hypothesis testing, especially with regard to the likelihood-ratio test and multivariate analysis of variance (MANOVA). ...
* Wishart distribution * Modified half-normal distribution with the pdf on (0, \infty) is given as f(x)= \frac, where \Psi(\alpha,z)=_1\Psi_1\left(\begin\left(\alpha,\frac\right)\\(1,0)\end;z \right) denotes the Fox–Wright Psi function.


References


External links


Table of critical values of the ''F''-distributionEarliest Uses of Some of the Words of Mathematics: entry on ''F''-distribution contains a brief history
{{DEFAULTSORT:F-distribution Continuous distributions Analysis of variance