HOME

TheInfoList



OR:

In
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, the Wald test (named after
Abraham Wald Abraham Wald (; hu, Wald Ábrahám, yi, אברהם וואַלד;  – ) was a Jewish Hungarian mathematician who contributed to decision theory, geometry, and econometrics and founded the field of statistical sequential analysis. One ...
) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the
null hypothesis In scientific research, the null hypothesis (often denoted ''H''0) is the claim that no difference or relationship exists between two sets of data or variables being analyzed. The null hypothesis is that any experimentally observed difference is d ...
, where the weight is the
precision Precision, precise or precisely may refer to: Science, and technology, and mathematics Mathematics and computing (general) * Accuracy and precision, measurement deviation from true value and its scatter * Significant figures, the number of digit ...
of the estimate. Intuitively, the larger this weighted distance, the less likely it is that the constraint is true. While the finite sample distributions of Wald tests are generally unknown, it has an asymptotic χ2-distribution under the null hypothesis, a fact that can be used to determine statistical significance. Together with the
Lagrange multiplier test In statistics, the score test assesses constraints on statistical parameters based on the gradient of the likelihood function—known as the ''score''—evaluated at the hypothesized parameter value under the null hypothesis. Intuitively, if the ...
and the
likelihood-ratio test In statistics, the likelihood-ratio test assesses the goodness of fit of two competing statistical models based on the ratio of their likelihoods, specifically one found by maximization over the entire parameter space and another found after im ...
, the Wald test is one of three classical approaches to
hypothesis testing A statistical hypothesis test is a method of statistical inference used to decide whether the data at hand sufficiently support a particular hypothesis. Hypothesis testing allows us to make probabilistic statements about population parameters. ...
. An advantage of the Wald test over the other two is that it only requires the estimation of the unrestricted model, which lowers the computational burden as compared to the likelihood-ratio test. However, a major disadvantage is that (in finite samples) it is not invariant to changes in the representation of the null hypothesis; in other words, algebraically equivalent expressions of non-linear parameter restriction can lead to different values of the test statistic. That is because the Wald statistic is derived from a
Taylor expansion In mathematics, the Taylor series or Taylor expansion of a function is an infinite sum of terms that are expressed in terms of the function's derivatives at a single point. For most common functions, the function and the sum of its Taylor seri ...
, and different ways of writing equivalent nonlinear expressions lead to nontrivial differences in the corresponding Taylor coefficients. Another aberration, known as the Hauck–Donner effect, can occur in binomial models when the estimated (unconstrained) parameter is close to the
boundary Boundary or Boundaries may refer to: * Border, in political geography Entertainment * ''Boundaries'' (2016 film), a 2016 Canadian film * ''Boundaries'' (2018 film), a 2018 American-Canadian road trip film *Boundary (cricket), the edge of the pla ...
of the
parameter space The parameter space is the space of possible parameter values that define a particular mathematical model, often a subset of finite-dimensional Euclidean space. Often the parameters are inputs of a function, in which case the technical term for th ...
—for instance a fitted probability being extremely close to zero or one—which results in the Wald test no longer
monotonically increasing In mathematics, a monotonic function (or monotone function) is a function between ordered sets that preserves or reverses the given order. This concept first arose in calculus, and was later generalized to the more abstract setting of orde ...
in the distance between the unconstrained and constrained parameter.


Mathematical details

Under the Wald test, the estimated \hat that was found as the maximizing argument of the unconstrained
likelihood function The likelihood function (often simply called the likelihood) represents the probability of random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a given sample, the likelihood funct ...
is compared with a hypothesized value \theta_0. In particular, the squared difference \hat - \theta_0 is weighted by the curvature of the log-likelihood function.


Test on a single parameter

If the hypothesis involves only a single parameter restriction, then the Wald statistic takes the following form: : W = \frac which under the null hypothesis follows an asymptotic χ2-distribution with one degree of freedom. The square root of the single-restriction Wald statistic can be understood as a (pseudo) ''t''-ratio that is, however, not actually ''t''-distributed except for the special case of linear regression with normally distributed errors. In general, it follows an asymptotic ''z'' distribution. :\sqrt = \frac where \operatorname(\widehat\theta) is the
standard error The standard error (SE) of a statistic (usually an estimate of a parameter) is the standard deviation of its sampling distribution or an estimate of that standard deviation. If the statistic is the sample mean, it is called the standard error ...
of the maximum likelihood estimate (MLE), the square root of the variance. There are several ways to consistently estimate the variance matrix which in finite samples leads to alternative estimates of standard errors and associated test statistics and ''p''-values.


Test(s) on multiple parameters

The Wald test can be used to test a single hypothesis on multiple parameters, as well as to test jointly multiple hypotheses on single/multiple parameters. Let \hat_n be our sample estimator of ''P'' parameters (i.e., \hat_n is a P \times 1 vector), which is supposed to follow asymptotically a normal distribution with
covariance matrix In probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of ...
 ''V'', \sqrt(\hat_n-\theta)\,\xrightarrow \,N(0, V) . The test of ''Q'' hypotheses on the ''P'' parameters is expressed with a Q \times P matrix ''R'': : H_0: R\theta=r : H_1: R\theta\neq r The distribution of the test statistic under the null hypothesis is: :(R\hat_n-r)' (\hat_n/n)R'(R\hat_n-r)/Q \quad \xrightarrow\quad F(Q,n-P) \quad \xrightarrow \rightarrow \inftyquad \chi^2_Q where \hat_n is an estimator of the covariance matrix. Suppose \sqrt(\hat_n-\theta)\,\xrightarrow\, N(0, V) . Then, by
Slutsky's theorem In probability theory, Slutsky’s theorem extends some properties of algebraic operations on convergent sequences of real numbers to sequences of random variables. The theorem was named after Eugen Slutsky. Slutsky's theorem is also attributed to ...
and by the properties of the
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
, multiplying by R has distribution: : R\sqrt(\hat_n-\theta) =\sqrt(R\hat_n-r)\,\xrightarrow\, N(0, RVR') Recalling that a quadratic form of normal distribution has a
Chi-squared distribution In probability theory and statistics, the chi-squared distribution (also chi-square or \chi^2-distribution) with k degrees of freedom is the distribution of a sum of the squares of k independent standard normal random variables. The chi-squa ...
: : \sqrt(R\hat_n-r)' VR'\sqrt(R\hat_n-r) \,\xrightarrow\, \chi^2_Q Rearranging ''n'' finally gives: :(R\hat_n-r)' (V/n)R'(R\hat_n-r) \quad \xrightarrow\quad \chi^2_Q What if the covariance matrix is not known a-priori and needs to be estimated from the data? If we have a
consistent estimator In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter ''θ''0—having the property that as the number of data points used increases indefinitely, the result ...
\hat_n of V such that V^\hat_n has a determinant that is distributed \chi^2_, then by the independence of the covariance estimator and equation above, we have: :(R\hat_n-r)' (\hat_n/n)R'(R\hat_n-r)/Q \quad \xrightarrow\quad F(Q,n-P)


Nonlinear hypothesis

In the standard form, the Wald test is used to test linear hypotheses that can be represented by a single matrix ''R''. If one wishes to test a non-linear hypothesis of the form: : H_0: c(\theta)=0 : H_1: c(\theta)\neq 0 The test statistic becomes: :c \left (\hat_n \right )' \left ' \left (\hat_n \right ) \left (\hat_n/n \right )c' \left (\hat_n \right )' \right c \left (\hat_n \right ) \quad \xrightarrow\quad \chi^2_Q where c'(\hat_n) is the
derivative In mathematics, the derivative of a function of a real variable measures the sensitivity to change of the function value (output value) with respect to a change in its argument (input value). Derivatives are a fundamental tool of calculus. F ...
of c evaluated at the sample estimator. This result is obtained using the
delta method In statistics, the delta method is a result concerning the approximate probability distribution for a function of an asymptotically normal statistical estimator from knowledge of the limiting variance of that estimator. History The delta method ...
, which uses a first order approximation of the variance.


Non-invariance to re-parameterisations

The fact that one uses an approximation of the variance has the drawback that the Wald statistic is not-invariant to a non-linear transformation/reparametrisation of the hypothesis: it can give different answers to the same question, depending on how the question is phrased. For example, asking whether ''R'' = 1 is the same as asking whether log ''R'' = 0; but the Wald statistic for ''R'' = 1 is not the same as the Wald statistic for log ''R'' = 0 (because there is in general no neat relationship between the standard errors of ''R'' and log ''R'', so it needs to be approximated).


Alternatives to the Wald test

There exist several alternatives to the Wald test, namely the
likelihood-ratio test In statistics, the likelihood-ratio test assesses the goodness of fit of two competing statistical models based on the ratio of their likelihoods, specifically one found by maximization over the entire parameter space and another found after im ...
and the
Lagrange multiplier test In statistics, the score test assesses constraints on statistical parameters based on the gradient of the likelihood function—known as the ''score''—evaluated at the hypothesized parameter value under the null hypothesis. Intuitively, if the ...
(also known as the score test).
Robert F. Engle Robert Fry Engle III (born November 10, 1942) is an American economist and statistician. He won the 2003 Nobel Memorial Prize in Economic Sciences, sharing the award with Clive Granger, "for methods of analyzing economic time series with time-va ...
showed that these three tests, the Wald test, the
likelihood-ratio test In statistics, the likelihood-ratio test assesses the goodness of fit of two competing statistical models based on the ratio of their likelihoods, specifically one found by maximization over the entire parameter space and another found after im ...
and the
Lagrange multiplier test In statistics, the score test assesses constraints on statistical parameters based on the gradient of the likelihood function—known as the ''score''—evaluated at the hypothesized parameter value under the null hypothesis. Intuitively, if the ...
are asymptotically equivalent. Although they are asymptotically equivalent, in finite samples, they could disagree enough to lead to different conclusions. There are several reasons to prefer the likelihood ratio test or the Lagrange multiplier to the Wald test: * Non-invariance: As argued above, the Wald test is not invariant under reparametrization, while the likelihood ratio tests will give exactly the same answer whether we work with ''R'', log ''R'' or any other
monotonic In mathematics, a monotonic function (or monotone function) is a function between ordered sets that preserves or reverses the given order. This concept first arose in calculus, and was later generalized to the more abstract setting of order ...
transformation of ''R''. * The other reason is that the Wald test uses two approximations (that we know the standard error or
Fisher information In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable ''X'' carries about an unknown parameter ''θ'' of a distribution that model ...
and the maximum likelihood estimate), whereas the likelihood ratio test depends only on the ratio of likelihood functions under the null hypothesis and alternative hypothesis. * The Wald test requires an estimate using the maximizing argument, corresponding to the "full" model. In some cases, the model is simpler under the null hypothesis, so that one might prefer to use the
score test In statistics, the score test assesses constraints on statistical parameters based on the gradient of the likelihood function—known as the ''score''—evaluated at the hypothesized parameter value under the null hypothesis. Intuitively, if the ...
(also called Lagrange multiplier test), which has the advantage that it can be formulated in situations where the variability of the maximizing element is difficult to estimate or computing the estimate according to the maximum likelihood estimator is difficult; e.g. the Cochran–Mantel–Haenzel test is a score test.


See also

*
Chow test The Chow test (), proposed by econometrician Gregory Chow in 1960, is a test of whether the true coefficients in two linear regressions on different data sets are equal. In econometrics, it is most commonly used in time series analysis to test fo ...
*
Sequential probability ratio test The sequential probability ratio test (SPRT) is a specific sequential hypothesis test, developed by Abraham Wald and later proven to be optimal by Wald and Jacob Wolfowitz. Neyman and Pearson's 1933 result inspired Wald to reformulate it as a seq ...
* Sup-Wald test * Student's ''t''-test * Welch's ''t''-test


References


Further reading

* * *


External links


Wald test
on th

{{DEFAULTSORT:Wald Test Statistical tests