HOME

TheInfoList



OR:

In
econometrics Econometrics is the application of Statistics, statistical methods to economic data in order to give Empirical evidence, empirical content to economic relationships.M. Hashem Pesaran (1987). "Econometrics," ''The New Palgrave: A Dictionary of ...
, the information matrix test is used to determine whether a
regression model In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one o ...
is misspecified. The test was developed by
Halbert White Halbert Lynn White Jr. (November 19, 1950 – March 31, 2012) was the Chancellor’s Associates Distinguished Professor of Economics at the University of California, San Diego, and a Fellow of the Econometric Society and the American Academy of ...
, who observed that in a correctly specified model and under standard regularity assumptions, the Fisher information matrix can be expressed in either of two ways: as the
outer product In linear algebra, the outer product of two coordinate vector In linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers (a tuple) that describes the vector in terms of a particular ordered basis. An ea ...
of the
gradient In vector calculus, the gradient of a scalar-valued differentiable function of several variables is the vector field (or vector-valued function) \nabla f whose value at a point p is the "direction and rate of fastest increase". If the gradi ...
, or as a function of the Hessian matrix of the log-likelihood function. Consider a linear model \mathbf = \mathbf \mathbf + \mathbf, where the errors \mathbf are assumed to be distributed \mathrm(0, \sigma^2 \mathbf). If the parameters \beta and \sigma^2 are stacked in the vector \mathbf^ = \begin \beta & \sigma^2 \end, the resulting
log-likelihood function The likelihood function (often simply called the likelihood) represents the probability of Realization (probability), random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a Sample (st ...
is :\ell (\mathbf) = - \frac \log \sigma^2 - \frac \left( \mathbf - \mathbf \mathbf \right)^ \left( \mathbf - \mathbf \mathbf \right) The information matrix can then be expressed as : \mathbf (\mathbf) = \operatorname \left \left( \frac \right) \left( \frac \right)^ \right/math> that is the expected value of the outer product of the gradient or score. Second, it can be written as the negative of the Hessian matrix of the log-likelihood function : \mathbf (\mathbf) = - \operatorname \left \frac \right/math> If the model is correctly specified, both expressions should be equal. Combining the equivalent forms yields : \mathbf(\mathbf) = \sum_^n \left \frac + \frac \frac \right/math> where \mathbf (\mathbf) is an (r \times r) random matrix, where r is the number of parameters. White showed that the elements of n^ \mathbf ( \mathbf ), where \mathbf is the MLE, are asymptotically normally distributed with zero means when the model is correctly specified. In small samples, however, the test generally performs poorly.


References


Further reading

* * {{cite book , first=Halbert , last=White , chapter=Information Matrix Testing , title=Estimation, Inference and Specification Analysis , location=New York , publisher=Cambridge University Press , year=1994 , isbn=0-521-25280-6 , pages=300–344 , chapter-url=https://books.google.com/books?id=hnNpQSf7ZlAC&pg=PA300 Statistical tests Regression diagnostics