In
econometrics
Econometrics is the application of Statistics, statistical methods to economic data in order to give Empirical evidence, empirical content to economic relationships.M. Hashem Pesaran (1987). "Econometrics," ''The New Palgrave: A Dictionary of ...
, the Frisch–Waugh–Lovell (FWL) theorem is named after the econometricians
Ragnar Frisch
Ragnar Anton Kittil Frisch (3 March 1895 – 31 January 1973) was an influential Norwegian economist known for being one of the major contributors to establishing economics as a quantitative and statistically informed science in the early 20th ce ...
,
Frederick V. Waugh
Frederick Vail Waugh (1898–1974) was an American agricultural economist known for his work relating supply, demand, quality, and marketing in the prices of agricultural products, for his understanding of who benefits from volatility in agricult ...
, and
Michael C. Lovell
Michael Christopher Lovell (April 11, 1930 – December 20, 2018) was an American economist. He was the Chester D. Hubbard Professor of Economics and Social Science at Wesleyan University from 1969 to 2002, professor of economics at Carnegie-Mell ...
.
The Frisch–Waugh–Lovell theorem states that if the
regression
Regression or regressions may refer to:
Science
* Marine regression, coastal advance due to falling sea level, the opposite of marine transgression
* Regression (medicine), a characteristic of diseases to express lighter symptoms or less extent ( ...
we are concerned with is:
:
where
and
are
and
matrices
Matrix most commonly refers to:
* ''The Matrix'' (franchise), an American media franchise
** ''The Matrix'', a 1999 science-fiction action film
** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
respectively and where
and
are
conformable
In mathematics, a matrix is conformable if its dimensions are suitable for defining some operation (''e.g.'' addition, multiplication, etc.).
Examples
* If two matrices have the same dimensions (number of rows and number of columns), they are ' ...
, then the estimate of
will be the same as the estimate of it from a modified regression of the form:
:
where
projects onto the
orthogonal complement In the mathematical fields of linear algebra and functional analysis, the orthogonal complement of a subspace ''W'' of a vector space ''V'' equipped with a bilinear form ''B'' is the set ''W''⊥ of all vectors in ''V'' that are orthogonal to every ...
of the
image
An image is a visual representation of something. It can be two-dimensional, three-dimensional, or somehow otherwise feed into the visual system to convey information. An image can be an artifact, such as a photograph or other two-dimensiona ...
of the
projection matrix
In statistics, the projection matrix (\mathbf), sometimes also called the influence matrix or hat matrix (\mathbf), maps the vector of response values (dependent variable values) to the vector of fitted values (or predicted values). It describes t ...
. Equivalently, ''M''
''X''1 projects onto the
orthogonal complement In the mathematical fields of linear algebra and functional analysis, the orthogonal complement of a subspace ''W'' of a vector space ''V'' equipped with a bilinear form ''B'' is the set ''W''⊥ of all vectors in ''V'' that are orthogonal to every ...
of the column space of ''X''
1. Specifically,
:
and this particular orthogonal projection matrix is known as the
residual maker matrix or annihilator matrix.
The vector
is the vector of residuals from regression of
on the columns of
.
The most relevant consequence of the theorem is that the parameters in
do not apply to
but to
, that is: the part of
uncorrelated with
. This is the basis for understanding the contribution of each single variable to a multivariate regression (see, for instance, Ch. 13 in ).
The theorem also implies that the secondary regression used for obtaining
is unnecessary when the predictor variables are uncorrelated (this never happens in practice): using projection matrices to make the explanatory variables orthogonal to each other will lead to the same results as running the regression with all non-orthogonal explanators included.
It is not clear who did prove this theorem first. However, in the context of linear regression, it was known well before Frisch and Waugh paper. In fact, it can be found as section 9, pag.184, in the detailed analysis of partial regressions by
George Udny Yule
George Udny Yule FRS (18 February 1871 – 26 June 1951), usually known as Udny Yule, was a British statistician, particularly known for the Yule distribution.
Personal life
Yule was born at Beech Hill, a house in Morham near Haddington, ...
published in 1907. In this paper, Yule stresses the central role of the result in understanding the meaning of multiple and partial regression and correlation coefficients. See the first paragraph of section 10 on pag. 184 of Yule's 1907 paper.
Yule's results were generally known by 1933 as Yule did include a detailed discussion of partial correlation, his novel partial correlation notation introduced in 1907 and the "Frisch, Waugh and Lovell" theorem, as chapter 10 of his, quite successful, Statistics textbook first issued in 1911 which, by 1932, had reached its tenth edition.
Frisch did quote Yule's results on pag. 389 of a 1931 paper with Mudgett.
In this paper Yule's formulas for partial regressions are quoted, and explicitly attributed to Yule, in order to correct misquotes of the same formulas by another Author. In fact, while Yule is not explicitly mentioned in their 1933 paper, Frisch and Waugh use, for the partial regression coefficients, the notation first introduced by Yule in his 1907 paper and in general use by 1933.
References
Further reading
*
*
*
*
*
{{DEFAULTSORT:Frisch-Waugh-Lovell theorem
Economics theorems
Regression analysis
Theorems in statistics