In
statistics
Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, the residual sum of squares (RSS), also known as the sum of squared estimate of errors (SSE), is the
sum of the
squares
In Euclidean geometry, a square is a regular quadrilateral, which means that it has four equal sides and four equal angles (90-degree angles, π/2 radian angles, or right angles). It can also be defined as a rectangle with two equal-length adj ...
of
residuals (deviations predicted from actual empirical values of data). It is a measure of the discrepancy between the data and an estimation model, such as a
linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is call ...
. A small RSS indicates a tight fit of the model to the data. It is used as an
optimality criterion
In statistics, an optimality criterion provides a measure of the fit of the data to a given hypothesis, to aid in model selection. A model is designated as the "best" of the candidate models if it gives the best value of an objective function mea ...
in parameter selection and
model selection
Model selection is the task of selecting a statistical model from a set of candidate models, given data. In the simplest cases, a pre-existing set of data is considered. However, the task can also involve the design of experiments such that the ...
.
In general,
total sum of squares
In statistical data analysis the total sum of squares (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses. For a set of observations, y_i, i\leq n, it is defined as the sum over all squared dif ...
=
explained sum of squares
In statistics, the explained sum of squares (ESS), alternatively known as the model sum of squares or sum of squares due to regression (SSR – not to be confused with the residual sum of squares (RSS) or sum of squares of errors), is a quantity ...
+ residual sum of squares. For a proof of this in the multivariate
ordinary least squares
In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the prin ...
(OLS) case, see
partitioning in the general OLS model.
One explanatory variable
In a model with a single explanatory variable, RSS is given by:
:
where ''y''
''i'' is the ''i''
th value of the variable to be predicted, ''x''
''i'' is the ''i''
th value of the explanatory variable, and
is the predicted value of ''y''
''i'' (also termed
).
In a standard linear simple
regression model
In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one o ...
,
, where
and
are
coefficient
In mathematics, a coefficient is a multiplicative factor in some term of a polynomial, a series, or an expression; it is usually a number, but may be any expression (including variables such as , and ). When the coefficients are themselves var ...
s, ''y'' and ''x'' are the
regressand
Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or demand ...
and the
regressor
Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or demand ...
, respectively, and ε is the
error term In mathematics and statistics, an error term is an additive type of error. Common examples include:
* errors and residuals in statistics, e.g. in linear regression
* the error term in numerical integration
In analysis, numerical integration ...
. The sum of squares of residuals is the sum of squares of
; that is
:
where
is the estimated value of the constant term
and
is the estimated value of the slope coefficient
.
Matrix expression for the OLS residual sum of squares
The general regression model with observations and explanators, the first of which is a constant unit vector whose coefficient is the regression intercept, is
:
where is an ''n'' × 1 vector of dependent variable observations, each column of the ''n'' × ''k'' matrix is a vector of observations on one of the ''k'' explanators,
is a ''k'' × 1 vector of true coefficients, and is an ''n''× 1 vector of the true underlying errors. The
ordinary least squares
In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the prin ...
estimator for
is
:
:
:
The residual vector
; so the residual sum of squares is:
:
,
(equivalent to the square of the
norm
Naturally occurring radioactive materials (NORM) and technologically enhanced naturally occurring radioactive materials (TENORM) consist of materials, usually industrial wastes or by-products enriched with radioactive elements found in the envir ...
of residuals). In full:
:
,
where is the
hat matrix
In statistics, the projection matrix (\mathbf), sometimes also called the influence matrix or hat matrix (\mathbf), maps the vector of response values (dependent variable values) to the vector of fitted values (or predicted values). It describes ...
, or the projection matrix in linear regression.
Relation with Pearson's product-moment correlation
The
least-squares regression line is given by
:
,
where
and
, where
and
Therefore,
:
where
The
Pearson product-moment correlation is given by
therefore,
See also
*
Akaike information criterion#Comparison with least squares
*
Chi-squared distribution#Applications
*
Degrees of freedom (statistics)#Sum of squares and degrees of freedom
*
Errors and residuals in statistics
In statistics and optimization, errors and residuals are two closely related and easily confused measures of the deviation of an observed value of an element of a statistical sample from its "true value" (not necessarily observable). The err ...
*
Lack-of-fit sum of squares In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null ...
*
Mean squared error
In statistics, the mean squared error (MSE) or mean squared deviation (MSD) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors—that is, the average squared difference between ...
*
Reduced chi-squared statistic
In statistics, the reduced chi-square statistic is used extensively in goodness of fit testing. It is also known as mean squared weighted deviation (MSWD) in isotopic dating and variance of unit weight in the context of weighted least squares.
I ...
, RSS per degree of freedom
*
Squared deviations
Squared deviations from the mean (SDM) result from squaring deviations. In probability theory and statistics, the definition of ''variance'' is either the expected value of the SDM (when considering a theoretical distribution) or its average valu ...
*
Sum of squares (statistics)
The partition of sums of squares is a concept that permeates much of inferential statistics and descriptive statistics. More properly, it is the partitioning of sums of squared deviations or errors. Mathematically, the sum of squared deviations ...
References
* {{cite book
, title = Applied Regression Analysis
, edition = 3rd
, last1= Draper , first1=N.R. , last2=Smith , first2=H.
, publisher = John Wiley
, year = 1998
, isbn = 0-471-17082-8
Least squares
Errors and residuals