In
regression
Regression or regressions may refer to:
Science
* Marine regression, coastal advance due to falling sea level, the opposite of marine transgression
* Regression (medicine), a characteristic of diseases to express lighter symptoms or less extent ( ...
, mean response (or expected response) and predicted response, also known as mean outcome (or expected outcome) and predicted outcome, are values of the
dependent variable
Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or demand ...
calculated from the regression parameters and a given value of the independent variable. The values of these two responses are the same, but their calculated
variance
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
s are different.
The concept is a generalization of the distinction between the
standard error of the mean
The standard error (SE) of a statistic (usually an estimate of a parameter) is the standard deviation of its sampling distribution or an estimate of that standard deviation. If the statistic is the sample mean, it is called the standard error of ...
and the
sample standard deviation
In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while ...
.
Background
In straight line fitting, the model is
:
where
is the
response variable,
is the
explanatory variable
Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or demand ...
, ''ε
i'' is the random error, and
and
are parameters. The mean, and predicted, response value for a given explanatory value, ''x
d'', is given by
:
while the actual response would be
:
Expressions for the values and variances of
and
are given in
linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is call ...
.
Variance of the mean response
Since the data in this context is defined to be (''x'', ''y'') pairs for every observation, the ''mean response'' at a given value of ''x'', say ''x
d'', is an estimate of the mean of the ''y'' values in the population at the ''x'' value of ''x
d'', that is
. The variance of the mean response is given by
:
This expression can be simplified to
:
where ''m'' is the number of data points.
To demonstrate this simplification, one can make use of the identity
:
Variance of the predicted response
The ''predicted response'' distribution is the predicted distribution of the residuals at the given point ''x
d''. So the variance is given by
:
The second line follows from the fact that
is zero because the new prediction point is independent of the data used to fit the model. Additionally, the term
was calculated earlier for the mean response.
Since
(a fixed but unknown parameter that can be estimated), the variance of the predicted response is given by
:
Confidence intervals
The
confidence intervals are computed as
. Thus, the confidence interval for predicted response is wider than the interval for mean response. This is expected intuitively – the variance of the population of
values does not shrink when one samples from it, because the random variable ''ε
i'' does not decrease, but the variance of the mean of the
does shrink with increased sampling, because the variance in
and
decrease, so the mean response (predicted response value) becomes closer to
.
This is analogous to the difference between the variance of a population and the variance of the sample mean of a population: the variance of a population is a parameter and does not change, but the variance of the sample mean decreases with increased samples.
General linear regression
The general linear model can be written as
:
Therefore, since
the general expression for the variance of the mean response is
:
where S is the
covariance matrix
In probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of ...
of the parameters, given by
:
See also
*
Expected value
In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
*
Prediction error
In statistics the mean squared prediction error or mean squared error of the predictions of a smoothing or curve fitting procedure is the expected value of the squared difference between the fitted values implied by the predictive function \wideh ...
*
Regression prediction
In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one ...
References
*
{{DEFAULTSORT:Mean And Predicted Response
Regression analysis