Polynomial Regression
   HOME

TheInfoList



OR:

In
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, polynomial regression is a form of
regression analysis In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one ...
in which the relationship between the
independent variable Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or demand ...
''x'' and the
dependent variable Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or demand ...
''y'' is modelled as an ''n''th degree
polynomial In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An exa ...
in ''x''. Polynomial regression fits a nonlinear relationship between the value of ''x'' and the corresponding
conditional mean In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take “on average” over an arbitrarily large number of occurrences – given ...
of ''y'', denoted E(''y'' , ''x''). Although ''polynomial regression'' fits a nonlinear model to the data, as a
statistical estimation Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value ...
problem it is linear, in the sense that the regression function E(''y'' ,  ''x'') is linear in the unknown
parameter A parameter (), generally, is any characteristic that can help in defining or classifying a particular system (meaning an event, project, object, situation, etc.). That is, a parameter is an element of a system that is useful, or critical, when ...
s that are estimated from the
data In the pursuit of knowledge, data (; ) is a collection of discrete values that convey information, describing quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpreted ...
. For this reason, polynomial regression is considered to be a special case of
multiple linear regression In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is call ...
. The explanatory (independent) variables resulting from the polynomial expansion of the "baseline" variables are known as higher-degree terms. Such variables are also used in
classification Classification is a process related to categorization, the process in which ideas and objects are recognized, differentiated and understood. Classification is the grouping of related facts into classes. It may also refer to: Business, organizat ...
settings.


History

Polynomial regression models are usually fit using the method of
least squares The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the res ...
. The least-squares method minimizes the
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
of the
unbiased Bias is a disproportionate weight ''in favor of'' or ''against'' an idea or thing, usually in a way that is closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individual, a group, ...
estimators In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule (the estimator), the quantity of interest (the estimand) and its result (the estimate) are distinguished. For example, the ...
of the coefficients, under the conditions of the Gauss–Markov theorem. The least-squares method was published in 1805 by Legendre and in 1809 by
Gauss Johann Carl Friedrich Gauss (; german: Gauß ; la, Carolus Fridericus Gauss; 30 April 177723 February 1855) was a German mathematician and physicist who made significant contributions to many fields in mathematics and science. Sometimes refer ...
. The first
design A design is a plan or specification for the construction of an object or system or for the implementation of an activity or process or the result of that plan or specification in the form of a prototype, product, or process. The verb ''to design'' ...
of an
experiment An experiment is a procedure carried out to support or refute a hypothesis, or determine the efficacy or likelihood of something previously untried. Experiments provide insight into Causality, cause-and-effect by demonstrating what outcome oc ...
for polynomial regression appeared in an 1815 paper of
Gergonne Joseph Diez Gergonne (19 June 1771 at Nancy, France – 4 May 1859 at Montpellier, France) was a French mathematician and logician. Life In 1791, Gergonne enlisted in the French army as a captain. That army was undergoing rapid expansion becau ...
. In the twentieth century, polynomial regression played an important role in the development of
regression analysis In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one ...
, with a greater emphasis on issues of
design A design is a plan or specification for the construction of an object or system or for the implementation of an activity or process or the result of that plan or specification in the form of a prototype, product, or process. The verb ''to design'' ...
and
inference Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word '' infer'' means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in ...
. More recently, the use of polynomial models has been complemented by other methods, with non-polynomial models having advantages for some classes of problems.


Definition and example

The goal of regression analysis is to model the expected value of a dependent variable ''y'' in terms of the value of an independent variable (or vector of independent variables) ''x''. In simple linear regression, the model : y = \beta_0 + \beta_1 x + \varepsilon, \, is used, where ε is an unobserved random error with mean zero conditioned on a
scalar Scalar may refer to: *Scalar (mathematics), an element of a field, which is used to define a vector space, usually the field of real numbers * Scalar (physics), a physical quantity that can be described by a single element of a number field such ...
variable ''x''. In this model, for each unit increase in the value of ''x'', the conditional expectation of ''y'' increases by ''β''1 units. In many settings, such a linear relationship may not hold. For example, if we are modeling the yield of a chemical synthesis in terms of the temperature at which the synthesis takes place, we may find that the yield improves by increasing amounts for each unit increase in temperature. In this case, we might propose a quadratic model of the form : y = \beta_0 + \beta_1x + \beta_2 x^2 + \varepsilon. \, In this model, when the temperature is increased from ''x'' to ''x'' + 1 units, the expected yield changes by \beta_1+\beta_2(2x+ 1). (This can be seen by replacing ''x'' in this equation with ''x''+1 and subtracting the equation in ''x'' from the equation in ''x''+1.) For
infinitesimal In mathematics, an infinitesimal number is a quantity that is closer to zero than any standard real number, but that is not zero. The word ''infinitesimal'' comes from a 17th-century Modern Latin coinage ''infinitesimus'', which originally referr ...
changes in ''x'', the effect on ''y'' is given by the
total derivative In mathematics, the total derivative of a function at a point is the best linear approximation near this point of the function with respect to its arguments. Unlike partial derivatives, the total derivative approximates the function with resp ...
with respect to ''x'': \beta_1+2\beta_2x. The fact that the change in yield depends on ''x'' is what makes the relationship between ''x'' and ''y'' nonlinear even though the model is linear in the parameters to be estimated. In general, we can model the expected value of ''y'' as an ''n''th degree polynomial, yielding the general polynomial regression model : y = \beta_0 + \beta_1 x + \beta_2 x^2 + \beta_3 x^3 + \cdots + \beta_n x^n + \varepsilon. \, Conveniently, these models are all linear from the point of view of
estimation Estimation (or estimating) is the process of finding an estimate or approximation, which is a value that is usable for some purpose even if input data may be incomplete, uncertain, or unstable. The value is nonetheless usable because it is der ...
, since the regression function is linear in terms of the unknown parameters ''β''0, ''β''1, .... Therefore, for
least squares The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the res ...
analysis, the computational and inferential problems of polynomial regression can be completely addressed using the techniques of
multiple regression In statistical modeling, regression analysis is a set of statistical processes for Estimation theory, estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning ...
. This is done by treating ''x'', ''x''2, ... as being distinct independent variables in a multiple regression model.


Matrix form and calculation of estimates

The polynomial regression model :y_i \,=\, \beta_0 + \beta_1 x_i + \beta_2 x_i^2 + \cdots + \beta_m x_i^m + \varepsilon_i\ (i = 1, 2, \dots , n) can be expressed in matrix form in terms of a design matrix \mathbf, a response vector \vec y, a parameter vector \vec \beta, and a vector \vec\varepsilon of random errors. The ''i''-th row of \mathbf and \vec y will contain the ''x'' and ''y'' value for the ''i''-th data sample. Then the model can be written as a system of linear equations: : \begin y_1\\ y_2\\ y_3 \\ \vdots \\ y_n \end= \begin 1 & x_1 & x_1^2 & \dots & x_1^m \\ 1 & x_2 & x_2^2 & \dots & x_2^m \\ 1 & x_3 & x_3^2 & \dots & x_3^m \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & x_n & x_n^2 & \dots & x_n^m \end \begin \beta_0\\ \beta_1\\ \beta_2\\ \vdots \\ \beta_m \end + \begin \varepsilon_1\\ \varepsilon_2\\ \varepsilon_3 \\ \vdots \\ \varepsilon_n \end, which when using pure matrix notation is written as : \vec y = \mathbf \vec \beta + \vec\varepsilon. \, The vector of estimated polynomial regression coefficients (using
ordinary least squares In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the prin ...
estimation Estimation (or estimating) is the process of finding an estimate or approximation, which is a value that is usable for some purpose even if input data may be incomplete, uncertain, or unstable. The value is nonetheless usable because it is der ...
) is : \widehat = (\mathbf^\mathsf \mathbf)^\; \mathbf^\mathsf \vec y, \, assuming ''m'' < ''n'' which is required for the matrix to be invertible; then since \mathbf is a
Vandermonde matrix In linear algebra, a Vandermonde matrix, named after Alexandre-Théophile Vandermonde, is a matrix with the terms of a geometric progression in each row: an matrix :V=\begin 1 & x_1 & x_1^2 & \dots & x_1^\\ 1 & x_2 & x_2^2 & \dots & x_2^\\ 1 & x_3 ...
, the invertibility condition is guaranteed to hold if all the x_i values are distinct. This is the unique least-squares solution.


Interpretation

Although polynomial regression is technically a special case of multiple linear regression, the interpretation of a fitted polynomial regression model requires a somewhat different perspective. It is often difficult to interpret the individual coefficients in a polynomial regression fit, since the underlying monomials can be highly correlated. For example, ''x'' and ''x''2 have correlation around 0.97 when x is uniformly distributed on the interval (0, 1). Although the correlation can be reduced by using
orthogonal polynomials In mathematics, an orthogonal polynomial sequence is a family of polynomials such that any two different polynomials in the sequence are orthogonality, orthogonal to each other under some inner product. The most widely used orthogonal polynomial ...
, it is generally more informative to consider the fitted regression function as a whole. Point-wise or simultaneous
confidence band A confidence band is used in statistical analysis to represent the uncertainty in an estimate of a curve or function based on limited or noisy data. Similarly, a prediction band is used to represent the uncertainty about the value of a new data-p ...
s can then be used to provide a sense of the uncertainty in the estimate of the regression function.


Alternative approaches

Polynomial regression is one example of regression analysis using
basis functions In mathematics, a basis function is an element of a particular basis for a function space. Every function in the function space can be represented as a linear combination of basis functions, just as every vector in a vector space can be repres ...
to model a functional relationship between two quantities. More specifically, it replaces x \in \mathbb R^ in linear regression with polynomial basis \varphi (x) \in \mathbb R^, e.g. ,x\mathbin ,x,x^2,\ldots,x^d/math>. A drawback of polynomial bases is that the basis functions are "non-local", meaning that the fitted value of ''y'' at a given value ''x'' = ''x''0 depends strongly on data values with ''x'' far from ''x''0. In modern statistics, polynomial basis-functions are used along with new
basis function In mathematics, a basis function is an element of a particular basis for a function space. Every function in the function space can be represented as a linear combination of basis functions, just as every vector in a vector space can be represen ...
s, such as splines,
radial basis function A radial basis function (RBF) is a real-valued function \varphi whose value depends only on the distance between the input and some fixed point, either the origin, so that \varphi(\mathbf) = \hat\varphi(\left\, \mathbf\right\, ), or some other fixed ...
s, and
wavelet A wavelet is a wave-like oscillation with an amplitude that begins at zero, increases or decreases, and then returns to zero one or more times. Wavelets are termed a "brief oscillation". A taxonomy of wavelets has been established, based on the num ...
s. These families of basis functions offer a more parsimonious fit for many types of data. The goal of polynomial regression is to model a non-linear relationship between the independent and dependent variables (technically, between the independent variable and the conditional mean of the dependent variable). This is similar to the goal of
nonparametric regression Nonparametric regression is a category of regression analysis in which the predictor does not take a predetermined form but is constructed according to information derived from the data. That is, no parametric form is assumed for the relationship ...
, which aims to capture non-linear regression relationships. Therefore, non-parametric regression approaches such as
smoothing In statistics and image processing, to smooth a data set is to create an approximating function (mathematics), function that attempts to capture important patterns in the data, while leaving out noise or other fine-scale structures/rapid phenomena ...
can be useful alternatives to polynomial regression. Some of these methods make use of a localized form of classical polynomial regression. An advantage of traditional polynomial regression is that the inferential framework of multiple regression can be used (this also holds when using other families of basis functions such as splines). A final alternative is to use kernelized models such as
support vector regression In machine learning, support vector machines (SVMs, also support vector networks) are supervised learning models with associated learning algorithms that analyze data for classification and regression analysis. Developed at AT&T Bell Laboratories ...
with a
polynomial kernel In machine learning, the polynomial kernel is a kernel function commonly used with support vector machines (SVMs) and other kernelized models, that represents the similarity of vectors (training samples) in a feature space over polynomials of th ...
. If residuals have
unequal variance In statistics, a sequence (or a vector) of random variables is homoscedastic () if all its random variables have the same finite variance. This is also known as homogeneity of variance. The complementary notion is called heteroscedasticity. The s ...
, a
weighted least squares Weighted least squares (WLS), also known as weighted linear regression, is a generalization of ordinary least squares and linear regression in which knowledge of the variance of observations is incorporated into the regression. WLS is also a speci ...
estimator may be used to account for that.


See also

*
Curve fitting Curve fitting is the process of constructing a curve, or mathematical function, that has the best fit to a series of data points, possibly subject to constraints. Curve fitting can involve either interpolation, where an exact fit to the data is ...
*
Line regression In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is cal ...
*
Local polynomial regression Local regression or local polynomial regression, also known as moving regression, is a generalization of the moving average and polynomial regression. Its most common methods, initially developed for scatterplot smoothing, are LOESS (locally e ...
*
Polynomial and rational function modeling In statistical modeling (especially process modeling), polynomial functions and rational functions are sometimes used as an empirical technique for curve fitting. Polynomial function models A polynomial function is one that has the form : y = a_x ...
*
Polynomial interpolation In numerical analysis, polynomial interpolation is the interpolation of a given data set by the polynomial of lowest possible degree that passes through the points of the dataset. Given a set of data points (x_0,y_0), \ldots, (x_n,y_n), with no ...
*
Response surface methodology In statistics, response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. The method was introduced by George E. P. Box and K. B. Wilson in 1951. The main idea of RSM ...
*
Smoothing spline Smoothing splines are function estimates, \hat f(x), obtained from a set of noisy observations y_i of the target f(x_i), in order to balance a measure of goodness of fit of \hat f(x_i) to y_i with a derivative based measure of the smoothness of \ ...


Notes

* Microsoft Excel makes use of polynomial regression when fitting a trendline to data points on an X Y scatter plot.


References

{{Least Squares and Regression Analysis


External links


Curve Fitting
PhET PhET Interactive Simulations, a project at the University of Colorado Boulder, is a non-profit open educational resource project that creates and hosts explorable explanations. It was founded in 2002 by Nobel Laureate Carl Wieman. PhET began with ...
Interactive simulations, University of Colorado at Boulder Regression analysis