HOME

TheInfoList



OR:

In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, a moment matrix is a special symmetric square
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
whose rows and columns are indexed by
monomial In mathematics, a monomial is, roughly speaking, a polynomial which has only one term. Two definitions of a monomial may be encountered: # A monomial, also called power product, is a product of powers of variables with nonnegative integer exponent ...
s. The entries of the matrix depend on the product of the indexing monomials only (cf. Hankel matrices.) Moment matrices play an important role in
polynomial fitting In statistics, polynomial regression is a form of regression analysis in which the relationship between the independent variable ''x'' and the dependent variable ''y'' is modelled as an ''n''th degree polynomial in ''x''. Polynomial regression fi ...
, polynomial optimization (since positive semidefinite moment matrices correspond to polynomials which are sums of squares) and
econometrics Econometrics is the application of Statistics, statistical methods to economic data in order to give Empirical evidence, empirical content to economic relationships.M. Hashem Pesaran (1987). "Econometrics," ''The New Palgrave: A Dictionary of ...
.


Application in regression

A multiple
linear regression In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is call ...
model can be written as :y = \beta_ + \beta_ x_ + \beta_ x_ + \dots \beta_ x_ + u where y is the explained variable, x_, x_ \dots, x_ are the explanatory variables, u is the error, and \beta_, \beta_ \dots, \beta_ are unknown coefficients to be estimated. Given observations \left\_^, we have a system of n linear equations that can be expressed in matrix notation. :\begin y_ \\ y_ \\ \vdots \\ y_ \end = \begin 1 & x_ & x_ & \dots & x_ \\ 1 & x_ & x_ & \dots & x_ \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & x_ & x_ & \dots & x_ \\ \end \begin \beta_ \\ \beta_ \\ \vdots \\ \beta_ \end + \begin u_ \\ u_ \\ \vdots \\ u_ \end or :\mathbf = \mathbf \boldsymbol + \mathbf where \mathbf and \mathbf are each a vector of dimension n \times 1, \mathbf is the
design matrix In statistics and in particular in regression analysis, a design matrix, also known as model matrix or regressor matrix and often denoted by X, is a matrix of values of explanatory variables of a set of objects. Each row represents an individual ob ...
of order N \times (k+1), and \boldsymbol is a vector of dimension (k+1) \times 1. Under the Gauss–Markov assumptions, the best linear unbiased estimator of \boldsymbol is the linear
least squares The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the res ...
estimator \mathbf = \left( \mathbf^ \mathbf \right)^ \mathbf^ \mathbf, involving the two moment matrices \mathbf^ \mathbf and \mathbf^ \mathbf defined as :\mathbf^ \mathbf = \begin n & \sum x_ & \sum x_ & \dots & \sum x_ \\ \sum x_ & \sum x_^ & \sum x_ x_ & \dots & \sum x_ x_ \\ \sum x_ & \sum x_ x_ & \sum x_^ & \dots & \sum x_ x_ \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \sum x_ & \sum x_ x_ & \sum x_ x_ & \dots & \sum x_^ \end and :\mathbf^ \mathbf = \begin \sum y_ \\ \sum x_ y_ \\ \vdots \\ \sum x_ y_ \end where \mathbf^ \mathbf is a square
normal matrix In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose : The concept of normal matrices can be extended to normal operators on infinite dimensional normed spaces and to normal elements in C*-algebras. As ...
of dimension (k+1) \times (k+1), and \mathbf^ \mathbf is a vector of dimension (k+1 ) \times 1.


See also

*
Design matrix In statistics and in particular in regression analysis, a design matrix, also known as model matrix or regressor matrix and often denoted by X, is a matrix of values of explanatory variables of a set of objects. Each row represents an individual ob ...
*
Gramian matrix In linear algebra, the Gram matrix (or Gramian matrix, Gramian) of a set of vectors v_1,\dots, v_n in an inner product space is the Hermitian matrix of inner products, whose entries are given by the inner product G_ = \left\langle v_i, v_j \right\r ...
*
Projection matrix In statistics, the projection matrix (\mathbf), sometimes also called the influence matrix or hat matrix (\mathbf), maps the vector of response values (dependent variable values) to the vector of fitted values (or predicted values). It describes t ...


References


External links

* Matrices Least squares {{Linear-algebra-stub