Generalized Linear Array Model
   HOME

TheInfoList



OR:

In statistics, the generalized linear array model (GLAM) is used for analyzing data sets with array structures. It based on the generalized linear model with the
design matrix In statistics and in particular in regression analysis, a design matrix, also known as model matrix or regressor matrix and often denoted by X, is a matrix of values of explanatory variables of a set of objects. Each row represents an individual ob ...
written as a
Kronecker product In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a generalization of the outer product (which is denoted by the same symbol) from vectors ...
.


Overview

The generalized linear array model or GLAM was introduced in 2006. Such models provide a structure and a computational procedure for fitting generalized linear models or GLMs whose model matrix can be written as a Kronecker product and whose data can be written as an array. In a large GLM, the GLAM approach gives very substantial savings in both storage and computational time over the usual GLM algorithm. Suppose that the data \mathbf Y is arranged in a d-dimensional array with size n_1\times n_2\times\dots\times n_d; thus, the corresponding data vector \mathbf y = \operatorname(\mathbf Y) has size n_1n_2n_3\cdots n_d. Suppose also that the
design matrix In statistics and in particular in regression analysis, a design matrix, also known as model matrix or regressor matrix and often denoted by X, is a matrix of values of explanatory variables of a set of objects. Each row represents an individual ob ...
is of the form :\mathbf X = \mathbf X_d\otimes\mathbf X_\otimes\dots\otimes\mathbf X_1. The standard analysis of a GLM with data vector \mathbf y and design matrix \mathbf X proceeds by repeated evaluation of the scoring algorithm : \mathbf X'\tilde_\delta\mathbf X\hat = \mathbf X'\tilde_\delta\tilde , where \tilde represents the approximate solution of \boldsymbol\theta, and \hat is the improved value of it; \mathbf W_\delta is the diagonal weight matrix with elements : w_^ = \left(\frac\right)^2\mathrm(y_i), and :\mathbf z = \boldsymbol\eta + \mathbf W_\delta^(\mathbf y - \boldsymbol\mu) is the working variable. Computationally, GLAM provides array algorithms to calculate the linear predictor, : \boldsymbol\eta = \mathbf X \boldsymbol\theta and the weighted inner product : \mathbf X'\tilde_\delta\mathbf X without evaluation of the model matrix \mathbf X .


Example

In 2 dimensions, let \mathbf X = \mathbf X_2\otimes\mathbf X_1, then the linear predictor is written \mathbf X_1 \boldsymbol\Theta \mathbf X_2' where \boldsymbol\Theta is the matrix of coefficients; the weighted inner product is obtained from G(\mathbf X_1)' \mathbf W G(\mathbf X_2) and \mathbf W is the matrix of weights; here G(\mathbf M) is the row tensor function of the r \times c matrix \mathbf M given by :G(\mathbf M) = (\mathbf M \otimes \mathbf 1') \circ (\mathbf 1' \otimes \mathbf M) where \circ means element by element multiplication and \mathbf 1 is a vector of 1's of length c. On the other hand, the row tensor function G(\mathbf M) of the r \times c matrix \mathbf M is the example of Face-splitting product of matrices, which was proposed by Vadym Slyusar in 1996: : \mathbf \bull \mathbf = \left(\mathbf \otimes \mathbf ^\textsf\right) \circ \left(\mathbf ^\textsf \otimes \mathbf \right) , where \bull means Face-splitting product. These low storage high speed formulae extend to d-dimensions.


Applications

GLAM is designed to be used in d-dimensional smoothing problems where the data are arranged in an array and the smoothing matrix is constructed as a Kronecker product of d one-dimensional smoothing matrices.


References

{{reflist Regression models Array model