Tucker decomposition
   HOME

TheInfoList



OR:

In mathematics, Tucker decomposition decomposes a
tensor In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects related to a vector space. Tensors may map between different objects such as vectors, scalars, and even other tensor ...
into a set of matrices and one small core tensor. It is named after Ledyard R. Tucker although it goes back to Hitchcock in 1927. Initially described as a three-mode extension of
factor analysis Factor analysis is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors. For example, it is possible that variations in six observed ...
and principal component analysis it may actually be generalized to higher mode analysis, which is also called higher-order singular value decomposition (HOSVD). It may be regarded as a more flexible PARAFAC (parallel factor analysis) model. In PARAFAC the core tensor is restricted to be "diagonal". In practice, Tucker decomposition is used as a modelling tool. For instance, it is used to model three-way (or higher way) data by means of relatively small numbers of components for each of the three or more modes, and the components are linked to each other by a three- (or higher-) way core array. The model parameters are estimated in such a way that, given fixed numbers of components, the modelled data optimally resemble the actual data in the least squares sense. The model gives a summary of the information in the data, in the same way as principal components analysis does for two-way data. For a 3rd-order tensor T \in F^, where F is either \mathbb or \mathbb, Tucker Decomposition can be denoted as follows, T = \mathcal \times_ U^ \times_ U^ \times_ U^ where \mathcal \in F^ is the ''core tensor'', a 3rd-order tensor that contains the 1-mode, 2-mode and 3-mode singular values of T, which are defined as the ''
Frobenius norm In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices (of given dimensions). Preliminaries Given a field K of either real or complex numbers, let K^ be the -vector space of matrices with m ro ...
'' of the 1-mode, 2-mode and 3-mode slices of tensor \mathcal respectively. U^, U^, U^ are unitary matrices in F^, F^, F^ respectively. The ''j''-mode product (''j'' = 1, 2, 3) of \mathcal by U^ is denoted as \mathcal \times U^ with entries as :\begin (\mathcal \times_ U^)(n_, d_, d_) &= \sum_^ \mathcal(i_, d_, d_)U^(i_, n_) \\ (\mathcal \times_ U^)(d_, n_, d_) &= \sum_^ \mathcal(d_, i_, d_)U^(i_, n_) \\ (\mathcal \times_ U^)(d_, d_, n_) &= \sum_^ \mathcal(d_, d_, i_)U^(i_, n_) \end Taking d_i = n_i for all i is always sufficient to represent T exactly, but often T can be compressed or efficiently approximately by choosing d_i < n_i. A common choice is d_1 = d_2 = d_3 = \min(n_1, n_2, n_3), which can be effective when the difference in dimension sizes is large. There are two special cases of Tucker decomposition: Tucker1: if U^ and U^ are identity, then T = \mathcal \times_ U^ Tucker2: if U^ is identity, then T = \mathcal \times_ U^ \times_ U^ . RESCAL decomposition can be seen as a special case of Tucker where U^ is identity and U^ is equal to U^ .


See also

* Higher-order singular value decomposition *
Multilinear principal component analysis Within statistics, Multilinear principal component analysis (MPCA) is a multilinear extension of principal component analysis (PCA). MPCA is employed in the analysis of M-way arrays, i.e. a cube or hyper-cube of numbers, also informally referred t ...


References

Dimension reduction {{statistics-stub