HOME

TheInfoList



OR:

In mathematics, Tucker decomposition decomposes a
tensor In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects related to a vector space. Tensors may map between different objects such as vectors, scalars, and even other tenso ...
into a set of matrices and one small core tensor. It is named after Ledyard R. Tucker although it goes back to
Hitchcock Sir Alfred Joseph Hitchcock (13 August 1899 – 29 April 1980) was an English filmmaker. He is widely regarded as one of the most influential figures in the history of cinema. In a career spanning six decades, he directed over 50 featur ...
in 1927. Initially described as a three-mode extension of
factor analysis Factor analysis is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors. For example, it is possible that variations in six observed ...
and
principal component analysis Principal component analysis (PCA) is a popular technique for analyzing large datasets containing a high number of dimensions/features per observation, increasing the interpretability of data while preserving the maximum amount of information, and ...
it may actually be generalized to higher mode analysis, which is also called
higher-order singular value decomposition In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one generalization of the matrix singular value decomposition. It has applications in co ...
(HOSVD). It may be regarded as a more flexible
PARAFAC In multilinear algebra, the tensor rank decomposition or the rank-R decomposition of a tensor is the decomposition of a tensor in terms of a sum of minimum R rank-1 tensors. This is an open problem. Canonical polyadic decomposition (CPD) is a var ...
(parallel factor analysis) model. In PARAFAC the core tensor is restricted to be "diagonal". In practice, Tucker decomposition is used as a modelling tool. For instance, it is used to model three-way (or higher way) data by means of relatively small numbers of components for each of the three or more modes, and the components are linked to each other by a three- (or higher-) way core array. The model parameters are estimated in such a way that, given fixed numbers of components, the modelled data optimally resemble the actual data in the least squares sense. The model gives a summary of the information in the data, in the same way as principal components analysis does for two-way data. For a 3rd-order tensor T \in F^, where F is either \mathbb or \mathbb, Tucker Decomposition can be denoted as follows, T = \mathcal \times_ U^ \times_ U^ \times_ U^ where \mathcal \in F^ is the ''core tensor'', a 3rd-order tensor that contains the 1-mode, 2-mode and 3-mode singular values of T, which are defined as the ''
Frobenius norm In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices (of given dimensions). Preliminaries Given a field K of either real or complex numbers, let K^ be the -vector space of matrices with m rows ...
'' of the 1-mode, 2-mode and 3-mode slices of tensor \mathcal respectively. U^, U^, U^ are unitary matrices in F^, F^, F^ respectively. The ''j''-mode product (''j'' = 1, 2, 3) of \mathcal by U^ is denoted as \mathcal \times U^ with entries as :\begin (\mathcal \times_ U^)(n_, d_, d_) &= \sum_^ \mathcal(i_, d_, d_)U^(i_, n_) \\ (\mathcal \times_ U^)(d_, n_, d_) &= \sum_^ \mathcal(d_, i_, d_)U^(i_, n_) \\ (\mathcal \times_ U^)(d_, d_, n_) &= \sum_^ \mathcal(d_, d_, i_)U^(i_, n_) \end Taking d_i = n_i for all i is always sufficient to represent T exactly, but often T can be compressed or efficiently approximately by choosing d_i < n_i. A common choice is d_1 = d_2 = d_3 = \min(n_1, n_2, n_3), which can be effective when the difference in dimension sizes is large. There are two special cases of Tucker decomposition: Tucker1: if U^ and U^ are identity, then T = \mathcal \times_ U^ Tucker2: if U^ is identity, then T = \mathcal \times_ U^ \times_ U^ . RESCAL decomposition can be seen as a special case of Tucker where U^ is identity and U^ is equal to U^ .


See also

*
Higher-order singular value decomposition In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one generalization of the matrix singular value decomposition. It has applications in co ...
*
Multilinear principal component analysis Within statistics, Multilinear principal component analysis (MPCA) is a multilinear extension of principal component analysis (PCA). MPCA is employed in the analysis of M-way arrays, i.e. a cube or hyper-cube of numbers, also informally referred t ...


References

Dimension reduction {{statistics-stub