Tensor Decomposition
   HOME

TheInfoList



OR:

In multilinear algebra, a tensor decomposition is any scheme for expressing a "data tensor" (M-way array) as a sequence of elementary operations acting on other, often simpler tensors. Many tensor decompositions generalize some matrix decompositions. Tensors are generalizations of matrices to higher dimensions and can consequently be treated as multidimensional fields. The main tensor decompositions are: * Tensor rank decomposition; *
Higher-order singular value decomposition In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one generalization of the matrix singular value decomposition. It has applications in ...
; * Tucker decomposition; * matrix product states, and operators or tensor trains; * Online Tensor Decompositions * hierarchical Tucker decomposition; * block term decomposition


Preliminary Definitions and Notation

This section introduces basic notations and operations that are widely used in the field. A summary of symbols that we use through the whole thesis can be found in the table.


Introduction

A multi-way graph with K perspectives is a collection of K matrices with dimensions I × J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I × J × K. In order to avoid overloading the term “dimension”, we call an I × J × K tensor a three “mode” tensor, where “modes” are the numbers of indices used to index the tensor.


References

Tensors {{linear-algebra-stub