HOME
*





Higher-order Singular Value Decomposition
In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one generalization of the matrix singular value decomposition. It has applications in computer vision, computer graphics, machine learning, scientific computing, and signal processing. Some aspects can be traced as far back as F. L. Hitchcock in 1928, but it was L. R. Tucker who developed for third-order tensors the general Tucker decomposition in the 1960s, further advocated by L. De Lathauwer ''et al.'' in their Multilinear SVD work that employs the power method, and advocated by Vasilescu and Terzopoulos that developed M-mode SVD. The term HOSVD was coined by Lieven DeLathauwer, but the algorithm referred to commonly in the literature as the HOSVD and attributed to either Tucker or DeLathauwer was developed by Vasilescu and Terzopoulos.M. A. O. Vasilescu, D. Terzopoulos (2002) with the name M-mode SVD. It is a particul ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Multilinear Algebra
Multilinear algebra is a subfield of mathematics that extends the methods of linear algebra. Just as linear algebra is built on the concept of a vector and develops the theory of vector spaces, multilinear algebra builds on the concepts of ''p''-vectors and multivectors with Grassmann algebras. Origin In a vector space of dimension ''n'', normally only vectors are used. However, according to Hermann Grassmann and others, this presumption misses the complexity of considering the structures of pairs, triplets, and general multi-vectors. With several combinatorial possibilities, the space of multi-vectors has 2''n'' dimensions. The abstract formulation of the determinant is the most immediate application. Multilinear algebra also has applications in the mechanical study of material response to stress and strain with various moduli of elasticity. This practical reference led to the use of the word tensor, to describe the elements of the multilinear space. The extra structure in a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tensor Reshaping
In multilinear algebra, a reshaping of tensors is any bijection between the set of indices of an order-d tensor and the set of indices of an order-\ell tensor, where \ell refers to the set \ of the first positive integers. For each integer k where 1 \le k \le d for a positive integer d, let ''V''''k'' denote an ''n''''k''- dimensional vector space over a field F. Then there are vector space isomorphisms (linear maps) \begin V_1 \otimes \cdots \otimes V_d & \simeq F^ \otimes \cdots \otimes F^ \\ & \simeq F^ \otimes \cdots \otimes F^ \\ & \simeq F^ \otimes F^ \otimes \cdots \otimes F^ \\ & \simeq F^ \otimes F^ \otimes F^ \otimes \cdots \otimes F^ \\ & \,\,\,\vdots \\ & \simeq F^, \end where \pi \in \mathfrak_d is any permutation and \mathfrak_d is the symmetric group on d elements. Via these (and other) vector space isomorphisms, a tensor can be interpreted in several ways as an order-\ell tensor where \ell \le d. Coordinate representation The first vector space isomo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Robust Statistics
Robust statistics are statistics with good performance for data drawn from a wide range of probability distributions, especially for distributions that are not normal. Robust statistical methods have been developed for many common problems, such as estimating location, scale, and regression parameters. One motivation is to produce statistical methods that are not unduly affected by outliers. Another motivation is to provide methods with good performance when there are small departures from a parametric distribution. For example, robust methods work well for mixtures of two normal distributions with different standard deviations; under this model, non-robust methods like a t-test work poorly. Introduction Robust statistics seek to provide methods that emulate popular statistical methods, but which are not unduly affected by outliers or other small departures from Statistical assumption, model assumptions. In statistics, classical estimation methods rely heavily on assumpti ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




TP Model Transformation In Control Theories
Baranyi and Yam proposed the TP model transformation as a new concept in quasi-LPV (qLPV) based control, which plays a central role in the highly desirable bridging between identification and polytopic systems theories. It is also used as a TS (Takagi-Sugeno) fuzzy model transformation. It is uniquely effective in manipulating the convex hull of polytopic forms (or TS fuzzy models), and, hence, has revealed and proved the fact that convex hull manipulation is a necessary and crucial step in achieving optimal solutions and decreasing conservativeness in modern linear matrix inequality based control theory. Thus, although it is a transformation in a mathematical sense, it has established a conceptually new direction in control theory and has laid the ground for further new approaches towards optimality. For details please visit: TP model transformation. ;TP-tool MATLAB toolbox: A free MATLAB implementation of the TP model transformation can be downloaded aor an old version of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


TP Model Transformation
In mathematics, the tensor product (TP) model transformation was proposed by Baranyi and Yam as key concept for higher-order singular value decomposition of functions. It transforms a function (which can be given via closed formulas or neural networks, fuzzy logic, etc.) into TP function form if such a transformation is possible. If an exact transformation is not possible, then the method determines a TP function that is an approximation of the given function. Hence, the TP model transformation can provide a trade-off between approximation accuracy and complexity. A free MATLAB implementation of the TP model transformation can be downloaded aor an old version of the toolbox is available at MATLAB Centra A key underpinning of the transformation is the higher-order singular value decomposition. Besides being a transformation of functions, the TP model transformation is also a new concept in qLPV based control which plays a central role in the providing a valuable means of bridgin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tensor Product Model Transformation
In mathematics, the tensor product (TP) model transformation was proposed by Baranyi and Yam as key concept for higher-order singular value decomposition of functions. It transforms a function (which can be given via closed formulas or neural networks, fuzzy logic, etc.) into TP function form if such a transformation is possible. If an exact transformation is not possible, then the method determines a TP function that is an approximation of the given function. Hence, the TP model transformation can provide a trade-off between approximation accuracy and complexity. A free MATLAB implementation of the TP model transformation can be downloaded aor an old version of the toolbox is available at MATLAB Centra A key underpinning of the transformation is the higher-order singular value decomposition. Besides being a transformation of functions, the TP model transformation is also a new concept in qLPV based control which plays a central role in the providing a valuable means of bridgin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Disease Surveillance
Disease surveillance is an epidemiological practice by which the spread of disease is monitored in order to establish patterns of progression. The main role of disease surveillance is to predict, observe, and minimize the harm caused by outbreak, epidemic, and pandemic situations, as well as increase knowledge about which factors contribute to such circumstances. A key part of modern disease surveillance is the practice of disease case reporting. In modern times, reporting incidences of disease outbreaks has been transformed from manual record keeping, to instant worldwide internet communication. The number of cases could be gathered from hospitals – which would be expected to see most of the occurrences – collated, and eventually made public. With the advent of modern communication technology, this has changed dramatically. Organizations like the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC) now can report cases and deaths ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Frobenius Norm
In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices (of given dimensions). Preliminaries Given a field K of either real or complex numbers, let K^ be the -vector space of matrices with m rows and n columns and entries in the field K. A matrix norm is a norm on K^. This article will always write such norms with double vertical bars (like so: \, A\, ). Thus, the matrix norm is a function \, \cdot\, : K^ \to \R that must satisfy the following properties: For all scalars \alpha \in K and matrices A, B \in K^, *\, A\, \ge 0 (''positive-valued'') *\, A\, = 0 \iff A=0_ (''definite'') *\left\, \alpha A\right\, =\left, \alpha\ \left\, A\right\, (''absolutely homogeneous'') *\, A+B\, \le \, A\, +\, B\, (''sub-additive'' or satisfying the ''triangle inequality'') The only feature distinguishing matrices from rearranged vectors is multiplication. Matrix norms are particularly useful if they are also sub-multiplicative: *\left\, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Projection (linear Algebra)
In linear algebra and functional analysis, a projection is a linear transformation P from a vector space to itself (an endomorphism) such that P\circ P=P. That is, whenever P is applied twice to any vector, it gives the same result as if it were applied once (i.e. P is idempotent). It leaves its image unchanged. This definition of "projection" formalizes and generalizes the idea of graphical projection. One can also consider the effect of a projection on a geometrical object by examining the effect of the projection on points in the object. Definitions A projection on a vector space V is a linear operator P : V \to V such that P^2 = P. When V has an inner product and is complete (i.e. when V is a Hilbert space) the concept of orthogonality can be used. A projection P on a Hilbert space V is called an orthogonal projection if it satisfies \langle P \mathbf x, \mathbf y \rangle = \langle \mathbf x, P \mathbf y \rangle for all \mathbf x, \mathbf y \in V. A projection on a Hilbert ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Conjugate Transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \boldsymbol is an n \times m matrix obtained by transposing \boldsymbol and applying complex conjugate on each entry (the complex conjugate of a+ib being a-ib, for real numbers a and b). It is often denoted as \boldsymbol^\mathrm or \boldsymbol^* or \boldsymbol'. H. W. Turnbull, A. C. Aitken, "An Introduction to the Theory of Canonical Matrices," 1932. For real matrices, the conjugate transpose is just the transpose, \boldsymbol^\mathrm = \boldsymbol^\mathsf. Definition The conjugate transpose of an m \times n matrix \boldsymbol is formally defined by where the subscript ij denotes the (i,j)-th entry, for 1 \le i \le n and 1 \le j \le m, and the overbar denotes a scalar complex conjugate. This definition can also be written as :\boldsymbol^\mathrm = \left(\overline\right)^\mathsf = \overline where \boldsymbol^\mathsf denotes the transpose and \overline denotes the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Multilinear Multiplication
In multilinear algebra, applying a map that is the tensor product of linear maps to a tensor is called a multilinear multiplication. Abstract definition Let F be a field of characteristic zero, such as \mathbb or \mathbb . Let V_k be a finite-dimensional vector space over F, and let \mathcal \in V_1 \otimes V_2 \otimes \cdots \otimes V_d be an order-d simple tensor, i.e., there exist some vectors \mathbf_k \in V_k such that \mathcal = \mathbf_1 \otimes \mathbf_2 \otimes \cdots \otimes \mathbf_d. If we are given a collection of linear maps A_k : V_k \to W_k, then the multilinear multiplication of \mathcal with (A_1, A_2, \ldots, A_d) is defined as the action on \mathcal of the tensor product of these linear maps, namely \begin A_1 \otimes A_2 \otimes \cdots \otimes A_d : V_1 \otimes V_2 \otimes \cdots \otimes V_d & \to W_1 \otimes W_2 \otimes \cdots \otimes W_d, \\ \mathbf_1 \otimes \mathbf_2 \otimes \cdots \otimes \mathbf_d & \mapsto A_1(\mathbf_1) \otimes A_2(\mathbf ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]