Polynomial SOS
In mathematics, a form (i.e. a homogeneous polynomial) ''h''(''x'') of degree 2''m'' in the real ''n''-dimensional vector ''x'' is sum of squares of forms (SOS) if and only if there exist forms g_1(x),\ldots,g_k(x) of degree ''m'' such that h(x) = \sum_^k g_i(x)^2 . Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for ''n'' = 2, 2''m'' = 2, or ''n'' = 3 and 2''m'' = 4 a form is SOS if and only if it is positive. The same is also valid for the analog problem on positive ''symmetric'' forms. Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found. Moreover, every real nonnegative form can be approximated as closely as desired (in the l_1-norm of its coefficient vector) by a sequence of forms \ that are SOS. Square matricial representation (SMR) To establish whether a form is SOS amounts to solving a convex optimization problem. Indeed, any ca ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Mathematics
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many areas of mathematics, which include number theory (the study of numbers), algebra (the study of formulas and related structures), geometry (the study of shapes and spaces that contain them), Mathematical analysis, analysis (the study of continuous changes), and set theory (presently used as a foundation for all mathematics). Mathematics involves the description and manipulation of mathematical object, abstract objects that consist of either abstraction (mathematics), abstractions from nature orin modern mathematicspurely abstract entities that are stipulated to have certain properties, called axioms. Mathematics uses pure reason to proof (mathematics), prove properties of objects, a ''proof'' consisting of a succession of applications of in ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Positive-semidefinite Matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number \mathbf^\mathsf M \mathbf is positive for every nonzero real column vector \mathbf, where \mathbf^\mathsf is the row vector transpose of \mathbf. More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number \mathbf^* M \mathbf is positive for every nonzero complex column vector \mathbf, where \mathbf^* denotes the conjugate transpose of \mathbf. Positive semi-definite matrices are defined similarly, except that the scalars \mathbf^\mathsf M \mathbf and \mathbf^* M \mathbf are required to be positive ''or zero'' (that is, nonnegative). Negative-definite and negative semi-definite matrices are defined analogously. A matrix that is not positive semi-definite and not negative semi-definite is sometimes called ''indefinite''. Some authors use more general definitions of definiteness, permitting the matrices to be ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
SOS-convexity
A multivariate polynomial is SOS-convex (or sum of squares convex) if its Hessian matrix H can be factored as H(''x'') = ''S''T(''x'')''S''(''x'') where ''S'' is a matrix (possibly rectangular) which entries are polynomials in ''x''. In other words, the Hessian matrix is a SOS matrix polynomial. An equivalent definition is that the form defined as ''g''(''x'',''y'') = ''y''TH(''x'')''y'' is a sum of squares of forms. Connection with convexity If a polynomial is SOS-convex, then it is also convex. Since establishing whether a polynomial is SOS-convex amounts to solving a semidefinite programming problem, SOS-convexity can be used as a proxy to establishing if a polynomial is convex. In contrast, deciding if a generic quartic polynomial of degree four (or higher even degree) is convex is a NP-hard problem. The first counterexample of a polynomial which is convex but not SOS-convex was constructed by Amir Ali Ahmadi and Pablo Parrilo in 2009. The polynomial is a homogeneous ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Hilbert's Seventeenth Problem
Hilbert's seventeenth problem is one of the 23 Hilbert problems set out in a celebrated list compiled in 1900 by David Hilbert. It concerns the expression of positive definite rational functions as sums of quotients of squares. The original question may be reformulated as: * Given a multivariate polynomial that takes only non-negative values over the reals, can it be represented as a sum of squares of rational functions? Hilbert's question can be restricted to homogeneous polynomials of even degree, since a polynomial of odd degree changes sign, and the homogenization of a polynomial takes only nonnegative values if and only if the same is true for the polynomial. Motivation The formulation of the question takes into account that there are non-negative polynomials, for example :f(x,y,z)=z^6+x^4y^2+x^2y^4-3x^2y^2z^2, which cannot be represented as a sum of squares of other polynomials. In 1888, Hilbert showed that every non-negative homogeneous polynomial in ''n'' vari ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Positive Polynomial
In mathematics, a positive polynomial (respectively non-negative polynomial) on a particular set is a polynomial whose values are positive (respectively non-negative) on that set. Precisely, Let p be a polynomial in n variables with real coefficients and let S be a subset of the n-dimensional Euclidean space \mathbb^n. We say that: * p is positive on S if p(x)>0 for every x in S. * p is non-negative on S if p(x)\ge 0 for every x in S. Positivstellensatz (and nichtnegativstellensatz) For certain sets S, there exist algebraic descriptions of all polynomials that are positive (resp. non-negative) on S. Such a description is a positivstellensatz (resp. nichtnegativstellensatz). The importance of Positivstellensatz theorems in computation arises from its ability to transform problems of polynomial optimization into semidefinite programming problems, which can be efficiently solved using convex optimization techniques. Examples of positivstellensatz (and nichtnegativstellensatz) ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Sum-of-squares Optimization
A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. When fixing the maximum degree of the polynomials involved, sum-of-squares optimization is also known as the Lasserre hierarchy of relaxations in semidefinite programming. Sum-of-squares optimization techniques have been applied across a variety of areas, including control theory (in particular, for searching for polynomial Lyapunov functions for dynamical systems described by polynomial vector fields), statistics, finance and machine learning. Optimization problem Given a vector c\in \R^n and polynomials a_ for k=1, \dots N_s, j = 0, 1, \dots, n, a sum-of-squares optimization problem is written as \begin \underset \quad & c^T u \\ \text \quad & a_ ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Positive Semi-definite Matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number \mathbf^\mathsf M \mathbf is positive for every nonzero real column vector \mathbf, where \mathbf^\mathsf is the row vector transpose of \mathbf. More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number \mathbf^* M \mathbf is positive for every nonzero complex column vector \mathbf, where \mathbf^* denotes the conjugate transpose of \mathbf. Positive semi-definite matrices are defined similarly, except that the scalars \mathbf^\mathsf M \mathbf and \mathbf^* M \mathbf are required to be positive ''or zero'' (that is, nonnegative). Negative-definite and negative semi-definite matrices are defined analogously. A matrix that is not positive semi-definite and not negative semi-definite is sometimes called ''indefinite''. Some authors use more general definitions of definiteness, permitting the matrices to be ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Symmetric Polynomial
In mathematics, a symmetric polynomial is a polynomial in variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, is a ''symmetric polynomial'' if for any permutation of the subscripts one has . Symmetric polynomials arise naturally in the study of the relation between the roots of a polynomial in one variable and its coefficients, since the coefficients can be given by polynomial expressions in the roots, and all roots play a similar role in this setting. From this point of view the elementary symmetric polynomials are the most fundamental symmetric polynomials. Indeed, a theorem called the fundamental theorem of symmetric polynomials states that any symmetric polynomial can be expressed in terms of elementary symmetric polynomials. This implies that every ''symmetric'' polynomial expression in the roots of a monic polynomial can alternatively be given as a polynomial expression in the coefficients of the polynomial. Symme ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Free Algebra
In mathematics, especially in the area of abstract algebra known as ring theory, a free algebra is the noncommutative analogue of a polynomial ring since its elements may be described as "polynomials" with non-commuting variables. Likewise, the polynomial ring may be regarded as a free commutative algebra. Definition For ''R'' a commutative ring, the free (associative, unital) algebra on ''n'' indeterminates is the free ''R''-module with a basis consisting of all words over the alphabet (including the empty word, which is the unit of the free algebra). This ''R''-module becomes an ''R''-algebra by defining a multiplication as follows: the product of two basis elements is the concatenation of the corresponding words: :\left(X_X_ \cdots X_\right) \cdot \left(X_X_ \cdots X_\right) = X_X_ \cdots X_X_X_ \cdots X_, and the product of two arbitrary ''R''-module elements is thus uniquely determined (because the multiplication in an ''R''-algebra must be ''R''-bilinear). This ''R' ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Kronecker Product
In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a specialization of the tensor product (which is denoted by the same symbol) from vectors to matrices and gives the matrix of the tensor product linear map with respect to a standard choice of basis. The Kronecker product is to be distinguished from the usual matrix multiplication, which is an entirely different operation. The Kronecker product is also sometimes called matrix direct product. The Kronecker product is named after the German mathematician Leopold Kronecker (1823–1891), even though there is little evidence that he was the first to define and use it. The Kronecker product has also been called the ''Zehfuss matrix'', and the ''Zehfuss product'', after , who in 1858 described this matrix operation, but Kronecker product is currently the most widely used term. The misattribution to Kronecker rather than Zehfuss wa ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linear Matrix Inequality
In convex optimization, a linear matrix inequality (LMI) is an expression of the form : \operatorname(y):=A_0+y_1A_1+y_2A_2+\cdots+y_m A_m\succeq 0\, where * y= _i\,,~i\!=\!1,\dots, m/math> is a real vector, * A_0, A_1, A_2,\dots,A_m are n\times n symmetric matrices \mathbb^n, * B\succeq0 is a generalized inequality meaning B is a positive semidefinite matrix belonging to the positive semidefinite cone \mathbb_+ in the subspace of symmetric matrices \mathbb{S}. This linear matrix inequality specifies a convex constraint on y. Applications There are efficient numerical methods to determine whether an LMI is feasible (''e.g.'', whether there exists a vector ''y'' such that LMI(''y'') ≥ 0), or to solve a convex optimization problem with LMI constraints. Many optimization problems in control theory, system identification and signal processing can be formulated using LMIs. Also LMIs find application in Polynomial Sum-Of-Squares. The prototypical primal and dual sem ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Matrix (mathematics)
In mathematics, a matrix (: matrices) is a rectangle, rectangular array or table of numbers, symbol (formal), symbols, or expression (mathematics), expressions, with elements or entries arranged in rows and columns, which is used to represent a mathematical object or property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two-by-three matrix", a " matrix", or a matrix of dimension . Matrices are commonly used in linear algebra, where they represent linear maps. In geometry, matrices are widely used for specifying and representing geometric transformations (for example rotation (mathematics), rotations) and coordinate changes. In numerical analysis, many computational problems are solved by reducing them to a matrix computation, and this often involves computing with matrices of huge dimensions. Matrices are used in most areas of mathematics and scientific fields, either directly ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |