HOME
*





Polynomial SOS
In mathematics, a form (i.e. a homogeneous polynomial) ''h''(''x'') of degree 2''m'' in the real ''n''-dimensional vector ''x'' is sum of squares of forms (SOS) if and only if there exist forms g_1(x),\ldots,g_k(x) of degree ''m'' such that h(x) = \sum_^k g_i(x)^2 . Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for ''n'' = 2, 2''m'' = 2 or ''n'' = 3 and 2''m'' = 4 a form is SOS if and only if it is positive. The same is also valid for the analog problem on positive ''symmetric'' forms. Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found. Moreover, every real nonnegative form can be approximated as closely as desired (in the l_1-norm of its coefficient vector) by a sequence of forms \ that are SOS. Square matricial representation (SMR) To establish whether a form is SOS amounts to solving a convex optimization problem. Indeed, any can ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Positive Polynomial
In mathematics, a positive polynomial on a particular set is a polynomial whose values are positive on that set. Let ''p'' be a polynomial in ''n'' variables with real coefficients and let ''S'' be a subset of the ''n''-dimensional Euclidean space ℝ''n''. We say that: * ''p'' is positive on ''S'' if ''p''(''x'') > 0 for every ''x'' in ''S''. * ''p'' is non-negative on ''S'' if ''p''(''x'') ≥ 0 for every ''x'' in ''S''. * ''p'' is zero on ''S'' if ''p''(''x'') = 0 for every ''x'' in ''S''. For certain sets ''S'', there exist algebraic descriptions of all polynomials that are positive, non-negative, or zero on ''S''. Such a description is a positivstellensatz, nichtnegativstellensatz, or nullstellensatz. This article will focus on the former two descriptions. For the latter, see Hilbert's Nullstellensatz for the most known nullstellensatz. Examples of positivstellensatz (and nichtnegativstellensatz) * Globally positive polynomials and su ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Positive-semidefinite Matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number z^\textsfMz is positive for every nonzero real column vector z, where z^\textsf is the transpose of More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number z^* Mz is positive for every nonzero complex column vector z, where z^* denotes the conjugate transpose of z. Positive semi-definite matrices are defined similarly, except that the scalars z^\textsfMz and z^* Mz are required to be positive ''or zero'' (that is, nonnegative). Negative-definite and negative semi-definite matrices are defined analogously. A matrix that is not positive semi-definite and not negative semi-definite is sometimes called indefinite. A matrix is thus positive-definite if and only if it is the matrix of a positive-definite quadratic form or Hermitian form. In other words, a matrix is positive-definite if and only if it defines ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


SOS-convexity
A multivariate polynomial is SOS-convex (or sum of squares convex) if its Hessian matrix H can be factored as H(''x'') = ''S''T(''x'')''S''(''x'') where ''S'' is a matrix (possibly rectangular) which entries are polynomials in ''x''. In other words, the Hessian matrix is a SOS matrix polynomial. An equivalent definition is that the form defined as ''g''(''x'',''y'') = ''y''TH(''x'')''y'' is a sum of squares of forms. Connection with convexity If a polynomial is SOS-convex, then it is also convex. Since establishing whether a polynomial is SOS-convex amounts to solving a semidefinite programming problem, SOS-convexity can be used as a proxy to establishing if a polynomial is convex. In contrast, deciding if a generic polynomial of degree large than four is convex is a NP-hard problem. The first counterexample of a polynomial which is convex but not SOS-convex was constructed by Amir Ali Ahmadi and Pablo Parrilo in 2009. The polynomial is a homogeneous polynomial that is ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Hilbert's Seventeenth Problem
Hilbert's seventeenth problem is one of the 23 Hilbert problems set out in a celebrated list compiled in 1900 by David Hilbert. It concerns the expression of positive definite rational functions as sums of quotients of squares. The original question may be reformulated as: * Given a multivariate polynomial that takes only non-negative values over the reals, can it be represented as a sum of squares of rational functions? Hilbert's question can be restricted to homogeneous polynomials of even degree, since a polynomial of odd degree changes sign, and the homogenization of a polynomial takes only nonnegative values if and only if the same is true for the polynomial. Motivation The formulation of the question takes into account that there are non-negative polynomials, for example :f(x,y,z)=z^6+x^4y^2+x^2y^4-3x^2y^2z^2, which cannot be represented as a sum of squares of other polynomials. In 1888, Hilbert showed that every non-negative homogeneous polynomial in ''n'' v ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Positive Polynomial
In mathematics, a positive polynomial on a particular set is a polynomial whose values are positive on that set. Let ''p'' be a polynomial in ''n'' variables with real coefficients and let ''S'' be a subset of the ''n''-dimensional Euclidean space ℝ''n''. We say that: * ''p'' is positive on ''S'' if ''p''(''x'') > 0 for every ''x'' in ''S''. * ''p'' is non-negative on ''S'' if ''p''(''x'') ≥ 0 for every ''x'' in ''S''. * ''p'' is zero on ''S'' if ''p''(''x'') = 0 for every ''x'' in ''S''. For certain sets ''S'', there exist algebraic descriptions of all polynomials that are positive, non-negative, or zero on ''S''. Such a description is a positivstellensatz, nichtnegativstellensatz, or nullstellensatz. This article will focus on the former two descriptions. For the latter, see Hilbert's Nullstellensatz for the most known nullstellensatz. Examples of positivstellensatz (and nichtnegativstellensatz) * Globally positive polynomials and su ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Sum-of-squares Optimization
A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. When fixing the maximum degree of the polynomials involved, sum-of-squares optimization is also known as the Lasserre hierarchy of relaxations in semidefinite programming. Sum-of-squares optimization techniques have been applied across a variety of areas, including control theory (in particular, for searching for polynomial Lyapunov functions for dynamical systems described by polynomial vector fields), statistics, finance and machine learning. Optimization problem The problem can be expressed as \max_ c^T u subject to a_(x) + a_(x)u_1 + \cdots + a_(x)u_n \in \text \quad (k=1,\ldots, N_s). Here "SOS" represents the class of sum-of-squares (SOS) poly ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Positive Semi-definite Matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number z^\textsfMz is positive for every nonzero real column vector z, where z^\textsf is the transpose of More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number z^* Mz is positive for every nonzero complex column vector z, where z^* denotes the conjugate transpose of z. Positive semi-definite matrices are defined similarly, except that the scalars z^\textsfMz and z^* Mz are required to be positive ''or zero'' (that is, nonnegative). Negative-definite and negative semi-definite matrices are defined analogously. A matrix that is not positive semi-definite and not negative semi-definite is sometimes called indefinite. A matrix is thus positive-definite if and only if it is the matrix of a positive-definite quadratic form or Hermitian form. In other words, a matrix is positive-definite if and only if it define ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Symmetric Polynomial
In mathematics, a symmetric polynomial is a polynomial in variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, is a ''symmetric polynomial'' if for any permutation of the subscripts one has . Symmetric polynomials arise naturally in the study of the relation between the roots of a polynomial in one variable and its coefficients, since the coefficients can be given by polynomial expressions in the roots, and all roots play a similar role in this setting. From this point of view the elementary symmetric polynomials are the most fundamental symmetric polynomials. A theorem states that any symmetric polynomial can be expressed in terms of elementary symmetric polynomials, which implies that every ''symmetric'' polynomial expression in the roots of a monic polynomial can alternatively be given as a polynomial expression in the coefficients of the polynomial. Symmetric polynomials also form an interesting structure by themselves ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Free Algebra
In mathematics, especially in the area of abstract algebra known as ring theory, a free algebra is the noncommutative analogue of a polynomial ring since its elements may be described as "polynomials" with non-commuting variables. Likewise, the polynomial ring may be regarded as a free commutative algebra. Definition For ''R'' a commutative ring, the free (associative, unital) algebra on ''n'' indeterminates is the free ''R''-module with a basis consisting of all words over the alphabet (including the empty word, which is the unit of the free algebra). This ''R''-module becomes an ''R''-algebra by defining a multiplication as follows: the product of two basis elements is the concatenation of the corresponding words: :\left(X_X_ \cdots X_\right) \cdot \left(X_X_ \cdots X_\right) = X_X_ \cdots X_X_X_ \cdots X_, and the product of two arbitrary ''R''-module elements is thus uniquely determined (because the multiplication in an ''R''-algebra must be ''R''-bilinear). This ''R''- ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Kronecker Product
In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a generalization of the outer product (which is denoted by the same symbol) from vectors to matrices, and gives the matrix of the tensor product linear map with respect to a standard choice of basis. The Kronecker product is to be distinguished from the usual matrix multiplication, which is an entirely different operation. The Kronecker product is also sometimes called matrix direct product. The Kronecker product is named after the German mathematician Leopold Kronecker (1823–1891), even though there is little evidence that he was the first to define and use it. The Kronecker product has also been called the ''Zehfuss matrix'', and the ''Zehfuss product'', after , who in 1858 described this matrix operation, but Kronecker product is currently the most widely used. Definition If A is an matrix and B is a matrix, then the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Linear Matrix Inequality
In convex optimization, a linear matrix inequality (LMI) is an expression of the form : \operatorname(y):=A_0+y_1A_1+y_2A_2+\cdots+y_m A_m\succeq 0\, where * y= _i\,,~i\!=\!1,\dots, m/math> is a real vector, * A_0, A_1, A_2,\dots,A_m are n\times n symmetric matrices \mathbb^n, * B\succeq0 is a generalized inequality meaning B is a positive semidefinite matrix belonging to the positive semidefinite cone \mathbb_+ in the subspace of symmetric matrices \mathbb{S}. This linear matrix inequality specifies a convex constraint on ''y''. Applications There are efficient numerical methods to determine whether an LMI is feasible (''e.g.'', whether there exists a vector ''y'' such that LMI(''y'') ≥ 0), or to solve a convex optimization problem with LMI constraints. Many optimization problems in control theory, system identification and signal processing can be formulated using LMIs. Also LMIs find application in Polynomial Sum-Of-Squares. The prototypical primal and dua ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Matrix (mathematics)
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a "-matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra. Therefore, the study of matrices is a large part of linear algebra, and most properties and operations of abstract linear algebra can be expressed in terms of matrices. For example, matrix multiplication represents composition of linear maps. Not all matrices are related to linear algebra. This is, in particular, the case in graph theory, of incidence matrices, and adjacency matrices. ''This article focuses on matrices related to linear algebra, and, un ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]