HOME

TheInfoList



OR:

In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, a
form Form is the shape, visual appearance, or configuration of an object. In a wider sense, the form is the way something happens. Form also refers to: *Form (document), a document (printed or electronic) with spaces in which to write or enter data ...
(i.e. a homogeneous polynomial) ''h''(''x'') of
degree Degree may refer to: As a unit of measurement * Degree (angle), a unit of angle measurement ** Degree of geographical latitude ** Degree of geographical longitude * Degree symbol (°), a notation used in science, engineering, and mathematics ...
2''m'' in the
real Real may refer to: Currencies * Brazilian real (R$) * Central American Republic real * Mexican real * Portuguese real * Spanish real * Spanish colonial real Music Albums * ''Real'' (L'Arc-en-Ciel album) (2000) * ''Real'' (Bright album) (2010) ...
''n''-dimensional vector ''x'' is sum of squares of forms (SOS) if and only if there exist forms g_1(x),\ldots,g_k(x) of degree ''m'' such that h(x) = \sum_^k g_i(x)^2 . Every form that is SOS is also a
positive polynomial In mathematics, a positive polynomial on a particular set is a polynomial whose values are positive on that set. Let ''p'' be a polynomial in ''n'' variables with real coefficients and let ''S'' be a subset of the ''n''-dimensional Euclidean ...
, and although the
converse Converse may refer to: Mathematics and logic * Converse (logic), the result of reversing the two parts of a definite or implicational statement ** Converse implication, the converse of a material implication ** Converse nonimplication, a logical c ...
is not always true,
Hilbert David Hilbert (; ; 23 January 1862 – 14 February 1943) was a German mathematician, one of the most influential mathematicians of the 19th and early 20th centuries. Hilbert discovered and developed a broad range of fundamental ideas in many ...
proved that for ''n'' = 2, 2''m'' = 2 or ''n'' = 3 and 2''m'' = 4 a form is SOS if and only if it is positive. The same is also valid for the analog problem on positive ''symmetric'' forms. Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found. Moreover, every real nonnegative form can be approximated as closely as desired (in the l_1-norm of its coefficient vector) by a sequence of forms \ that are SOS.


Square matricial representation (SMR)

To establish whether a form is SOS amounts to solving a
convex optimization Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization probl ...
problem. Indeed, any can be written as h(x) = x^\left(H+L(\alpha)\right)x^ where x^ is a vector containing a base for the forms of degree ''m'' in ''x'' (such as all
monomial In mathematics, a monomial is, roughly speaking, a polynomial which has only one term. Two definitions of a monomial may be encountered: # A monomial, also called power product, is a product of powers of variables with nonnegative integer exponent ...
s of degree ''m'' in ''x''), the prime ′ denotes the
transpose In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal; that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other notations). The tr ...
, ''H'' is any
symmetric matrix In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with re ...
satisfying h(x) = x^Hx^ and L(\alpha) is a linear parameterization of the
linear space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but can ...
\mathcal = \left\. The dimension of the vector x^ is given by \sigma(n,m) = \binom, whereas the dimension of the vector \alpha is given by \omega(n,2m) = \frac\sigma(n,m)\left(1+\sigma(n,m)\right)-\sigma(n,2m). Then, is SOS if and only if there exists a vector \alpha such that H + L(\alpha) \ge 0, meaning that the
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
H + L(\alpha) is positive-semidefinite. This is a
linear matrix inequality In convex optimization, a linear matrix inequality (LMI) is an expression of the form : \operatorname(y):=A_0+y_1A_1+y_2A_2+\cdots+y_m A_m\succeq 0\, where * y= _i\,,~i\!=\!1,\dots, m/math> is a real vector, * A_0, A_1, A_2,\dots,A_m are n\times n ...
(LMI) feasibility test, which is a convex optimization problem. The expression h(x)=x^\left(H+L(\alpha)\right)x^ was introduced in with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.


Examples

*Consider the form of degree 4 in two variables h(x)=x_1^4-x_1^2x_2^2+x_2^4. We have m = 2,~x^ = \begin x_1^2\\x_1x_2\\x_2^2\end\!, ~H+L(\alpha) = \begin 1&0&-\alpha_1\\0&-1+2\alpha_1&0\\-\alpha_1&0&1 \end\!. Since there exists α such that H+L(\alpha)\ge 0, namely \alpha=1, it follows that ''h''(''x'') is SOS. *Consider the form of degree 4 in three variables h(x)=2x_1^4-2.5x_1^3x_2+x_1^2x_2x_3-2x_1x_3^3+5x_2^4+x_3^4. We have m=2,~x^=\beginx_1^2\\x_1x_2\\x_1x_3\\x_2^2\\x_2x_3\\x_3^2\end, ~H+L(\alpha) = \begin 2&-1.25&0&-\alpha_1&-\alpha_2&-\alpha_3\\ -1.25&2\alpha_1&0.5+\alpha_2&0&-\alpha_4&-\alpha_5\\ 0&0.5+\alpha_2&2\alpha_3&\alpha_4&\alpha_5&-1\\ -\alpha_1&0&\alpha_4&5&0&-\alpha_6\\ -\alpha_2&-\alpha_4&\alpha_5&0&2\alpha_6&0\\ -\alpha_3&-\alpha_5&-1&-\alpha_6&0&1 \end. Since H+L(\alpha)\ge 0 for \alpha=(1.18,-0.43,0.73,1.13,-0.37,0.57), it follows that is SOS.


Generalizations


Matrix SOS

A matrix form ''F''(''x'') (i.e., a matrix whose entries are forms) of dimension ''r'' and degree 2''m'' in the real ''n''-dimensional vector ''x'' is SOS if and only if there exist matrix forms G_1(x),\ldots,G_k(x) of degree ''m'' such that F(x)=\sum_^k G_i(x)'G_i(x) .


Matrix SMR

To establish whether a matrix form ''F''(''x'') is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any ''F''(''x'') can be written according to the SMR as F(x) = \left(x^\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^\otimes I_r\right) where \otimes is the
Kronecker product In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a generalization of the outer product (which is denoted by the same symbol) from vectors to ...
of matrices, ''H'' is any symmetric matrix satisfying F(x) = \left(x^\otimes I_r\right)'H\left(x^\otimes I_r\right) and L(\alpha) is a linear parameterization of the linear space \mathcal=\left\. The dimension of the vector \alpha is given by \omega(n,2m,r)=\fracr\left(\sigma(n,m)\left(r\sigma(n,m)+1\right)-(r+1)\sigma(n,2m)\right). Then, is SOS if and only if there exists a vector \alpha such that the following LMI holds: H+L(\alpha) \ge 0. The expression F(x) = \left(x^\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^\otimes I_r\right) was introduced in in order to establish whether a matrix form is SOS via an LMI.


Noncommutative polynomial SOS

Consider the
free algebra In mathematics, especially in the area of abstract algebra known as ring theory, a free algebra is the noncommutative analogue of a polynomial ring since its elements may be described as "polynomials" with non-commuting variables. Likewise, the po ...
''R''⟨''X''⟩ generated by the ''n'' noncommuting letters ''X'' = (''X''1, ..., ''X''''n'') and equipped with the involution T, such that T fixes ''R'' and ''X''1, ..., ''X''''n'' and reverses words formed by ''X''1, ..., ''X''''n''. By analogy with the commutative case, the noncommutative
symmetric polynomial In mathematics, a symmetric polynomial is a polynomial in variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, is a ''symmetric polynomial'' if for any permutation of the subscripts one has ...
s ''f'' are the noncommutative polynomials of the form . When any real matrix of any dimension ''r'' × ''r'' is evaluated at a symmetric noncommutative polynomial ''f'' results in a
positive semi-definite matrix In mathematics, a symmetric matrix M with real entries is positive-definite if the real number z^\textsfMz is positive for every nonzero real column vector z, where z^\textsf is the transpose of More generally, a Hermitian matrix (that is, a co ...
, ''f'' is said to be matrix-positive. A noncommutative polynomial is SOS if there exists noncommutative polynomials h_1,\ldots,h_k such that f(X) = \sum_^ h_i(X)^T h_i(X). Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive. Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.


References

{{Reflist


See also

*
Sum-of-squares optimization A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficien ...
*
Positive polynomial In mathematics, a positive polynomial on a particular set is a polynomial whose values are positive on that set. Let ''p'' be a polynomial in ''n'' variables with real coefficients and let ''S'' be a subset of the ''n''-dimensional Euclidean ...
*
Hilbert's seventeenth problem Hilbert's seventeenth problem is one of the 23 Hilbert problems set out in a celebrated list compiled in 1900 by David Hilbert. It concerns the expression of positive definite rational functions as sums of quotients of squares. The original q ...
*
SOS-convexity A multivariate polynomial is SOS-convex (or sum of squares convex) if its Hessian matrix H can be factored as H(''x'') = ''S''T(''x'')''S''(''x'') where ''S'' is a matrix (possibly rectangular) which entries are polynomials in ''x''. In other wo ...
Homogeneous polynomials Real algebraic geometry