HOME
*





SOS-convexity
A multivariate polynomial is SOS-convex (or sum of squares convex) if its Hessian matrix H can be factored as H(''x'') = ''S''T(''x'')''S''(''x'') where ''S'' is a matrix (possibly rectangular) which entries are polynomials in ''x''. In other words, the Hessian matrix is a SOS matrix polynomial. An equivalent definition is that the form defined as ''g''(''x'',''y'') = ''y''TH(''x'')''y'' is a sum of squares of forms. Connection with convexity If a polynomial is SOS-convex, then it is also convex. Since establishing whether a polynomial is SOS-convex amounts to solving a semidefinite programming problem, SOS-convexity can be used as a proxy to establishing if a polynomial is convex. In contrast, deciding if a generic polynomial of degree large than four is convex is a NP-hard problem. The first counterexample of a polynomial which is convex but not SOS-convex was constructed by Amir Ali Ahmadi and Pablo Parrilo in 2009. The polynomial is a homogeneous polynomial that is su ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Polynomial SOS
In mathematics, a form (i.e. a homogeneous polynomial) ''h''(''x'') of degree 2''m'' in the real ''n''-dimensional vector ''x'' is sum of squares of forms (SOS) if and only if there exist forms g_1(x),\ldots,g_k(x) of degree ''m'' such that h(x) = \sum_^k g_i(x)^2 . Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for ''n'' = 2, 2''m'' = 2 or ''n'' = 3 and 2''m'' = 4 a form is SOS if and only if it is positive. The same is also valid for the analog problem on positive ''symmetric'' forms. Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found. Moreover, every real nonnegative form can be approximated as closely as desired (in the l_1-norm of its coefficient vector) by a sequence of forms \ that are SOS. Square matricial representation (SMR) To establish whether a form is SOS amounts to solving a convex optimization problem. Indeed, any can be ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hilbert's Seventeenth Problem
Hilbert's seventeenth problem is one of the 23 Hilbert problems set out in a celebrated list compiled in 1900 by David Hilbert. It concerns the expression of positive definite rational functions as sums of quotients of squares. The original question may be reformulated as: * Given a multivariate polynomial that takes only non-negative values over the reals, can it be represented as a sum of squares of rational functions? Hilbert's question can be restricted to homogeneous polynomials of even degree, since a polynomial of odd degree changes sign, and the homogenization of a polynomial takes only nonnegative values if and only if the same is true for the polynomial. Motivation The formulation of the question takes into account that there are non-negative polynomials, for example :f(x,y,z)=z^6+x^4y^2+x^2y^4-3x^2y^2z^2, which cannot be represented as a sum of squares of other polynomials. In 1888, Hilbert showed that every non-negative homogeneous polynomial in ''n'' varia ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Multivariate Polynomial
In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An example of a polynomial of a single indeterminate is . An example with three indeterminates is . Polynomials appear in many areas of mathematics and science. For example, they are used to form polynomial equations, which encode a wide range of problems, from elementary word problems to complicated scientific problems; they are used to define polynomial functions, which appear in settings ranging from basic chemistry and physics to economics and social science; they are used in calculus and numerical analysis to approximate other functions. In advanced mathematics, polynomials are used to construct polynomial rings and algebraic varieties, which are central concepts in algebra and algebraic geometry. Etymology The word ''polynomial'' joins tw ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hessian Matrix
In mathematics, the Hessian matrix or Hessian is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. The Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named after him. Hesse originally used the term "functional determinants". Definitions and properties Suppose f : \R^n \to \R is a function taking as input a vector \mathbf \in \R^n and outputting a scalar f(\mathbf) \in \R. If all second-order partial derivatives of f exist, then the Hessian matrix \mathbf of f is a square n \times n matrix, usually defined and arranged as follows: \mathbf H_f= \begin \dfrac & \dfrac & \cdots & \dfrac \\ .2ex \dfrac & \dfrac & \cdots & \dfrac \\ .2ex \vdots & \vdots & \ddots & \vdots \\ .2ex \dfrac & \dfrac & \cdots & \dfrac \end, or, by stating an equation for the coefficients using indices i and j, (\mathbf H_f)_ = \fra ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Semidefinite Programming
Semidefinite programming (SDP) is a subfield of convex optimization concerned with the optimization of a linear objective function (a user-specified function that the user wants to minimize or maximize) over the intersection of the cone of positive semidefinite matrices with an affine space, i.e., a spectrahedron. Semidefinite programming is a relatively new field of optimization which is of growing interest for several reasons. Many practical problems in operations research and combinatorial optimization can be modeled or approximated as semidefinite programming problems. In automatic control theory, SDPs are used in the context of linear matrix inequalities. SDPs are in fact a special case of cone programming and can be efficiently solved by interior point methods. All linear programs and (convex) quadratic programs can be expressed as SDPs, and via hierarchies of SDPs the solutions of polynomial optimization problems can be approximated. Semidefinite programming has been use ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Amir Ali Ahmadi
Amir Ali Ahmadi is a professor in the Department of Operations Research and Financial Engineering at Princeton University. He is primarily known for his work on mathematical optimization. Biography Ahmadi obtained a B.S. in both mathematics and electrical engineering at the University of Maryland in 2006. He then received his M.S. and PhD from the Massachusetts Institute of Technology in 2008 and 2011 respectively, where he was supervised by Pablo Parrilo. After this, he spent a year in the Robot Locomotion Group at MIT as a postdoctoral fellow before joining the IBM Watson Research Center in 2012 as a Herman Goldstine Fellow. He is now professor in the department of Operations Research and Financial Engineering at Princeton University Princeton University is a private university, private research university in Princeton, New Jersey. Founded in 1746 in Elizabeth, New Jersey, Elizabeth as the College of New Jersey, Princeton is the List of Colonial Colleges, fourth-oldest ins ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Pablo Parrilo
Pablo A. Parrilo from MIT (Massachusetts Institute of Technology) was named Fellow of the Institute of Electrical and Electronics Engineers (IEEE) in 2016 for contributions to semidefinite and sum-of-squares optimization. He was named a SIAM Fellow The SIAM Fellowship is an award and fellowship that recognizes outstanding members of the Society for Industrial and Applied Mathematics (SIAM). The goal of the program is to: *honor SIAM members who are recognized by their peers as distinguished ... in 2018. References Fellow Members of the IEEE Fellows of the Society for Industrial and Applied Mathematics Living people Massachusetts Institute of Technology faculty 21st-century American engineers Year of birth missing (living people) {{Argentina-engineer-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Non-constructive
In mathematics, a constructive proof is a method of mathematical proof, proof that demonstrates the existence of a mathematical object by creating or providing a method for creating the object. This is in contrast to a non-constructive proof (also known as an existence proof or existence theorem, ''pure existence theorem''), which proves the existence of a particular kind of object without providing an example. For avoiding confusion with the stronger concept that follows, such a constructive proof is sometimes called an effective proof. A constructive proof may also refer to the stronger concept of a proof that is valid in constructive mathematics. Constructivism (mathematics), Constructivism is a mathematical philosophy that rejects all proof methods that involve the existence of objects that are not explicitly built. This excludes, in particular, the use of the law of the excluded middle, the axiom of infinity, and the axiom of choice, and induces a different meaning for some ter ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Sum-of-squares Optimization
A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. When fixing the maximum degree of the polynomials involved, sum-of-squares optimization is also known as the Lasserre hierarchy of relaxations in semidefinite programming. Sum-of-squares optimization techniques have been applied across a variety of areas, including control theory (in particular, for searching for polynomial Lyapunov functions for dynamical systems described by polynomial vector fields), statistics, finance and machine learning. Optimization problem The problem can be expressed as \max_ c^T u subject to a_(x) + a_(x)u_1 + \cdots + a_(x)u_n \in \text \quad (k=1,\ldots, N_s). Here "SOS" represents the class of sum-of-squares (SOS) pol ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Homogeneous Polynomials
In mathematics, a homogeneous polynomial, sometimes called wikt:quantic, quantic in older texts, is a polynomial whose nonzero terms all have the same Degree of a polynomial, degree. For example, x^5 + 2 x^3 y^2 + 9 x y^4 is a homogeneous polynomial of degree 5, in two variables; the sum of the exponents in each term is always 5. The polynomial x^3 + 3 x^2 y + z^7 is not homogeneous, because the sum of exponents does not match from term to term. The function defined by a homogeneous polynomial is always a homogeneous function. An algebraic form, or simply form, is a function (mathematics), function defined by a homogeneous polynomial. A binary form is a form in two variables. A ''form'' is also a function defined on a vector space, which may be expressed as a homogeneous function of the coordinates over any basis (linear algebra), basis. A polynomial of degree 0 is always homogeneous; it is simply an element of the field (mathematics), field or ring (mathematics), ring of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Real Algebraic Geometry
In mathematics, real algebraic geometry is the sub-branch of algebraic geometry studying real algebraic sets, i.e. real-number solutions to algebraic equations with real-number coefficients, and mappings between them (in particular real polynomial mappings). Semialgebraic geometry is the study of semialgebraic sets, i.e. real-number solutions to algebraic inequalities with-real number coefficients, and mappings between them. The most natural mappings between semialgebraic sets are semialgebraic mappings, i.e., mappings whose graphs are semialgebraic sets. Terminology Nowadays the words 'semialgebraic geometry' and 'real algebraic geometry' are used as synonyms, because real algebraic sets cannot be studied seriously without the use of semialgebraic sets. For example, a projection of a real algebraic set along a coordinate axis need not be a real algebraic set, but it is always a semialgebraic set: this is the Tarski–Seidenberg theorem. Related fields are o-minimal theory and r ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]