Skip to content

8. Sum‐of‐squares Polynomials

Fabian Geyer edited this page Sep 30, 2024 · 3 revisions

Definition

The problem of determining whether a general function is nonnegative, i.e., whether

$$ F(x_1, \dots, x_n) \geq 0 \quad \text{for all} \quad x_1, \dots, x_n \in \mathbb{R}, $$

is a fundamental question in mathematics with wide-ranging applications across various fields. As Parrilo discusses in his influential work [1], this problem is crucial for many theoretical and practical purposes.

To approach the problem, we reduce the formulation to polynomial functions, that is

$$ p(x)=\sum\limits_{\alpha} c_{\alpha}x^{\alpha}=\sum\limits_{\alpha} c_{\alpha}x_{1}^{\alpha_{1}}x_{2}^{\alpha_{2}}\dots x_{n}^{\alpha_{n}} $$

as described in detail in 2. Polynomials. We write

$$p(x) \in \mathbb{R}[x].$$

One potential method to establish the nonnegativity of such a polynomial is to demonstrate that it can be decomposed into a sum of squares. In other words,

$$p(x) = \sum\limits_{i}f_{i}(x)^{2}$$

where each $f_i(x)$ is a polynomial.
If such a decomposition exists, we refer to $p(x)$ as a sum-of-squares polynomial, and we denote this by writing

$$ p(x) \in \Sigma[x]. $$

Note

Existence of SOS Decompositions: Hilbert showed that not every nonnegative polynomial can be decomposed into an SOS polynomial. This however does not stop us from trying to find them.

Gram Matrix Method

Consider a polynomial $p(x) \in \Sigma[x]$ of degree $2d$. Using the Gram matrix method we express it as a quadratic form

$$ p(x) = z^{\top}Qz \quad z = [1, x_1, x_2, \dots, x_n, x_1x_2, \dots, x_n^d] $$

in all the monomials of degree less than or equal to $d$ as described in [1]. For example, for a polynomial of degree $2d=4$ we introduce the monomial vector

$$z = \begin{bmatrix}1 && x_{1} && x_{2} && x_{1}^{2} && x_{1}x_{2} && x_{2}^{2}\end{bmatrix}^{\top}.$$

Furthermore, $Q \in \mathcal{S}^n$ is a symmetric matrix. If $Q$ is positive semidefinite ($Q \succcurlyeq 0$), then the polynomial $p(x)$ is nonnegative.

Consider for example

$$F(x) = 2x^{4} +2x^{3}y -x^{2}y^{2}+5y^{4}$$

for which we want to prove nonnegativity. We introduce

$$ \begin{align*} f(x) &= z^\top Q z \\ &= \begin{bmatrix} x^2 \ y^2 \ xy \end{bmatrix}^\top \begin{bmatrix} q_{11} & q_{12} & q_{13} \\ q_{12} & q_{22} & q_{23} \\ q_{13} & q_{23} & q_{33} \end{bmatrix} \begin{bmatrix} x^2 \ y^2 \ xy \end{bmatrix} \\ &= q_{11}x^4 + q_{22}y^4 + (q_{33} + 2q_{12})x^2y^2 + 2q_{13}x^3y + 2q_{23}xy^3 \end{align*} $$

and compare coefficients to obtain

$$q_{11}=2, \quad q_{22}=5, \quad q_{33}+2q_{12}=-1,\quad 2q_{13}=2, \quad 2q_{23}=0.$$

As we can see, the solution defines an affine subspace, or hyperplane. Additionally, we require $Q$ to be positive semidefinite, meaning $z^{\top}Qz \geq 0$. As demonstrated in cones, $Q$ must therefore reside within the cone of positive semidefinite matrices.

Relationship to SDP

The solutions we seek are found at the intersection of a hyperplane with the cone of positive semidefinite matrices. This is precisely the type of problem that semidefinite programming is designed to solve.

A semidefinite program is formulated as follows:

$$ \begin{align*} \text{minimize} \quad & c^T x \\ \text{subject to} \quad & x_1 F_1 + \cdots + x_n F_n + G \preceq 0, \\ & Ax = b. \end{align*} $$

The second constraint represents a linear matrix inequality, which allows us to enforce the condition $Q \succeq 0$, while the equation $Ax = b$ defines a hyperplane.

Dual Cone

TODO

Positivstellensatz

TODO


[1] P. A. Parrilo, “Semidefinite programming relaxations for semialgebraic problems,” Mathematical Programming, vol. 96, no. 2, pp. 293–320, May 2003, doi: 10.1007/s10107-003-0387-5.