Sum-of-squares optimization

From Wikipedia, the free encyclopedia
Jump to: navigation, search
This article deals with sum-of-squares constraints. For problems with sum-of-squares cost functions, see Least squares.

A sum-of-squares optimization program is an optimization problem with a linear cost function and a particular type of constraint on the decision variables. These constraints are of the form that when the decision variables are used as coefficients in certain polynomials, those polynomials should have the polynomial SOS property. Sum-of-squares optimization techniques have been successfully applied by researchers in the control engineering field.[1][2][3]

Optimization problem[edit]

The problem can be expressed as

 \min_{u\in\R^n} c^T u

subject to

 a_{k,0}(x) + a_{k,1}(x)u_1 + \cdots + a_{k,n}(x)u_n \in \text{SOS}
\quad (k=1,\ldots, N_s).

Here "SOS" represents the class of SOS polynomials. The vector c\in \R^n and polynomials \{ a_{k,j} \} are given as part of the data for the optimization problem. The quantities u\in \R^n are the decision variables. SOS programs can be converted to semidefinite programs (SDPs) using the connection between SOS polynomials and positive-semidefinite matrices.

Sum-of-squares background[edit]

A polynomial  p is a sum of squares (SOS) if there exist polynomials  \{f_i\}_{i=1}^m such that  p = \sum_{i=1}^m f_i^2 . For example,

p=x^2 - 4xy + 7y^2

is a sum of squares since

 p = f_1^2 + f_2^2


f_1 = (x-2y)\text{ and  }f_2 = \sqrt{3}y.

Note that if  p is a sum of squares then p(x) \ge 0 for all  x \in \R^n. Detailed descriptions of polynomial SOS are available.[4][5][6]

Quadratic forms can be expressed as  p(x)=x^T Q x where  Q is a symmetric matrix. Similarly, polynomials of degree ≤ 2d can be expressed as

 p(x)=z(x)^T Q z(x) ,

where the vector z contains all monomials of degree  \le d . This is known as the Gram matrix form. An important fact is that  p is SOS if and only if there exists a symmetric and positive-semidefinite matrix  Q such that p(x)=z(x)^T Q z(x) . This provides a connection between SOS polynomials and positive-semidefinite matrices.

Software tools[edit]


  1. ^ Tan, W., Packard, A., 2004. "Searching for control Lyapunov functions using sums of squares programming". In: Allerton Conf. on Comm., Control and Computing. pp. 210–219.
  2. ^ Tan, W., Topcu, U., Seiler, P., Balas, G., Packard, A., 2008. Simulation-aided reachability and local gain analysis for nonlinear dynamical systems. In: Proc. of the IEEE Conference on Decision and Control. pp. 4097–4102.
  3. ^ A. Chakraborty, P. Seiler, and G. Balas, “Susceptibility of F/A-18 Flight Controllers to the Falling-Leaf Mode: Nonlinear Analysis,” AIAA Journal of Guidance, Control, and Dynamics, Vol.34 No.1, 2011, 73–85.
  4. ^ Parrilo, P., (2000) Structured semidefinite programs and semialgebraic geometry methods in robustness and optimization. Ph.D. thesis, California Institute of Technology.
  5. ^ Parrilo, P. (2003) "Semidefinite programming relaxations for semialgebraic problems". Mathematical Programming Ser. B 96 (2), 293–320.
  6. ^ Lasserre, J. (2001) "Global optimization with polynomials and the problem of moments". SIAM Journal on Optimization, 11 (3), 796{817.