Jump to content

User:Saung Tadashi/Polynomial Optimization

From Wikipedia, the free encyclopedia

Polynomial optimization concerns a optimization problem where the objective is a multivariate polynomial and the constraints are defined by a semialgebraic set, that is, a set of equality and inequalities defined by polynomials. Polynomial optimization can be encountered in several problems from engineering and mathematics, such as control theory.[1]

Although it is possible to find local minimizers using common tools from non-linear optimization, the problem of global optimization is more complex (precisely, NP-hard[2]). Thus, common approaches to tackle a polynomial optimization problem are based on relaxing the problem to a sequence of easier problems whose solution converges to the global optimum of the original polynomial optimization problem.[3]

Moments of measures

[edit]

To solve a polynomial optimization problem, one can use a hierarchy of easier problems to approximate the original solution of the problem. A common approach is the Lasserre's hierarchy of semidefinite program.[4]

Given a polynomial , consider that the problem has a minimizer. It can be shown that this problem is related to finding a measure which minimizes . In fact,

 Comment:

Sum of squares

[edit]

Nonnegative circuit polynomials and geometric programming

[edit]

See [5]

Hyperbolic programming

[edit]

See [6]

Comparison of hierarchies

[edit]

See Figure 1 of https://arxiv.org/abs/1903.04996.

References

[edit]
  1. ^ Henrion, D.; Lasserre, J. (June 2004). "How gloptipoly is applied to problems in robust and nonlinear control Solving nonconvex optimization problems". IEEE Control Systems Magazine. 24 (3): 72–83. doi:10.1109/mcs.2004.1299534. ISSN 0272-1708.
  2. ^ Nesterov, Yurii (2000), "Squared Functional Systems and Optimization Problems", Applied Optimization, Springer US, pp. 405–440, doi:10.1007/978-1-4757-3216-0_17, ISBN 9781441948199, retrieved 2018-06-30
  3. ^ "Optimization over Polynomials: Selected Topics" (PDF). Proceedings of the International Congress of Mathematicians 2014 (ICM 2014). Seoul. 2014. pp. 843–869. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  4. ^ Lasserre, Jean B. (January 2001). "Global Optimization with Polynomials and the Problem of Moments". SIAM Journal on Optimization. 11 (3): 796–817. doi:10.1137/s1052623400366802. ISSN 1052-6234.
  5. ^ Dressler, Mareike; Iliman, Sadik; de Wolff, Timo (2018). "An approach to constrained polynomial optimization via nonnegative circuit polynomials and geometric programming". Journal of Symbolic Computation. doi:10.1016/j.jsc.2018.06.018. ISSN 0747-7171.
  6. ^ Saunderson, James (2019-03-31). "Certifying polynomial nonnegativity via hyperbolic optimization". arXiv:1904.00491 [math].

To add:

See also

[edit]

Software tools

[edit]
  • GloptiPoly
  • SparsePOP
  • Ncpol2sdpa - A tool to convert a polynomial optimization problem of either commutative or noncommutative variables to a sparse semidefinite programming
  • NCTSSOS - a sparse non-commutative polynomial optimization tool based on block moment-SOHS hierarchies.
  • Sparse-BSOS - Implementation of bounded degree SOS hierarchy for large scale polynomial optimization with sparsity
  • TSSOS - A sparse polynomial optimization tool based on (quasi) block Moment-SOS hierarchies
  • repop-toolbox - Tackling Polynomial Optimization Using Relative Entropy Relaxations (AM/GM polynomials)
  • POEM – Effective Methods in Polynomial Optimization - Polynomial optimization via sums of nonnegative circuit polynomials (SONC)
  • polMin - A Matlab toolbox for constrained and unconstrained polynomial optimization based on a nonuniform random coordinate minimization algorithm[1]



  1. ^ Calafiore, Giuseppe C.; Novara, Carlo; Possieri, Corrado (2020). "Control analysis and design via randomised coordinate polynomial minimisation". International Journal of Control: 1–15. doi:10.1080/00207179.2020.1782476. ISSN 0020-7179.