# Spiral optimization algorithm

The spiral shares the global (blue) and intensive (red) behavior.

The spiral optimization (SPO) algorithm is an uncomplicated search concept inspired by spiral phenomena in nature. The first SPO algorithm was proposed for two-dimensional unconstrained optimization [1] based on two-dimensional spiral models. This was extended to n-dimensional problems by generalizing the two-dimensional spiral model to an n-dimensional spiral model.[2] There are effective settings for the SPO algorithm: the periodic descent direction setting [3] and the convergence setting .[4]

## Metaphor

The motivation for focusing on spiral phenomena was due to the insight that the dynamics that generate logarithmic spirals share the diversification and intensification behavior. The diversification behavior can work for a global search (exploration) and the intensification behavior enables an intensive search around a current found good solution (exploitation).

## Algorithm

Spiral Optimization (SPO) algorithm

The SPO algorithm is a multipoint search algorithm that has no objective function gradient, which uses multiple spiral models that can be described as deterministic dynamical systems. As search points follow logarithmic spiral trajectories towards the common center, defined as the current best point, better solutions can be found and the common center can be updated. The general SPO algorithm for a minimization problem under the maximum iteration ${\displaystyle k_{\max }}$ (termination criterion) is as follows:

0) Set the number of search points ${\displaystyle m\geq 2}$ and the maximum iteration number ${\displaystyle k_{\max }}$.
1) Place the initial search points ${\displaystyle x_{i}(0)\in \mathbb {R} ^{n}~(i=1,\ldots ,m)}$ and determine the center ${\displaystyle x^{\star }(0)=x_{i_{\text{b}}}(0)}$, ${\displaystyle \displaystyle i_{\text{b}}=\mathop {\text{argmin}} _{i=1,\ldots ,m}\{f(x_{i}(0))\}}$,and then set ${\displaystyle k=0}$.
2) Decide the step rate ${\displaystyle r(k)}$ by a rule.
3) Update the search points: ${\displaystyle x_{i}(k+1)=x^{\star }(k)+r(k)R(\theta )(x_{i}(k)-x^{\star }(k))\quad (i=1,\ldots ,m).}$
4) Update the center: ${\displaystyle x^{\star }(k+1)={\begin{cases}x_{i_{\text{b}}}(k+1)&{\big (}{\text{if }}f(x_{i_{\text{b}}}(k+1)) where ${\displaystyle \displaystyle i_{\text{b}}=\mathop {\text{argmin}} _{i=1,\ldots ,m}\{f(x_{i}(k+1))\}}$.
5) Set ${\displaystyle k:=k+1}$. If ${\displaystyle k=k_{\max }}$ is satisfied then terminate and output ${\displaystyle x^{\star }(k)}$. Otherwise, return to Step 2).


## Setting

The search performance depends on setting the composite rotation matrix ${\displaystyle R(\theta )}$, the step rate ${\displaystyle r(k)}$, and the initial points ${\displaystyle x_{i}(0)~(i=1,\ldots ,m)}$. The following settings are new and effective.

Setting 1 (Periodic Descent Direction Setting) [3]
This setting is an effective setting for high dimensional problems under the maximum iteration ${\displaystyle k_{\max }}$. The conditions on ${\displaystyle R(\theta )}$ and ${\displaystyle x_{i}(0)~(i=1,\ldots ,m)}$ together ensure that the spiral models generate descent directions periodically. The condition of ${\displaystyle r(k)}$ works to utilize the periodic descent directions under the search termination ${\displaystyle k_{\max }}$.
• Set ${\displaystyle R(\theta )}$ as follows:${\displaystyle R(\theta )={\begin{bmatrix}0_{n-1}^{\top }&-1\\I_{n-1}&0_{n-1}\\\end{bmatrix}}}$ where ${\displaystyle I_{n-1}}$ is the ${\displaystyle (n-1)\times (n-1)}$ identity matrix and ${\displaystyle 0_{n-1}}$ is the ${\displaystyle (n-1)\times 1}$ zero vector.
• Place the initial points ${\displaystyle x_{i}(0)\in \mathbb {R} ^{n}}$ ${\displaystyle (i=1,\ldots ,m)}$ at random to satisfy the following condition: ${\displaystyle \min _{i=1,\ldots ,m}\{\max _{j=1,\ldots ,m}{\bigl \{}{\text{rank}}{\bigl [}d_{j,i}(0)~R(\theta )d_{j,i}(0)~~\cdots ~~R(\theta )^{2n-1}d_{j,i}(0){\bigr ]}{\bigr \}}{\bigr \}}=n}$ where ${\displaystyle d_{j,i}(0)=x_{j}(0)-x_{i}(0)}$. Note that this condition is almost all satisfied by a random placing and thus no check is actually fine.
• Set ${\displaystyle r(k)}$ at Step 2) as follows:${\displaystyle r(k)=r={\sqrt[{k_{\max }}]{\delta }}~~~~{\text{(constant value)}}}$ where a sufficiently small ${\displaystyle \delta >0}$ such as ${\displaystyle \delta =1/k_{\max }}$ or ${\displaystyle \delta =10^{-3}}$.
Setting 2 (Convergence Setting) [4]
This setting ensures that the SPO algorithm converges to a stationary point under the maximum iteration ${\displaystyle k_{\max }=\infty }$. The settings of ${\displaystyle R(\theta )}$ and the initial points ${\displaystyle x_{i}(0)~(i=1,\ldots ,m)}$ are the same with the above Setting 1. The setting of ${\displaystyle r(k)}$ is as follows.
• Set ${\displaystyle r(k)}$ at Step 2) as follows:${\displaystyle r(k)={\begin{cases}1&(k^{\star }\leqq k\leqq k^{\star }+2n-1),\\h&(k\geqq k^{\star }+2n),\end{cases}}}$ where ${\displaystyle k^{\star }}$ is an iteration when the center is newly updated at Step 4) and ${\displaystyle h={\sqrt[{2n}]{\delta }},\delta \in (0,1)}$ such as ${\displaystyle \delta =0.5}$. Thus we have to add the following rules about ${\displaystyle k^{\star }}$ to the Algorithm:
•(Step 1) ${\displaystyle k^{\star }=0}$.
•(Step 4) If ${\displaystyle x^{\star }(k+1)\neq x^{\star }(k)}$ then ${\displaystyle k^{\star }=k+1}$.

## Future works

• The algorithms with the above settings are deterministic. Thus, incorporating some random operations would make this algorithm powerful for the global optimization.
• To find an appropriate balance between diversification and intensification spirals depending on the target problem class (including ${\displaystyle k_{\max }}$) is important to enhance the performance.

## Extended works

Many extended studies have been conducted on the SPO due to its simple structure and concept; these studies have helped improve its global search performance and proposed novel applications [5] [6] [7] [8] [9] .

## References

1. ^ Tamura, K.; Yasuda, K. (2011). "Primary Study of Spiral Dynamics Inspired Optimization". IEEJ Transactions on Electrical and Electronic Engineering. 6 (S1): 98–100.
2. ^ Tamura, K.; Yasuda, K. (2011). "Spiral Dynamics Inspired Optimization". Journal of Advanced Computational Intelligence and Intelligent Informatics. 132 (5): 1116–1121.
3. ^ a b Tamura, K.; Yasuda, K. (2016). "Spiral Optimization Algorithm Using Periodic Descent Directions". SICE Journal of Control, Measurement, and System Integration. 6 (3): 133–143. doi:10.9746/jcmsi.9.134.
4. ^ a b Tamura, K.; Yasuda, K. (2017). "The Spiral Optimization Algorithm: Convergence Conditions and Settings". IEEE Transactions on Systems, Man, and Cybernetics: Systems. PP (99): 1–16. doi:10.1109/TSMC.2017.2695577.
5. ^ Nasir, A. N. K.; Tokhi, M. O. (2015). "An improved spiral dynamic optimization algorithm with engineering application". IEEE Trans. Syst.,Man, Cybern., Syst. 45 (6): 943–954.
6. ^ Nasir, A. N. K.; Ismail, R.M.T.R.; Tokhi, M. O. (2016). "Adaptive spiral dynamics metaheuristic algorithm for global optimisation with application to modelling of a flexible system". Appl. Math. Modell. 40 (9–10): 5442–5461.
7. ^ Ouadi, A.; Bentarzi, H.; Recioui, A. (2013). "multiobjective design of digital filters using spiral optimization technique". SpringerPlus. 2 (461): 697–707.
8. ^ Benasla, L.; Belmadani, A.; Rahli, M. (2014). "Spiral optimization algorithm for solving combined economic and Emission Dispatch". Int. J. Elect. Power & Energy Syst. 62: 163–174.
9. ^ Sidarto, K. A.; Kania, A. "Finding all solutions of systems of nonlinear equations using spiral dynamics inspired optimization with clustering". JACIII. 19 (5): 697–707.