Jump to content

Metaheuristic: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
mNo edit summary
Line 8: Line 8:
Metaheuristics are also used for problems over [[real number|real-valued]] search-spaces, where the classic way of optimization is to derive the [[gradient]] of the [[mathematical function|function]] to be optimized and then employ [[gradient descent]] or a [[quasi-Newton method]]. Metaheuristics do not use the gradient or [[Hessian matrix]] so their advantage is that the function to be optimized need not be [[continuous]] or [[differentiable]] and it can also have [[Constraint (mathematics)|constraints]]. Popular metaheuristic optimizers for real-valued search-spaces include [[particle swarm optimization]] by Eberhart and Kennedy,<ref name=kennedy95particle/> [[differential evolution]] by Storn and Price<ref name=storn97differential/> and [[evolution strategies]] by Rechenberg<ref name=rechenberg71evolutionsstrategie/> and Schwefel.<ref name=schwefel74numerische/>
Metaheuristics are also used for problems over [[real number|real-valued]] search-spaces, where the classic way of optimization is to derive the [[gradient]] of the [[mathematical function|function]] to be optimized and then employ [[gradient descent]] or a [[quasi-Newton method]]. Metaheuristics do not use the gradient or [[Hessian matrix]] so their advantage is that the function to be optimized need not be [[continuous]] or [[differentiable]] and it can also have [[Constraint (mathematics)|constraints]]. Popular metaheuristic optimizers for real-valued search-spaces include [[particle swarm optimization]] by Eberhart and Kennedy,<ref name=kennedy95particle/> [[differential evolution]] by Storn and Price<ref name=storn97differential/> and [[evolution strategies]] by Rechenberg<ref name=rechenberg71evolutionsstrategie/> and Schwefel.<ref name=schwefel74numerische/>


Metaheuristics based on decomposition techniques have also been proposed for tackeling hard combinatorial problems of large size.<ref name=taillard99popmusic/>
Metaheuristics based on decomposition techniques have also been proposed for tackling hard combinatorial problems of large size.<ref name=taillard99popmusic/>


== Main contributions ==
== Main contributions ==

Revision as of 07:27, 2 June 2011

In computer science, metaheuristic designates a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Metaheuristics make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. However, metaheuristics do not guarantee an optimal solution is ever found. Many metaheuristics implement some form of stochastic optimization.

Other terms having a similar meaning as metaheuristic, are: derivative-free, direct search, black-box, or indeed just heuristic optimizer. Several books and survey papers have been published on the subject, see e.g.[1][2][3][4][5][6][7]

Applications

Metaheuristics are used for combinatorial optimization in which an optimal solution is sought over a discrete search-space. An example problem is the travelling salesman problem where the search-space of candidate solutions grows more than exponentially as the size of the problem increases, which makes an exhaustive search for the optimal solution infeasible. Additionally, multidimensional combinatorial problems, including most design problems in engineering such as form-finding and behavior-finding, suffer from the curse of dimensionality, which also makes them infeasible for exhaustive search or analytical methods. Popular metaheuristics for combinatorial problems include simulated annealing by Kirkpatrick et al.,[8] genetic algorithms by Holland et al.,[9] ant colony optimization by Dorigo,[10] scatter search[11] and tabu search[12] by Glover.

Metaheuristics are also used for problems over real-valued search-spaces, where the classic way of optimization is to derive the gradient of the function to be optimized and then employ gradient descent or a quasi-Newton method. Metaheuristics do not use the gradient or Hessian matrix so their advantage is that the function to be optimized need not be continuous or differentiable and it can also have constraints. Popular metaheuristic optimizers for real-valued search-spaces include particle swarm optimization by Eberhart and Kennedy,[13] differential evolution by Storn and Price[14] and evolution strategies by Rechenberg[15] and Schwefel.[16]

Metaheuristics based on decomposition techniques have also been proposed for tackling hard combinatorial problems of large size.[17]

Main contributions

Timeline of main contributions.

Many different metaheuristics are in existence and new variants are continually being proposed. Some of the most significant contributions to the field are:

Criticism

Mathematical analyses of metaheuristics have been presented in the literature, see e.g. Holland's schema theorem[9] for the genetic algorithm, the work by Trelea,[55] amongst others, for analysis of particle swarm optimization, and Zaharie[56] for analysis of differential evolution. These analyses make a number of assumptions in regard to the optimizer variants and the simplicity of the optimization problems which limit their validity in real-world optimization scenarios. Performance and convergence aspects of metaheuristic optimizers are therefore often demonstrated empirically in the research literature. This has been criticized in the no free lunch set of theorems by Wolpert and Macready,[41] which, among other things, prove that all optimizers perform equally well when averaged over all problems. The practical relevance of the no free lunch theorems however is minor, because they will generally not hold on the collection of problems a practitioner is facing.

For the practitioner the most relevant issue is that metaheuristics are not guaranteed to find the optimum or even a satisfactory near-optimal solution. All metaheuristics will eventually encounter problems on which they perform poorly and the practitioner must gain experience in which optimizers work well on different classes of problems.

See also

Metaheuristic methods are, generally speaking, sub-fields of:

Sub-fields of metaheuristics include:

Other fields of interest:

References

  1. ^ a b Goldberg, D.E. (1989). Genetic Algorithms in Search, Optimization and Machine Learning. Kluwer Academic Publishers. ISBN 0201157675.
  2. ^ Luke, S. (2009). Essentials of metaheuristics.
  3. ^ Talbi, E-G. (2009). Metaheuristics: from design to implementation. Wiley. ISBN 0470278587.
  4. ^ Glover, F.; Kochenberger, G.A. (2003). Handbook of metaheuristics. Vol. 57. Springer, International Series in Operations Research & Management Science. ISBN 978-1-4020-7263-5.
  5. ^ Mucherino, A.; Seref, O. (2009). "Modeling and Solving Real Life Global Optimization Problems with Meta-Heuristic Methods". Advances in Modeling Agricultural Systems. 25: 1–17. doi:10.1007/978-0-387-75181-8_19.
  6. ^ Blum, C.; Roli, A. (2003). "Metaheuristics in combinatorial optimization: Overview and conceptual comparison". 35 (3). ACM Computing Surveys: 268–308. {{cite journal}}: Cite journal requires |journal= (help)
  7. ^ Battiti, Roberto (2008). Reactive Search and Intelligent Optimization. Springer Verlag. ISBN 978-0-387-09623-0. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  8. ^ a b Kirkpatrick, S.; Gelatt Jr., C.D.; Vecchi, M.P. (1983). "Optimization by Simulated Annealing". Science. 220 (4598): 671–680. doi:10.1126/science.220.4598.671. PMID 17813860.
  9. ^ a b c Holland, J.H. (1975). Adaptation in Natural and Artificial Systems. University of Michigan Press. ISBN 0262082136.
  10. ^ a b Dorigo, M. (1992). Optimization, Learning and Natural Algorithms (Phd Thesis). Politecnico di Milano, Italie.
  11. ^ a b Glover, Fred (1977). "Heuristics for Integer programming Using Surrogate Constraints". Decision Sciences. 8 (1): 156–166.
  12. ^ a b Glover, F. (1986). "Future Paths for Integer Programming and Links to Artificial Intelligence". Computers and Operations Research. 13 (5): 533–549. doi:10.1016/0305-0548(86)90048-1.
  13. ^ a b Kennedy, J.; Eberhart, R. (1995). "Particle Swarm Optimization". Proceedings of IEEE International Conference on Neural Networks. Vol. IV. pp. 1942–1948. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  14. ^ a b Storn, R. (1997). "Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces". Journal of Global Optimization. 11 (4): 341–359. doi:10.1023/A:1008202821328. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  15. ^ Rechenberg, I. (1971). Evolutionsstrategie – Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (PhD Thesis). ISBN 3772803733.
  16. ^ Schwefel, H-P. (1974). Numerische Optimierung von Computer-Modellen (PhD Thesis).
  17. ^ a b Taillard, Eric; Voss, Stephan (1999). "POPMUSIC: Partial Optimization Metaheuristic Under Special Intensification Conditions" (PDF). Technical Report. Institute for Computer Sciences, heig-vd, Yverdon.
  18. ^ Robbins, H.; Monro, S. (1951). "A Stochastic Approximation Method". Annals of Mathematical Statistics. 22 (3): 400–407. doi:10.1214/aoms/1177729586.
  19. ^ Davidon, W.C. (1991). "Variable metric method for minimization". SIAM Journal on Optimization. 1 (1): 1–17. doi:10.1137/0801001.
  20. ^ Barricelli, N.A. (1954). "Esempi numerici di processi di evoluzione". Methodos: 45–68.
  21. ^ Rastrigin, L.A. (1963). "The convergence of the random search method in the extremal control of a many parameter system". Automation and Remote Control. 24 (10): 1337–1342.
  22. ^ Matyas, J. (1965). "Random optimization". Automation and Remote Control. 26 (2): 246–253.
  23. ^ Rechenberg, I. (1965). Cybernetic Solution Path of an Experimental Problem. Royal Aircraft Establishment Library Translation.
  24. ^ Nelder, J.A.; Mead, R. (1965). "A simplex method for function minimization". Computer Journal. 7: 308–313.
  25. ^ Fogel, L.; Owens, A.J.; Walsh, M.J. (1966). Artificial Intelligence through Simulated Evolution. Wiley. ISBN 0471265160.
  26. ^ Hastings, W.K. (1970). "Monte Carlo Sampling Methods Using Markov Chains and Their Applications". Biometrika. 57 (1): 97–109. doi:10.1093/biomet/57.1.97.
  27. ^ Cavicchio, D.J. (1970). "Adaptive search using simulated evolution". Technical Report. University of Michigan, Computer and Communication Sciences Department.
  28. ^ Kernighan, B.W.;Lin, S. (1970). "An efficient heuristic procedure for partitioning graphs". Bell System Technical Journal. 49 (2): 291--307.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  29. ^ Mercer, R.E. (1978). "Adaptive search using a reproductive metaplan". Kybernetes (The International Journal of Systems and Cybernetics). 7 (3): 215–228. doi:10.1108/eb005486. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  30. ^ Smith, S.F. (1980). A Learning System Based on Genetic Adaptive Algorithms (PhD Thesis). University of Pittsburgh.
  31. ^ Farmer, J.D.; Packard, N.; Perelson, A. (1986). "The immune system, adaptation and machine learning". Physica D. 22 (1–3): 187–204. doi:10.1016/0167-2789(86)90240-X.
  32. ^ Grefenstette, J.J. (1986). "Optimization of control parameters for genetic algorithms". IEEE Transactions Systems, Man, and Cybernetics. 16 (1): 122–128. doi:10.1109/TSMC.1986.289288.
  33. ^ US 4935877 
  34. ^ Koza, J.R. (1992). Genetic Programming : on the programming of computers by means of natural selection. MIT Press. ISBN 0262111705.
  35. ^ Moscato, P. (1989). "On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts : Towards Memetic Algorithms". Technical Report C3P 826. Caltech Concurrent Computation Program.
  36. ^ Fonseca, C.M.; Fleming, P.J. (1993). "Genetic Algorithms for Multiobjective Optimization: formulation, discussion and generalization". Proceedings of the 5th International Conference on Genetic Algorithms. Urbana-Champaign, IL, USA: 416–423.
  37. ^ Battiti, Roberto (1994). "The reactive tabu search" (PDF). ORSA Journal on Computing. 6 (2): 126–140. {{cite journal}}: Cite has empty unknown parameters: |laydate=, |trans_title=, |month=, |laysummary=, and |laysource= (help); Unknown parameter |coauthors= ignored (|author= suggested) (help)
  38. ^ Battiti, Roberto (1995). "Training neural nets with the reactive tabu search" (PDF). IEEE Transactions on Neural Networks. 6 (5): 1185–1200. {{cite journal}}: Cite has empty unknown parameters: |laydate=, |trans_title=, |month=, |laysummary=, and |laysource= (help); Unknown parameter |coauthors= ignored (|author= suggested) (help)
  39. ^ Srinivas, N.; Deb, K. (1994). "Multiobjective Optimization Using Nondominated Sorting in Genetic Algorithms". Evolutionary Computation. 2 (3): 221–248. doi:10.1162/evco.1994.2.3.221.
  40. ^ Wolpert, D.H.; Macready, W.G. (1995). "No free lunch theorems for search". Technical Report SFI-TR-95-02-010. Santa Fe Institute.
  41. ^ a b Wolpert, D.H.; Macready, W.G. (1997). "No free lunch theorems for optimization". IEEE Transactions on Evolutionary Computation. 1 (1): 67–82. doi:10.1109/4235.585893.
  42. ^ Mülhenbein, H.; Paaß, G. (1996). "From recombination of genes to the estimation of distribution I. Binary parameters". Lectures Notes in Computer Science: Parallel Problem Solving from Nature (PPSN IV). 1411: 178–187.
  43. ^ Hansen; Ostermeier (1996). "Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation". Proceedings of the IEEE International Conference on Evolutionary Computation: 312–317.
  44. ^ Rubinstein, R.Y. (1997). "Optimization of Computer simulation Models with Rare Events". European Journal of Operations Research. 99 (1): 89–112. doi:10.1016/S0377-2217(96)00385-2.
  45. ^ Geem, Z.W.; Kim, J.H.; Loganathan, G.V. (2001). "A new heuristic optimization algorithm: harmony search". Simulation. 76 (2): 60–68. doi:10.1177/003754970107600201.
  46. ^ Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T. (2002). "A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II". IEEE Transactions on Evolutionary Computation. 6 (2): 182–197. doi:10.1109/4235.996017.
  47. ^ Nakrani, S.; Tovey, S. (2004). "On honey bees and dynamic server allocation in Internet hosting centers". Adaptive Behaviour. 12.
  48. ^ Krishnanand, K.; Ghose, D. (2009). "Glowworm swarm optimization for simultaneous capture of multiple local optima of multimodal functions". Swarm Intelligence. 3 (2): 87–124. doi:10.1007/s11721-008-0021-5.
  49. ^ Krishnanand, K.; Ghose, D. (2006). "Glowworm swarm based optimization algorithm for multimodal functions with collective robotics applications". Multiagent and Grid Systems. 2: 209–222.
  50. ^ Krishnanand, K.; Ghose, D. (2005). "Detection of multiple source locations using a glowworm metaphor with applications to collective robotics". Proceedings of IEEE Swarm Intelligence Symposium. pp. 84–91. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  51. ^ Karaboga, D. (2005). "An Idea Based On Honey Bee Swarm For Numerical Numerical Optimization". Technical Report-TR06. Erciyes University, Engineering Faculty, Computer Engineering Department.
  52. ^ Haddad, O. B.; Afshar, Abbas; Mariño, Miguel A.; et al. (2006). "Honey-bees mating optimization (HBMO) algorithm: a new heuristic approach for water resources optimization". Water Resources Management. 20 (5): 661–680. doi:10.1007/s11269-005-9001-3. {{cite journal}}: Explicit use of et al. in: |first1= (help)
  53. ^ Yang, X.-S. (2008). Nature-Inspired Metaheuristic Algorithms. Luniver Press. ISBN 1905986289.
  54. ^ Yang, X.-S.; Deb, S. (2009). Cuckoo search via Levy flights, in: World Congress on Nature & Biologically Inspired Computing (NaBIC 2009). IEEE Publication, USA. pp. 210–214.
  55. ^ Trelea, I.C. (2003). "The Particle Swarm Optimization Algorithm: convergence analysis and parameter selection". Information Processing Letters. 85 (6): 317–325. doi:10.1016/S0020-0190(02)00447-7.
  56. ^ Zaharie, D. (2002). "Critical values for the control parameters of differential evolution algorithms". Proceedings of the 8th International Conference on Soft Computing (MENDEL). Brno, Czech Republic. pp. 62–67. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)

Template:Metaheuristics for combinatorial problems Template:Metaheuristics for real-valued problems