HyperNEAT

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Hypercube-based NEAT, or HyperNEAT,[1] is a generative encoding that evolves artificial neural networks (ANNs) with the principles of the widely used NeuroEvolution of Augmented Topologies (NEAT) algorithm.[2] It is a novel technique for evolving large-scale neural networks utilizing the geometric regularities of the task domain. It uses Compositional Pattern Producing Networks [3] (CPPNs), which are used to generate the images for Picbreeder.org and shapes for EndlessForms.com. HyperNEAT has recently been extended to also evolve plastic ANNs [4] and to evolve the location of every neuron in the network.[5]

Applications to Date[edit]

  • Multi-agent learning [6]
  • Checkers board evaluation [7]
  • Controlling Legged Robots [8][9][10][11][12][13]video
  • Comparing Generative vs. Direct Encodings [14][15][16]
  • Investigating the Evolution of Modular Neural Networks [17][18][19]
  • Evolving Objects that can be 3D Printed [20]
  • Evolving the Neural Geometry and Plasticity of an ANN [21]

References[edit]

  1. ^ K. O. Stanley , D. B. D’Ambrosio and J. Gauci, “A Hypercube-Based Indirect Encoding for Evolving Large-Scale Neural Networks.” Artificial Life. 2009. To be published.
  2. ^ Kenneth O. Stanley and Risto Miikkulainen (2002). "Evolving Neural Networks Through Augmenting Topologies". Evolutionary Computation 10 (2): 99-127
  3. ^ K. O. Stanley, “Compositional pattern producing networks: A novel abstraction of development,” Genetic Programming and Evolvable Machines, vol. 8, pp. 131 – 162, June 2007.
  4. ^ Sebastian Risi and Kenneth O. Stanley, " Indirectly Encoding Neural Plasticity as a Pattern of Local Rules" In: Proceedings of the 11th International Conference on Simulation of Adaptive Behavior (SAB 2010). New York, NY: Springer, 2010
  5. ^ Sebastian Risi and Kenneth O. Stanley, "An Enhanced Hypercube-Based Encoding for Evolving the Placement, Density and Connectivity of Neurons", In: Artificial Life journal. Cambridge, MA: MIT Press, 2012
  6. ^ D. B. D’Ambrosio and K. O. Stanley, “Generative encoding for multiagent learning,” in GECCO ’08: Proceedings of the 10th annual conference on Genetic and evolutionary computation, (New York, NY, USA), pp. 819–826, ACM, 2008.
  7. ^ J. Gauci and K. O. Stanley, “A case study on the critical role of geometric regularity in machine learning,” in AAAI (D. Fox and C. P. Gomes, eds.), pp. 628–633, AAAI Press, 2008.
  8. ^ Sebastian Risi and Kenneth O. Stanley, "Confronting the Challenge of Learning a Flexible Neural Controller for a Diversity of Morphologies" In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2013). New York, NY:ACM, 2013
  9. ^ Jeff Clune, Benjamin Beckmann, Charles Ofria, and Robert Pennock. Evolving Coordinated Quadruped Gaits with the HyperNEAT Generative Encoding. Proceedings of the IEEE Congress on Evolutionary Computing Special Section on Evolutionary Robotics, 2009. (pdf)Trondheim, Norway.
  10. ^ J. Clune, R.T. Pennock, and C. Ofria. The sensitivity of HyperNEAT to different geometric representations of a problem. Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). Montreal, Canada. 2009. (pdf)
  11. ^ Yosinski J, Clune J, Hidalgo D, Nguyen S, Cristobal Zagal J, Lipson H (2011) Evolving Robot Gaits in Hardware: the HyperNEAT Generative Encoding Vs. Parameter Optimization. Proceedings of the European Conference on Artificial Life. (pdf)
  12. ^ Lee S, Yosinski J, Glette K, Lipson H, Clune J (2013) Evolving gaits for physical robots with the HyperNEAT generative encoding: the benefits of simulation. Applications of Evolutionary Computing. Springer. pdf
  13. ^ Lee S, Yosinski J, Glette K, Lipson H, Clune J (2013) Evolving gaits for physical robots with the HyperNEAT generative encoding: the benefits of simulation. Applications of Evolutionary Computing. 540-549. (pdf)
  14. ^ Clune J, Stanley KO, Pennock RT, and Ofria C. On the performance of indirect encoding across the continuum of regularity. IEEE Transactions on Evolutionary Computation, 2011. pdf
  15. ^ J. Clune, C. Ofria, and R. T. Pennock, “How a generative encoding fares as problem-regularity decreases,” in PPSN (G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, and N. Beume, eds.), vol. 5199 of Lecture Notes in Computer Science, pp. 358–367, Springer, 2008. (pdf)
  16. ^ Clune J, Beckmann BE, Pennock RT, and Ofria C. HybrID: A Hybridization of Indirect and Direct Encodings for Evolutionary Computation. Proceedings of the European Conference on Artificial Life (ECAL), 2009. Budapest, Hungary. (pdf)
  17. ^ Clune J, Beckmann BE, McKinley PK, and Ofria C (2010) Investigating whether HyperNEAT produces modular neural networks. Proceedings of the Genetic and Evolutionary Computation Conference. 635-642.(pdf)
  18. ^ Suchorzewski M, Clune J (2011) A Novel Generative Encoding for Evolving Modular, Regular and Scalable Networks. Proceedings of the Genetic and Evolutionary Computation Conference. 1523-1530. (pdf)
  19. ^ Verbancsics P and Stanley KO (2011) Constraining Connectivity to Encourage Modularity in HyperNEAT Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2011). New York, NY:ACM
  20. ^ Clune J, Lipson H (2011) Evolving three-dimensional objects with a generative encoding inspired by developmental biology. Proceedings of the European Conference on Artificial Life. 144-148. (pdf) see EndlessForms.com
  21. ^ Sebastian Risi and Kenneth O. Stanley, "A Unified Approach to Evolving Plasticity and Neural Geometry" In: Proceedings of the International Joint Conference on Neural Networks (IJCNN 2012). Piscataway, NJ: IEEE, 2012

External links[edit]