HyperNEAT: The First Five Years
HyperNEAT, which stands for Hypercube-based NeuroEvolution of Augmenting Topologies, is a method for evolving indirectly-encoded artificial neural networks (ANNs) that was first introduced in 2007. By exploiting a unique indirect encoding called Compositi
- PDF / 1,116,733 Bytes
- 27 Pages / 439.37 x 666.142 pts Page_size
- 11 Downloads / 221 Views
HyperNEAT: The First Five Years David B. D’Ambrosio, Jason Gauci and Kenneth O. Stanley
Abstract HyperNEAT, which stands for Hypercube-based NeuroEvolution of Augmenting Topologies, is a method for evolving indirectly-encoded artificial neural networks (ANNs) that was first introduced in 2007. By exploiting a unique indirect encoding called Compositional Pattern Producing Networks (CPPNs) that does not require a typical developmental stage, HyperNEAT introduced several novel capabilities to the field of neuroevolution (i.e. evolving artificial neural networks). Among these, (1) large ANNs can be compactly encoded by small genomes, (2) the size and resolution of evolved ANNs can scale up or down even after training is completed, and (3) neural structure can be evolved to exploit problem geometry. Five years after its introduction, researchers have leveraged these capabilities to produce a broad range of successful experiments and extensions that highlight the potential for future research to build further on the ideas introduced by HyperNEAT. This chapter reviews these first 5 years of research that builds upon this approach, and culminates with thoughts on promising future directions.
1 Introduction HyperNEAT lies at the intersection of two research areas. One of these, neuroevolution, aims to harness the power of evolutionary computation to evolve artificial neural networks (ANNs) [33, 39, 82, 96]. The other area, called generative and developmental systems (GDS), studies how compact encodings can describe large, D. B. D’Ambrosio (B) · J. Gauci · K. O. Stanley University of Central Florida, 4000 Central Florida Blvd. Orlando, Orlando, FL 32816, USA e-mail: [email protected] J. Gauci e-mail: [email protected] K. O. Stanley e-mail: [email protected] T. Kowaliw et al. (eds.), Growing Adaptive Machines, Studies in Computational Intelligence 557, DOI: 10.1007/978-3-642-55337-0_5, © Springer-Verlag Berlin Heidelberg 2014
159
160
D. B. D’Ambrosio et al.
complex structures for the purpose of evolution [11, 13, 50, 83]. Such encodings are sometimes called indirect encodings because each gene in the encoding does not map to a single corresponding unit of structure in the phenotype. The hope in both areas is that evolved artifacts will someday approach the complexity and power of the products of evolution in nature. At their intersection is the idea that indirect encoding might aid the evolution of ANNs by leveraging the properties of development. Before 2006, most indirect encodings worked by triggering a process of development that begins with a small embryonic structure that ultimately grows into the final phenotypic form [13, 32, 50, 60, 64, 74, 83]. The connection between development on the one hand and indirect encoding on the other is intuitive because natural DNA itself maps to the human or animal phenotype through a process of development that begins with the egg. However, in 2006 Stanley introduced a new kind of indirect encoding called a Compositional Pattern Producing Network (CPPN) that does not
Data Loading...