A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks

被引:520
作者
Stanley, Kenneth O. [1 ]
D'Ambrosio, David B. [1 ]
Gauci, Jason [1 ]
机构
[1] Univ Cent Florida, Sch Elect Engn & Comp Sci, Orlando, FL 32816 USA
关键词
Compositional pattern-producing networks; CPPNs; HyperNEAT; indirect encoding; hypercube-based; NeuroEvolution of Augmenting Topologies; artificial embryogeny; BRAIN;
D O I
10.1162/artl.2009.15.2.15202
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
Research in neuroevolution-that is, evolving artificial neural networks (ANNs) through evolutionary algorithms-is inspired by the evolution of biological brains, which can contain trillions of connections. Yet while neuroevolution has produced successful results, the scale of natural brains remains far beyond reach. This article presents a method called hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) that aims to narrow this gap. HyperNEAT employs an indirect encoding called connective compositional pattern-producing networks (CPPNs) that can produce connectivity patterns with symmetries and repeating motifs by interpreting spatial patterns generated within a hypercube as connectivity patterns in a lower-dimensional space. This approach can exploit the geometry of the task by mapping its regularities onto the topology of the network, thereby shifting problem difficulty away from dimensionality to the underlying problem structure. Furthermore, connective CPPNs can represent the same connectivity pattern at any resolution, allowing ANNs to scale to new numbers of inputs and outputs without further evolution. HyperNEAT is demonstrated through visual discrimination and food-gathering tasks, including successful visual discrimination networks containing over eight million connections. The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.
引用
收藏
页码:185 / 212
页数:28
相关论文
共 62 条
[1]
ALTEWBERG L., 1994, Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence, V1, DOI [DOI 10.1109/ICEC.1994.350019, 10.1109/ICEC.1994.350019]
[2]
ANGELINE PJ, 1995, COM ADAP SY, P387
[3]
AN EVOLUTIONARY ALGORITHM THAT CONSTRUCTS RECURRENT NEURAL NETWORKS [J].
ANGELINE, PJ ;
SAUNDERS, GM ;
POLLACK, JB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (01) :54-65
[4]
[Anonymous], 1987, LEARNING INTERNAL RE
[5]
[Anonymous], 1987, Molecular biology of the gene
[6]
[Anonymous], P GEN EV COMP C GECC
[7]
[Anonymous], 1991, Principles of Neural Science
[8]
[Anonymous], 1986, The Blind Watchmaker
[9]
BELEW RK, 1993, PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON GENETIC ALGORITHMS, P629
[10]
Bentley P, 1999, GECCO-99: PROCEEDINGS OF THE GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, P35