On the convergence of a growing topology neural algorithm

被引:3
作者
Drago, GP [1 ]
Ridella, S [1 ]
机构
[1] UNIV GENOA,DIBE,I-16145 GENOA,ITALY
关键词
neural network computing; learning; adaptive architecture; growing neural networks; Cascade Correlation; function approximation;
D O I
10.1016/0925-2312(95)00120-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A convergence theorem for the Cascade Correlation is described and based on this, results on convergence speed of the algorithm are obtained. It is possible to forecast the cost when adding hidden units: this allows us to understand if the Cascade Correlation works properly. A critical test is presented where the growing process must be modified in order to avoid unsatisfactory outcome; a parameter for detecting such occurrences is proposed and therapies are suggested. The results obtained in this paper may be used as a kernel for building new types of growing networks, whose topology may be better matched with the training data.
引用
收藏
页码:171 / 185
页数:15
相关论文
共 24 条
[1]  
[Anonymous], NEURAL COMPUT
[2]  
[Anonymous], 1988, PARALLEL DISTRIBUTED
[3]   UNIVERSAL APPROXIMATION BOUNDS FOR SUPERPOSITIONS OF A SIGMOIDAL FUNCTION [J].
BARRON, AR .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1993, 39 (03) :930-945
[4]  
BUCKINGHAM D, 1994, PROCEEDINGS OF THE SIXTEENTH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, P72
[5]  
CYBENKO G, 1989, 856 U ILL
[6]   STATISTICALLY CONTROLLED ACTIVATION WEIGHT INITIALIZATION (SCAWI) [J].
DRAGO, GP ;
RIDELLA, S .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (04) :627-631
[7]  
DRAGO GP, 1993, 6 IT WORKSH NEUR NET
[8]  
DRAGO GP, P ICANN 93, P750
[9]   AN ACCELERATED LEARNING ALGORITHM FOR MULTILAYER PERCEPTRONS - OPTIMIZATION LAYER-BY-LAYER [J].
ERGEZINGER, S ;
THOMSEN, E .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (01) :31-42
[10]  
Fahlman S.E., 1990, The cascade-correlation learning architecture