ENHANCED TRAINING ALGORITHMS, AND INTEGRATED TRAINING ARCHITECTURE SELECTION FOR MULTILAYER PERCEPTRON NETWORKS

被引:58
作者
BELLO, MG
机构
[1] Charles Stark Draper Laboratory, Inc., Cambridge, MA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1992年 / 3卷 / 06期
关键词
D O I
10.1109/72.165589
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The standard backpropagation based multilayer perceptron training algorithm suffers from a slow asymptotic convergence rate. In the work reported here, sophisticated nonlinear least squares and quasi-Newton optimization techniques are employed to construct enhanced multilayer perceptron training algorithms, which are then compared to the backpropagation algorithm in the context of several example problems. In addition, an integrated approach to training and architecture selection that employs the described enhanced algorithms is presented, and its effectiveness illustrated in the context of synthetic and actual pattern recognition problems.
引用
收藏
页码:864 / 875
页数:12
相关论文
共 20 条
[1]   STATISTICAL PREDICTOR IDENTIFICATION [J].
AKAIKE, H .
ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 1970, 22 (02) :203-&
[2]  
[Anonymous], 1988, INT C NEUR INF PROC
[3]  
BARRON AR, 1984, SELF ORG METHODS MOD, pCH4
[4]  
DENNIS JE, 1981, ACM T MATH SOFTWARE, V7, P348, DOI 10.1145/355958.355965
[5]   ALGORITHM 573 - NL2SOL - AN ADAPTIVE NON-LINEAR LEAST-SQUARES ALGORITHM [E4] [J].
DENNIS, JE ;
GAY, DM ;
WELSCH, RE .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 1981, 7 (03) :369-383
[6]  
DENNIS JE, 1982, NUMERICAL METHODS UN
[7]  
Golub G.H., 1983, MATRIX COMPUTATIONS
[8]  
HARNIK K, 1989, NEURAL NETWORKS, V2, P359
[9]  
KRAMER AH, 1989, ADV NEURAL INFORMATI, P40
[10]  
Linhart H., 1986, MODEL SELECTION