The dependence identification neural network construction algorithm

被引:36
作者
Moody, JO
Antsaklis, PJ
机构
[1] Department of Electrical Engineering, University of Notre Dame, Notre Dame
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1996年 / 7卷 / 01期
关键词
D O I
10.1109/72.478388
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An algorithm for constructing and training multilayer neural networks, dependence identification, is presented in this paper. Its distinctive features are that i) it transforms the training problem into a set of quadratic optimization problems that are solved by a number of linear equations, ii) it constructs an appropriate network to meet the training specifications, and iii) the resulting network architecture and weights can be further refined with standard training algorithms, like backpropagation, giving a significant speedup in the development time of the neural network and decreasing the amount of trial and error usually associated with network development.
引用
收藏
页码:3 / 15
页数:13
相关论文
共 24 条
[1]  
[Anonymous], NEURAL COMPUT
[2]  
Baum E. B., 1988, Journal of Complexity, V4, P193, DOI 10.1016/0885-064X(88)90020-9
[3]  
CHESTER D, 1990, P INT JOINT C NEURAL, P265
[4]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[5]  
ELMAN H, 1982, THESIS YALE U NEW HA
[6]  
Fausett L. V., 1993, FUNDAMENTALS NEURAL
[7]   COUNTERPROPAGATION NETWORKS [J].
HECHTNIELSEN, R .
APPLIED OPTICS, 1987, 26 (23) :4979-4984
[8]  
Hertz J., 1991, Introduction to the Theory of Neural Computation
[9]  
HINTON GE, 1986, 8TH P ANN C COGN SCI
[10]   MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366