BACK-PROPAGATION ALGORITHM WHICH VARIES THE NUMBER OF HIDDEN UNITS

被引:330
作者
HIROSE, Y
YAMASHITA, K
HIJIYA, S
机构
关键词
NEURAL NETWORKS; BACK-PROPAGATION; HIDDEN UNIT; LOCAL MINIMUM; WEIGHT;
D O I
10.1016/0893-6080(91)90032-Z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This report presents a back-propagation algorithm that varies the number of hidden units. This algorithm is expected to escape local minima and makes it no longer necessary to decide the number of hidden units. We tested this algorithm on two examples. One was exclusive-OR learning and the other was 8 X 8 dot alphanumeric font learning. In both examples, the probability of becoming trapped in local minima was reduced. Furthermore, in alphanumeric font learning, the network converged two to three times faster than conventional back-propagation.
引用
收藏
页码:61 / 66
页数:6
相关论文
共 3 条
[1]  
Rumelhart David E., 1987, LEARNING INTERNAL RE, P318
[2]   LEARNING REPRESENTATIONS BY BACK-PROPAGATING ERRORS [J].
RUMELHART, DE ;
HINTON, GE ;
WILLIAMS, RJ .
NATURE, 1986, 323 (6088) :533-536
[3]  
Sejnowski T. J., 1987, Complex Systems, V1, P145