LEARNING IN MULTILAYERED NETWORKS USED AS AUTOASSOCIATORS

被引:27
作者
BIANCHINI, M
FRASCONI, P
GORI, M
机构
[1] Dipartimento di Sistemi e Informatica, Universita di Firenze
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1995年 / 6卷 / 02期
关键词
D O I
10.1109/72.363492
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gradient descent learning algorithms may get stuck in local minima, thus making the learning suboptimal. In this paper, we focus attention on multilayered networks used as autoassociators and show some relationships with classical linear autoassociators. In addition, using the theoretical framework of our previous research [3], we derive a condition which is met at the end of the learning process and show that this condition has a very intriguing geometrical meaning in the pattern space.
引用
收藏
页码:512 / 515
页数:4
相关论文
共 7 条
[1]   NEURAL NETWORKS AND PRINCIPAL COMPONENT ANALYSIS - LEARNING FROM EXAMPLES WITHOUT LOCAL MINIMA [J].
BALDI, P ;
HORNIK, K .
NEURAL NETWORKS, 1989, 2 (01) :53-58
[2]   AUTO-ASSOCIATION BY MULTILAYER PERCEPTRONS AND SINGULAR VALUE DECOMPOSITION [J].
BOURLARD, H ;
KAMP, Y .
BIOLOGICAL CYBERNETICS, 1988, 59 (4-5) :291-294
[3]   ON THE PROBLEM OF LOCAL MINIMA IN BACKPROPAGATION [J].
GORI, M ;
TESI, A .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1992, 14 (01) :76-86
[4]  
LASTRUCCI L, 1994, APR P INT WORKSH AUT, P189
[5]  
MCCLELLAND JL, 1988, EXPLORATIONS PARALLE, P132
[6]  
POSTON T, 1991, JUL P IEEE IJCNN91 S, V2, P173
[7]   CAN BACKPROPAGATION ERROR SURFACE NOT HAVE LOCAL MINIMA [J].
YU, XH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (06) :1019-1021