Empirical evaluation of the improved Rprop learning algorithms

被引:269
作者
Igel, C [1 ]
Hüsken, M [1 ]
机构
[1] Ruhr Univ Bochum, Inst Neuroinformat, D-44780 Bochum, Germany
关键词
supervised learning; resilient backpropagation (Rprop); gradient-based optimization;
D O I
10.1016/S0925-2312(01)00700-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Rprop algorithm proposed by Riedmiller and Braun is one of the best performing first-order learning methods for neural networks. We discuss modifications of this algorithm that improve its learning speed. The new optimization methods are empirically compared to the existing Rprop variants, the conjugate gradient method, Quickprop, and the BFGS algorithm on a set of neural network benchmark problems. The improved Rprop outperforms the other methods; only the BFGS performs better in the later stages of learning on some of the test problems. For the analysis of the local search behavior, we compare the Rprop algorithms on general hyperparabolic error landscapes, where the new variants confirm their improvement. (C) 2002 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:105 / 123
页数:19
相关论文
共 28 条
[1]  
Braun H, 1997, NEURONALE NETZE OPTI
[2]  
Fahlman S.E., 1988, Proceedings of the 1988 Connectionist Models Summer School, P38
[3]  
Hansen N, 1997, EUFIT, V97, P650
[4]  
Hansen N., 1995, P 6 INT C GEN ALG, P57
[5]  
Igel C, 1999, FROM ANIM ANIMAT, P191
[6]  
Igel C., 2000, P 2 INT ICSC S NEUR, V2000, P115
[7]   INCREASED RATES OF CONVERGENCE THROUGH LEARNING RATE ADAPTATION [J].
JACOBS, RA .
NEURAL NETWORKS, 1988, 1 (04) :295-307
[8]   Speeding up backpropagation algorithms by using Cross-Entropy combined with Pattern Normalization [J].
Joost, M ;
Schiffmann, W .
INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 1998, 6 (02) :117-126
[9]   Efficient backprop [J].
LeCun, Y ;
Bottou, L ;
Orr, GB ;
Müller, KR .
NEURAL NETWORKS: TRICKS OF THE TRADE, 1998, 1524 :9-50
[10]  
LORENZ EN, 1963, J ATMOS SCI, V20, P130, DOI 10.1175/1520-0469(1963)020<0130:DNF>2.0.CO