ADVANCED SUPERVISED LEARNING IN MULTILAYER PERCEPTRONS - FROM BACKPROPAGATION TO ADAPTIVE LEARNING ALGORITHMS

被引:307
作者
RIEDMILLER, M [1 ]
机构
[1] UNIV KARLSRUHE,INST LOG KOMPLEXITAT & DEDUKT SYST,D-76128 KARLSRUHE,GERMANY
关键词
SUPERVISED LEARNING; MULTILAYER PERCEPTRONS; FEEDFORWARD NETWORKS; ADAPTIVE LEARNING ALGORITHMS; BENCHMARK PROBLEMS; ROBUSTNESS;
D O I
10.1016/0920-5489(94)90017-5
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Since the presentation of the backpropagation algorithm [1] a vast variety of improvements of the technique for training the weights in a feed-forward neural network have been proposed. The following article introduces the concept of supervised learning in multi-layer perceptrons based on the technique of gradient descent. Some problems and drawbacks of the original backpropagation learning procedure are discussed, eventually leading to the development of more sophisticated techniques This article concentrates on adaptive learning strategies. Some of the most popular learning algorithms are described and discussed according to their classification in terms of global and local adaptation strategies. The behavior of several learning procedures on some popular benchmark problems is reported, thereby illuminating convergence, robustness, and scaling properties of the respective algorithms.
引用
收藏
页码:265 / 278
页数:14
相关论文
共 12 条
  • [1] [Anonymous], 1987, LEARNING INTERNAL RE
  • [2] FAHLMAN S, 1988, CMUCS88162 CARN U TE
  • [3] Herz J., 1991, INTRO THEORY NEURAL
  • [4] JACOBS R, 1988, NEURAL NETWORKS, V1
  • [5] KRAMER AH, 1989, ADV NEURAL INFORMATI, V1
  • [6] LANG KJ, 1988, 1988 P CONN MOD SUMM
  • [7] A SCALED CONJUGATE-GRADIENT ALGORITHM FOR FAST SUPERVISED LEARNING
    MOLLER, MF
    [J]. NEURAL NETWORKS, 1993, 6 (04) : 525 - 533
  • [8] RIEDMILLER M, 1993, P IEEE INT C NEUR NE, P586
  • [9] SALOMON R, 1990, LECTURE NOTES COMPUT, V1, P269
  • [10] SCHIFFMANN W, 1993, OPTIMIZATION BACKPRO