PRUNING ALGORITHMS - A SURVEY

被引:1057
作者
REED, R
机构
[1] Department of Electrical Engineering, University of Washington, FT-10, Seattle
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1993年 / 4卷 / 05期
关键词
D O I
10.1109/72.248452
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A rule of thumb for obtaining good generalization in systems trained by examples is that one should use the smallest system that will fit the data. Unfortunately, it usually is not obvious what size is best; a system that is too small will not be able to learn the data while one that is just big enough may learn very slowly and be very sensitive to initial conditions and learning parameters. This paper is a survey of neural network pruning algorithms. The approach taken by the methods described here is to train a network that is larger than necessary and then remove the parts that are not needed.
引用
收藏
页码:740 / 747
页数:8
相关论文
共 37 条
  • [1] BAUM EB, 1990, P EURASIP WORKSHOP N, P2
  • [2] What Size Net Gives Valid Generalization?
    Baum, Eric B.
    Haussler, David
    [J]. NEURAL COMPUTATION, 1989, 1 (01) : 151 - 160
  • [3] LEARNABILITY AND THE VAPNIK-CHERVONENKIS DIMENSION
    BLUMER, A
    EHRENFEUCHT, A
    HAUSSLER, D
    WARMUTH, MK
    [J]. JOURNAL OF THE ACM, 1989, 36 (04) : 929 - 965
  • [4] Chauvin Y., 1990, Neural Networks. EURASIP Workshop 1990 Proceedings, P46
  • [5] CHAUVIN Y, 1989, ADV NEURAL INFORMATI, V2, P642
  • [6] CHAUVIN Y, 1989, ADV NEURAL INFORMATI, V1, P519
  • [7] EHRENFEUCHT A, 1988, 1988 P WORKSH COMP L
  • [8] HANSON SJ, 1989, ADV NEURAL INFORMATI, V1, P177
  • [9] HERGERT F, 1992, P INT JOINT C NEURAL, V3, P980
  • [10] ISHIKAWA M, 1990, TR907 EL LAB TECH RE