A penalty-function approach for pruning feedforward neural networks

被引:137
作者
Setiono, R
机构
[1] Dept. of Info. Syst. and Comp. Sci., National University of Singapore, Kent Ridge
关键词
D O I
10.1162/neco.1997.9.1.185
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article proposes the use of a penalty function for pruning feedforward neural network by weight elimination. The penalty function proposed consists of two terms. The first term is to discourage the use of unnecessary connections, and the second term is to prevent the weights of the connections from taking excessively large values. Simple criteria for eliminating weights from the network are also given. The effectiveness of this penalty function is tested on three well-known problems: the contiguity problem, the parity problems, and the monks problems. The resulting pruned networks obtained for many of these problems have fewer connections than previously reported in the literature.
引用
收藏
页码:185 / 204
页数:20
相关论文
共 25 条
[1]  
Ash T., 1989, Connection Science, V1, P365, DOI 10.1080/09540098908915647
[2]  
Chauvin Y., 1989, ADV NEURAL INFORMATI, P519
[3]  
Chung F. L., 1992, International Journal of Neural Systems, V3, P301, DOI 10.1142/S0129065792000231
[4]  
Dennis, 1996, NUMERICAL METHODS UN
[5]  
Fahlman S., 1990, ADV NEURAL INFORMATI, V2, P524
[6]  
Frean M, 1990, NEURAL COMPUT, V2, P198
[7]  
Gorodkin J, 1993, Int J Neural Syst, V4, P159, DOI 10.1142/S0129065793000146
[8]  
GROSSMAN SI, 1989, MULTIVARIABLE CALCUL, V1, P177
[9]  
Hassibi Babak, 1992, P ADV NEUR INF PROC, V5
[10]  
Hertz J., 1991, Introduction to the Theory of Neural Computation