Toward optimally distributed computation

被引:11
作者
Edwards, PJ [1 ]
Murray, AF [1 ]
机构
[1] Univ Edinburgh, Dept Elect Engn, Edinburgh EH9 3JL, Midlothian, Scotland
关键词
D O I
10.1162/089976698300017593
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article introduces the concept of optimally distributed computation in feedforward neural networks via regularization of weight saliency. By constraining the relative importance of the parameters, computation can be distributed thinly and evenly throughout the network. We propose that this will have beneficial effects on fault-tolerance performance and generalization ability in large network architectures. These theoretical predictions are verified by simulation experiments on two problems: one artificial and the other a real-world task. In summary, this article presents regularization terms for distributing neural computation optimally.
引用
收藏
页码:987 / 1005
页数:19
相关论文
共 28 条
[1]   MINIMUM COMPLEXITY DENSITY-ESTIMATION [J].
BARRON, AR ;
COVER, TM .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1991, 37 (04) :1034-1054
[2]  
Bishop C. M., 1995, Neural networks for pattern recognition
[3]   TRAINING WITH NOISE IS EQUIVALENT TO TIKHONOV REGULARIZATION [J].
BISHOP, CM .
NEURAL COMPUTATION, 1995, 7 (01) :108-116
[4]  
BISHOP CM, 1990, P INT NEURAL NETWORK, V2, P749
[5]  
BOLT G, 1992, THESIS U YORK
[6]  
Buntine W. L., 1991, Complex Systems, V5, P603
[7]  
CARTER M, 1988, P TECHN SESS FAULT T
[8]  
EDWARDS P, 1996, P INT C NEUR NETW WA, V1, P78
[9]  
EDWARDS P, IN PRESS IEEE P CIRC, V2
[10]   Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance? [J].
Edwards, PJ ;
Murray, AF .
INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 1995, 6 (04) :401-416