Worst case analysis of weight inaccuracy effects in multilayer perceptrons

被引:14
作者
Anguita, D [1 ]
Ridella, S [1 ]
Rovetta, S [1 ]
机构
[1] Univ Genoa, Dept Biophys & Elect Engn, I-16126 Genoa, Italy
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1999年 / 10卷 / 02期
关键词
interval arithmetic; multilayer perceptron; quantization; robustness;
D O I
10.1109/72.750571
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We derive here a new method for the analysis of weight quantization effects in multilayer perceptrons based on the application of interval arithmetic, Differently from previous results, we find worst case bounds on the errors due to weight quantization, that are valid for every distribution of the input or weight values. Given a trained network, our method allows to easily compute the minimum number of bits needed to encode its weights.
引用
收藏
页码:415 / 418
页数:4
相关论文
共 14 条
[11]  
Lippmann R. P., 1988, Computer Architecture News, V16, P7, DOI [10.1109/MASSP.1987.1165576, 10.1145/44571.44572]
[12]  
Ruck D. W., 1990, IEEE T NEURAL NETWOR, V1
[13]   REDUCTION OF REQUIRED PRECISION BITS FOR BACKPROPAGATION APPLIED TO PATTERN-RECOGNITION [J].
SAKAUE, S ;
KOHDA, T ;
YAMAMOTO, H ;
MARUNO, S ;
SHIMEKI, Y .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1993, 4 (02) :270-275
[14]   ANALYSIS OF THE EFFECTS OF QUANTIZATION IN MULTILAYER NEURAL NETWORKS USING A STATISTICAL-MODEL [J].
XIE, Y ;
JABRI, MA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (02) :334-338