EFFECT OF NONLINEAR TRANSFORMATIONS ON CORRELATION BETWEEN WEIGHTED SUMS IN MULTILAYER PERCEPTRONS

被引:13
作者
OH, SH
LEE, YJ
机构
[1] Research Department, Electronics and Telecommunications Research Institute, Daeduk Science Town, Daejeon
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1994年 / 5卷 / 03期
关键词
D O I
10.1109/72.286927
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nonlinear transformation is one of the major obstacles to analyzing the properties of multi-layer perceptrons. In this letter, we prove that the correlation coefficient between two jointly Gaussian random variables decreases when each of them is transformed under continuous nonlinear transformations, which can be approximated by piecewise linear functions. When the inputs or the weights of a multilayer perceptron are perturbed randomly, the weighted sums to the hidden neurons are asymptotically jointly Gaussian random variables. Since sigmoidal transformation can be approximated piecewise linearly, the correlations among the weighted sums decrease under sigmoidal transformations. Based on this result, we can say that sigmoidal transformation used as the transfer function of the multi-layer perceptron reduces redundancy in the information contents of the hidden neurons.
引用
收藏
页码:508 / 510
页数:3
相关论文
共 8 条
[1]   ANALYSIS OF GRADIENT DESCENT LEARNING ALGORITHMS FOR MULTILAYER FEEDFORWARD NEURAL NETWORKS [J].
GUO, H ;
GELFAND, SB .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1991, 38 (08) :883-894
[2]   SENSITIVITY ANALYSIS OF MULTILAYER PERCEPTRON WITH DIFFERENTIABLE ACTIVATION FUNCTIONS [J].
JIN, YC ;
CHOI, CH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (01) :101-107
[3]  
KAPLAN W, 1981, ADV MATH ENG, P481
[4]  
LEE Y, 1991, JUN P IJCNN 91
[5]  
OH SH, 1992, NOV P IJCNN92
[6]  
Papoulis A., 1984, PROBABILITY RANDOM V
[7]   ANALYSIS OF THE EFFECTS OF QUANTIZATION IN MULTILAYER NEURAL NETWORKS USING A STATISTICAL-MODEL [J].
XIE, Y ;
JABRI, MA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (02) :334-338
[8]  
XUE Q, 1990, JUN P IJCNN 90