The nonlinear PCA learning rule in independent component analysis

被引:184
作者
Oja, E
机构
[1] Helsinki University of Technology, Lab. of Comp. and Info. Science, FIN-02150 Espoo
关键词
principal component analysis; independent component analysis; unsupervised learning; signal separation;
D O I
10.1016/S0925-2312(97)00045-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It has been verified experimentally that when nonlinear Principal Component Analysis (PCA) learning rules are used for the weights of a neural layer, the neurons have signal separation capabilities. The network is performing Independent Component Analysis. The learning rule, earlier proposed by the author, is studied here mathematically to analyze why and how the algorithm works in this application, It is shown that the weight matrix obtained as the asymptotic solution of the nonlinear PCA learning rule is in some cases a rotation of the input vector to statistically independent directions. This explains why it can be used for image and speech signal separation. Sufficient conditions are formulated, depending on the nonlinear neuron activation function and on the probability densities of the original signal components. It is shown that a sigmoidal nonlinearity as the activation function is feasible for flat sub-Gaussian densities of the original signals, while polynomial activation functions are feasible for sharp super-Gaussian densities.
引用
收藏
页码:25 / 45
页数:21
相关论文
共 16 条