The MCA EXIN neuron for the minor component analysis

被引:95
作者
Cirrincione, G [1 ]
Cirrincione, M
Hérault, J
Van Huffel, S
机构
[1] Univ Picardie, CREA, F-80039 Amiens, France
[2] CNR, CERISEP, I-90128 Palermo, Italy
[3] LIS INPG, F-38000 Grenoble, France
[4] Katholieke Univ Leuven, SISTA, ESAT, B-3001 Heverlee, Belgium
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2002年 / 13卷 / 01期
关键词
gradient flow; induction motor; linear neurons; minor component analysis (MCA); Rayleigh quotient (RQ); regression; total least squares;
D O I
10.1109/72.977295
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The minor component analysis (MCA) deals with the recovery of the eigenvector associated to the smallest eigenvalue of the autocorrelation matrix of the input data and is a very important tool for signal processing and data analysis. It is almost exclusively solved by linear neurons. This paper presents a linear neuron endowed with a novel learning law, called MCA EXINn and analyzes its features. The neural literature about MCA is very poor, in the sense that both a little theoretical basis is given (almost always focusing on the ODE asymptotic approximation) and only experiments on toy problems (at most four-dimensional problems) are presented, without any numerical analysis. This work addresses these problems and lays sound theoretical foundations for the neural MCA theory. In particular, it classifies the MCA neurons according to the Riemannian metric and justifies, from the analysis of the degeneracy of the error cost, the different behavior in approaching convergence. The cost landscape is studied and used as a basis for the analysis of the asymptotic behavior. All the phases of the dynamics of the MCA algorithms are investigated in detail and, together with the numerical analysis, lead to the identification of three possible kinds of divergence, here called sudden, dynamic, and numerical. The importance of the choice of low initial conditions is also explained. A lot of importance is given to the experimental part, where simulations on high-dimensional problems are presented and analyzed. The orthogonal regression or total least squares (TLS) technique is also presented, together with a real-world application on the identification of the parameters of an electrical machine. It can be concluded that MCA EXIN is the best MCA neuron in terms of stability (no finite time divergence), speed, and accuracy.
引用
收藏
页码:160 / 187
页数:28
相关论文
共 65 条
[1]  
[Anonymous], LECT NOTES MATH
[2]  
[Anonymous], 1989, Complex Syst
[3]   NEURAL NETWORKS AND PRINCIPAL COMPONENT ANALYSIS - LEARNING FROM EXAMPLES WITHOUT LOCAL MINIMA [J].
BALDI, P ;
HORNIK, K .
NEURAL NETWORKS, 1989, 2 (01) :53-58
[4]   COMPARISON OF OPTIMUM AND LINEAR PREDICTION TECHNIQUES FOR CLUTTER CANCELLATION [J].
BARBAROSSA, S ;
DADDIO, E ;
GALATI, G .
IEE PROCEEDINGS-F RADAR AND SIGNAL PROCESSING, 1987, 134 (03) :277-282
[5]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[6]  
Chatelin F., 1993, EIGENVALUES MATRICES
[7]  
Chauvin Y., 1989, IJCNN: International Joint Conference on Neural Networks (Cat. No.89CH2765-6), P373, DOI 10.1109/IJCNN.1989.118611
[8]   A unified algorithm for principal and minor components extraction [J].
Chen, TP ;
Amari, SI ;
Lin, Q .
NEURAL NETWORKS, 1998, 11 (03) :385-390
[9]  
Cichocki A., 1993, Neural Networks for Optimization and Signal Processing
[10]   LIMITED-PRECISION EFFECTS IN ADAPTIVE FILTERING [J].
CIOFFI, JM .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1987, 34 (07) :821-833