ADAPTIVE PRINCIPAL COMPONENT EXTRACTION (APEX) AND APPLICATIONS

被引:105
作者
KUNG, SY [1 ]
DIAMANTARAS, KI [1 ]
TAUR, JS [1 ]
机构
[1] SIEMENS,CORP RES,PRINCETON,NJ 08540
关键词
D O I
10.1109/78.295198
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper we describe a neural network model (APEX) for multiple principal component extraction. All the synaptic weights of the model are trained with the normalized Hebbian learning rule. The network structure features a hierarchical set of lateral connections among the output units which serve the purpose of weight orthogonalization. This structure also allows the size of the model to grow or shrink without need for retraining the old units. The exponential convergence of the network is formally proved while there is significant performance improvement over previous methods. By establishing an important connection with the recursive least squares algorithm we have been able to provide the optimal size for the learning step-size parameter which leads to a significant improvement in the convergence speed. This is in contrast with previous neural PCA models which lack such numerical advantages. The APEX algorithm is also parallelizable allowing the concurrent extraction of multiple principal components. Furthermore, APEX is shown to be applicable to the constrained PCA problem where the signal variance is maximized under external orthogonality constraints. We then study various principal component analysis (PCA) applications that might benefit from the adaptive solution offered by APEX. In particular we discuss applications in spectral estimation, signal detection and image compression and filtering, while other application domains are also briefly outlined.
引用
收藏
页码:1202 / 1217
页数:16
相关论文
共 44 条
[1]  
ABBAS HM, 1992, P INT JOINT C NEUR N, V2, P975
[2]   NEURAL NETWORKS AND PRINCIPAL COMPONENT ANALYSIS - LEARNING FROM EXAMPLES WITHOUT LOCAL MINIMA [J].
BALDI, P ;
HORNIK, K .
NEURAL NETWORKS, 1989, 2 (01) :53-58
[3]  
BANNOUR S, 1991, IEEE IJCNN, P2110, DOI 10.1109/IJCNN.1991.170699
[4]   A MEASURE OF THE TRACKING CAPABILITY OF RECURSIVE STOCHASTIC ALGORITHMS WITH CONSTANT GAINS [J].
BENVENISTE, A ;
RUGET, G .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1982, 27 (03) :639-649
[5]   AUTO-ASSOCIATION BY MULTILAYER PERCEPTRONS AND SINGULAR VALUE DECOMPOSITION [J].
BOURLARD, H ;
KAMP, Y .
BIOLOGICAL CYBERNETICS, 1988, 59 (4-5) :291-294
[6]  
Chen H., 1991, Neural Networks for Signal Processing. Proceedings of the 1991 IEEE Workshop (Cat. No.91TH0385-5), P40, DOI 10.1109/NNSP.1991.239537
[7]  
Deprettere EFA, 1988, SVD SIGNAL PROCESSIN
[8]  
DIAMANTARAS KI, 1991, P INT C AC SPEECH SI, P1049
[9]  
DIAMANTARAS KI, IN PRESS CROSS CORRE
[10]  
DIAMANTARAS KI, IN PRESS MULTILAYER