SVD algorithms: APEX-like versus subspace methods

被引:10
作者
Weingessel, A [1 ]
Hornik, K [1 ]
机构
[1] VIENNA TECH UNIV,INST STAT & WAHRSCHEINLICHKEITSTHEORIE,A-1040 VIENNA,AUSTRIA
关键词
APEX algorithm; principal component analysis; singular value decomposition; subspace algorithm;
D O I
10.1023/A:1009642710601
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We compare several new SVD learning algorithms which are based on the subspace method in principal component analysis with the APEX-like algorithm proposed by Diamantaras. It is shown experimentally that the convergence of these algorithms is as fast as the convergence of the APEX-like algorithm.
引用
收藏
页码:177 / 184
页数:8
相关论文
共 13 条
[1]   LEARNING IN LINEAR NEURAL NETWORKS - A SURVEY [J].
BALDI, PF ;
HORNIK, K .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (04) :837-858
[2]   DYNAMIC-SYSTEMS THAT SORT LISTS, DIAGONALIZE MATRICES, AND SOLVE LINEAR-PROGRAMMING PROBLEMS [J].
BROCKETT, RW .
LINEAR ALGEBRA AND ITS APPLICATIONS, 1991, 146 :79-91
[3]  
Diamantaras KI, 1996, Principal Component Neural Networks: Theory and Applications
[4]  
DIAMANTARAS KI, 1992, THESIS PRINCETON U
[5]  
Golub GH, 1989, MATRIX COMPUTATIONS
[6]  
Goodall I.H., 1990, BIDDLE, V1990, P861
[7]  
KAY J, 1992, P INT JOINT C NEUR N, V4, P79
[8]  
Oja E., 1989, International Journal of Neural Systems, V1, P61, DOI 10.1142/S0129065789000475
[9]   A SIMPLIFIED NEURON MODEL AS A PRINCIPAL COMPONENT ANALYZER [J].
OJA, E .
JOURNAL OF MATHEMATICAL BIOLOGY, 1982, 15 (03) :267-273
[10]   LYAPUNOV FUNCTIONS FOR CONVERGENCE OF PRINCIPAL COMPONENT ALGORITHMS [J].
PLUMBLEY, MD .
NEURAL NETWORKS, 1995, 8 (01) :11-23