A constrained EM algorithm for principal component analysis

被引:28
作者
Ahn, JH [1 ]
Oh, JH [1 ]
机构
[1] Pohang Univ Sci & Technol, Dept Phys, Pohang, South Korea
关键词
D O I
10.1162/089976603321043694
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a constrained EM algorithm for principal component analysis (PCA) using a coupled probability model derived from single-standard factor analysis models with isotropic noise structure. The single probabilistic PCA, especially for the case where there is no noise, can find only a vector set that is a linear superposition of principal components and requires postprocessing, such as diagonalization of symmetric matrices. By contrast, the proposed algorithm finds the actual principal components, which are sorted in descending order of eigenvalue size and require no additional calculation or postprocessing. The method is easily applied to kernel PCA. It is also shown that the new EM algorithm is derived from a generalized least-squares formulation.
引用
收藏
页码:57 / 65
页数:9
相关论文
共 5 条
[1]   MAXIMUM LIKELIHOOD FROM INCOMPLETE DATA VIA EM ALGORITHM [J].
DEMPSTER, AP ;
LAIRD, NM ;
RUBIN, DB .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1977, 39 (01) :1-38
[2]  
Jolliffe I., 2002, PRINCIPAL COMPONENT, DOI [DOI 10.1016/0169-7439(87)80084-9, 10.1007/0-387-22440-8_13, DOI 10.1007/0-387-22440-8_13]
[3]   An expectation-maximization approach to nonlinear component analysis [J].
Rosipal, R ;
Girolami, M .
NEURAL COMPUTATION, 2001, 13 (03) :505-510
[4]  
Roweis S, 1998, ADV NEUR IN, V10, P626
[5]   Mixtures of probabilistic principal component analyzers [J].
Tipping, ME ;
Bishop, CM .
NEURAL COMPUTATION, 1999, 11 (02) :443-482