An expectation-maximization approach to nonlinear component analysis

被引:39
作者
Rosipal, R [1 ]
Girolami, M [1 ]
机构
[1] Univ Paisley, Dept Comp & Informat Syst, Computat Intelligence Res Unit, Paisley PA1 2BE, Renfrew, Scotland
关键词
D O I
10.1162/089976601300014439
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The proposal of considering nonlinear principal component analysis as a kernel eigenvalue problem has provided an extremely powerful method of extracting nonlinear features for a number of classification and regression applications. Whereas the utilization of Mercer kernels makes the problem of computing principal components in, possibly, infinite-demensional feature spaces tractable, there are still the attendant numerical problems of diagonalizing large matrices. In this contribution, we propose an expectation-maximization approach for performing kernel principal component analysis and show this to be a computationally efficient method, especially when the number of data points is large.
引用
收藏
页码:505 / 510
页数:6
相关论文
共 6 条
  • [1] Jolliffe I., 2002, PRINCIPAL COMPONENT, DOI [DOI 10.1016/0169-7439(87)80084-9, 10.1007/0-387-22440-8_13, DOI 10.1007/0-387-22440-8_13]
  • [2] ROSIPAL R, 2000, 4 U PAISL CIS DEP
  • [3] A unifying review of linear gaussian models
    Roweis, S
    Ghahramani, Z
    [J]. NEURAL COMPUTATION, 1999, 11 (02) : 305 - 345
  • [4] Nonlinear component analysis as a kernel eigenvalue problem
    Scholkopf, B
    Smola, A
    Muller, KR
    [J]. NEURAL COMPUTATION, 1998, 10 (05) : 1299 - 1319
  • [5] Schölkopf B, 1999, ADVANCES IN KERNEL METHODS, P327
  • [6] Probabilistic principal component analysis
    Tipping, ME
    Bishop, CM
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 1999, 61 : 611 - 622