An improved algorithm for kernel principal component analysis

被引:73
作者
Zheng, WM [1 ]
Zou, CR
Zhao, L
机构
[1] SE Univ, Res Ctr Learning Sci, Nanjing 210096, Jiangsu, Peoples R China
[2] SE Univ, Engn Res Ctr Informat Proc & Applicat, Nanjing 210096, Jiangsu, Peoples R China
关键词
eigenvalue decomposition; feature extraction; kernel principal component analysis;
D O I
10.1007/s11063-004-0036-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel principal component analysis (KPCA), introduced by Scholkopf et al., is a nonlinear generalization of the popular principal component analysis (PCA) via the kernel trick. KPCA has shown to be a very powerful approach of extracting nonlinear features for classification and regression applications. However, the standard KPCA algorithm (Scholkopf et al., 1998, Neural Computation 10, 1299-1319) may suffer from computational problem for large scale data set. To overcome these drawbacks, we propose an efficient training algorithm in this paper, and show that this approach is of much more computational efficiency compared to the previous ones for KPCA.
引用
收藏
页码:49 / 56
页数:8
相关论文
共 6 条
[1]  
Ham F.M., 2001, PRINCIPLES NEUROCOMP
[2]  
Jolliffe I. T., 1986, Principal Component Analysis, DOI [DOI 10.1016/0169-7439(87)80084-9, 10.1007/0-387-22440-8_13, DOI 10.1007/0-387-22440-8_13]
[3]   Kernel PCA for feature extraction and de-noising in nonlinear regression [J].
Rosipal, R ;
Girolami, M ;
Trejo, LJ ;
Cichocki, A .
NEURAL COMPUTING & APPLICATIONS, 2001, 10 (03) :231-243
[4]   An expectation-maximization approach to nonlinear component analysis [J].
Rosipal, R ;
Girolami, M .
NEURAL COMPUTATION, 2001, 13 (03) :505-510
[5]   Nonlinear component analysis as a kernel eigenvalue problem [J].
Scholkopf, B ;
Smola, A ;
Muller, KR .
NEURAL COMPUTATION, 1998, 10 (05) :1299-1319
[6]   Locally nearest neighbor classifiers for pattern classification [J].
Zheng, WM ;
Zhao, L ;
Zou, CR .
PATTERN RECOGNITION, 2004, 37 (06) :1307-1309