Incremental kernel principal component analysis

被引:141
作者
Chin, Tat-Jun [1 ]
Suter, David
机构
[1] Inst Infocomm Res, Singapore 119613, Singapore
[2] Monash Univ, Dept Elect & Comp Syst Engn, Clayton, Vic 3800, Australia
关键词
enabling online processing; incremental kernel principal component analysis (KPCA); reduced-set expansions; reducing time complexity;
D O I
10.1109/TIP.2007.896668
中图分类号
TP18 [人工智能理论];
学科分类号
081104 [模式识别与智能系统]; 0812 [计算机科学与技术]; 0835 [软件工程]; 1405 [智能科学与技术];
摘要
The kernel principal component analysis (KPCA) has been applied in numerous image-related machine learning applications and it has exhibited superior performance over previous approaches, such as PCA. However, the standard implementation of KPCA scales badly with the problem size, making computations for large problems infeasible. Also, the "batch" nature of the standard KPCA computation method does not allow for applications that require online processing. This has somewhat restricted the domains in which KPCA can potentially be applied. This paper introduces an incremental computation algorithm for KPCA to address these two problems. The basis of the proposed solution lies in computing incremental linear PCA in the kernel induced feature space, and constructing reduced-set expansions to maintain constant update speed and memory usage. We also provide experimental results which demonstrate the effectiveness of the approach.
引用
收藏
页码:1662 / 1674
页数:13
相关论文
共 32 条
[1]
[Anonymous], J MACHINE LEARNING R
[2]
[Anonymous], INT C MACH LEARN
[3]
[Anonymous], P 13 INT C MACH LEAR
[4]
BRAND M, 2002, ECCV
[5]
CHIN TJ, 2006, 7 IEEE C AUT FAC GES
[6]
CRISTIANI N, 2004, KERNEL METHODS PATTE
[7]
FINE S, 2000, 21911 RC IBM TJ WATS
[8]
Franc V, 2003, LECT NOTES COMPUT SC, V2756, P426
[9]
Franc V, 2005, THESIS CZECH TU PRAG
[10]
Fast Monte-Carlo algorithms for finding low-rank approximations [J].
Frieze, A ;
Kannan, R ;
Vempala, S .
39TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE, PROCEEDINGS, 1998, :370-378