Kernel PCA for feature extraction and de-noising in nonlinear regression

被引:178
作者
Rosipal, R
Girolami, M
Trejo, LJ
Cichocki, A
机构
[1] NASA, Ames Res Ctr, Computat Sci Div, Moffett Field, CA 94035 USA
[2] Univ Paisley, Appl Computat Intelligence Res Unit, Sch Informat & Commun Technol, Paisley PA1 2BE, Renfrew, Scotland
[3] RIKEN, Lab Adv Brain Signal Proc, Brain Sci Inst, Wako, Saitama 35101, Japan
[4] Warsaw Univ Technol, Warsaw, Poland
关键词
de-noising; feature extraction; human performance monitoring; kernel functions; nonlinear regression; principal components;
D O I
10.1007/s521-001-8051-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose the application of the Kernel Principal Component Analysis (PCA) technique for feature selection in a. high-dimensional feature space, where input variables are mapped by a Gaussian kernel. The extracted features are employed in the regression problems of chaotic Mackey-Glass time-series prediction in a noisy environment and estimating human signal detection performance from brain event-related potentials elicited by task relevant signals. We compared results obtained using either Kernel PCA or linear PCA as data preprocessing steps. On the human signal detection task, we report the superiority of Kernel PCA feature extraction over linear PCA. Similar to linear PCA, we demonstrate de-noising of the original data by the appropriate selection of various nonlinear principal components. The theoretical relation and experimental comparison of Kernel Principal Components Regression, Kernel Ridge Regression and epsilon-insensitive Support Vector Regression is also provided.
引用
收藏
页码:231 / 243
页数:13
相关论文
共 26 条
[1]  
COLLOBERT R, 2000, 0017 IDIAP
[2]  
Cristianini N, 2000, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods
[3]   Regularization networks and support vector machines [J].
Evgeniou, T ;
Pontil, M ;
Poggio, T .
ADVANCES IN COMPUTATIONAL MATHEMATICS, 2000, 13 (01) :1-50
[4]   A STATISTICAL VIEW OF SOME CHEMOMETRICS REGRESSION TOOLS [J].
FRANK, IE ;
FRIEDMAN, JH .
TECHNOMETRICS, 1993, 35 (02) :109-135
[5]  
Girosi F., 1993, 1430 AI MIT
[6]  
Golub G. H., 2012, MATRIX COMPUTATIONS
[7]  
Jolliffe I., 2002, PRINCIPAL COMPONENT, DOI [DOI 10.1016/0169-7439(87)80084-9, 10.1007/0-387-22440-8_13, DOI 10.1007/0-387-22440-8_13]
[8]   A NOTE ON THE USE OF PRINCIPAL COMPONENTS IN REGRESSION [J].
JOLLIFFE, IT .
APPLIED STATISTICS-JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES C, 1982, 31 (03) :300-303
[9]  
Koska M, 1997, COMPUTER-INTENSIVE METHODS IN CONTROL AND SIGNAL PROCESSING, P271
[10]  
Montgomery D. C., 1992, INTRO LINEAR REGRESS