Feature vector selection and projection using kernels

被引:83
作者
Baudat, G [1 ]
Anouar, F [1 ]
机构
[1] MEI, W Chester, PA 19380 USA
关键词
kernel methods; feature space; data selection; principal component analysis; discriminant analysis;
D O I
10.1016/S0925-2312(03)00429-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper provides new insight into kernel methods by using data selection. The kernel trick is used to select from the data a relevant subset forming a basis in a feature space F. Thus the selected vectors define a subspace in F. Then, the data is projected onto this subspace where classical algorithms are applied. We show that kernel methods like generalized discriminant analysis (Neural Comput. 12 (2000) 2385) or kernel principal component analysis (Neural Comput. 10 (1998) 1299) can be expressed more easily. Moreover, it will turn out that the size of the basis is related to the complexity of the model. Therefore, the data selection leads to a complexity control and thus to a better generalization. The approach covers a wide range of algorithms. We investigate the function approximation on real classification problems and on a regression problem. (C) 2003 Elsevier B.V. All rights reserved.
引用
收藏
页码:21 / 38
页数:18
相关论文
共 24 条
[1]  
Aizerman M., 1964, AUTOMAT REM CONTR, V25, P821, DOI DOI 10.1234/12345678
[2]  
Albert A., 1972, REGRESSION MOORE PEN, DOI DOI 10.1016/S0076-5392(08)X6167-3
[3]  
[Anonymous], INT C MACH LEARN
[4]  
[Anonymous], 1992, TRAINING ALGORITHM O
[5]   Probabilistic self-organizing map and radial basis function networks [J].
Anouar, F ;
Badran, F ;
Thiria, S .
NEUROCOMPUTING, 1998, 20 (1-3) :83-96
[6]   Generalized discriminant analysis using a kernel approach [J].
Baudat, G ;
Anouar, FE .
NEURAL COMPUTATION, 2000, 12 (10) :2385-2404
[7]  
Baudat G, 2001, IEEE IJCNN, P1244, DOI 10.1109/IJCNN.2001.939539
[8]  
Burges C. J. C., 1998, TUTORIAL SUPPORT VEC
[9]  
BURGES CJC, 1997, IMPROVING ACCURACY S, V9
[10]  
Fukunaga K., 1990, INTRO STAT PATTERN R