On a kernel-based method for pattern recognition, regression, approximation, and operator inversion

被引:167
作者
Smola, AJ
Scholkopf, B
机构
[1] GMD FIRST, D-12489 Berlin, Germany
[2] Max Planck Inst Biol Cybernet, D-72076 Tubingen, Germany
关键词
kernels; support vector machines; regularization; inverse problems; regression; pattern recognition;
D O I
10.1007/PL00013831
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We present a kernel-based framework for pattern recognition, regression estimation, function approximation, and multiple operator inversion. Adopting a regularization-theoretic framework, the above are formulated as constrained optimization problems. Previous approaches such as ridge regression, support vector methods, and regularization networks are included as special cases. We show connections between the cost function and some properties up to now believed to apply to support vector machines only. For appropriately chosen cost functions, the optimal solution of all the problems described above can be found by solving a simple quadratic programming problem.
引用
收藏
页码:211 / 231
页数:21
相关论文
共 43 条
[1]   HINTS [J].
ABUMOSTAFA, YS .
NEURAL COMPUTATION, 1995, 7 (04) :639-671
[2]   NEW LOOK AT STATISTICAL-MODEL IDENTIFICATION [J].
AKAIKE, H .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (06) :716-723
[3]  
[Anonymous], 1982, ESTIMATION DEPENDENC
[4]  
[Anonymous], P 13 INT C MACH LEAR
[5]  
[Anonymous], 1963, PURE APPL MATH
[6]  
Bishop C. M., 1995, Neural networks for pattern recognition
[7]  
Boser B. E., 1992, Proceedings of the Fifth Annual ACM Workshop on Computational Learning Theory, P144, DOI 10.1145/130385.130401
[8]   A COMPUTATIONAL METHOD FOR THE INDEFINITE QUADRATIC-PROGRAMMING PROBLEM [J].
BUNCH, JR ;
KAUFMAN, L .
LINEAR ALGEBRA AND ITS APPLICATIONS, 1980, 34 (DEC) :341-370
[9]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297
[10]  
COURANT R, 1953, METHODS MATH PHYSICS, V1