Least squares support vector machine classifiers

被引:9311
作者
Suykens, JAK [1 ]
Vandewalle, J [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn, ESAT, SISTA, B-3001 Heverlee, Belgium
关键词
classification; support vector machines; linear least squares; radial basis function kernel;
D O I
10.1023/A:1018628609742
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this letter we discuss a least squares version for support vector machine (SVM) classifiers. Due to equality type constraints in the formulation, the solution follows from solving a set of linear equations, instead of quadratic programming for classical SVM's. The approach is illustrated on a two-spiral benchmark classification problem.
引用
收藏
页码:293 / 300
页数:8
相关论文
共 12 条
  • [1] Bishop C. M., 1995, NEURAL NETWORKS PATT
  • [2] Cherkassky V.S., 1998, LEARNING DATA CONCEP, V1st ed.
  • [3] Fletcher R., 1981, PRACTICAL METHODS OP
  • [4] Golub GH, 1989, MATRIX COMPUTATIONS
  • [5] Haykin S, 1998, NEURAL NETWORKS COMP
  • [6] Circular backpropagation networks for classification
    Ridella, S
    Rovetta, S
    Zunino, R
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (01): : 84 - 97
  • [7] Saunders C, 1998, Ridge regression learning algorithm in dual variables
  • [8] Comparing support vector machines with Gaussian kernels to radial basis function classifiers
    Scholkopf, B
    Sung, KK
    Burges, CJC
    Girosi, F
    Niyogi, P
    Poggio, T
    Vapnik, V
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1997, 45 (11) : 2758 - 2765
  • [9] Vapnik V, 1998, NONLINEAR MODELING, P55
  • [10] Vapnik V, 1999, NATURE STAT LEARNING