SMO algorithm for least-squares SVM formulations

被引:164
作者
Keerthi, SS [1 ]
Shevade, SK
机构
[1] Natl Univ Singapore, Dept Mech Engn, Singapore 117576, Singapore
[2] Natl Univ Singapore, Genome Inst Singapore, Singapore 117528, Singapore
关键词
D O I
10.1162/089976603762553013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 [模式识别与智能系统]; 0812 [计算机科学与技术]; 0835 [软件工程]; 1405 [智能科学与技术];
摘要
This article extends the well-known SMO algorithm of support vector machines (SVMs) to least-squares SVM formulations that include LS-SVM classification, kernel ridge regression, and a particular form of regularized kernel Fisher discriminant. The algorithm is shown to be asymptotically convergent. It is also extremely easy to implement. Computational experiments show that the algorithm is fast and scales efficiently (quadratically) as a function of the number of examples.
引用
收藏
页码:487 / 507
页数:21
相关论文
共 19 条
[1]
[Anonymous], 1998, MATRIX COMPUTATIONS
[2]
[Anonymous], NC2TR1998030 ESPRIT
[3]
Bertsekas DP, 1998, Nonlinear programming
[4]
LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[5]
DECOSTE D, 2000, INT C KNOWL DISC DAT
[6]
Fletcher R., 1987, PRACTICAL METHODS OP, DOI [DOI 10.1002/9781118723203, 10.1002/9781118723203]
[7]
HAMERS B, 2001, 011110 ESATSISTA K U
[8]
Improvements to Platt's SMO algorithm for SVM classifier design [J].
Keerthi, SS ;
Shevade, SK ;
Bhattacharyya, C ;
Murthy, KRK .
NEURAL COMPUTATION, 2001, 13 (03) :637-649
[9]
Mika S, 2001, ADV NEUR IN, V13, P591
[10]
Mika S., 1999, NEURAL NETWORKS SIGN, VIX, P41, DOI DOI 10.1109/NNSP.1999.788121