Exact simplification of support vector solutions

被引:86
作者
Downs, T [1 ]
Gates, KE
Masters, A
机构
[1] Univ Queensland, Sch Informat Technol & Elect Engn, Brisbane, Qld 4072, Australia
[2] Univ Queensland, Dept Math, Brisbane, Qld 4072, Australia
关键词
support vector machines; kernel methods;
D O I
10.1162/15324430260185637
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper demonstrates that standard algorithms for training support vector machines generally produce solutions with a greater number of support vectors than are strictly necessary. An algorithm is presented that allows unnecessary support vectors to be recognized and eliminated while leaving the solution otherwise unchanged. The algorithm is applied to a variety of benchmark data sets (for both classification and regression) and in most cases the procedure leads to a reduction in the number of support vectors. In some cases the reduction is substantial.
引用
收藏
页码:293 / 297
页数:5
相关论文
共 7 条
  • [1] [Anonymous], 1999, P WORKSH SUPP VECT M
  • [2] [Anonymous], P 13 INT C MACH LEAR
  • [3] Burges CJC, 1997, ADV NEUR IN, V9, P375
  • [4] SVMTorch: Support vector machines for large-scale regression problems
    Collobert, R
    Bengio, S
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2001, 1 (02) : 143 - 160
  • [5] Noble B., 1988, APPL LINEAR ALGEBRA
  • [6] Platt JC, 1999, ADVANCES IN KERNEL METHODS, P185
  • [7] Vapnik V., 1998, STAT LEARNING THEORY, V1, P2