The analysis of decomposition methods for support vector machines

被引:110
作者
Chang, CC [1 ]
Hsu, CW [1 ]
Lin, CJ [1 ]
机构
[1] Natl Taiwan Univ, Dept Comp Sci & Informat Engn, Taipei 106, Taiwan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2000年 / 11卷 / 04期
关键词
decomposition methods; projected gradients; support vector machines;
D O I
10.1109/72.857780
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The support vector machine (SVM) is a new and promising technique for pattern recognition. It requires the solution of a large dense quadratic programming problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, very few methods can handle the memory problem and an important one is the "decomposition method." However, there is no convergence proof so far. In this paper, we connect this method to projected gradient methods and provide theoretical proofs for a version of decomposition methods. An extension to bound-constrained formulation of SVM is also provided. We then show that this convergence proof is valid for general decomposition methods if their working set selection meets a simple requirement.
引用
收藏
页码:1003 / 1008
页数:6
相关论文
共 18 条