Recursive finite Newton algorithm for support vector regression in the primal

被引:36
作者
Bo, Liefeng [1 ]
Wang, Ling [1 ]
Jiao, Licheng [1 ]
机构
[1] Xidian Univ, Inst Intelligent Informat Proc, Xian 710071, Peoples R China
关键词
D O I
10.1162/neco.2007.19.4.1082
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Some algorithms in the primal have been recently proposed for training support vector machines. This letter follows those studies and develops a recursive finite Newton algorithm (IHLF-SVR-RFN) for training nonlinear support vector regression. The insensitive Huber loss function and the computation of the Newton step are discussed in detail. Comparisons with LIBSVM 2.82 show that the proposed algorithm gives promising results.
引用
收藏
页码:1082 / 1096
页数:15
相关论文
共 20 条
[1]   A tutorial on Support Vector Machines for pattern recognition [J].
Burges, CJC .
DATA MINING AND KNOWLEDGE DISCOVERY, 1998, 2 (02) :121-167
[2]  
CHAPELLE O, 2006, 147 MPI BIOL CYB
[3]  
Clarke FH, 1983, OPTIMIZATION NONSMOO
[4]  
Fan RE, 2005, J MACH LEARN RES, V6, P1889
[5]   Finite Newton method for Lagrangian support vector machine classification [J].
Fung, G ;
Mangasarian, OL .
NEUROCOMPUTING, 2003, 55 (1-2) :39-55
[6]   GENERALIZED HESSIAN MATRIX AND 2ND-ORDER OPTIMALITY CONDITIONS FOR PROBLEMS WITH C1,1 DATA [J].
HIRIARTURRUTY, JB ;
STRODIOT, JJ ;
NGUYEN, VH .
APPLIED MATHEMATICS AND OPTIMIZATION, 1984, 11 (01) :43-56
[7]  
Huber P. J., 1981, ROBUST STAT
[8]  
Joachims J., 1999, ADV KERNEL METHODS S
[9]  
Keerthi SS, 2006, J MACH LEARN RES, V7, P1493
[10]  
Keerthi SS, 2005, J MACH LEARN RES, V6, P341