Incremental Support Vector Learning for Ordinal Regression

被引:655
作者
Gu, Bin [1 ,2 ,3 ]
Sheng, Victor S. [4 ]
Tay, Keng Yeow [5 ]
Romano, Walter [6 ]
Li, Shuo [3 ,7 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Jiangsu Engn Ctr Network Monitoring, Nanjing 210044, Jiangsu, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing 210044, Jiangsu, Peoples R China
[3] Univ Western Ontario, Dept Med Biophys, London, ON N6A 3K7, Canada
[4] Univ Cent Arkansas, Dept Comp Sci, Conway, AR 72035 USA
[5] Victoria Hosp, London Hlth Sci Ctr, London, ON N6A 5W9, Canada
[6] St Josephs Hlth Care, London, ON M6R 1B5, Canada
[7] GE HealthCare, London, ON AL9 5EN, Canada
基金
美国国家科学基金会; 中国国家自然科学基金;
关键词
Incremental learning; online learning; ordinal regression (OR); support vector machine (SVM); MACHINE; CONVERGENCE; ALGORITHMS;
D O I
10.1109/TNNLS.2014.2342533
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support vector ordinal regression (SVOR) is a popular method to tackle ordinal regression problems. However, until now there were no effective algorithms proposed to address incremental SVOR learning due to the complicated formulations of SVOR. Recently, an interesting accurate on-line algorithm was proposed for training nu-support vector classification (nu-SVC), which can handle a quadratic formulation with a pair of equality constraints. In this paper, we first present a modified SVOR formulation based on a sum-of-margins strategy. The formulation has multiple constraints, and each constraint includes a mixture of an equality and an inequality. Then, we extend the accurate on-line nu-SVC algorithm to the modified formulation, and propose an effective incremental SVOR algorithm. The algorithm can handle a quadratic formulation with multiple constraints, where each constraint is constituted of an equality and an inequality. More importantly, it tackles the conflicts between the equality and inequality constraints. We also provide the finite convergence analysis for the algorithm. Numerical experiments on the several benchmark and real-world data sets show that the incremental algorithm can converge to the optimal solution in a finite number of steps, and is faster than the existing batch and incremental SVOR algorithms. Meanwhile, the modified formulation has better accuracy than the existing incremental SVOR algorithm, and is as accurate as the sum-of-margins based formulation of Shashua and Levin.
引用
收藏
页码:1403 / 1416
页数:14
相关论文
共 33 条
[1]  
Agarwal S, 2008, LECT NOTES ARTIF INT, V5254, P7, DOI 10.1007/978-3-540-87987-9_6
[2]  
Amit Y, 2008, J MACH LEARN RES, V9, P1399
[3]  
[Anonymous], 2001, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and beyond
[4]  
[Anonymous], 2010, UCI Machine Learning Repository
[5]  
[Anonymous], ADV NEURAL INFORM PR
[6]  
[Anonymous], 1964, THEORY MATRICES NUME
[7]  
Boyd S., 2004, CONVEX OPTIMIZATION
[8]  
Cardoso JS, 2007, J MACH LEARN RES, V8, P1393
[9]  
Cauwenberghs G, 2001, ADV NEUR IN, V13, P409
[10]   Support vector ordinal regression [J].
Chu, Wei ;
Keerthi, S. Sathiya .
NEURAL COMPUTATION, 2007, 19 (03) :792-815