一种非光滑损失坐标下降算法

被引:2
作者
吴卫邦 [1 ]
朱烨雷 [2 ]
陶卿 [2 ]
机构
[1] 镇江船艇学院船艇指挥系
[2] 陆军军官学院五系
关键词
机器学习; 优化; 坐标下降; 非光滑损失; Hinge;
D O I
暂无
中图分类号
TP301.6 [算法理论];
学科分类号
摘要
针对非光滑损失问题提出一种新的坐标下降算法,采用排序搜索的方式求解子问题解析解。分析了算法的时间复杂度,并给出了三种提高收敛速度的实用技巧。实验表明算法对正则化Hinge损失问题具有良好的性能,达到了预期的效果。
引用
收藏
页码:3688 / 3692+3700 +3700
页数:6
相关论文
共 18 条
[1]  
Efficiency of coordinate descent methods on huge-scaleoptimization problems. NESTEROV Y. . 2010
[2]  
Composite ob-jective mirror descent. DUCHI J,SHALEV-SHWARTZ S,SINGER Y,et al. Proc of the 23rd Annual Conference onLearning Theory . 2010
[3]  
Optimized cutting plane algorithm forsupport vector machines. FRANC V,SONNENBURG S. Proc of the 25th International Confe-rence on Machine Learning . 2008
[4]  
On the linear convergence of descentmethods for convex essentially smooth minimization. LUO Zhi-quan,TSENG P. SIAM Jour-nal Control Optimization . 1992
[5]  
A dual coordinate de-scent method for large-scale linear SVM. HSIEH C J,CHANG Kai-wei,LIN C J,et al. Proc of the 25th Inter-national Conference on Machine Learning . 2008
[6]  
Coordinate descent methodfor large-scale L2-loss linear support vector machines. CHANG Kai-wei,HSIEH C J,LIN C J. Journal ofMachine Learning Research . 2008
[7]   ON THE CONVERGENCE OF THE COORDINATE DESCENT METHOD FOR CONVEX DIFFERENTIABLE MINIMIZATION [J].
LUO, ZQ ;
TSENG, P .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1992, 72 (01) :7-35
[8]  
Training Linear SVMs in Linear Time. Joachins T. Proceed-ings of the ACM Sigkdd International Conference on Know ledgeDiscovery and Data Mining . 2006
[9]  
Pegasos:Primal estimated sub-gradient solver for SVM. Shalev-Shwartz S,Singer Y,Srebro N. Proc of the24thInt Conf on Machine Learning . 2007
[10]  
Pathwise coordinate optimization. T Wu,K Lange. Annals of Applied Statistics . 2008