PATHWISE COORDINATE OPTIMIZATION

被引:1265
作者
Friedman, Jerome [1 ]
Hastie, Trevor [1 ,2 ]
Hoefling, Holger [1 ]
Tibshirani, Robert [1 ,2 ]
机构
[1] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Hlth Res & Policy, Stanford, CA 94305 USA
关键词
Coordinate descent; lasso; convex optimization;
D O I
10.1214/07-AOAS131
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider "one-at-a-time" coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the L-1-penalized regression (lasso) in the literature, but it seems to have been largely ignored. Indeed. it seems (hat coordinate-wise algorithms are not often Used in convex optimization. We show that this algorithm is very competitive with the well-known LARS (or homotopy) procedure in large lasso problems, and that it call be applied to related methods such as the garotte and elastic net. It turns out that coordinate-wise descent does not work in the "Fused lasso." however. so we derive a generalized algorithm that yields the solution in much less time that a standard convex optimizer. Finally. we generalize the procedure to the two-dimensional fused lasso, and demonstrate its performance oil some image smoothing problems.
引用
收藏
页码:302 / 332
页数:31
相关论文
共 23 条