A Homotopy Algorithm for the Quantile Regression Lasso and Related Piecewise Linear Problems

被引:11
作者
Osborne, M. R. [1 ]
Turlach, B. A. [2 ]
机构
[1] Australian Natl Univ, Inst Math Sci, Canberra, ACT 0200, Australia
[2] Univ Western Australia, Sch Math & Stat, Crawley, WA 6009, Australia
关键词
Complete solution path algorithm; Homotopy algorithm; LASSO; Piecewise linear loss; Quantile regression; VARIABLE SELECTION; REGULARIZATION;
D O I
10.1198/jcgs.2011.09184
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We show that the homotopy algorithm of Osborne, Presnell, and Turlach (2000), which has proved such an effective optimal path following method for implementing Tibshirani's "lasso" for variable selection in least squares estimation problems, can be extended to polyhedral objectives in examples such as the quantile regression lasso. The new algorithm introduces the novel feature that it requires two homotopy sequences involving continuation steps with respect to both the constraint bound and the Lagrange multiplier to be performed consecutively. Performance is illustrated by application to several standard datasets, and these results are compared to calculations made with the original lasso homotopy program. This permits an assessment of the computational complexity to be made both for the new method and for the closely related linear programming post-optimality procedures as these generate essentially identical solution trajectories. This comparison strongly favors the least squares selection method. However, the new method still provides an effective computational procedure, plus it has distinct implementation advantages over the linear programming approaches to the polyhedral objective problem. The computational difficulty is explained and the problem that needs to be resolved in order to improve performance identified. An online supplement to the article contains proofs and R code to implement the algorithm.
引用
收藏
页码:972 / 987
页数:16
相关论文
共 21 条
  • [1] Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with OSCAR
    Bondell, Howard D.
    Reich, Brian J.
    [J]. BIOMETRICS, 2008, 64 (01) : 115 - 123
  • [2] Candes E, 2007, ANN STAT, V35, P2313, DOI 10.1214/009053606000001523
  • [3] Fast Solution of l1-Norm Minimization Problems When the Solution May Be Sparse
    Donoho, David L.
    Tsaig, Yaakov
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2008, 54 (11) : 4789 - 4812
  • [4] Draper N. R., 1998, APPL REGRESSION ANAL, DOI DOI 10.1002/9781118625590.CH15
  • [5] Hastie T, 2004, J MACH LEARN RES, V5, P1391
  • [6] Hettsmansperger T. P., 1998, ROBUST NONPARAMETRIC
  • [7] Solving l1 Regularization Problems With Piecewise Linear Losses
    Kato, Kengo
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2010, 19 (04) : 1024 - 1040
  • [8] Koenker R., 2005, Econometric Society Monographs, DOI [10.1017/CBO9780511754098, DOI 10.1017/CBO9780511754098]
  • [9] L1-norm quantile regression
    Li, Youjuan
    Zhu, Ji
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2008, 17 (01) : 163 - 185
  • [10] An Improved Algorithm for the Solution of the Regularization Path of Support Vector Machine
    Ong, Chong-Jin
    Shao, Shiyun
    Yang, Jianbo
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (03): : 451 - 462