Unified LASSO estimation by least squares approximation

被引:244
作者
Wang, Hansheng [1 ]
Leng, Chenlei
机构
[1] Peking Univ, Grad Sch Management, Beijing 100871, Peoples R China
[2] Natl Univ Singapore, Dept Stat & Appl Probabil, Singapore 117548, Singapore
关键词
adaptive LASSO; Bayes information criterion; LASSO; least angle regression; least squares approximation; microarray data; oracle property; solution path;
D O I
10.1198/016214507000000509
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We propose a method of least squares approximation (LSA) for unified yet simple LASSO estimation. Our general theoretical framework includes ordinary least squares, generalized linear models, quantile regression, and many others as special cases. Specifically, LSA can transfer many different types of LASSO objective functions into their asymptotically equivalent least squares problems. Thereafter, the standard asymptotic theory can be established and the LARS algorithm can be applied. In particular, if the adaptive LASSO penalty and a Bayes information criterion-type tuning parameter selector are used, the resulting LSA estimator can be as efficient as the oracle. Extensive numerical studies confirm our theory.
引用
收藏
页码:1039 / 1048
页数:10
相关论文
共 33 条
[1]  
BUCKLEY J, 1979, BIOMETRIKA, V66, P429
[2]  
COX DR, 1972, J R STAT SOC B, V34, P187
[3]   Comparison of discrimination methods for the classification of tumors using gene expression data [J].
Dudoit, S ;
Fridlyand, J ;
Speed, TP .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2002, 97 (457) :77-87
[4]   Least angle regression - Rejoinder [J].
Efron, B ;
Hastie, T ;
Johnstone, I ;
Tibshirani, R .
ANNALS OF STATISTICS, 2004, 32 (02) :494-499
[5]  
Fan JQ, 2002, ANN STAT, V30, P74
[6]   Variable selection via nonconcave penalized likelihood and its oracle properties [J].
Fan, JQ ;
Li, RZ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (456) :1348-1360
[7]   Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring [J].
Golub, TR ;
Slonim, DK ;
Tamayo, P ;
Huard, C ;
Gaasenbeek, M ;
Mesirov, JP ;
Coller, H ;
Loh, ML ;
Downing, JR ;
Caligiuri, MA ;
Bloomfield, CD ;
Lander, ES .
SCIENCE, 1999, 286 (5439) :531-537
[8]  
HUBER P, 1981, ROBUSTY ESTIMATION
[9]   Variable selection using MM algorithms [J].
Hunter, DR ;
Li, RZ .
ANNALS OF STATISTICS, 2005, 33 (04) :1617-1642
[10]   MODEL SELECTION FOR LEAST ABSOLUTE DEVIATIONS REGRESSION IN SMALL SAMPLES [J].
HURVICH, CM ;
TSAI, CL .
STATISTICS & PROBABILITY LETTERS, 1990, 9 (03) :259-265