SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR

被引:1398
作者
Bickel, Peter J. [1 ]
Ritov, Ya'acov [2 ]
Tsybakov, Alexandre B. [3 ,4 ]
机构
[1] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
[2] Hebrew Univ Jerusalem, Fac Social Sci, Dept Stat, IL-91904 Jerusalem, Israel
[3] CREST, Stat Lab, F-92240 Malakoff, France
[4] Univ Paris 06, CNRS, UMR 1599, LPMA, F-75252 Paris 05, France
关键词
Linear models; model selection; nonparametric statistics; STATISTICAL ESTIMATION; VARIABLE SELECTION; LEAST-SQUARES; REGRESSION; AGGREGATION; SPARSITY; LARGER;
D O I
10.1214/08-AOS620
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. Forboth methods, wederive, inparallel, oracle inequalities for the prediction risk in the general nonparametric regression model, as well as bounds on the e(p) estimation loss for 1 <= p <= 2 in the linear model when the number of variables can be Much larger than the sample size.
引用
收藏
页码:1705 / 1732
页数:28
相关论文
共 27 条
[1]  
[Anonymous], 2004 P STAT COMP SEC
[2]  
Bickel PJ, 2007, ANN STAT, V35, P2352, DOI 10.1214/009053607000000424
[3]  
BUNEA F, 2004, 948 LPMA U PAR 6 PAR
[4]   Sparsity oracle inequalities for the Lasso [J].
Bunea, Florentina ;
Tsybakov, Alexandre ;
Wegkamp, Marten .
ELECTRONIC JOURNAL OF STATISTICS, 2007, 1 :169-194
[5]   Sparse density estimation with l1 penalties [J].
Bunea, Florentina ;
Tsybakov, Alexandre B. ;
Wegkamp, Marten H. .
LEARNING THEORY, PROCEEDINGS, 2007, 4539 :530-+
[6]   Aggregation for gaussian regression [J].
Bunea, Florentina ;
Tsybakov, Alexandre B. ;
Wegkamp, Marten H. .
ANNALS OF STATISTICS, 2007, 35 (04) :1674-1697
[7]   Aggregation and sparsity via l1 penalized least squares [J].
Bunea, Florentina ;
Tsybakov, Alexandre B. ;
Wegkamp, Marten H. .
LEARNING THEORY, PROCEEDINGS, 2006, 4005 :379-391
[8]  
Candes E, 2007, ANN STAT, V35, P2313, DOI 10.1214/009053606000001523
[9]   Stable recovery of sparse overcomplete representations in the presence of noise [J].
Donoho, DL ;
Elad, M ;
Temlyakov, VN .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (01) :6-18
[10]   Least angle regression - Rejoinder [J].
Efron, B ;
Hastie, T ;
Johnstone, I ;
Tibshirani, R .
ANNALS OF STATISTICS, 2004, 32 (02) :494-499