LOCAL LINEAR-REGRESSION SMOOTHERS AND THEIR MINIMAX EFFICIENCIES

被引:586
作者
FAN, JQ
机构
关键词
LOCAL LINEAR SMOOTHERS; HARDEST ONE-DIMENSIONAL SUBPROBLEM; MINIMAX RISK; MODULUS OF CONTINUITY; NONPARAMETRIC REGRESSION;
D O I
10.1214/aos/1176349022
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper we introduce a smooth version of local linear regression estimators and address their advantages. The MSE and MISE of the estimators are computed explicitly. It turns out that the local linear regression smoothers have nice sampling properties and high minimax efficiency-they are not only efficient in rates but also nearly efficient in constant factors. In the nonparametric regression context, the asymptotic minimax lower bound is developed via the heuristic of the ''hardest one-dimensional subproblem'' of Donoho and Liu. Connections of the minimax risk with the modulus of continuity are made. The lower bound is also applicable for estimating conditional mean (regression) and conditional quantiles for both fixed and random design regression problems.
引用
收藏
页码:196 / 216
页数:21
相关论文
共 29 条