Model selection for Gaussian regression with random design

被引:14
作者
Birgé, L [1 ]
机构
[1] Univ Paris 06, CNRS, Lab Probabil & Modeles Aleatoires, UMR 7599, F-75252 Paris 05, France
关键词
Besov spaces; Hellinger distance; minimax risk; model selection; random design regression;
D O I
10.3150/bj/1106314849
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper is concerned with Gaussian regression with random design, where the observation are independent and indentically distributed. It is known from work by Le Cam that the rate of convergence of optimal estimators is closely connected to the metric structure of the parameter space with respect to the Hellinger distance. In particular, this metric structure essentially determines the risk when the loss function is a power of the Hellinger distance. For random design regression. one typically uses as loss function the squared L-2-distance between the estimator and the parameter. If the parameter space is bounded with respect to the L-infinity-norm, both distances are equivalent. Without this assumption, it may happen that there is a large distortion between the two distances, resulting in some unusual rates of convergence for the squared L-2-risk, as noticed by Baraud. We explain this phenomenon and then show that the use of the Hellinger distance instead of the L-2-distance allows us to recover the usual rates and to carry out model selection in great generality. An extension to the L-2-risk is given under a boundedness assumption similar to that Liven by Wegkamp and by Yang.
引用
收藏
页码:1039 / 1051
页数:13
相关论文
共 22 条