Adapting to unknown smoothness via wavelet shrinkage

被引:3049
作者
Donoho, DL
Johnstone, IM
机构
[1] Department of Statistics, Stanford University, CA
基金
美国国家科学基金会; 美国国家卫生研究院; 美国国家航空航天局;
关键词
Besov; Holder; Sobolev; Triebel spaces; compactly supported wavelets; denoising; James-Stein estimator; minimax decision theory; nonparametric regression; nonlinear estimation; orthonormal bases; Stein unbiased risk estimate; thresholding; white noise model;
D O I
10.1080/01621459.1995.10476626
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We attempt to recover a function of unknown smoothness from noisy sampled data. We introduce a procedure, SureShrink, that suppresses noise by thresholding the empirical wavelet coefficients. The thresholding is adaptive: A threshold level is assigned to each dyadic resolution level by the principle of minimizing the Stein unbiased estimate of risk (Sure) for threshold estimates. The computational effort of the overall procedure is order N.log(N) as a function of the sample size N. SureShrink is smoothness adaptive: If the unknown function contains jumps, then the reconstruction (essentially) does also; if the unknown function has a smooth piece, then the reconstruction is (essentially) as smooth as the mother wavelet will allow. The procedure is in a sense optimally smoothness adaptive: It is near minimax simultaneously over a whole interval of the Besov scale; the size of this interval depends on the choice of mother wavelet. We know from a previous paper by the authors that traditional smoothing methods-kernels, splines, and orthogonal series estimates-even with optimal choices of the smoothing parameter, would be unable to perform in a near-minimax way over many spaces in the Besov scale. Examples of SureShrink are given. The advantages of the method are particularly evident when the underlying function has jump discontinuities on a smooth background.
引用
收藏
页码:1200 / 1224
页数:25
相关论文
共 37 条