Uniformly improving the Cramer-Rao bound and maximum-likelihood estimation

被引:48
作者
Eldar, Yonina C. [1 ]
机构
[1] Technion Israel Inst Technol, Dept Elect Engn, IL-32000 Haifa, Israel
基金
以色列科学基金会;
关键词
biased estimation; Cramer-Rao bound; dominating estimators; maximum likelihood; mean-squared error (MSE) bounds; minimax bounds;
D O I
10.1109/TSP.2006.877648
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
An important aspect of estimation theory is characterizing the best achievable performance in a given estimation problem, as well as determining estimators that achieve the optimal performance. The traditional Cramer-Rao type bounds provide benchmarks on the variance of any estimator of a deterministic parameter vector under suitable regularity conditions, while requiring a-priori specification of a desired bias gradient. In applications, it is often not clear how to choose the required bias. A direct measure of the estimation error that takes both the variance and the bias into account is the mean squared error (MSE), which is the sum of the variance and the squared-norm of the bias. Here, we develop bounds on the MSE in estimating a deterministic parameter vector x(0) over all bias vectors that are linear in x(0), which includes the traditional unbiased estimation as a special case. In some settings, it is possible to minimize the MSE over all linear bias vectors. More generally, direct minimization is not possible since the optimal solution depends on the unknown x(0). Nonetheless, we show that in many cases, we can find bias vectors that result in an MSE bound that is smaller than the Cramer-Rao lower bound (CRLB) for all values of x(0). Furthermore, we explicitly construct estimators that achieve these bounds in cases where an efficient estimator exists, by performing a simple linear transformation on the standard maximum likelihood (ML) estimator. This leads to estimators that result in a smaller MSE than the ML approach for all possible values of x(0).
引用
收藏
页码:2943 / 2956
页数:14
相关论文
共 47 条
[1]  
[Anonymous], 1999, Mathematical Methods of Statistics
[2]  
BECK A, IN PRESS SIAM J OPTI
[3]   Maximum set estimators with bounded estimation error [J].
Ben-Haim, Z ;
Eldar, YC .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2005, 53 (08) :3172-3182
[4]  
Ben-Tal A., 2001, MPS SIAM SERIES OPTI
[5]  
BENHAIM Z, 2005, 550 CCIT EL ENG DEP
[6]  
BENHAIM Z, 2005, IEEE WORKSH STAT SIG
[7]   ADMISSIBLE MINIMAX ESTIMATION OF A MULTIVARIATE NORMAL MEAN WITH ARBITRARY QUADRATIC LOSS [J].
BERGER, JO .
ANNALS OF STATISTICS, 1976, 4 (01) :223-226
[8]  
Bertsekas D., 1999, NONLINEAR PROGRAMMIN
[9]   Is it useful to know a nuisance parameter? [J].
Beyerer, J .
SIGNAL PROCESSING, 1998, 68 (01) :107-111
[10]   SOME GENERAL RESULTS ON REDUCED MEAN SQUARE ERROR ESTIMATION [J].
BLIGHT, BJN .
AMERICAN STATISTICIAN, 1971, 25 (03) :24-&