MAXIMUM-LIKELIHOOD-ESTIMATION VIA THE ECM ALGORITHM - A GENERAL FRAMEWORK

被引:505
作者
MENG, XL [1 ]
RUBIN, DB [1 ]
机构
[1] HARVARD UNIV,DEPT STAT,CAMBRIDGE,MA 02138
关键词
BAYESIAN INFERENCE; CONDITIONAL MAXIMIZATION; CONSTRAINED OPTIMIZATION; EM ALGORITHM; GIBBS SAMPLER; INCOMPLETE DATA; ITERATED CONDITIONAL MODES; ITERATIVE PROPORTIONAL FITTING; MISSING DATA;
D O I
10.1093/biomet/80.2.267
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Two major reasons for the popularity of the EM algorithm are that its maximum step involves only complete-data maximum likelihood estimation, which is often computationally simple, and that its convergence is stable, with each iteration increasing the likelihood. When the associated complete-data maximum likelihood estimation itself is complicated, EM is less attractive because the M-step is computationally unattractive. In many cases, however, complete-data maximum likelihood estimation is relatively simple when conditional on some function of the parameters being estimated. We introduce a class of generalized EM algorithms, which we call the ECM algorithm, for Expectation/Conditional Maximization (CM), that takes advantage of the simplicity of complete-data conditional maximum likelihood estimation by replacing a complicated M-step of EM with several computationally simpler CM-steps. We show that the ECM algorithm shares all the appealing convergence properties Of EM, such as always increasing the likelihood, and present several illustrative examples.
引用
收藏
页码:267 / 278
页数:12
相关论文
共 24 条