A Markov Chain Monte Carlo version of the genetic algorithm differential evolution: easy Bayesian computing for real parameter spaces

被引:763
作者
Ter Braak, Cajo J. F. [1 ]
机构
[1] Univ Wageningen & Res Ctr, NL-6700 AC Wageningen, Netherlands
关键词
block updating; evolutionary Monte Carlo; metropolis algorithm; population Markov Chain Monte Carlo; simulated annealing; simulated tempering; theophylline kinetics;
D O I
10.1007/s11222-006-8769-1
中图分类号
TP301 [理论、方法];
学科分类号
081202 [计算机软件与理论];
摘要
Differential Evolution (DE) is a simple genetic algorithm for numerical optimization in real parameter spaces. In a statistical context one would not just want the optimum but also its uncertainty. The uncertainty distribution can be obtained by a Bayesian analysis (after specifying prior and likelihood) using Markov Chain Monte Carlo (MCMC) simulation. This paper integrates the essential ideas of DE and MCMC, resulting in Differential Evolution Markov Chain (DE-MC). DE-MC is a population MCMC algorithm, in which multiple chains are run in parallel. DE-MC solves an important problem in MCMC, namely that of choosing an appropriate scale and orientation for the jumping distribution. In DE-MC the jumps are simply a fixed multiple of the differences of two random parameter vectors that are currently in the population. The selection process of DE-MC works via the usual Metropolis ratio which defines the probability with which a proposal is accepted. In tests with known uncertainty distributions, the efficiency of DE-MC with respect to random walk Metropolis with optimal multivariate Normal jumps ranged from 68% for small population sizes to 100% for large population sizes and even to 500% for the 97.5% point of a variable from a 50-dimensional Student distribution. Two Bayesian examples illustrate the potential of DE-MC in practice. DE-MC is shown to facilitate multidimensional updates in a multi-chain "Metropolis-within-Gibbs" sampling approach. The advantage of DE-MC over conventional MCMC are simplicity, speed of calculation and convergence, even for nearly collinear parameters and multimodal densities.
引用
收藏
页码:239 / 249
页数:11
相关论文
共 25 条
[1]
[Anonymous], 2021, Bayesian Data Analysis
[2]
[Anonymous], 1995, DIFFERENTIAL EVOLUTI
[3]
ADAPTIVE DIRECTION SAMPLING [J].
GILKS, WR ;
ROBERTS, GO ;
GEORGE, EI .
STATISTICIAN, 1994, 43 (01) :179-189
[4]
An adaptive Metropolis algorithm [J].
Haario, H ;
Saksman, E ;
Tamminen, J .
BERNOULLI, 2001, 7 (02) :223-242
[5]
LAMPINEN J, 2001, BIBLIO DIFFERENTIAL
[6]
Lampinen J, 2000, P MENDEL 2000 6 INT, P76
[7]
Population Markov Chain Monte Carlo [J].
Laskey, KB ;
Myers, JW .
MACHINE LEARNING, 2003, 50 (1-2) :175-196
[8]
Dynamically weighted importance sampling in Monte Carlo computation [J].
Liang, F .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2002, 97 (459) :807-821
[9]
Real-parameter evolutionary Monte Carlo with applications to Bayesian mixture models [J].
Liang, FM ;
Wong, WH .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (454) :653-666
[10]
Posterior bimodality in the balanced one-way random-effects model [J].
Liu, JN ;
Hodges, JS .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2003, 65 :247-255