Bayesian estimation via sequential Monte Carlo sampling: Unconstrained nonlinear dynamic systems

被引:54
作者
Chen, WS
Bakshi, BR
Goel, PK
Ungarala, S
机构
[1] Ohio State Univ, Dept Chem Engn, Columbus, OH 43210 USA
[2] Ohio State Univ, Dept Stat, Columbus, OH 43210 USA
[3] Cleveland State Univ, Dept Chem Engn, Cleveland, OH 44115 USA
关键词
D O I
10.1021/ie034010v
中图分类号
TQ [化学工业];
学科分类号
0817 ;
摘要
Precise estimation of state variables and model parameters is essential for efficient process operation. Bayesian formulation of the estimation problem suggests a general solution for all types of systems. Even though the theory of Bayesian estimation of nonlinear dynamic systems has been available for 4 decades, practical implementation has not been feasible because of computational and methodological challenges. Consequently, most existing methods rely on simplifying assumptions to obtain a tractable but approximate solution. For example, extended Kalman filtering linearizes the process model and assumes Gaussian prior and noise. Moving-horizon-based least-squares estimation also assumes Gaussian or other fixed-shape prior and noise to obtain a least-squares optimization problem. This approach can impose constraints but is nonrecursive and requires computationally expensive nonlinear or quadratic programming. This paper introduces sequential Monte Carlo sampling for Bayesian estimation of chemical process systems. This recent approach approximates computationally expensive integration by recursive Monte Carlo sampling and can obtain posterior distributions accurately and efficiently with minimum assumptions. This approach has not been compared with estimation methods popular for chemical processes including moving-horizon estimation. In addition to comparing various methods, this paper also develops a novel empirical Bayes approach to deal with practical challenges due to degeneracy and a poor initial guess. The ability of the proposed approach to be more computationally efficient and at least as accurate as moving-horizon-based least-squares estimation is demonstrated via several case studies.
引用
收藏
页码:4012 / 4025
页数:14
相关论文
共 36 条
  • [1] An introduction to MCMC for machine learning
    Andrieu, C
    de Freitas, N
    Doucet, A
    Jordan, MI
    [J]. MACHINE LEARNING, 2003, 50 (1-2) : 5 - 43
  • [2] A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking
    Arulampalam, MS
    Maskell, S
    Gordon, N
    Clapp, T
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2002, 50 (02) : 174 - 188
  • [3] CHEN WS, 2002, TUTORIAL IMPORTANCE
  • [4] CHEN WS, 2003, BAYESIAN ESTIMATION
  • [5] CHEN WS, 2004, P 7 INT S DYN CONTR
  • [6] AIS-BN: An adaptive importance sampling algorithm for evidential reasoning in large Bayesian networks
    Cheng, J
    Druzdzel, MJ
    [J]. JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2000, 13 : 155 - 188
  • [7] A survey of convergence results on particle filtering methods for practitioners
    Crisan, D
    Doucet, A
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2002, 50 (03) : 736 - 746
  • [8] Data reconciliation - Progress and challenges
    Crowe, CM
    [J]. JOURNAL OF PROCESS CONTROL, 1996, 6 (2-3) : 89 - 98
  • [9] Sequential Monte Carlo methods to train neural network models
    de Freitas, JFG
    Niranjan, M
    Gee, AH
    Doucet, A
    [J]. NEURAL COMPUTATION, 2000, 12 (04) : 955 - 993
  • [10] On sequential Monte Carlo sampling methods for Bayesian filtering
    Doucet, A
    Godsill, S
    Andrieu, C
    [J]. STATISTICS AND COMPUTING, 2000, 10 (03) : 197 - 208