Bayesian parameter estimation via variational methods

被引:372
作者
Jaakkola, TS
Jordan, MI
机构
[1] MIT, Dept Elect Engn & Comp Sci, Cambridge, England
[2] Univ Calif Berkeley, Div Comp Sci, Berkeley, CA 94720 USA
[3] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
关键词
logistic regression; graphical models; belief networks; variational methods; Bayesian estimation; incomplete data;
D O I
10.1023/A:1008932416310
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We consider a logistic regression model with a Gaussian prior distribution over the parameters. We show that an accurate variational transformation can be used to obtain a closed form approximation to the posterior distribution of the parameters thereby yielding an approximate posterior predictive model. This approach is readily extended to binary graphical model with complete observations. For graphical models with incomplete observations we utilize an additional variational transformation and again obtain a closed form approximation to the posterior. Finally, we show that the dual of the regression problem gives a latent variable density model, the variational formulation of which leads to exactly solvable EM updates.
引用
收藏
页码:25 / 37
页数:13
相关论文
共 20 条
[1]  
Bernardo J. M., 1994, BAYESIAN THEORY
[2]  
Everitt BS., 1984, INTRO LATENT VARIABL
[3]  
Gelman A., 1995, Bayesian Data Analysis
[4]  
Gilks W., 1995, Markov Chain Monte Carlo in Practice, DOI 10.1201/b14835
[5]  
HECKERMAN D, 1995, MACH LEARN, V20, P197, DOI 10.1007/BF00994016
[6]  
HINTON GE, 1993, P 6 ANN WORKSH COMP
[7]  
Jordan MI., 1999, Learning in graphical models
[8]  
MACKAY D, 1997, UNPUB ENSEMBLE LEARN
[9]  
McCullagh P., 2019, Generalized Linear Models
[10]   DUALITY BETWEEN LEARNING MACHINES - A BRIDGE BETWEEN SUPERVISED AND UNSUPERVISED LEARNING [J].
NADAL, JP ;
PARGA, N .
NEURAL COMPUTATION, 1994, 6 (03) :491-508