Accelerating Monte Carlo Markov chain convergence for cumulative-link generalized linear models

被引:138
作者
Cowles, MK [1 ]
机构
[1] HARVARD UNIV,SCH PUBL HLTH,DEPT BIOSTAT,BOSTON,MA 02215
关键词
blocking; collapsing; data augmentation; Gibbs sampler; latent data;
D O I
10.1007/BF00162520
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The ordinal probit, univariate or multivariate, is a generalized linear model (GLM) structure that arises frequently in such disparate areas of statistical applications as medicine and econometrics. Despite the straightforwardness of its implementation using the Gibbs sampler, the ordinal probit may present challenges in obtaining satisfactory convergence. We present a multivariate Hastings-within-Gibbs update step for generating latent data and bin boundary parameters jointly, instead of individually from their respective full conditionals. When the latent data are parameters of interest, this algorithm substantially improves Gibbs sampler convergence for large datasets. We also discuss Monte Carlo Markov chain (MCMC) implementation of cumulative legit (proportional odds) and cumulative complementary log-log (proportional hazards) models with latent data.
引用
收藏
页码:101 / 111
页数:11
相关论文
共 24 条