Extending conventional priors for testing general hypotheses in linear models

被引:49
作者
Bayarri, M. J. [1 ]
Garcia-Donato, Gonzalo
机构
[1] Univ Valencia, Dept Stat & Operat Res, E-46100 Burjassot, Spain
[2] Univ Castilla La Mancha, Dept Econ & Finance, Albacete 02071, Spain
关键词
analysis-of-variance model; changepoint problem; model selection; objective Bayesian methods; partially informative distribution; regression model;
D O I
10.1093/biomet/asm014
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
We consider that observations come from a general normal linear model and that it is desirable to test a simplifying null hypothesis about the parameters. We approach this problem from an objective Bayesian, model-selection perspective. Crucial ingredients for this approach are 'proper objective priors' to be used for deriving the Bayes factors. Jeffreys-Zellner-Siow priors have good properties for testing null hypotheses defined by specific values of the parameters in full-rank linear models. We extend these priors to deal with general hypotheses in general linear models, not necessarily of full rank. The resulting priors, which we call 'conventional priors', are expressed as a generalization of recently introduced 'partially informative distributions'. The corresponding Bayes factors are fully automatic, easily computed and very reasonable. The methodology is illustrated for the change-point problem and the equality of treatments effects problem. We compare the conventional priors derived for these problems with other objective Bayesian proposals like the intrinsic priors. It is concluded that both priors behave similarly although interesting subtle differences arise. We adapt the conventional priors to deal with nonnested model selection as well as multiple-model comparison. Finally, we briefly address a generalization of conventional priors to nonnormal scenarios.
引用
收藏
页码:135 / 152
页数:18
相关论文
共 27 条
[1]  
Berger J. O., 1992, Bayesian Statistics, V4, P35
[2]  
Berger J.O., 1998, Sankhya A, V60, P307
[3]   Approximations and consistency of Bayes factors as model dimension grows [J].
Berger, JO ;
Ghosh, JK ;
Mukhopadhyay, N .
JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2003, 112 (1-2) :241-258
[4]   The intrinsic Bayes factor for model selection and prediction [J].
Berger, JO ;
Pericchi, LR .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1996, 91 (433) :109-122
[5]  
BERGER JO, 2001, MODEL SELECTION, P135
[6]  
CARLIN BP, 1995, J ROY STAT SOC B MET, V57, P473
[7]   On Bayesian model and variable selection using MCMC [J].
Dellaportas, P ;
Forster, JJ ;
Ntzoufras, I .
STATISTICS AND COMPUTING, 2002, 12 (01) :27-36
[8]   Markov chain Monte Carlo methods for computing Bayes factors: A comparative review [J].
Han, C ;
Carlin, BP .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (455) :1122-1132
[9]  
IBRAHIM JG, 1994, J AM STAT ASSOC, V89, P309
[10]  
Jeffreys H., 1998, The Theory of Probability