A large-sample model selection criterion based on Kullback's symmetric divergence

被引:150
作者
Cavanaugh, JE [1 ]
机构
[1] Univ Missouri, Dept Stat, Columbia, MO 65211 USA
关键词
AIC; Akaike information criterion; I-divergence; J-divergence; Kullback-Leibler information; relative entropy;
D O I
10.1016/S0167-7152(98)00200-4
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The Akaike information criterion, AIC, is a widely known and extensively used tool for statistical model selection. AIC serves as an asymptotically unbiased estimator of a variant of Kullback's directed divergence between the true model and a fitted approximating model. The directed divergence is an asymmetric measure of separation between two statistical models, meaning that an alternate directed divergence may be obtained by reversing the roles of the two models in the definition of the measure. The sum of the two directed divergences is Kullback's symmetric divergence. Since the symmetric divergence combines the information in two related though distinct measures, it functions as a gauge of model disparity which is arguably more sensitive than either of its individual components. With this motivation, we propose a model selection criterion which serves as an asymptotically unbiased estimator of a variant of the symmetric divergence between the true model and a fitted approximating model. We examine the performance of the criterion relative to other well-known criteria in a simulation study. (C) 1999 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:333 / 343
页数:11
相关论文
共 28 条