Interpreting Kullback-Leibler divergence with the Neyman-Pearson lemma

被引:110
作者
Eguchi, Shinto [1 ]
Copas, John
机构
[1] Inst Stat Math, Minato Ku, Tokyo 1068569, Japan
[2] Grad Univ Adv Studies, Minato Ku, Tokyo 1068569, Japan
[3] Univ Warwick, Dept Stat, Coventry CV4 7AL, W Midlands, England
关键词
exponential connection; mixture connection; information geometry; testing hypotheses; maximum likelihood; ROC curve;
D O I
10.1016/j.jmva.2006.03.007
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this connection gives another statistical interpretation of the Kullback-Leibler divergence in terms of the loss of power of the likelihood ratio test when the wrong distribution is used for one of the hypotheses. In this interpretation, the standard non-negativity property of the Kullback-Leibler divergence is essentially a restatement of the optimal property of likelihood ratios established by the Neyman-Pearson lemma. The asymmetry of Kullback-Leibler divergence is overviewed in information geometry. (c) 2006 Elsevier Inc. All rights reserved.
引用
收藏
页码:2034 / 2040
页数:7
相关论文
共 14 条
[1]  
AKAIKE H, 1973, P 2 INT S INF THEOR, P67
[2]  
Amari S., 1985, LECT NOTES STAT, V28, DOI DOI 10.1007/978-1-4612-5056-2
[3]  
[Anonymous], 2000, METHODS INFORM GEOME
[4]  
[Anonymous], 1967, ANN MATH STAT
[5]  
[Anonymous], 2001, J KOREAN STAT SOC
[6]  
COX D. R., 2000, Theoretical Statistics
[8]   A class of logistic-type discriminant functions [J].
Eguchi, S ;
Copas, J .
BIOMETRIKA, 2002, 89 (01) :1-22
[9]  
Eguchi S., 1992, Hiroshima Math. J., V22, P631
[10]  
EGUCHI S, 2006, IN PRESS INFORM GEOM