A new metric for probability distributions

被引:694
作者
Endres, DM [1 ]
Schindelin, JE
机构
[1] Univ St Andrews, Sch Psychol, St Andrews KY16 9JU, Fife, Scotland
[2] Univ Wurzburg, Inst Genet, Biozentrum, D-97074 Wurzburg, Germany
关键词
capacitory discrimination; chi(2) distance; Jensen-Shannon divergence; metric; triangle inequality;
D O I
10.1109/TIT.2003.813506
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We introduce a metric for probability distributions, which is bounded, information-theoretically motivated, and has a natural Bayesian interpretation. The square root of the well-known chi(2) distance is an asymptotic approximation to it. Moreover, it is a close relative of the capacitory discrimination and Jensen-Shannon divergence.
引用
收藏
页码:1858 / 1860
页数:3
相关论文
共 9 条
[1]  
ALI SM, 1966, J ROY STAT SOC B, V28, P131
[2]  
Brown R.F., 1993, TOPOLOGICAL INTRO NO
[3]  
Cover T. M., 2005, ELEM INF THEORY, DOI 10.1002/047174882X
[4]  
HAFKA P, 1991, STUD SCI MATH HUNG, V26, P415
[5]  
Liese F., 1987, Convex Statistical Distances
[6]  
MINKA TP, 2001, BAYESIAN INFERENCE E
[7]  
Osterreicher F, 1996, KYBERNETIKA, V32, P389
[8]  
OSTERREICHER F, 1993, IEEE T INFORM THEORY, V39, P1036, DOI 10.1109/18.256536
[9]   Some inequalities for information divergence and Related measures of discrimination [J].
Topsoe, F .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2000, 46 (04) :1602-1609