DIVERGENCE MEASURES BASED ON THE SHANNON ENTROPY

被引:2822
作者
LIN, JH
机构
[1] Department of Computer Science, Brandeis University, Waltham
关键词
D O I
10.1109/18.61115
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A new class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The new measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness. © 1991 IEEE
引用
收藏
页码:145 / 151
页数:7
相关论文
共 32 条