RELATIONS BETWEEN ENTROPY AND ERROR-PROBABILITY

被引:139
作者
FEDER, M [1 ]
MERHAV, N [1 ]
机构
[1] TECHNION ISRAEL INST TECHNOL,DEPT ELECT ENGN,IL-32000 HAIFA,ISRAEL
关键词
ENTROPY; ERROR PROBABILITY; EQUIVOCATION; PREDICTABILITY; FANOS INEQUALITY; CHANNEL CODING THEOREM;
D O I
10.1109/18.272494
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The relation between the entropy of a discrete random variable and the minimum attainable probability of error made in guessing its value is examined. While Fano's inequality provides a tight lower bound on the error probability in terms of the entropy, we derive a converse result-a tight upper bound on the minimal error probability in terms of the entropy. Both bounds are sharp, and can draw a relation, as well, between the error probability for the maximum a posteriori (MAP) rule, and the conditional entropy (equivocation), which is a useful uncertainty measure in several applications. Combining this relation and the classical channel coding theorem, we present a channel coding theorem for the equivocation which, unlike the channel coding theorem for error probability, is meaningful at all rates. This theorem is proved directly for DMC's, and from this proof it is further concluded that for R greater-than-or-equal-to C the equivocation achieves its minimal value of R - C at the rate of n1/2 where n is the block length.
引用
收藏
页码:259 / 266
页数:8
相关论文
共 11 条
[1]  
[Anonymous], 2006, ELEM INF THEORY
[2]   INEQUALITIES BETWEEN INFORMATION MEASURES AND ERROR PROBABILITY [J].
CHU, JT ;
CHUEH, JC .
JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 1966, 282 (02) :121-&
[3]  
Cover TM., 1974, 12 STANF U DEP STAT
[4]  
COVER TM, 1967, 4TH T PRAG C INF THE, P263
[5]   UNIVERSAL PREDICTION OF INDIVIDUAL SEQUENCES [J].
FEDER, M ;
MERHAV, N ;
GUTMAN, M .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1992, 38 (04) :1258-1270
[6]   A SIMPLE DERIVATION OF THE CODING THEOREM AND SOME APPLICATIONS [J].
GALLAGER, RG .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1965, 11 (01) :3-18
[7]  
GALLAGER RG, 1968, INFORMATION THEORY R
[8]   PROBABILITY OF ERROR, EQUIVOCATION, AND CHERNOFF BOUND [J].
HELLMAN, ME ;
RAVIV, J .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1970, 16 (04) :368-+
[9]  
JUMARIE G, 1990, RELATIVE INFORMATION
[10]   A NEW INTERPRETATION OF INFORMATION RATE [J].
KELLY, JL .
BELL SYSTEM TECHNICAL JOURNAL, 1956, 35 (04) :917-926