Estimating entropy rates with Bayesian confidence intervals

被引:61
作者
Kennel, MB [1 ]
Shlens, J
Abarbanel, HDI
Chichilnisky, EJ
机构
[1] Univ Calif San Diego, Inst Nonlinear Sci, La Jolla, CA 92093 USA
[2] Salk Inst Biol Studies, Syst Neurobiol Lab, La Jolla, CA 92037 USA
[3] Univ Calif San Diego, Scripps Inst Oceanog, Dept Phys, La Jolla, CA 92093 USA
[4] Univ Calif San Diego, Scripps Inst Oceanog, Marine Phys Lab, La Jolla, CA 92093 USA
关键词
D O I
10.1162/0899766053723050
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. In a spiking neuron, this uncertainty translates into the amount of information potentially encoded and thus the subject of intense theoretical and experimental investigation. Estimating this quantity in observed, experimental data is difficult and requires a judicious selection of probabilistic models, balancing between two opposing biases. We use a model weighting principle originally developed for lossless data compression, following the minimum description length principle. This weighting yields a direct estimator of the entropy rate, which, compared to existing methods, exhibits significantly less bias and converges faster in simulation. With Monte Carlo techinques, we estimate a Bayesian confidence interval for the entropy rate. In related work, we apply these ideas to estimate the information rates between sensory stimuli and neural responses in experimental data (Shlens, Kennel, Abarbanel, & Chichilnisky, in preparation).
引用
收藏
页码:1531 / 1576
页数:46
相关论文
共 70 条
  • [1] Estimating the entropy rate of spike trains via Lempel-Ziv complexity
    Amigó, JM
    Szczepanski, J
    Wajnryb, E
    Sanchez-Vives, MV
    [J]. NEURAL COMPUTATION, 2004, 16 (04) : 717 - 736
  • [2] [Anonymous], 1993, Chaos in Dynamical Systems
  • [3] Statistical inference, Occam's razor, and statistical mechanics on the space of probability distributions
    Balasubramanian, V
    [J]. NEURAL COMPUTATION, 1997, 9 (02) : 349 - 368
  • [4] Predictability, complexity, and learning
    Bialek, W
    Nemenman, I
    Tishby, N
    [J]. NEURAL COMPUTATION, 2001, 13 (11) : 2409 - 2463
  • [5] READING A NEURAL CODE
    BIALEK, W
    RIEKE, F
    VANSTEVENINCK, RRD
    WARLAND, D
    [J]. SCIENCE, 1991, 252 (5014) : 1854 - 1857
  • [6] Information theory and neural coding
    Borst, A
    Theunissen, FE
    [J]. NATURE NEUROSCIENCE, 1999, 2 (11) : 947 - 957
  • [7] Synergy in a neural code
    Brenner, N
    Strong, SP
    Koberle, R
    Bialek, W
    van Steveninck, RRD
    [J]. NEURAL COMPUTATION, 2000, 12 (07) : 1531 - 1552
  • [8] Efficient discrimination of temporal patterns by motion-sensitive neurons in primate visual cortex
    Buracas, GT
    Zador, AM
    DeWeese, MR
    Albright, TD
    [J]. NEURON, 1998, 20 (05) : 959 - 969
  • [9] How much information is associated with a particular stimulus?
    Butts, DA
    [J]. NETWORK-COMPUTATION IN NEURAL SYSTEMS, 2003, 14 (02) : 177 - 187
  • [10] Geodesic entropic graphs for dimension and entropy estimation in manifold learning
    Costa, JA
    Hero, AO
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (08) : 2210 - 2221