Estimating Entropy Rates with Bayesian Confidence Intervals
- 1 July 2005
- journal article
- Published by MIT Press in Neural Computation
- Vol. 17 (7), 1531-1576
- https://doi.org/10.1162/0899766053723050
Abstract
The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. In a spiking neuron, this uncertainty translates into the amount of information potentially encoded and thus the subject of intense theoretical and experimental investigation. Estimating this quantity in observed, experimental data is difficult and requires a judicious selection of probabilistic models, balancing between two opposing biases. We use a model weighting principle originally developed for lossless data compression, following the minimum description length principle. This weighting yields a direct estimator of the entropy rate, which, compared to existing methods, exhibits significantly less bias and converges faster in simulation. With Monte Carlo techinques, we estimate a Bayesian confidence interval for the entropy rate. In related work, we apply these ideas to estimate the information rates between sensory stimuli and neural responses in experimental data (Shlens, Kennel, Abarbanel, & Chichilnisky, in preparation).Keywords
This publication has 51 references indexed in Scilit:
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv ComplexityNeural Computation, 2004
- Predictability, Complexity, and LearningNeural Computation, 2001
- Synergy in a Neural CodeNeural Computation, 2000
- On the role of pattern matching in information theoryIEEE Transactions on Information Theory, 1998
- Metric-space analysis of spike trains: theory, algorithms and applicationNetwork: Computation in Neural Systems, 1997
- Naturalistic stimuli increase the rate and efficiency of information transmission by primary auditory afferentsProceedings Of The Royal Society B-Biological Sciences, 1995
- Calculation of entropy from data of motionJournal of Statistical Physics, 1981
- The performance of universal encodingIEEE Transactions on Information Theory, 1981
- A universal algorithm for sequential data compressionIEEE Transactions on Information Theory, 1977
- On the Complexity of Finite SequencesIEEE Transactions on Information Theory, 1976