Bayesian entropy estimation for binary spike train data using parametric prior knowledge Evan Archer 13, Jonathan W. Pillow
–Neural Information Processing Systems
Shannon's entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes. Estimating the entropy of a discrete distribution from samples is an important and difficult problem that has received considerable attention in statistics and theoretical neuroscience. However, neural responses have characteristic statistical structure that generic entropy estimators fail to exploit. For example, existing Bayesian entropy estimators make the naive assumption that all spike words are equally likely a priori, which makes for an inefficient allocation of prior probability mass in cases where spikes are sparse. Here we develop Bayesian estimators for the entropy of binary spike trains using priors designed to flexibly exploit the statistical structure of simultaneouslyrecorded spike responses.
Neural Information Processing Systems
Mar-14-2024, 01:05:40 GMT
- Country:
- North America > United States
- Massachusetts > Middlesex County
- Cambridge (0.04)
- Texas > Travis County
- Austin (0.04)
- Massachusetts > Middlesex County
- North America > United States
- Industry:
- Health & Medicine > Therapeutic Area > Neurology (0.69)