spike count data
Export Reviews, Discussions, Author Feedback and Meta-Reviews
High-dimensional neural spike train analysis with generalized count linear dynamical systems This paper describes a general exponential-family model (called the "generalized count" (GC) distribution) for multi-neuron spike count data. The model accounts for both under-dispersed and over-dispersed spike count data, and has Poisson, Negative Binomial, Bernoulli, and several other classic models as special cases. The authors give a clear account of the relationship to other models, and demonstrate the need for a model to capture under-dispersed counts in primate motor cortex. They then describe an efficient method for maximum-likelihood fitting (and demonstrate concavity of the log-likelihood). They derive an efficient variational Bayesian inference method and apply the model to data from primate motor cortex, showing that it accounts more accurately for variance and cross-covariance of spike count data, compared to a model with Poisson observations.
Modeling Short Over-Dispersed Spike Count Data: A Hierarchical Parametric Empirical Bayes Framework
She, Qi, Jelfs, Beth, Chan, Rosa H. M.
In this letter, a Hierarchical Parametric Empirical Bayes model is proposed to model spike count data. We have integrated Generalized Linear Models (GLMs) and empirical Bayes theory to simultaneously provide three advantages: (1) a model of over-dispersion of spike count values; (2) reduced MSE in estimation when compared to using the maximum likelihood method for GLMs; and (3) an efficient alternative to inference with fully Bayes estimators. We apply the model to study both simulated data and experimental neural data from the retina. The simulation results indicate that the new model can estimate both the weights of connections among neural populations and the output firing rates (mean spike count) efficiently and accurately. The results from the retinal datasets show that the proposed model outperforms both standard Poisson and Negative Binomial GLMs in terms of the prediction log-likelihood of held-out datasets.
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.88)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.68)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.66)