Goto

Collaborating Authors

 Sharpee, Tatyana


Better than least squares: comparison of objective functions for estimating linear-nonlinear models

Neural Information Processing Systems

This paper compares a family of methods for characterizing neural feature selectivity withnatural stimuli in the framework of the linear-nonlinear model. In this model, the neural firing rate is a nonlinear function of a small number of relevant stimulus components. The relevant stimulus dimensions can be found by maximizing oneof the family of objective functions, Rényi divergences of different orders [1, 2]. We show that maximizing one of them, Rényi divergence of order 2,is equivalent to least-square fitting of the linear-nonlinear model to neural data. Next, we derive reconstruction errors in relevant dimensions found by maximizing Rényidivergences of arbitrary order in the asymptotic limit of large spike numbers. We find that the smallest errors are obtained with Rényi divergence of order 1, also known as Kullback-Leibler divergence.


Maximally Informative Dimensions: Analyzing Neural Responses to Natural Signals

Neural Information Processing Systems

From olfaction to vision and audition, there is an increasing need, and a growing number of experiments [1]-[8] that study responses of sensory neurons to natural stimuli. Natural stimuli have specific statistical properties [9, 10], and therefore sample only a subspace of all possible spatial and temporal frequencies explored during stimulation with white noise. Observing the full dynamic range of neural responses may require using stimulus ensembles which approximate those occurring in nature, and it is an attractive hypothesis that the neural representation of these natural signals may be optimized in some way. Finally, some neuron responses are strongly nonlinear and adaptive, and may not be predicted from a combination of responses to simple stimuli. It has also been shown that the variability in neural response decreases substantially when dynamical, rather than static, stimuli are used [11, 12]. For all these reasons, it would be attractive to have a rigorous method of analyzing neural responses to complex, naturalistic inputs.


Maximally Informative Dimensions: Analyzing Neural Responses to Natural Signals

Neural Information Processing Systems

From olfaction to vision and audition, there is an increasing need, and a growing number of experiments [1]-[8] that study responses of sensory neurons to natural stimuli. Natural stimuli have specific statistical properties [9, 10], and therefore sample only a subspace of all possible spatial and temporal frequencies explored during stimulation with white noise. Observing the full dynamic range of neural responses may require using stimulus ensembles whichapproximate those occurring in nature, and it is an attractive hypothesis that the neural representation of these natural signals may be optimized in some way. Finally, some neuron responses are strongly nonlinear and adaptive, and may not be predicted from a combination of responses to simple stimuli. It has also been shown that the variability in neural response decreases substantially when dynamical, rather than static, stimuli are used [11, 12]. For all these reasons, it would be attractive to have a rigorous method of analyzing neural responses to complex, naturalistic inputs.