Better than least squares: comparison of objective functions for estimating linear-nonlinear models

Sharpee, Tatyana

Neural Information Processing Systems 

This paper compares a family of methods for characterizing neural feature selectivity withnatural stimuli in the framework of the linear-nonlinear model. In this model, the neural firing rate is a nonlinear function of a small number of relevant stimulus components. The relevant stimulus dimensions can be found by maximizing oneof the family of objective functions, Rényi divergences of different orders [1, 2]. We show that maximizing one of them, Rényi divergence of order 2,is equivalent to least-square fitting of the linear-nonlinear model to neural data. Next, we derive reconstruction errors in relevant dimensions found by maximizing Rényidivergences of arbitrary order in the asymptotic limit of large spike numbers. We find that the smallest errors are obtained with Rényi divergence of order 1, also known as Kullback-Leibler divergence.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found