Not enough data to create a plot.
Try a different view from the menu above.
Tishby, Naftali
Group Redundancy Measures Reveal Redundancy Reduction in the Auditory Pathway
Chechik, Gal, Globerson, Amir, Anderson, M. J., Young, E. D., Nelken, Israel, Tishby, Naftali
The way groups of auditory neurons interact to code acoustic information isinvestigated using an information theoretic approach. We develop measures of redundancy among groups of neurons, and apply them to the study of collaborative coding efficiency in two processing stations in the auditory pathway: the inferior colliculus (IC) and the primary auditory cortex (AI). Under two schemes for the coding of the acoustic content, acoustic segments coding and stimulus identity coding, we show differences both in information content and group redundancies between IC and AI neurons. These results provide for the first time a direct evidence for redundancy reduction along the ascending auditory pathway, as has been hypothesized fortheoretical considerations [Barlow 1959,2001]. The redundancy effects under the single-spikes coding scheme are significant onlyfor groups larger than ten cells, and cannot be revealed with the redundancy measures that use only pairs of cells. The results suggest that the auditory system transforms low level representations thatcontain redundancies due to the statistical structure of natural stimuli, into a representation in which cortical neurons extractrare and independent component of complex acoustic signals, that are useful for auditory scene analysis.
Universality and Individuality in a Neural Code
Schneidman, Elad, Brenner, Naama, Tishby, Naftali, Steveninck, Robert R. de Ruyter van, Bialek, William
This basic question in the theory of knowledge seems to be beyond the scope of experimental investigation. An accessible version of this question is whether different observers of the same sense data have the same neural representation of these data: how much of the neural code is universal, and how much is individual? Differences in the neural codes of different individuals may arise from various sources: First, different individuals may use different'vocabularies' of coding symbols. Second, they may use the same symbols to encode different stimulus features.
Universality and Individuality in a Neural Code
Schneidman, Elad, Brenner, Naama, Tishby, Naftali, Steveninck, Robert R. de Ruyter van, Bialek, William
This basic question in the theory of knowledge seems to be beyond the scope of experimental investigation. An accessible version of this question is whether different observers of the same sense data have the same neural representation of these data: how much of the neural code is universal, and how much is individual? Differences in the neural codes of different individuals may arise from various sources: First, different individuals may use different'vocabularies' of coding symbols. Second, they may use the same symbols to encode different stimulus features. Third, they may have different latencies, so they'say' the same things at slightly different times.
Data Clustering by Markovian Relaxation and the Information Bottleneck Method
Tishby, Naftali, Slonim, Noam
We introduce a new, nonparametric and principled, distance based clustering method. This method combines a pairwise based approach with a vector-quantization method which provide a meaningful interpretation to the resulting clusters. The idea is based on turning the distance matrix into a Markov process and then examine the decay of mutual-information during the relaxation of this process. The clusters emerge as quasi-stable structures during this relaxation, and then are extracted using the information bottleneck method.
Data Clustering by Markovian Relaxation and the Information Bottleneck Method
Tishby, Naftali, Slonim, Noam
We introduce a new, nonparametric and principled, distance based clustering method. This method combines a pairwise based approach witha vector-quantization method which provide a meaningful interpretation to the resulting clusters. The idea is based on turning the distance matrix into a Markov process and then examine the decay of mutual-information during the relaxation of this process. The clusters emerge as quasi-stable structures during thisrelaxation, and then are extracted using the information bottleneck method.
Agglomerative Information Bottleneck
Slonim, Noam, Tishby, Naftali
Agglomerative Information Bottleneck
Slonim, Noam, Tishby, Naftali