Stemmler, Martin
Information Maximization in Single Neurons
Stemmler, Martin, Koch, Christof
Information from the senses must be compressed into the limited range of firing rates generated by spiking nerve cells. Optimal compression uses all firing rates equally often, implying that the nerve cell's response matches the statistics of naturally occurring stimuli. Since changing the voltage-dependent ionic conductances in the cell membrane alters the flow of information, an unsupervised, non-Hebbian, developmental learning rule is derived to adapt the conductances in Hodgkin-Huxley model neurons. By maximizing the rate of information transmission, each firing rate within the model neuron's limited dynamic range is used equally often. An efficient neuronal representation of incoming sensory information should take advantage of the regularity and scale invariance of stimulus features in the natural world. In the case of vision, this regularity is reflected in the typical probabilities of encountering particular visual contrasts, spatial orientations, or colors [1]. Given these probabilities, an optimized neural code would eliminate any redundancy, while devoting increased representation to commonly encountered features. At the level of a single spiking neuron, information about a potentially large range of stimuli is compressed into a finite range of firing rates, since the maximum firing rate of a neuron is limited. Optimizing the information transmission through a single neuron in the presence of uniform, additive noise has an intuitive interpretation: the most efficient representation of the input uses every firing rate with equal probability.
Information Maximization in Single Neurons
Stemmler, Martin, Koch, Christof
Information from the senses must be compressed into the limited range of firing rates generated by spiking nerve cells. Optimal compression uses all firing rates equally often, implying that the nerve cell's response matches the statistics of naturally occurring stimuli. Since changing the voltage-dependent ionic conductances in the cell membrane alters the flow of information, an unsupervised, non-Hebbian, developmental learning rule is derived to adapt the conductances in Hodgkin-Huxley model neurons. By maximizing the rate of information transmission, each firing rate within the model neuron's limited dynamic range is used equally often. An efficient neuronal representation of incoming sensory information should take advantage ofthe regularity and scale invariance of stimulus features in the natural world. In the case of vision, this regularity is reflected in the typical probabilities of encountering particular visual contrasts, spatial orientations, or colors [1]. Given these probabilities, an optimized neural code would eliminate any redundancy, while devoting increased representation tocommonly encountered features. At the level of a single spiking neuron, information about a potentially large range of stimuli is compressed into a finite range of firing rates, since the maximum firing rate of a neuron is limited. Optimizing the information transmission through a single neuron in the presence of uniform, additive noise has an intuitive interpretation: the most efficient representation of the input uses every firing rate with equal probability.
Synchronization, oscillations, and 1/f noise in networks of spiking neurons
Stemmler, Martin, Usher, Marius, Koch, Christof, Olami, Zeev
The model consists of a two-dimensional sheet of leaky integrateand-fire neuronswith feedback connectivity consisting of local excitation andsurround inhibition. Each neuron is independently driven by homogeneous external noise. Spontaneous symmetry breaking occurs, resulting in the formation of "hotspots" of activity inthe network. These localized patterns of excitation appear as clusters that coalesce, disintegrate, or fluctuate in size while simultaneously movingin a random walk constrained by the interaction with other clusters. The emergent cross-correlation functions have a dual structure, with a sharp peak around zero on top of a much broader hill.
Synchronization, oscillations, and 1/f noise in networks of spiking neurons
Stemmler, Martin, Usher, Marius, Koch, Christof, Olami, Zeev
The model consists of a two-dimensional sheet of leaky integrateand-fire neurons with feedback connectivity consisting of local excitation and surround inhibition. Each neuron is independently driven by homogeneous external noise. Spontaneous symmetry breaking occurs, resulting in the formation of "hotspots" of activity in the network. These localized patterns of excitation appear as clusters that coalesce, disintegrate, or fluctuate in size while simultaneously moving in a random walk constrained by the interaction with other clusters. The emergent cross-correlation functions have a dual structure, with a sharp peak around zero on top of a much broader hill.