Hemmen, J. Leo van
Spike-Based Compared to Rate-Based Hebbian Learning
Kempter, Richard, Gerstner, Wulfram, Hemmen, J. Leo van
For example, a'Hebbian' (Hebb 1949) learning rule which is driven by the correlations between presynaptic and postsynaptic rates may be used to generate neuronal receptive fields (e.g., Linsker 1986, MacKay and Miller 1990, Wimbauer et al. 1997) with properties similar to those of real neurons. A rate-based description, however, neglects effects which are due to the pulse structure of neuronal signals.
Spike-Based Compared to Rate-Based Hebbian Learning
Kempter, Richard, Gerstner, Wulfram, Hemmen, J. Leo van
For example, a'Hebbian' (Hebb 1949) learning rule which is driven by the correlations between presynaptic and postsynaptic rates may be used to generate neuronal receptive fields (e.g., Linsker 1986, MacKay and Miller 1990, Wimbauer et al. 1997) with properties similar to those of real neurons. A rate-based description, however, neglects effects which are due to the pulse structure of neuronal signals.
Temporal coding in the sub-millisecond range: Model of barn owl auditory pathway
Kempter, Richard, Gerstner, Wulfram, Hemmen, J. Leo van, Wagner, Hermann
Binaural coincidence detection is essential for the localization of external sounds and requires auditory signal processing with high temporal precision. We present an integrate-and-fire model of spike processing in the auditory pathway of the barn owl. It is shown that a temporal precision in the microsecond range can be achieved with neuronal time constants which are at least one magnitude longer. An important feature of our model is an unsupervised Hebbian learning rule which leads to a temporal fine tuning of the neuronal connections.
Temporal coding in the sub-millisecond range: Model of barn owl auditory pathway
Kempter, Richard, Gerstner, Wulfram, Hemmen, J. Leo van, Wagner, Hermann
Binaural coincidence detection is essential for the localization of external sounds and requires auditory signal processing with high temporal precision. We present an integrate-and-fire model of spike processing in the auditory pathway of the barn owl. It is shown that a temporal precision in the microsecond range can be achieved with neuronal time constants which are at least one magnitude longer. An important feature of our model is an unsupervised Hebbian learning rule which leads to a temporal fine tuning of the neuronal connections.
How to Describe Neuronal Activity: Spikes, Rates, or Assemblies?
Gerstner, Wulfram, Hemmen, J. Leo van
What is the'correct' theoretical description of neuronal activity? The analysis of the dynamics of a globally connected network of spiking neurons (the Spike Response Model) shows that a description by mean firing rates is possible only if active neurons fire incoherently. If firing occurs coherently or with spatiotemporal correlations, the spike structure of the neural code becomes relevant. Alternatively, neurons can be gathered into local or distributed ensembles or'assemblies'. A description based on the mean ensemble activity is, in principle, possible but the interaction between different assemblies becomes highly nonlinear. A description with spikes should therefore be preferred.
How to Describe Neuronal Activity: Spikes, Rates, or Assemblies?
Gerstner, Wulfram, Hemmen, J. Leo van
What is the'correct' theoretical description of neuronal activity? The analysis of the dynamics of a globally connected network of spiking neurons (the Spike Response Model) shows that a description bymean firing rates is possible only if active neurons fire incoherently. Iffiring occurs coherently or with spatiotemporal correlations, the spike structure of the neural code becomes relevant. Alternatively, neurons can be gathered into local or distributed ensembles or'assemblies'. A description based on the mean ensemble activity is, in principle, possible but the interaction between different assembliesbecomes highly nonlinear. A description with spikes should therefore be preferred.
Statistical Mechanics of Temporal Association in Neural Networks
Herz, Andreas V. M., Li, Zhaoping, Hemmen, J. Leo van
Basic computational functions of associative neural structures may be analytically studied within the framework of attractor neural networks where static patterns are stored as stable fixed-points for the system's dynamics. If the interactions between single neurons are instantaneous and mediated by symmetric couplings, there is a Lyapunov function for the retrieval dynamics (Hopfield 1982). The global computation corresponds in that case to a downhill motion in an energy landscape created by the stored information. Methods of equilibrium statistical mechanics may be applied and permit a quantitative analysis of the asymptotic network behavior (Amit et al. 1985, 1987). The existence of a Lyapunov function is thus of great conceptual as well as technical importance. Nevertheless, one should be aware that environmental inputs to a neural net always provide information in both space and time. It is therefore desirable to extend the original Hopfield scheme and to explore possibilities for a joint representation of static patterns and temporal associations.
Statistical Mechanics of Temporal Association in Neural Networks
Herz, Andreas V. M., Li, Zhaoping, Hemmen, J. Leo van
Basic computational functions of associative neural structures may be analytically studied within the framework of attractor neural networks where static patterns are stored as stable fixed-points for the system's dynamics. If the interactions between single neurons are instantaneous and mediated by symmetric couplings, there is a Lyapunov function for the retrieval dynamics (Hopfield 1982). The global computation correspondsin that case to a downhill motion in an energy landscape created by the stored information. Methods of equilibrium statistical mechanics may be applied andpermit a quantitative analysis of the asymptotic network behavior (Amit et al. 1985, 1987). The existence of a Lyapunov function is thus of great conceptual aswell as technical importance. Nevertheless, one should be aware that environmental inputs to a neural net always provide information in both space and time. It is therefore desirable to extend the original Hopfield scheme and to explore possibilities for a joint representation of static patterns and temporal associations.