Goto

Collaborating Authors

 Sejnowski, Terrence J.


Neural Network Analysis of Distributed Representations of Dynamical Sensory-Motor Transformations in the Leech

Neural Information Processing Systems

Neu.ยทal Network Analysis of Distributed Representations of Dynamical Sensory-Motor rrransformations in the Leech Shawn R. LockerYt Van Fangt and Terrence J. Sejnowski Computational Neurobiology Laboratory Salk Institute for Biological Studies Box 85800, San Diego, CA 92138 ABSTRACT Interneurons in leech ganglia receive multiple sensory inputs and make synaptic contacts with many motor neurons. These "hidden" units coordinate several different behaviors. We used physiological and anatomical constraints to construct a model of the local bending reflex. Dynamical networks were trained on experimentally derived input-output patterns using recurrent back-propagation. Units in the model were modified to include electrical synapses and multiple synaptic time constants.


Combining Visual and Acoustic Speech Signals with a Neural Network Improves Intelligibility

Neural Information Processing Systems

Previous attempts at using these visual speech signals to improve automatic speech recognition systems havecombined the acoustic and visual speech information at a symbolic level using heuristic rules. In this paper, we demonstrate an alternative approach to fusing the visual and acoustic speech information by training feedforward neural networks to map the visual signal onto the corresponding short-term spectral amplitude envelope (STSAE) of the acoustic signal. This information can be directly combined with the degraded acoustic STSAE. Significant improvementsare demonstrated in vowel recognition from noise-degraded acoustic signals. These results are compared to the performance of humans, as well as other pattern matching and estimation algorithms. 1 INTRODUCTION Current automatic speech recognition systems rely almost exclusively on the acoustic speechsignal, and as a consequence, these systems often perform poorly in noisy Combining Visual and Acoustic Speech Signals 233 environments.


Neural Network Analysis of Distributed Representations of Dynamical Sensory-Motor Transformations in the Leech

Neural Information Processing Systems

Neu.ยทal Network Analysis of Distributed Representations of Dynamical Sensory-Motor rrransformations in the Leech Shawn R. LockerYt Van Fangt and Terrence J. Sejnowski Computational Neurobiology Laboratory Salk Institute for Biological Studies Box 85800, San Diego, CA 92138 ABSTRACT Interneurons in leech ganglia receive multiple sensory inputs and make synaptic contacts with many motor neurons. These "hidden" units coordinate several different behaviors. We used physiological and anatomical constraints to construct a model of the local bending reflex. Dynamical networks were trained on experimentally derived input-output patterns using recurrent back-propagation. Units in the model were modified to include electrical synapses and multiple synaptic time constants.


Storing Covariance by the Associative Long-Term Potentiation and Depression of Synaptic Strengths in the Hippocampus

Neural Information Processing Systems

We have tested this assumption in the hippocampus, a cortical structure or the brain that is involved in long-term memory. A brier, high-frequency activation or excitatory synapses in the hippocampus produces an increase in synaptic strength known as long-term potentiation, or LTP (BUss and Lomo, 1973), that can last ror many days. LTP is known to be Hebbian since it requires the simultaneous release or neurotransmitter from presynaptic terminals coupled with postsynaptic depolarization (Kelso et al, 1986; Malinow and Miller, 1986; Gustatrson et al, 1987). However, a mechanism ror the persistent reduction or synaptic strength that could balance LTP has not yet been demonstrated. We studied theassociative interactions between separate inputs onto the same dendritic trees or hippocampal pyramidal cells or field CAl, and round that a low-frequency input which, by itselr, does not persistently change synaptic strength, can either increase (associative LTP) or decrease in strength (associative long-term depression or LTD) depending upon whether it is positively or negatively correlated in time with a second, high-frequency bursting input. LTP or synaptic strength is Hebbian, and LTD is anti-Hebbian since it is elicited by pairing presynaptic firing with postsynaptic hyperpolarizationsufficient to block postsynaptic activity.


Storing Covariance by the Associative Long-Term Potentiation and Depression of Synaptic Strengths in the Hippocampus

Neural Information Processing Systems

We have tested this assumption in the hippocampus, a cortical structure or the brain that is involved in long-term memory. A brier, high-frequency activation or excitatory synapses in the hippocampus produces an increase in synaptic strength known as long-term potentiation, or L TP (BUss and Lomo, 1973), that can last ror many days. LTP is known to be Hebbian since it requires the simultaneous release or neurotransmitter from presynaptic terminals coupled with postsynaptic depolarization (Kelso et al, 1986; Malinow and Miller, 1986; Gustatrson et al, 1987). However, a mechanism ror the persistent reduction or synaptic strength that could balance LTP has not yet been demonstrated. We studied the associative interactions between separate inputs onto the same dendritic trees or hippocampal pyramidal cells or field CAl, and round that a low-frequency input which, by itselr, does not persistently change synaptic strength, can either increase (associative L TP) or decrease in strength (associative long-term depression or LTD) depending upon whether it is positively or negatively correlated in time with a second, high-frequency bursting input. LTP or synaptic strength is Hebbian, and LTD is anti-Hebbian since it is elicited by pairing presynaptic firing with postsynaptic hyperpolarization sufficient to block postsynaptic activity.


A 'Neural' Network that Learns to Play Backgammon

Neural Information Processing Systems

QUALITATIVE RESULTS Analysis of the weights produced by training a network is an exceedingly difficult problem, which we have only been able to approach qualitatively. In Figure 1 we present a diagram showing the connection strengths in a network with 651 input units and no hidden units.


A 'Neural' Network that Learns to Play Backgammon

Neural Information Processing Systems

QUALITATIVERESULTS Analysis of the weights produced by training a network is an exceedingly difficult problem, which we have only been able to approach qualitatively. In Figure 1 we present a diagram showing the connection strengths in a network with 651 input units and no hidden units.


A 'Neural' Network that Learns to Play Backgammon

Neural Information Processing Systems

QUALITATIVE RESULTS Analysis of the weights produced by training a network is an exceedingly difficult problem, which we have only been able to approach qualitatively. In Figure 1 we present a diagram showing the connection strengths in a network with 651 input units and no hidden units.