Goto

Collaborating Authors

 Frasconi, Paolo


Credit Assignment through Time: Alternatives to Backpropagation

Neural Information Processing Systems

Learning to recognize or predict sequences using long-term context hasmany applications. However, practical and theoretical problems are found in training recurrent neural networks to perform tasksin which input/output dependencies span long intervals. Starting from a mathematical analysis of the problem, we consider and compare alternative algorithms and architectures on tasks for which the span of the input/output dependencies can be controlled. Results on the new algorithms show performance qualitatively superior tothat obtained with backpropagation. 1 Introduction Recurrent neural networks have been considered to learn to map input sequences to output sequences. Machines that could efficiently learn such tasks would be useful for many applications involving sequence prediction, recognition or production.