Generalization of Back propagation to Recurrent and Higher Order Neural Networks
–Neural Information Processing Systems
The propagation of activation in these networks is determined by dissipative differential equations. The error signal is backpropagated by integrating an associated differential equation. The method is introduced by applying it to the recurrent generalization of the feedforward backpropagation network. The method is extended to the case of higher order networks and to a constrained dynamical system for training a content addressable memory. The essential feature of the adaptive algorithms is that adaptive equation has a simple outer product form.
Neural Information Processing Systems
Apr-6-2023, 20:07:32 GMT
- Technology: