Scaling Equilibrium Propagation to Deeper Neural Network Architectures

Elayedam, Sankar Vinayak, Srinivasan, Gopalakrishnan

arXiv.org Artificial Intelligence 

Abstract--Equilibrium propagation has been proposed as a biologically plausible alternative to the backpropagation algorithm. The local nature of gradient computations, combined with the use of convergent RNNs to reach equilibrium states, make this approach well-suited for implementation on neuro-morphic hardware. However, previous studies on equilibrium propagation have been restricted to networks containing only dense layers or relatively small architectures with a few convo-lutional layers followed by a final dense layer . These networks have a significant gap in accuracy compared to similarly sized feedforward networks trained with backpropagation. In this work, we introduce the Hopfield-Resnet architecture, which incorporates residual (or skip) connections in Hopfield networks with clipped ReLU as the activation function. The proposed architectural enhancements enable the training of networks with nearly twice the number of layers reported in prior works.