Goto

Collaborating Authors

 exact implementation


Review for NeurIPS paper: Can the Brain Do Backpropagation? --- Exact Implementation of Backpropagation in Predictive Coding Networks

Neural Information Processing Systems

Weaknesses: I have some critical remarks: 1.) Weight transport problem. This problem is not solved in the model. In fact the model needs symmetric weights. Feedback alignment will probably not work here, as I assume that the existence of an equilibrium state necessitates symmetric weights. The authors claim that the update rules are local.


Review for NeurIPS paper: Can the Brain Do Backpropagation? --- Exact Implementation of Backpropagation in Predictive Coding Networks

Neural Information Processing Systems

Following the author response, we had a long discussion. On the positive side, this is the first algorithm with local update rules that exactly simulates BP (at least asymptotically, given complete convergence at the initialization). On the negative side, all reviewers agreed this algorithm has some reduced plausibility. Specifically, in IL (original PCN) we have to present both input and output, and wait sufficient time until convergence. In contrast, in Z-IL and Fa-Z-IL, we have to first present (only) the input, also wait sufficient time until convergence, and then present the output; In addition, the learning rule becomes more complicated (through the introduction of the Phi function) and we must detect when "the change in error node is caused by feedback input" (which seems to require some global signals). This seems more complicated and less plausible then the original IL.


Can the Brain Do Backpropagation? --- Exact Implementation of Backpropagation in Predictive Coding Networks

Neural Information Processing Systems

Backpropagation (BP) has been the most successful algorithm used to train artificial neural networks. However, there are several gaps between BP and learning in biologically plausible neuronal networks of the brain (learning in the brain, or simply BL, for short), in particular, (1) it has been unclear to date, if BP can be implemented exactly via BL, (2) there is a lack of local plasticity in BP, i.e., weight updates require information that is not locally available, while BL utilizes only locally available information, and (3) there is a lack of autonomy in BP, i.e., some external control over the neural network is required (e.g., switching between prediction and learning stages requires changes to dynamics and synaptic plasticity rules), while BL works fully autonomously. Bridging such gaps, i.e., understanding how BP can be approximated by BL, has been of major interest in both neuroscience and machine learning. Despite tremendous efforts, however, no previous model has bridged the gaps at a degree of demonstrating an equivalence to BP, instead, only approximations to BP have been shown. We propose a BL model that (1) produces \emph{exactly the same} updates of the neural weights as BP, while (2) employing local plasticity, i.e., all neurons perform only local computations, done simultaneously.