Reviews: Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks

Neural Information Processing Systems 

This paper proposes a new memory layout for recurrent neural networks that is 1. theoretically grounded 2. allows for orders of magnitude longer memory than traditional approaches with comparable parameter cost The results are also confirmed experimentally. This work is definitely of interest to Neurips community and would be a great contribution to the conference.