Reservoir memory machines

Paassen, Benjamin, Schulz, Alexander

arXiv.org Machine Learning 

While neural networks have achieved impressive successes in domains like image classification or machine translation, standard models still struggle with tasks that require very longterm memory without interference and would thus benefit from a separation of memory and computation [Graves et al., 2016, Collier and Beel, 2018]. Neural Turing Machines (NTM) attempt to address these tasks by augmenting recurrent neural networks with an explicit memory to which the network has read and write access [Graves et al., 2016, Collier and Beel, 2018]. Unfortunately, such models are notoriously hard to train, even compared to other deep learning models [Collier and Beel, 2018]. In our contribution, we propose to address this training problem by replacing the learned recurrent neural network controller of a NTM with an echo state network (ESN) [Jaeger and Haas, 2004]. In other words, we only learn the controller for the read and write head of our memory access as well as the output mapping, all of which is possible via standard linear regression.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found