Transfer between long-term and short-term memory using Conceptors

Strock, Anthony, Rougier, Nicolas, Hinaut, Xavier

arXiv.org Machine Learning 

The reservoir computing (RC) paradigm [9] is a peculiar and economic way to train a recurrent neural network (RNN) because only the output layer is modified while the input and recurrent layers are kept unmodified. Such RNNs are called reservoirs because they provide a pool of nonlinear computations based on inputs. Many variants (such as Echo State Networks [8] and Liquid State Machine [15]), along with specific extensions of this RC paradigm have been proposed since its initial stance by [8] (for a review see [14]), including implementations in various hardware like DNAor laser-based ones (see [25] for a recent review on physical reservoirs). A recent and major enhancement of the RC paradigm has been proposed by Jaeger [10], called Conceptors (see Figure 1 that introduces the main concepts). Intuitively, a conceptor represents a subspace of internal states of a RNN, e.g. the trajectory of a reservoir when fed by some input.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found