Complex Gated Recurrent Neural Networks

Moritz Wolter, Angela Yao

Neural Information Processing Systems 

Gating, as used in gated recurrent units (GRUs) [4] and long short-term memory (LSTM) networks [12], has become common-place in recurrent architectures. Gates facilitate the learning of longer term temporal relationships [12]. Furthermore, in the presence of noise in the input signal, gates can protect the cell state from undesired updates, thereby improving overall stability and convergence.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found