LightRNN: Memory and Computation-Efficient Recurrent Neural Networks

Xiang Li, Tao Qin, Jian Yang, Tie-Yan Liu

Neural Information Processing Systems 

While RNNs are becoming increasingly popular, they have a known limitation: when applied to textual corpora with large vocabularies, the size of the model will become very big. For instance, when using RNNs for language modeling, a word is first mapped from a one-hot vector (whose dimension is equal to the size of the vocabulary) to an embedding vector by an input-embedding matrix.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found