Skip-Thought Vectors
Kiros, Ryan, Zhu, Yukun, Salakhutdinov, Ruslan R., Zemel, Richard, Urtasun, Raquel, Torralba, Antonio, Fidler, Sanja
–Neural Information Processing Systems
We describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoder-decoder model that tries to reconstruct the surrounding sentences of an encoded passage. Sentences that share semantic and syntactic properties are thus mapped to similar vector representations. We next introduce a simple vocabulary expansion method to encode words that were not seen as part of training, allowing us to expand our vocabulary to a million words. After training our model, we extract and evaluate our vectors with linear models on 8 tasks: semantic relatedness, paraphrase detection, image-sentence ranking, question-type classification and 4 benchmark sentiment and subjectivity datasets. The end result is an off-the-shelf encoder that can produce highly generic sentence representations that are robust and perform well in practice. We will make our encoder publicly available.
Neural Information Processing Systems
Dec-31-2015
- Country:
- North America
- Canada > Ontario
- Toronto (0.14)
- United States (0.14)
- Canada > Ontario
- North America
- Genre:
- Research Report > New Finding (0.48)
- Technology: