Skip-Thought Vectors
Ryan Kiros, Yukun Zhu, Russ R. Salakhutdinov, Richard Zemel, Raquel Urtasun, Antonio Torralba, Sanja Fidler
–Neural Information Processing Systems
We describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoder-decoder model that tries to reconstruct the surrounding sentences of an encoded passage. Sentences that share semantic and syntactic properties are thus mapped to similar vector representations. We next introduce a simple vocabulary expansion method to encode words that were not seen as part of training, allowing us to expand our vocabulary to a million words. After training our model, we extract and evaluate our vectors with linear models on 8 tasks: semantic relatedness, paraphrase detection, image-sentence ranking, question-type classification and 4 benchmark sentiment and subjectivity datasets. The end result is an off-the-shelf encoder that can produce highly generic sentence representations that are robust and perform well in practice.
Neural Information Processing Systems
Oct-2-2025, 16:22:05 GMT
- Country:
- North America
- Canada > Ontario
- Toronto (0.14)
- United States
- Illinois (0.04)
- Massachusetts (0.04)
- Canada > Ontario
- North America
- Genre:
- Research Report > New Finding (0.48)
- Technology: