Uncovering Meanings of Embeddings via Partial Orthogonality

Neural Information Processing Systems 

Machine learning tools often rely on embedding text as vectors of real numbers. In this paper, we study how the semantic structure of language is encoded in the algebraic structure of such embeddings.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found