Reviews: What the Vec? Towards Probabilistically Grounded Embeddings
–Neural Information Processing Systems
This paper's view is novel and relatively solid. It provides a perspective for understanding the semantic similarity in word embedding, by (1) showing via space geometry that different semantic compositionality can be captured by PMI vectors (2) the linear projection between the PMI vectors and word embedding can preserve properties in (1). To me, the best part of the paper is that the author makes an effort to give a systematic and mathematically well-formed analysis addressing the frequently mentioned but not fully understood semantic issues in word embedding. The paper also derives a new model with LSQ loss in section 5 which achieves better performance and thus justified the previous analysis to some extent. My biggest concern lies in the absence of the understanding of COSINE similarity.
Neural Information Processing Systems
Jan-22-2025, 08:14:37 GMT
- Technology: