On the Downstream Performance of Compressed Word Embeddings
May, Avner, Zhang, Jian, Dao, Tri, Ré, Christopher
–Neural Information Processing Systems
Compressing word embeddings is important for deploying NLP models in memory-constrained settings. However, understanding what makes compressed embeddings perform well on downstream tasks is challenging---existing measures of compression quality often fail to distinguish between embeddings that perform well and those that do not. We thus propose the eigenspace overlap score as a new measure. We relate the eigenspace overlap score to downstream performance by developing generalization bounds for the compressed embeddings in terms of this score, in the context of linear and logistic regression. We then show that we can lower bound the eigenspace overlap score for a simple uniform quantization compression method, helping to explain the strong empirical performance of this method.
Neural Information Processing Systems
Mar-19-2020, 01:30:55 GMT