Approximating Word Ranking and Negative Sampling for Word Embedding

Approximating Word Ranking and Negative Sampling for Word Embedding

Guibing Guo, Shichang Ouyang, Fajie Yuan, Xingwei Wang

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 4092-4098. https://doi.org/10.24963/ijcai.2018/569

CBOW (Continuous Bag-Of-Words) is one of the most commonly used techniques to generate word embeddings in various NLP tasks. However, it fails to reach the optimal performance due to uniform involvements of positive words and a simple sampling distribution of negative words. To resolve these issues, we propose OptRank to optimize word ranking and approximate negative sampling for bettering word embedding. Specifically, we first formalize word embedding as a ranking problem. Then, we weigh the positive words by their ranks such that highly ranked words have more importance, and adopt a dynamic sampling strategy to select informative negative words. In addition, an approximation method is designed to efficiently compute word ranks. Empirical experiments show that OptRank consistently outperforms its counterparts on a benchmark dataset with different sampling scales, especially when the sampled subset is small. The code and datasets can be obtained from https://github.com/ouououououou/OptRank.
Keywords:
Natural Language Processing: Natural Language Semantics
Natural Language Processing: Embeddings