A Correlated Topic Model Using Word Embeddings

A Correlated Topic Model Using Word Embeddings

Guangxu Xun, Yaliang Li, Wayne Xin Zhao, Jing Gao, Aidong Zhang

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 4207-4213. https://doi.org/10.24963/ijcai.2017/588

Conventional correlated topic models are able to capture correlation structure among latent topics by replacing the Dirichlet prior with the logistic normal distribution. Word embeddings have been proven to be able to capture semantic regularities in language. Therefore, the semantic relatedness and correlations between words can be directly calculated in the word embedding space, for example, via cosine values. In this paper, we propose a novel correlated topic model using word embeddings. The proposed model enables us to exploit the additional word-level correlation information in word embeddings and directly model topic correlation in the continuous word embedding space. In the model, words in documents are replaced with meaningful word embeddings, topics are modeled as multivariate Gaussian distributions over the word embeddings and topic correlations are learned among the continuous Gaussian topics. A Gibbs sampling solution with data augmentation is given to perform inference. We evaluate our model on the 20 Newsgroups dataset and the Reuters-21578 dataset qualitatively and quantitatively. The experimental results show the effectiveness of our proposed model.
Keywords:
Natural Language Processing: Natural Language Generation
Natural Language Processing: Natural Language Semantics