Exploration of Tree-based Hierarchical Softmax for Recurrent Language Models

Exploration of Tree-based Hierarchical Softmax for Recurrent Language Models

Nan Jiang, Wenge Rong, Min Gao, Yikang Shen, Zhang Xiong

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 1951-1957. https://doi.org/10.24963/ijcai.2017/271

Recently, variants of neural networks for computational linguistics have been proposed and successfully applied to neural language modeling and neural machine translation. These neural models can leverage knowledge from massive corpora but they are extremely slow as they predict candidate words from a large vocabulary during training and inference. As an alternative to gradient approximation and softmax with class decomposition, we explore the tree-based hierarchical softmax method and reform its architecture, making it compatible with modern GPUs and introducing a compact tree-based loss function. When combined with several word hierarchical clustering algorithms, improved performance is achieved in language modelling task with intrinsic evaluation criterions on PTB, WikiText-2 and WikiText-103 datasets.
Keywords:
Machine Learning: Machine Learning
Machine Learning: Deep Learning
Natural Language Processing: Natural Language Processing
Natural Language Processing: Natural Language Generation