Recurrent Neural Network for Text Classification with Hierarchical Multiscale Dense Connections

Recurrent Neural Network for Text Classification with Hierarchical Multiscale Dense Connections

Yi Zhao, Yanyan Shen, Junjie Yao

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 5450-5456. https://doi.org/10.24963/ijcai.2019/757

Text classification is a fundamental task in many Natural Language Processing applications. While recurrent neural networks have achieved great success in performing text classification, they fail to capture the hierarchical structure and long-term semantics dependency which are common features of text data. Inspired by the advent of the dense connection pattern in advanced convolutional neural networks, we propose a simple yet effective recurrent architecture, named Hierarchical Mutiscale Densely Connected RNNs (HM-DenseRNNs), which: 1) enables direct access to the hidden states of all preceding recurrent units via dense connections, and 2) organizes multiple densely connected recurrent units into a hierarchical multi-scale structure, where the layers are updated at different scales. HM-DenseRNNs can effectively capture long-term dependencies among words in long text data, and a dense recurrent block is further introduced to reduce the number of parameters and enhance training efficiency. We evaluate the performance of our proposed architecture on three text datasets and the results verify the advantages of HM-DenseRNN over the baseline methods in terms of the classification accuracy.
Keywords:
Natural Language Processing: Natural Language Processing
Natural Language Processing: Text Classification