Hierarchical Inter-Attention Network for Document Classification with Multi-Task Learning

Hierarchical Inter-Attention Network for Document Classification with Multi-Task Learning

Bing Tian, Yong Zhang, Jin Wang, Chunxiao Xing

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 3569-3575. https://doi.org/10.24963/ijcai.2019/495

Document classification is an essential task in many real world applications. Existing approaches adopt both text semantics and document structure to obtain the document representation. However, these models usually require a large collection of annotated training instances, which are not always feasible, especially in low-resource settings. In this paper, we propose a multi-task learning framework to jointly train multiple related document classification tasks. We devise a hierarchical architecture to make use of the shared knowledge from all tasks to enhance the document representation of each task. We further propose an inter-attention approach to improve the task-specific modeling of documents with global information. Experimental results on 15 public datasets demonstrate the benefits of our proposed model.
Keywords:
Machine Learning: Transfer, Adaptation, Multi-task Learning
Natural Language Processing: Sentiment Analysis and Text Mining
Natural Language Processing: Text Classification