Knowledge-enhanced Hierarchical Attention for Community Question Answering with Multi-task and Adaptive Learning

Knowledge-enhanced Hierarchical Attention for Community Question Answering with Multi-task and Adaptive Learning

Min Yang, Lei Chen, Xiaojun Chen, Qingyao Wu, Wei Zhou, Ying Shen

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 5349-5355. https://doi.org/10.24963/ijcai.2019/743

In this paper, we propose a Knowledge-enhanced Hierarchical Attention for community question answering with Multi-task learning and Adaptive learning (KHAMA). First, we propose a hierarchical attention network to fully fuse knowledge from input documents and knowledge base (KB) by exploiting the semantic compositionality of the input sequences. The external factual knowledge helps recognize background knowledge (entity mentions and their relationships) and eliminate noise information from long documents that have sophisticated syntactic and semantic structures. In addition, we build multiple CQA models with adaptive boosting and then combine these models to learn a more effective and robust CQA system. Further- more, KHAMA is a multi-task learning model. It regards CQA as the primary task and question categorization as the auxiliary task, aiming at learning a category-aware document encoder and enhance the quality of identifying essential information from long questions. Extensive experiments on two benchmarks demonstrate that KHAMA achieves substantial improvements over the compared methods.
Keywords:
Natural Language Processing: Natural Language Processing
Natural Language Processing: Question Answering
Machine Learning: Deep Learning