Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction

Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction

Jie Liu, Shaowei Chen, Bingquan Wang, Jiaxin Zhang, Na Li, Tong Xu

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3787-3793. https://doi.org/10.24963/ijcai.2020/524

Joint entity and relation extraction is critical for many natural language processing (NLP) tasks, which has attracted increasing research interest. However, it is still faced with the challenges of identifying the overlapping relation triplets along with the entire entity boundary and detecting the multi-type relations. In this paper, we propose an attention-based joint model, which mainly contains an entity extraction module and a relation detection module, to address the challenges. The key of our model is devising a supervised multi-head self-attention mechanism as the relation detection module to learn the token-level correlation for each relation type separately. With the attention mechanism, our model can effectively identify overlapping relations and flexibly predict the relation type with its corresponding intensity. To verify the effectiveness of our model, we conduct comprehensive experiments on two benchmark datasets. The experimental results demonstrate that our model achieves state-of-the-art performances.
Keywords:
Natural Language Processing: Information Extraction
Natural Language Processing: Natural Language Processing