Consistent Inference for Dialogue Relation Extraction

Consistent Inference for Dialogue Relation Extraction

Xinwei Long, Shuzi Niu, Yucheng Li

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 3885-3891. https://doi.org/10.24963/ijcai.2021/535

Relation Extraction is key to many downstream tasks. Dialogue relation extraction aims at discovering entity relations from multi-turn dialogue scenario. There exist utterance, topic and relation discrepancy mainly due to multi-speakers, utterances, and relations. In this paper, we propose a consistent learning and inference method to minimize possible contradictions from those distinctions. First, we design mask mechanisms to refine utterance-aware and speaker-aware representations respectively from the global dialogue representation for the utterance distinction. Then a gate mechanism is proposed to aggregate such bi-grained representations. Next, mutual attention mechanism is introduced to obtain the entity representation for various relation specific topic structures. Finally, the relational inference is performed through first order logic constraints over the labeled data to decrease logically contradictory predicted relations. Experimental results on two benchmark datasets show that the F1 performance improvement of the proposed method is at least 3.3% compared with SOTA.
Keywords:
Natural Language Processing: Information Extraction
Natural Language Processing: Dialogue