Function-words Adaptively Enhanced Attention Networks for Few-Shot Inverse Relation Classification

Function-words Adaptively Enhanced Attention Networks for Few-Shot Inverse Relation Classification

Chunliu Dou, Shaojuan Wu, Xiaowang Zhang, Zhiyong Feng, Kewen Wang

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 2937-2943. https://doi.org/10.24963/ijcai.2022/407

The relation classification is to identify semantic relations between two entities in a given text. While existing models perform well for classifying inverse relations with large datasets, their performance is significantly reduced for few-shot learning. In this paper, we propose a function words adaptively enhanced attention framework (FAEA) for few-shot inverse relation classification, in which a hybrid attention model is designed to attend class-related function words based on meta-learning. As the involvement of function words brings in significant intra-class redundancy, an adaptive message passing mechanism is introduced to capture and transfer inter-class differences.We mathematically analyze the negative impact of function words from dot-product measurement, which explains why the message passing mechanism effectively reduces the impact. Our experimental results show that FAEA outperforms strong baselines, especially the inverse relation accuracy is improved by 14.33% under 1-shot setting in FewRel1.0.
Keywords:
Machine Learning: Few-shot learning
Natural Language Processing: Text Classification