Hermitian Co-Attention Networks for Text Matching in Asymmetrical Domains

Hermitian Co-Attention Networks for Text Matching in Asymmetrical Domains

Yi Tay, Anh Tuan Luu, Siu Cheung Hui

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 4425-4431. https://doi.org/10.24963/ijcai.2018/615

Co-Attentions are highly effective attention mechanisms for text matching applications. Co-Attention enables the learning of pairwise attentions, i.e., learning to attend based on computing word-level affinity scores between two documents. However, text matching problems can exist in either symmetrical or asymmetrical domains. For example, paraphrase identification is a symmetrical task while question-answer matching and entailment classification are considered asymmetrical domains. In this paper, we argue that Co-Attention models in asymmetrical domains require different treatment as opposed to symmetrical domains, i.e., a concept of word-level directionality should be incorporated while learning word-level similarity scores. Hence, the standard inner product in real space commonly adopted in co-attention is not suitable. This paper leverages attractive properties of the complex vector space and proposes a co-attention mechanism based on the complex-valued inner product (Hermitian products). Unlike the real dot product, the dot product in complex space is asymmetric because the first item is conjugated. Aside from modeling and encoding directionality, our proposed approach also enhances the representation learning process. Extensive experiments on five text matching benchmark datasets demonstrate the effectiveness of our approach. 
Keywords:
Natural Language Processing: Natural Language Processing
Natural Language Processing: Question Answering
Multidisciplinary Topics and Applications: Information Retrieval