Learning Sentence Representation with Guidance of Human Attention

Learning Sentence Representation with Guidance of Human Attention

Shaonan Wang, Jiajun Zhang, Chengqing Zong

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 4137-4143. https://doi.org/10.24963/ijcai.2017/578

Recently, much progress has been made in learning general-purpose sentence representations that can be used across domains. However, most of the existing models typically treat each word in a sentence equally. In contrast, extensive studies have proven that human read sentences efficiently by making a sequence of fixation and saccades. This motivates us to improve sentence representations by assigning different weights to the vectors of the component words, which can be treated as an attention mechanism on single sentences. To that end, we propose two novel attention models, in which the attention weights are derived using significant predictors of human reading time, i.e., Surprisal, POS tags and CCG supertags. The extensive experiments demonstrate that the proposed methods significantly improve upon the state-of-the-art sentence representation models.
Keywords:
Natural Language Processing: Natural Language Semantics
Natural Language Processing: Natural Language Processing