Leveraging Document-Level Label Consistency for Named Entity Recognition

Leveraging Document-Level Label Consistency for Named Entity Recognition

Tao Gui, Jiacheng Ye, Qi Zhang, Yaqian Zhou, Yeyun Gong, Xuanjing Huang

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3976-3982. https://doi.org/10.24963/ijcai.2020/550

Document-level label consistency is an effective indicator that different occurrences of a particular token sequence are very likely to have the same entity types. Previous work focused on better context representations and used the CRF for label decoding. However, CRF-based methods are inadequate for modeling document-level label consistency. This work introduces a novel two-stage label refinement approach to handle document-level label consistency, where a key-value memory network is first used to record draft labels predicted by the base model, and then a multi-channel Transformer makes refinements on these draft predictions based on the explicit co-occurrence relationship derived from the memory network. In addition, in order to mitigate the side effects of incorrect draft labels, Bayesian neural networks are used to indicate the labels with a high probability of being wrong, which can greatly assist in preventing the incorrect refinement of correct draft labels. The experimental results on three named entity recognition benchmarks demonstrated that the proposed method significantly outperformed the state-of-the-art methods.
Keywords:
Natural Language Processing: Named Entities
Natural Language Processing: Tagging, chunking, and parsing