Neighborhood-Aware Attentional Representation for Multilingual Knowledge Graphs

Neighborhood-Aware Attentional Representation for Multilingual Knowledge Graphs

Qiannan Zhu, Xiaofei Zhou, Jia Wu, Jianlong Tan, Li Guo

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 1943-1949. https://doi.org/10.24963/ijcai.2019/269

Multilingual knowledge graphs constructed by entity alignment are the indispensable resources for numerous AI-related applications. Most existing entity alignment methods only use the triplet-based knowledge to find the aligned entities across multilingual knowledge graphs, they usually ignore the neighborhood subgraph knowledge of entities that implies more richer alignment information for aligning entities. In this paper, we incorporate neighborhood subgraph-level information of entities, and propose a neighborhood-aware attentional representation method NAEA for multilingual knowledge graphs. NAEA devises an attention mechanism to learn neighbor-level representation by aggregating neighbors' representations with a weighted combination. The attention mechanism enables entities not only capture different impacts of their neighbors on themselves, but also attend over their neighbors' feature representations with different importance. We evaluate our model on two real-world datasets DBP15K and DWY100K, and the experimental results show that the proposed model NAEA significantly and consistently outperforms state-of-the-art entity alignment models.
Keywords:
Knowledge Representation and Reasoning: Knowledge Representation and Decision ; Utility Theory
Machine Learning: Knowledge-based Learning