Modeling Precursors for Temporal Knowledge Graph Reasoning via Auto-encoder Structure

Modeling Precursors for Temporal Knowledge Graph Reasoning via Auto-encoder Structure

Yifu Gao, Linhui Feng, Zhigang Kan, Yi Han, Linbo Qiao, Dongsheng Li

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 2044-2051. https://doi.org/10.24963/ijcai.2022/284

Temporal knowledge graph (TKG) reasoning that infers missing facts in the future is an essential and challenging task. When predicting a future event, there must be a narrative evolutionary process composed of closely related historical facts to support the event's occurrence, namely fact precursors. However, most existing models employ a sequential reasoning process in an auto-regressive manner, which cannot capture precursor information. This paper proposes a novel auto-encoder architecture that introduces a relation-aware graph attention layer into transformer (rGalT) to accommodate inference over the TKG. Specifically, we first calculate the correlation between historical and predicted facts through multiple attention mechanisms along intra-graph and inter-graph dimensions, then constitute these mutually related facts into diverse fact segments. Next, we borrow the translation generation idea to decode in parallel the precursor information associated with the given query, which enables our model to infer future unknown facts by progressively generating graph structures. Experimental results on four benchmark datasets demonstrate that our model outperforms other state-of-the-art methods, and precursor identiļ¬cation provides supporting evidence for prediction.
Keywords:
Data Mining: Knowledge Graphs and Knowledge Base Completion
Knowledge Representation and Reasoning: Learning and reasoning
Machine Learning: Representation learning