DyGRAIN: An Incremental Learning Framework for Dynamic Graphs

DyGRAIN: An Incremental Learning Framework for Dynamic Graphs

Seoyoon Kim, Seongjun Yun, Jaewoo Kang

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 3157-3163. https://doi.org/10.24963/ijcai.2022/438

Graph-structured data provide a powerful representation of complex relations or interactions. Many variants of graph neural networks (GNNs) have emerged to learn graph-structured data where underlying graphs are static, although graphs in various real-world applications are dynamic (e.g., evolving structure). To consider the dynamic nature that a graph changes over time, the need for applying incremental learning (i.e., continual learning or lifelong learning) to the graph domain has been emphasized. However, unlike incremental learning on Euclidean data, graph-structured data contains dependency between the existing nodes and newly appeared nodes, resulting in the phenomenon that receptive fields of existing nodes vary by new inputs (e.g., nodes and edges). In this paper, we raise a crucial challenge of incremental learning for dynamic graphs as time-varying receptive fields, and propose a novel incremental learning framework, DyGRAIN, to mitigate time-varying receptive fields and catastrophic forgetting. Specifically, our proposed method incrementally learns dynamic graph representations by reflecting the influential change in receptive fields of existing nodes and maintaining previous knowledge of informational nodes prone to be forgotten. Our experiments on large-scale graph datasets demonstrate that our proposed method improves the performance by effectively capturing pivotal nodes and preventing catastrophic forgetting.
Keywords:
Machine Learning: Sequence and Graph Learning
Data Mining: Mining Graphs
Machine Learning: Incremental Learning