Learning Continuous Graph Structure with Bilevel Programming for Graph Neural Networks

Learning Continuous Graph Structure with Bilevel Programming for Graph Neural Networks

Minyang Hu, Hong Chang, Bingpeng Ma, Shiguang Shan

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 3057-3063. https://doi.org/10.24963/ijcai.2022/424

Learning graph structure for graph neural networks (GNNs) is crucial to facilitate the GNN-based downstream learning tasks. It is challenging due to the non-differentiable discrete graph structure and lack of ground-truth. In this paper, we address these problems and propose a novel graph structure learning framework for GNNs. Firstly, we directly model the continuous graph structure with dual-normalization, which implicitly imposes sparse constraint and reduces the influence of noisy edges. Secondly, we formulate the whole training process as a bilevel programming problem, where the inner objective is to optimize the GNNs given learned graphs, while the outer objective is to optimize the graph structure to minimize the generalization error of downstream task. Moreover, for bilevel optimization, we propose an improved Neumann-IFT algorithm to obtain an approximate solution, which is more stable and accurate than existing optimization methods. Besides, it makes the bilevel optimization process memory-efficient and scalable to large graphs. Experiments on node classification and scene graph generation show that our method can outperform related methods, especially with noisy graphs.
Keywords:
Machine Learning: Hyperparameter Optimization
Computer Vision: Scene analysis and understanding   
Data Mining: Mining Graphs