Blocking-based Neighbor Sampling for Large-scale Graph Neural Networks
Blocking-based Neighbor Sampling for Large-scale Graph Neural Networks
Kai-Lang Yao, Wu-Jun Li
Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 3307-3313.
https://doi.org/10.24963/ijcai.2021/455
The exponential increase in computation and memory complexity with the depth of network has become the main impediment to the successful application of graph neural networks (GNNs) on large-scale graphs like graphs with hundreds of millions of nodes. In this paper, we propose a novel neighbor sampling strategy, dubbed blocking-based neighbor sampling (BNS), for efficient training of GNNs on large-scale graphs. Specifically, BNS adopts a policy to stochastically block the ongoing expansion of neighboring nodes, which can reduce the rate of the exponential increase in computation and memory complexity of GNNs. Furthermore, a reweighted policy is applied to graph convolution, to adjust the contribution of blocked and non-blocked neighbors to central nodes. We theoretically prove that BNS provides an unbiased estimation for the original graph convolution operation. Extensive experiments on three benchmark datasets show that, on large-scale graphs, BNS is 2X~5X faster than state-of-the-art methods when achieving the same accuracy. Moreover, even on the small-scale graphs, BNS also demonstrates the advantage of low time cost.
Keywords:
Machine Learning: Relational Learning
Machine Learning Applications: Big data; Scalability
Data Mining: Mining Graphs, Semi Structured Data, Complex Data