Multi-Channel Pooling Graph Neural Networks

Multi-Channel Pooling Graph Neural Networks

Jinlong Du, Senzhang Wang, Hao Miao, Jiaqiang Zhang

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 1442-1448. https://doi.org/10.24963/ijcai.2021/199

Graph pooling is a critical operation to downsample a graph in graph neural networks. Existing coarsening pooling methods (e.g. DiffPool) mostly focus on capturing the global topology structure by assigning the nodes into several coarse clusters, while dropping pooling methods (e.g. SAGPool) try to preserve the local topology structure by selecting the top-k representative nodes. However, there lacks an effective method to integrate the two types of methods so that both the local and the global topology structure of a graph can be well captured. To address this issue, we propose a Multi-channel Graph Pooling method named MuchPool, which captures the local structure, the global structure, and node feature simultaneously in graph pooling. Specifically, we use two channels to conduct dropping pooling based on the local topology and node features respectively, and one channel to conduct coarsening pooling. Then a cross-channel convolution operation is designed to refine the graph representations of different channels. Finally, the pooling results are aggregated as the final pooled graph. Extensive experiments on six benchmark datasets present the superior performance of MuchPool. The code of this work is publicly available at Github.
Keywords:
Data Mining: Mining Graphs, Semi Structured Data, Complex Data
Machine Learning: Deep Learning
Machine Learning: Classification