Learning Attributed Graph Representation with Communicative Message Passing Transformer

Learning Attributed Graph Representation with Communicative Message Passing Transformer

Jianwen Chen, Shuangjia Zheng, Ying Song, Jiahua Rao, Yuedong Yang

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 2242-2248. https://doi.org/10.24963/ijcai.2021/309

Constructing appropriate representations of molecules lies at the core of numerous tasks such as material science, chemistry, and drug designs. Recent researches abstract molecules as attributed graphs and employ graph neural networks (GNN) for molecular representation learning, which have made remarkable achievements in molecular graph modeling. Albeit powerful, current models either are based on local aggregation operations and thus miss higher-order graph properties or focus on only node information without fully using the edge information. For this sake, we propose a Communicative Message Passing Transformer (CoMPT) neural network to improve the molecular graph representation by reinforcing message interactions between nodes and edges based on the Transformer architecture. Unlike the previous transformer-style GNNs that treat molecule as a fully connected graph, we introduce a message diffusion mechanism to leverage the graph connectivity inductive bias and reduce the message enrichment explosion. Extensive experiments demonstrated that the proposed model obtained superior performances (around 4% on average) against state-of-the-art baselines on seven chemical property datasets (graph-level tasks) and two chemical shift datasets (node-level tasks). Further visualization studies also indicated a better representation capacity achieved by our model.
Keywords:
Machine Learning: Deep Learning
Multidisciplinary Topics and Applications: Biology and Medicine
Machine Learning Applications: Bio/Medicine
Machine Learning Applications: Applications of Supervised Learning