OS-GCL: A One-Shot Learner in Graph Contrastive Learning

OS-GCL: A One-Shot Learner in Graph Contrastive Learning

Cheng Ji, Chenrui He, Qian Li, Qingyun Sun, Xingcheng Fu, Jianxin Li

Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 2964-2972. https://doi.org/10.24963/ijcai.2025/330

Graph contrastive learning (GCL) enhances the self-supervised learning capacity for graph representation learning. Nevertheless, the previous research has neglected to consider one fundamental nature of GCL -- graph contrastive learning operates as a one-shot learner, guided by the widely utilized noise contrastive estimation (e.g., the InfoNCE loss). Theoretically, to initially investigate the factors that contribute to the one-shot learner essence, we analyze the InfoNCE-based objective and derive its equivalent form of the softmax-based cross-entropy function. It is concluded that the InfoNCE-based GCL is determined to be a (2n-1)-way 1-shot classifier (n is the number of nodes). In this particular context, each sample is indicative of a unique ideational class, and each class has only one sample. Consequently, the one-shot learning nature of GCL leads to the issue of the limited self-supervised signal. To further address the above issue, we propose a One-Shot Learner in Graph Contrastive Learning (OS-GCL). Firstly, we estimate the potential probability distributions of the deterministic node features and discrete graph topology. Secondly, we develop a probabilistic message-passing mechanism to propagate probability (of feature) on probability (of topology). Thirdly, we propose the ProbNCE loss functions to contrast distributions. Extensive experimental results demonstrate the superiority of OS-GCL. To the best of our knowledge, this is the first study to examine the one-shot learning essence and the limited self-supervised signal issue of GCL.
Keywords:
Data Mining: DM: Mining graphs