Multi-Task Curriculum Graph Contrastive Learning with Clustering Entropy Guidance
Multi-Task Curriculum Graph Contrastive Learning with Clustering Entropy Guidance
Chusheng Zeng, Bocheng Wang, Jinghui Yuan, Mulin Chen, Xuelong Li
Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 6949-6957.
https://doi.org/10.24963/ijcai.2025/773
Recent advances in unsupervised deep graph clustering have been significantly promoted by contrastive learning. Despite the strides, most graph contrastive learning models face challenges: 1) graph augmentation is used to improve learning diversity, but commonly used random augmentation methods may destroy inherent semantics and cause noise; 2) the fixed positive and negative sample selection strategy ignores the difficulty distribution of samples when deal with complex real data, thereby impeding the model’s capability to capture fine-grained patterns and trapping the model in sub-optimal for clustering. To reduce these problems, we propose the Clustering-guided Curriculum Graph contrastive Learning (CurGL) framework. CurGL uses clustering entropy as the guidance of the following graph augmentation and contrastive learning. Specifically, according to the clustering entropy, the intra-class edges and important features are emphasized in augmentation. Then, a multi-task curriculum learning scheme is proposed, which employs the clustering guidance to shift the focus from the discrimination task to the clustering task. In this way, the sample selection strategy of contrastive learning can be adjusted adaptively from early to late stage, which enhances the model's flexibility for complex data structure. Experimental results demonstrate that CurGL has achieved excellent performance compared to state-of-the-art competitors.
Keywords:
Machine Learning: ML: Clustering
Machine Learning: ML: Sequence and graph learning
