Mitigating Over-Smoothing in Graph Neural Networks via Separation Coefficient-Guided Adaptive Graph Structure Adjustment
Mitigating Over-Smoothing in Graph Neural Networks via Separation Coefficient-Guided Adaptive Graph Structure Adjustment
Hanyang Meng, Jielong Yang, Li Peng
Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 5959-5966.
https://doi.org/10.24963/ijcai.2025/663
As the number of layers in Graph Neural Networks (GNNs) increases, over-smoothing becomes more severe, causing intra-class feature distances to shrink, while heterogeneous representations tend to converge. Most existing methods attempt to address this issue by employing heuristic shortcut mechanisms or optimizing objectives to constrain inter-class feature differences. However, these approaches fail to establish a theoretical connection between message passing and the variation in inter-class feature differences, making it challenging to design methods that target the key influencing factors. To address this gap, this paper first introduces the concept of the separation coefficient, which quantifies the contraction of feature distances between classes during multi-layer message passing. Based on this theory, we propose a low-complexity, pluggable, pseudo-label-based adaptive graph structure adjustment method. This approach effectively enhances the separation coefficient of inter-class features while maintaining intra-class compactness, thereby alleviating the convergence of heterogeneous representations caused by multi-layer aggregation. Experimental results demonstrate that the proposed method significantly improves the discriminability of node representations and enhances node classification performance across various datasets and foundational models.
Keywords:
Machine Learning: ML: Semi-supervised learning
Machine Learning: ML: Representation learning
Machine Learning: ML: Theory of deep learning
Machine Learning: ML: Explainable/Interpretable machine learning
