Inconsistency-Based Federated Active Learning
Inconsistency-Based Federated Active Learning
Chen-Chen Zong, Tong Jin, Sheng-Jun Huang
Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 7300-7308.
https://doi.org/10.24963/ijcai.2025/812
Federated learning (FL) enables distributed collaborative learning across local clients while preserving data privacy. However, its practical application in weakly supervised learning (WSL), where only a small subset of data is labeled, remains underexplored. Active learning (AL) is a promising solution for label-limited scenarios, but its adaptation to federated settings presents unique challenges, such as data heterogeneity and noise. In this paper, we propose Inconsistency-based Federated Active Learning (IFAL), a novel approach to address these challenges. First, we introduce a data-driven probability formulation that aligns the biases between local and global models in heterogeneous FL settings. Next, to mitigate noise, we propose an inter-model inconsistency criterion that filters out noisy examples and focuses on those with beneficial prediction discrepancies. Additionally, we introduce an intra-model inconsistency criterion to query examples that help refine the model’s decision boundaries. By combining these strategies with clustering, IFAL effectively selects a diverse and informative query set. Extensive experiments on benchmark datasets demonstrate that IFAL outperforms state-of-the-art methods.
Keywords:
Machine Learning: ML: Federated learning
Machine Learning: ML: Active learning
Machine Learning: ML: Classification
Machine Learning: ML: Weakly supervised learning
