One-step Label Shift Adaptation via Robust Weight Estimation

One-step Label Shift Adaptation via Robust Weight Estimation

Ruidong Fan, Xiao Ouyang, Tingjin Luo, Lijun Zhang, Chenping Hou

Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 5074-5082. https://doi.org/10.24963/ijcai.2025/565

Label shift is a prevalent phenomenon encountered in open environments, characterized by a notable discrepancy in the label distributions between the source (training) and target (test) domains, whereas the conditional distributions given the labels remain invariant. Existing label shift methods adopt a two-step strategy: initially computing the importance weight and subsequently utilizing it to calibrate the target outputs. However, this conventional strategy overlooks the intricate interplay between output adjustment and weight estimation. In this paper, we introduce a novel approach termed as One-step Label Shift Adaptation (OLSA). Our methodology jointly learns the predictive model and the corresponding weights through a bi-level optimization framework, with the objective of minimizing an upper bound on the target risk. To enhance the robustness of our proposed model, we incorporate a debiasing term into the upper-level classifier training and devise a regularization term for the lower-level weight estimation. Furthermore, we present theoretical analyses about the generalization bounds, offering guarantees for the model's performance. Extensive experimental results substantiate the efficacy of our proposal.
Keywords:
Machine Learning: ML: Learnware/model reuse/transfer learning
Machine Learning: ML: Semi-supervised learning