Exploring Safety Supervision for Continual Test-time Domain Adaptation
Exploring Safety Supervision for Continual Test-time Domain Adaptation
Xu Yang, Yanan Gu, Kun Wei, Cheng Deng
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 1649-1657.
https://doi.org/10.24963/ijcai.2023/183
Continual test-time domain adaptation aims to adapt a source pre-trained model to a continually changing target domain without using any source data. Unfortunately, existing methods based on pseudo-label learning suffer from the changing target domain environment, and the quality of generated pseudo-labels is attenuated due to the domain shift, leading to instantaneous negative learning and long-term knowledge forgetting. To solve these problems, in this paper, we propose a simple yet effective framework for exploring safety supervision with three elaborate strategies: Label Safety, Sample Safety, and Parameter Safety. Firstly, to select reliable pseudo-labels, we define and adjust the confidence threshold in a self-adaptive manner according to the test-time learning status. Secondly, a soft-weighted contrastive learning module is presented to explore the highly-correlated samples and discriminate uncorrelated ones, improving the instantaneous efficiency of the model. Finally, we frame a Soft Weight Alignment strategy to normalize the distance between the parameters of the adapted model and the source pre-trained model, which alleviates the long-term problem of knowledge forgetting and significantly improves the accuracy of the adapted model in the late adaptation stage. Extensive experimental results demonstrate that our method achieves state-of-the-art performance on several benchmark datasets.
Keywords:
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning
Computer Vision: CV: Representation learning