Few-Shot Adaptation of Pre-Trained Networks for Domain Shift
Few-Shot Adaptation of Pre-Trained Networks for Domain Shift
Wenyu Zhang, Li Shen, Wanyue Zhang, Chuan-Sheng Foo
Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 1665-1671.
https://doi.org/10.24963/ijcai.2022/232
Deep networks are prone to performance degradation when there is a domain shift between the source (training) data and target (test) data. Recent test-time adaptation methods update batch normalization layers of pre-trained source models deployed in new target environments with streaming data. Although these methods can adapt on-the-fly without first collecting a large target domain dataset, their performance is dependent on streaming conditions such as mini-batch size and class-distribution which can be unpredictable in practice. In this work, we propose a framework for few-shot domain adaptation to address the practical challenges of data-efficient adaptation. Specifically, we propose a constrained optimization of feature normalization statistics in pre-trained source models supervised by a small target domain support set. Our method is easy to implement and improves source model performance with as little as one sample per class for classification tasks. Extensive experiments on 5 cross-domain classification and 4 semantic segmentation datasets show that our proposed method achieves more accurate and reliable performance than test-time adaptation, while not being constrained by streaming conditions.
Keywords:
Computer Vision: Transfer, low-shot, semi- and un- supervised learning
Machine Learning: Robustness