ESBN: Estimation Shift of Batch Normalization for Source-free Universal Domain Adaptation

ESBN: Estimation Shift of Batch Normalization for Source-free Universal Domain Adaptation

Jiao Li, Houcheng Su, Bingli Wang, Yuandong Min, Mengzhu Wang, Nan Yin, Shanshan Wang, Jingcai Guo

Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 1386-1394. https://doi.org/10.24963/ijcai.2025/155

Domain adaptation (DA) is crucial for transferring models trained in one domain to perform well in a different, often unseen domain. Traditional methods, including unsupervised domain adaptation (UDA) and source-free domain adaptation (SFDA), have made significant progress. However, most existing DA methods rely heavily on Batch Normalization (BN) layers, which are not optimal in source-free settings, where the source domain is unavailable for comparison. In this study, we propose a novel method, ESBN, which addresses the challenge of domain shift by adjusting the placement of normalization layers and replacing BN with Batch-free Normalization (BFN). Unlike BN, BFN is less dependent on batch statistics and provides more robust feature representations through instance-specific statistics. We systematically investigate the effects of different BN layer placements across various network configurations and demonstrate that selective replacement with BFN improves generalization performance. Extensive experiments on multiple domain adaptation benchmarks show that our approach outperforms state-of-the-art methods, particularly in challenging scenarios such as Open-Partial Domain Adaptation (OPDA).
Keywords:
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning   
Computer Vision: CV: Recognition (object detection, categorization)