Clarinet: A One-step Approach Towards Budget-friendly Unsupervised Domain Adaptation
Clarinet: A One-step Approach Towards Budget-friendly Unsupervised Domain Adaptation
Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2526-2532.
https://doi.org/10.24963/ijcai.2020/350
In unsupervised domain adaptation (UDA), classifiers for the target domain are trained with massive true-label data from the source domain and unlabeled data from the target domain. However, it may be difficult to collect fully-true-label data in a source domain given limited budget. To mitigate this problem, we consider a novel problem setting where the classifier for the target domain has to be trained with complementary-label data from the source domain and unlabeled data from the target domain named budget-friendly UDA (BFUDA). The key benefit is that it is much less costly to collect complementary-label source data (required by BFUDA) than collecting the true-label source data (required by ordinary UDA). To this end, complementary label adversarial network (CLARINET) is proposed to solve the BFUDA problem. CLARINET maintains two deep networks simultaneously, where one focuses on classifying complementary-label source data and the other takes care of the source-to-target distributional adaptation. Experiments show that CLARINET significantly outperforms a series of competent baselines.
Keywords:
Machine Learning: Transfer, Adaptation, Multi-task Learning
Machine Learning: Semi-Supervised Learning
Machine Learning: Deep Learning