Metric Learning in Optimal Transport for Domain Adaptation
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2162-2168. https://doi.org/10.24963/ijcai.2020/299
Domain Adaptation aims at benefiting from a labeled dataset drawn from a source distribution to learn a model from examples generated from a different but related target distribution. Creating a domain-invariant representation between the two source and target domains is the most widely technique used. A simple and robust way to perform this task consists in (i) representing the two domains by subspaces described by their respective eigenvectors and (ii) seeking a mapping function which aligns them. In this paper, we propose to use Optimal Transport (OT) and its associated Wassertein distance to perform this alignment. While the idea of using OT in domain adaptation is not new, the original contribution of this paper is two-fold: (i) we derive a generalization bound on the target error involving several Wassertein distances. This prompts us to optimize the ground metric of OT to reduce the target risk; (ii) from this theoretical analysis, we design an algorithm (MLOT) which optimizes a Mahalanobis distance leading to a transportation plan that adapts better. Extensive experiments demonstrate the effectiveness of this original approach.
Machine Learning: Transfer, Adaptation, Multi-task Learning
Machine Learning: Unsupervised Learning