General Heterogeneous Transfer Distance Metric Learning via Knowledge Fragments Transfer

General Heterogeneous Transfer Distance Metric Learning via Knowledge Fragments Transfer

Yong Luo, Yonggang Wen, Tongliang Liu, Dacheng Tao

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 2450-2456. https://doi.org/10.24963/ijcai.2017/341

Transfer learning aims to improve the performance of target learning task by leveraging information (or transferring knowledge) from other related tasks. Recently, transfer distance metric learning (TDML) has attracted lots of interests, but most of these methods assume that feature representations for the source and target learning tasks are the same. Hence, they are not suitable for the applications, in which the data are from heterogeneous domains (feature spaces, modalities and even semantics). Although some existing heterogeneous transfer learning (HTL) approaches is able to handle such domains, they lack flexibility in real-world applications, and the learned transformations are often restricted to be linear. We therefore develop a general and flexible heterogeneous TDML (HTDML) framework based on the knowledge fragment transfer strategy. In the proposed HTDML, any (linear or nonlinear) distance metric learning algorithms can be employed to learn the source metric beforehand. Then a set of knowledge fragments are extracted from the pre-learned source metric to help target metric learning. In addition, either linear or nonlinear distance metric can be learned for the target domain. Extensive experiments on both scene classification and object recognition demonstrate superiority of the proposed method.
Keywords:
Machine Learning: Machine Learning
Machine Learning: Transfer, Adaptation, Multi-task Learning