Completely Heterogeneous Transfer Learning with Attention - What And What Not To Transfer

Completely Heterogeneous Transfer Learning with Attention - What And What Not To Transfer

Seungwhan Moon, Jaime Carbonell

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 2508-2514. https://doi.org/10.24963/ijcai.2017/349

We study a transfer learning framework where source and target datasets are heterogeneous in both feature and label spaces. Specifically, we do not assume explicit relations between source and target tasks a priori, and thus it is crucial to determine what and what not to transfer from source knowledge. Towards this goal, we define a new heterogeneous transfer learning approach that (1) selects and attends to an optimized subset of source samples to transfer knowledge from, and (2) builds a unified transfer network that learns from both source and target knowledge. This method, termed "Attentional Heterogeneous Transfer", along with a newly proposed unsupervised transfer loss, improve upon the previous state-of-the-art approaches on extensive simulations as well as a challenging hetero-lingual text classification task.
Keywords:
Machine Learning: Transfer, Adaptation, Multi-task Learning
Machine Learning: Deep Learning
Natural Language Processing: Text Classification
Machine Learning: Neural Networks