Exploiting High-Order Information in Heterogeneous Multi-Task Feature Learning

Exploiting High-Order Information in Heterogeneous Multi-Task Feature Learning

Yong Luo, Dacheng Tao, Yonggang Wen

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 2443-2449. https://doi.org/10.24963/ijcai.2017/340

Multi-task feature learning (MTFL) aims to improve the generalization performance of multiple related learning tasks by sharing features between them. It has been successfully applied to many pattern recognition and biometric prediction problems. Most of current MTFL methods assume that different tasks exploit the same feature representation, and thus are not applicable to the scenarios where data are drawn from heterogeneous domains. Existing heterogeneous transfer learning (including multi-task learning) approaches handle multiple heterogeneous domains by usually learning feature transformations across different domains, but they ignore the high-order statistics (correlation information) which can only be discovered by simultaneously exploring all domains. We therefore develop a tensor based heterogeneous MTFL (THMTFL) framework to exploit such high-order information. Specifically, feature transformations of all domains are learned together, and finally used to derive new representations. A connection between all domains is built by using the transformations to project the pre-learned predictive structures of different domains into a common subspace, and minimizing their divergence in the subspace. By exploring the high-order information, the proposed THMTFL can obtain more reliable feature transformations compared with existing heterogeneous transfer learning approaches. Extensive experiments on both text categorization and social image annotation demonstrate superiority of the proposed method.
Keywords:
Machine Learning: Machine Learning
Machine Learning: Transfer, Adaptation, Multi-task Learning