Amalgamating Filtered Knowledge: Learning Task-customized Student from Multi-task Teachers
Amalgamating Filtered Knowledge: Learning Task-customized Student from Multi-task Teachers
Jingwen Ye, Xinchao Wang, Yixin Ji, Kairi Ou, Mingli Song
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 4128-4134.
https://doi.org/10.24963/ijcai.2019/573
Many well-trained Convolutional Neural Network~(CNN) models have now been released online by developers for the sake of effortless reproducing. In this paper, we treat such pre-trained networks as teachers and explore how to learn a target student network for customized tasks, using multiple teachers that handle different tasks. We assume no human-labelled annotations are available, and each teacher model can be either single- or multi-task network, where the former is a degenerated case of the latter. The student model, depending on the customized tasks, learns the related knowledge filtered from the multiple teachers, and eventually masters the complete or a subset of expertise from all teachers. To this end, we adopt a layer-wise training strategy, which entangles the student's network block to be learned with the corresponding teachers. As demonstrated on several benchmarks, the learned student network achieves very promising results, even outperforming the teachers on the customized tasks.
Keywords:
Machine Learning: Developmental Learning
Machine Learning: Classification
Machine Learning: Learning Graphical Models
Machine Learning: Deep Learning
Computer Vision: Recognition: Detection, Categorization, Indexing, Matching, Retrieval, Semantic Interpretation
Computer Vision: Structural and Model-Based Approaches, Knowledge Representation and Reasoning
Machine Learning: Dimensionality Reduction and Manifold Learning