Regularising Knowledge Transfer by Meta Functional Learning

Regularising Knowledge Transfer by Meta Functional Learning

Pan Li, Yanwei Fu, Shaogang Gong

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 2687-2693. https://doi.org/10.24963/ijcai.2021/370

Machine learning classifiers’ capability is largely dependent on the scale of available training data and limited by the model overfitting in data-scarce learning tasks. To address this problem, this work proposes a novel Meta Functional Learning (MFL) by meta-learning a generalisable functional model from data-rich tasks whilst simultaneously regularising knowledge transfer to data-scarce tasks. The MFL computes meta-knowledge on functional regularisation generalisable to different learning tasks by which functional training on limited labelled data promotes more discriminative functions to be learned. Moreover, we adopt an Iterative Update strategy on MFL (MFL-IU). This improves knowledge transfer regularisation from MFL by progressively learning the functional regularisation in knowledge transfer. Experiments on three Few-Shot Learning (FSL) benchmarks (miniImageNet, CIFAR-FS and CUB) show that meta functional learning for regularisation knowledge transfer can benefit improving FSL classifiers.
Keywords:
Machine Learning: Classification
Machine Learning: Transfer, Adaptation, Multi-task Learning
Machine Learning: Weakly Supervised Learning