Synthesizing Samples for Zero-shot Learning

Synthesizing Samples for Zero-shot Learning

Yuchen Guo, Guiguang Ding, Jungong Han, Yue Gao

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 1774-1780. https://doi.org/10.24963/ijcai.2017/246

Zero-shot learning (ZSL) is to construct recognition models for unseen target classes that have no labeled samples for training. It utilizes the class attributes or semantic vectors as side information and transfers supervision information from related source classes with abundant labeled samples. Existing ZSL approaches adopt an intermediary embedding space to measure the similarity between a sample and the attributes of a target class to perform zero-shot classification. However, this way may suffer from the information loss caused by the embedding process and the similarity measure cannot fully make use of the data distribution. In this paper, we propose a novel approach which turns the ZSL problem into a conventional supervised learning problem by synthesizing samples for the unseen classes. Firstly, the probability distribution of an unseen class is estimated by using the knowledge from seen classes and the class attributes. Secondly, the samples are synthesized based on the distribution for the unseen class. Finally, we can train any supervised classifiers based on the synthesized samples. Extensive experiments on benchmarks demonstrate the superiority of the proposed approach to the state-of-the-art ZSL approaches.
Keywords:
Machine Learning: Feature Selection/Construction
Robotics and Vision: Robotics and Vision
Machine Learning: Classification
Machine Learning: Transfer, Adaptation, Multi-task Learning