Learning From Multi-Dimensional Partial Labels

Learning From Multi-Dimensional Partial Labels

Haobo Wang, Weiwei Liu, Yang Zhao, Tianlei Hu, Ke Chen, Gang Chen

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2943-2949. https://doi.org/10.24963/ijcai.2020/407

Multi-dimensional classification has attracted huge attention from the community. Though most studies consider fully annotated data, in real practice obtaining fully labeled data in MDC tasks is usually intractable. In this paper, we propose a novel learning paradigm: MultiDimensional Partial Label Learning (MDPL) where the ground-truth labels of each instance are concealed in multiple candidate label sets. We first introduce the partial hamming loss for MDPL that incurs a large loss if the predicted labels are not in candidate label sets, and provide an empirical risk minimization (ERM) framework. Theoretically, we rigorously prove the conditions for ERM learnability of MDPL in both independent and dependent cases. Furthermore, we present two MDPL algorithms under our proposed ERM framework. Comprehensive experiments on both synthetic and real-world datasets validate the effectiveness of our proposals.
Keywords:
Machine Learning: Multi-instance;Multi-label;Multi-view learning