Learning from Complementary Labels via Partial-Output Consistency Regularization

Learning from Complementary Labels via Partial-Output Consistency Regularization

Deng-Bao Wang, Lei Feng, Min-Ling Zhang

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 3075-3081. https://doi.org/10.24963/ijcai.2021/423

In complementary-label learning (CLL), a multi-class classifier is learned from training instances each associated with complementary labels, which specify the classes that the instance does not belong to. Previous studies focus on unbiased risk estimator or surrogate loss while neglect the importance of regularization in training phase. In this paper, we give the first attempt to leverage regularization techniques for CLL. By decoupling a label vector into complementary labels and partial unknown labels, we simultaneously inhibit the outputs of complementary labels with a complementary loss and penalize the sensitivity of the classifier on the partial outputs of these unknown classes by consistency regularization. Then we unify the complementary loss and consistency loss together by a specially designed dynamic weighting factor. We conduct a series of experiments showing that the proposed method achieves highly competitive performance in CLL.
Keywords:
Machine Learning: Classification
Machine Learning: Weakly Supervised Learning