Label Distribution for Learning with Noisy Labels

Label Distribution for Learning with Noisy Labels

Yun-Peng Liu, Ning Xu, Yu Zhang, Xin Geng

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2568-2574. https://doi.org/10.24963/ijcai.2020/356

The performances of deep neural networks (DNNs) crucially rely on the quality of labeling. In some situations, labels are easily corrupted, and therefore some labels become noisy labels. Thus, designing algorithms that deal with noisy labels is of great importance for learning robust DNNs. However, it is difficult to distinguish between clean labels and noisy labels, which becomes the bottleneck of many methods. To address the problem, this paper proposes a novel method named Label Distribution based Confidence Estimation (LDCE). LDCE estimates the confidence of the observed labels based on label distribution. Then, the boundary between clean labels and noisy labels becomes clear according to confidence scores. To verify the effectiveness of the method, LDCE is combined with the existing learning algorithm to train robust DNNs. Experiments on both synthetic and real-world datasets substantiate the superiority of the proposed algorithm against state-of-the-art methods.
Keywords:
Machine Learning: Classification
Machine Learning: Deep Learning