CTW: Confident Time-Warping for Time-Series Label-Noise Learning

CTW: Confident Time-Warping for Time-Series Label-Noise Learning

Peitian Ma, Zhen Liu, Junhao Zheng, Linghao Wang, Qianli Ma

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 4046-4054. https://doi.org/10.24963/ijcai.2023/450

Noisy labels seriously degrade the generalization ability of Deep Neural Networks (DNNs) in various classification tasks. Existing studies on label-noise learning mainly focus on computer vision, while time series also suffer from the same issue. Directly applying the methods from computer vision to time series may reduce the temporal dependency due to different data characteristics. How to make use of the properties of time series to enable DNNs to learn robust representations in the presence of noisy labels has not been fully explored. To this end, this paper proposes a method that expands the distribution of Confident instances by Time-Warping (CTW) to learn robust representations of time series. Specifically, since applying the augmentation method to all data may introduce extra mislabeled data, we select confident instances to implement Time-Warping. In addition, we normalize the distribution of the training loss of each class to eliminate the model's selection preference for instances of different classes, alleviating the class imbalance caused by sample selection. Extensive experimental results show that CTW achieves state-of-the-art performance on the UCR datasets when dealing with different types of noise. Besides, the t-SNE visualization of our method verifies that augmenting confident data improves the generalization ability. Our code is available at https://github.com/qianlima-lab/CTW.
Keywords:
Machine Learning: ML: Classification
Machine Learning: ML: Representation learning
Machine Learning: ML: Time series and data streams