RTdetector: Deep Transformer Networks for Time Series Anomaly Detection Based on Reconstruction Trend
RTdetector: Deep Transformer Networks for Time Series Anomaly Detection Based on Reconstruction Trend
Xinhong Liu, Xiaoliang Li, Yangfan Li, Fengxiao Tang, Ming Zhao
Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 5788-5796.
https://doi.org/10.24963/ijcai.2025/644
Anomaly detection in multivariate time series data is critical across a variety of real-life applications. The predominant anomaly detection techniques currently rely on reconstruction-based methods. However, these methods often overfit the abnormal pattern and fail to diagnose the anomaly. Although some studies have attempted to prevent the incorrect fitting of anomalous data by enabling models to learn the trend of data variations, they fail to account for the dynamic nature of data distribution. This oversight can lead to the erroneous reconstruction of anomalies that do not exist. To address these challenges, we propose RTdetector, a Transformer-based time series anomaly detection model leveraging reconstruction trends. RTdetector employs a novel global attention mechanism based on reconstruction trends to learn distinguishable attention from the original sequence, thereby preserving the global trend information intrinsic to the time series. Additionally, it incorporates a self-conditioning transformer, based on reconstruction trend enhancement to achieve superior predictive performance. Extensive experiments on four datasets demonstrate that RTdetector achieves state-of-the-art results in multivariate time series data anomaly detection. Our code is available at https://github.com/CSUFUNLAB/RTdetector.
Keywords:
Machine Learning: ML: Time series and data streams
Machine Learning: ML: Unsupervised learning
