Time-Aware Multi-Scale RNNs for Time Series Modeling

Time-Aware Multi-Scale RNNs for Time Series Modeling

Zipeng Chen, Qianli Ma, Zhenxi Lin

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 2285-2291. https://doi.org/10.24963/ijcai.2021/315

Multi-scale information is crucial for modeling time series. Although most existing methods consider multiple scales in the time-series data, they assume all kinds of scales are equally important for each sample, making them unable to capture the dynamic temporal patterns of time series. To this end, we propose Time-Aware Multi-Scale Recurrent Neural Networks (TAMS-RNNs), which disentangle representations of different scales and adaptively select the most important scale for each sample at each time step. First, the hidden state of the RNN is disentangled into multiple independently updated small hidden states, which use different update frequencies to model time-series multi-scale information. Then, at each time step, the temporal context information is used to modulate the features of different scales, selecting the most important time-series scale. Therefore, the proposed model can capture the multi-scale information for each time series at each time step adaptively. Extensive experiments demonstrate that the model outperforms state-of-the-art methods on multivariate time series classification and human motion prediction tasks. Furthermore, visualized analysis on music genre recognition verifies the effectiveness of the model.
Keywords:
Machine Learning: Deep Learning
Machine Learning: Time-series; Data Streams