Proceedings Abstracts of the Twenty-Fourth International Joint Conference on Artificial Intelligence

Model Metric Co-Learning for Time Series Classification / 3387
Huanhuan Chen, Fengzhen Tang, Peter Tino, Anthony G. Cohn, Xin Yao

We present a novel model-metric co-learning (MMCL) methodology for sequence classification which learns in the model space — each data item (sequence) is represented by a predictive model from a carefully designed model class. MMCL learning encourages sequences from the same class to be represented by ‘close’ model representations, well separated from those for different classes. Existing approaches to the problem either fit a single model to all the data, or a (predominantly linear) model on each sequence. We introduce a novel hybrid approach spanning the two extremes. The model class we use is a special form of adaptive high-dimensional non-linear state space model with a highly constrained and simple dynamic part. The dynamic part is identical for all data items and acts as a temporal filter providing a rich pool of dynamic features that can be selectively extracted by individual (static) linear readout mappings representing the sequences. Alongside learning the dynamic part, we also learn the global metric in the model readout space. Experiments on synthetic and benchmark data sets confirm the effectiveness of the algorithm compared to a variety of alternative methods.