Fast Recursive Low-rank Tensor Learning for Regression

Fast Recursive Low-rank Tensor Learning for Regression

Ming Hou, Brahim Chaib-draa

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 1851-1857. https://doi.org/10.24963/ijcai.2017/257

In this work, we develop a fast sequential low-rank tensor regression framework, namely recursive higher-order partial least squares (RHOPLS). It addresses the great challenges posed by the limited storage space and fast processing time required by dynamic environments when dealing with large-scale high-speed general tensor sequences. Smartly integrating a low-rank modification strategy of Tucker into a PLS-based framework, we efficiently update the regression coefficients by effectively merging the new data into the previous low-rank approximation of the model at a small-scale factor (feature) level instead of the large raw data (observation) level. Unlike batch models, which require accessing the entire data, RHOPLS conducts a blockwise recursive calculation scheme and thus only a small set of factors is needed to be stored. Our approach is orders of magnitude faster than all other methods while maintaining a highly comparable predictability with the cutting-edge batch methods, as verified on challenging real-life tasks.
Keywords:
Machine Learning: Machine Learning
Machine Learning: Online Learning
Machine Learning: Time-series/Data Streams
Machine Learning: Structured Learning