A Convergent Solution to Tensor Subspace Learning

Huan Wang, Shuicheng Yan, Thomas Huang, Xiaoou Tang

Recently, substantial efforts have been devoted to the subspace learning techniques based on tensor representation, such as 2DLDA, DATER and Tensor Subspace Analysis (TSA). In this context, a vital yet unsolved problem is that the computational convergency of these iterative algorithms is not guaranteed. In this work, we present a novel solution procedure for general tensor-based subspace learning, followed by a detailed convergency proof of the solution projection matrices and the objective function value. Extensive experiments on real-world databases verify the high convergence speed of the proposed procedure, as well as its superiority in classification capability over traditional solution procedures.