Robust Kernel Dictionary Learning Using a Whole Sequence Convergent Algorithm / 3678
Huaping Liu, Jie Qin, Hong Cheng, Fuchun Sun
Kernel sparse coding is an effective strategy to capture the non-linear structure of data samples. However,how to learn a robust kernel dictionary remains an open problem. In this paper, we propose a new optimization model to learn the robust kernel dictionary while isolating outliers in the training samples. This model is essentially based on the decomposition of the reconstruction error into small dense noises and large sparse outliers. The outlier error term is formulated as the product of the sample matrix in the feature space and a diagonal coefficient matrix. This facilitates the kernelized dictionary learning. To solve the non-convex optimization problem, we develop a whole sequence convergent algorithm which guarantees the obtained solution sequence is a Cauchy sequence. The experimental results show that the proposed robust kernel dictionary learning method provides significant performance improvement.