Abstract

Proceedings Abstracts of the Twenty-Fourth International Joint Conference on Artificial Intelligence

Multi-view Self-Paced Learning for Clustering / 3974
Chang Xu, Dacheng Tao, Chao Xu
PDF

Exploiting the information from multiple views can improve clustering accuracy. However, most existing multi-view clustering algorithms are non-convex and are thus prone to becoming stuck into bad local minima, especially when there are outliers and missing data. To overcome this problem, we present a new multi-view self-paced learning (MSPL) algorithm for clustering, that learns the multi-view model by not only progressing from 'easy' to 'complex' examples, but also from 'easy' to 'complex' views. Instead of binarily separating the examples or views into 'easy' and 'complex', we design a novel probabilistic smoothed weighting scheme. Employing multiple views for clustering and defining complexity across both examples and views are shown theoretically to be beneficial to optimal clustering. Experimental results on toy and real-world data demonstrate the efficacy of the proposed algorithm.