Partial Multi-Label Learning via Multi-Subspace Representation

Partial Multi-Label Learning via Multi-Subspace Representation

Ziwei Li, Gengyu Lyu, Songhe Feng

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2612-2618. https://doi.org/10.24963/ijcai.2020/362

Partial Multi-Label Learning (PML) aims to learn from the training data where each instance is associated with a set of candidate labels, among which only a part of them are relevant. Existing PML methods mainly focus on label disambiguation, while they lack the consideration of noise in the feature space. To tackle the problem, we propose a novel framework named partial multi-label learning via MUlti-SubspacE Representation (MUSER), where the redundant labels together with noisy features are jointly taken into consideration during the training process. Specifically, we first decompose the original label space into a latent label subspace and a label correlation matrix to reduce the negative effects of redundant labels, then we utilize the correlations among features to project the original noisy feature space to a feature subspace to resist the noisy feature information. Afterwards, we introduce a graph Laplacian regularization to constrain the label subspace to keep intrinsic structure among features and impose an orthogonality constraint on the correlations among features to guarantee discriminability of the feature subspace. Extensive experiments conducted on various datasets demonstrate the superiority of our proposed method.
Keywords:
Machine Learning: Multi-instance;Multi-label;Multi-view learning
Data Mining: Feature Extraction, Selection and Dimensionality Reduction