Discriminative Feature Selection via A Structured Sparse Subspace Learning Module

Discriminative Feature Selection via A Structured Sparse Subspace Learning Module

Zheng Wang, Feiping Nie, Lai Tian, Rong Wang, Xuelong Li

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3009-3015. https://doi.org/10.24963/ijcai.2020/416

In this paper, we first propose a novel Structured Sparse Subspace Learning S^3L module to address the long-standing subspace sparsity issue. Elicited by proposed module, we design a new discriminative feature selection method, named Subspace Sparsity Discriminant Feature Selection S^2DFS which enables the following new functionalities: 1) Proposed S^2DFS method directly joints trace ratio objective and structured sparse subspace constraint via L2,0-norm to learn a row-sparsity subspace, which improves the discriminability of model and overcomes the parameter-tuning trouble with comparison to the methods used L2,1-norm regularization; 2) An alternative iterative optimization algorithm based on the proposed S^3L module is presented to explicitly solve the proposed problem with a closed-form solution and strict convergence proof. To our best knowledge, such objective function and solver are first proposed in this paper, which provides a new though for the development of feature selection methods. Extensive experiments conducted on several high-dimensional datasets demonstrate the discriminability of selected features via S^2DFS with comparison to several related SOTA feature selection methods. Source matlab code: https://github.com/StevenWangNPU/L20-FS.
Keywords:
Machine Learning: Feature Selection; Learning Sparse Models
Data Mining: Feature Extraction, Selection and Dimensionality Reduction