Direct Sparsity Optimization Based Feature Selection for Multi-Class Classification / 1918
Hanyang Peng, Yong Fan
A novel sparsity optimization method is proposed to select features for multi-class classification problems by directly optimizing a ℓ2,p-norm (0 < p ≤ 1) based sparsity function subject to data-fitting inequality constraints to obtain large between-class margins. The direct sparse optimization method circumvents the empirical tuning of regularization parameters in existing feature selection methods that adopt the sparsity model as a regularization term. To solve the direct sparsity optimization problem that is non-smooth and non-convex when 0 < p < 1, we propose an efficient iterative algorithm with proved convergence by converting it to a convex and smooth optimization problem at every iteration step. The proposed algorithm has been evaluated based on publicly available datasets. The experiments have demonstrated that our algorithm could achieve feature selection performance competitive to state-of-the-art algorithms.