Feature Selection via Scaling Factor Integrated Multi-Class Support Vector Machines

Feature Selection via Scaling Factor Integrated Multi-Class Support Vector Machines

Jinglin Xu, Feiping Nie, Junwei Han

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 3168-3174. https://doi.org/10.24963/ijcai.2017/442

In data mining, we often encounter high dimensional and noisy features, which may not only increase the load of computational resources but also result in the problem of model overfitting. Feature selection is often adopted to address this issue. In this paper, we propose a novel feature selection method based on multi-class SVM, which introduces the scaling factor with a flexible parameter to renewedly adjust the distribution of feature weights and select the most discriminative features. Concretely, the proposed method designs a scaling factor with p/2 power to control the distribution of weights adaptively and search optimal sparsity of weighting matrix. In addition, to solve the proposed model, we provide an alternative and iterative optimization method. It not only makes solutions of weighting matrix and scaling factor independently, but also provides a better way to address the problem of solving L2,0-norm. Comprehensive experiments are conducted on six datasets to demonstrate that this work can obtain better performance compared with a number of existing state-of-the-art multi-class feature selection methods.
Keywords:
Machine Learning: Classification
Machine Learning: Feature Selection/Construction
Machine Learning: Machine Learning