Multiple Indefinite Kernel Learning for Feature Selection

Multiple Indefinite Kernel Learning for Feature Selection

Hui Xue, Yu Song, Hai-Ming Xu

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 3210-3216. https://doi.org/10.24963/ijcai.2017/448

Multiple kernel learning for feature selection (MKL-FS) utilizes kernels to explore complex properties of features and performs better in embedded methods. However, the kernels in MKL-FS are generally limited to be positive definite. In fact, indefinite kernels often emerge in actual applications and can achieve better empirical performance. But due to the non-convexity of indefinite kernels, existing MKL-FS methods are usually inapplicable and the corresponding research is also relatively little. In this paper, we propose a novel multiple indefinite kernel feature selection method (MIK-FS) based on the primal framework of indefinite kernel support vector machine (IKSVM), which applies an indefinite base kernel for each feature and then exerts an l1-norm constraint on kernel combination coefficients to select features automatically. A two-stage algorithm is further presented to optimize the coefficients of IKSVM and kernel combination alternately. In the algorithm, we reformulate the non-convex optimization problem of primal IKSVM as a difference of convex functions (DC) programming and transform the non-convex problem into a convex one with the affine minorization approximation. Experiments on real-world datasets demonstrate that MIK-FS is superior to some related state-of-the-art methods in both feature selection and classification performance.
Keywords:
Machine Learning: Feature Selection/Construction
Machine Learning: Kernel Methods
Machine Learning: Machine Learning