Spectral Feature Scaling Method for Supervised Dimensionality Reduction

Spectral Feature Scaling Method for Supervised Dimensionality Reduction

Momo Matsuda, Keiichi Morikuni, Tetsuya Sakurai

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 2560-2566. https://doi.org/10.24963/ijcai.2018/355

Spectral dimensionality reduction methods enable linear separations of complex data with high-dimensional features in a reduced space. However, these methods do not always give the desired results due to irregularities or uncertainties of the data. Thus, we consider aggressively modifying the scales of the features to obtain the desired classification. Using prior knowledge on the labels of partial samples to specify the Fiedler vector, we formulate an eigenvalue problem of a linear matrix pencil whose eigenvector has the feature scaling factors. The resulting factors can modify the features of entire samples to form clusters in the reduced space, according to the known labels. In this study, we propose new dimensionality reduction methods supervised using the feature scaling associated with the spectral clustering. Numerical experiments show that the proposed methods outperform well-established supervised methods for toy problems with more samples than features, and are more robust regarding clustering than existing methods. Also, the proposed methods outperform existing methods regarding classification for real-world problems with more features than samples of gene expression profiles of cancer diseases. Furthermore, the feature scaling tends to improve the clustering and classification accuracies of existing unsupervised methods, as the proportion of training data increases.
Keywords:
Machine Learning: Classification
Machine Learning: Data Mining
Machine Learning: Machine Learning
Machine Learning: Knowledge-based Learning
Machine Learning: Dimensionality Reduction and Manifold Learning
Machine Learning: Clustering