Flexible Orthogonal Neighborhood Preserving Embedding

Flexible Orthogonal Neighborhood Preserving Embedding

Tianji Pang, Feiping Nie, Junwei Han

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 2592-2598. https://doi.org/10.24963/ijcai.2017/361

In this paper, we propose a novel linear subspace learning algorithm called Flexible Orthogonal Neighborhood Preserving Embedding (FONPE), which is a linear approximation of Locally Linear Embedding (LLE) algorithm. Our novel objective function integrates two terms related to manifold smoothness and a flexible penalty defined on the projection fitness. Different from Neighborhood Preserving Embedding (NPE), we relax the hard constraint by modeling the mismatch between the approximate linear embedding and the original nonlinear embedding instead of enforcing them to be equal, which makes it better cope with the data sampled from a nonlinear manifold. Besides, instead of enforcing an orthogonality between the projected points, we enforce the mapping to be orthogonal. By using this method, FONPE tends to preserve distances and thus the overall geometry can be preserved. Unlike LLE, as FONPE has an explicit linear mapping between the input and the reduced spaces, it can handle novel testing data straightforwardly. Moreover, when the projection matrix in our model becomes an identity matrix, our model can be transformed to denoising LLE (DLLE). Compared with the standard LLE, we demonstrate that DLLE can handle data with noise better. Comprehensive experiments on several benchmark databases demonstrate the effectiveness of our algorithm.
Keywords:
Machine Learning: Data Mining
Machine Learning: Machine Learning
Machine Learning: Unsupervised Learning