Robust Learning from Noisy Side-information by Semidefinite Programming

Robust Learning from Noisy Side-information by Semidefinite Programming

En-Liang Hu, Quanming Yao

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 2514-2520. https://doi.org/10.24963/ijcai.2019/349

Robustness recently becomes one of the major concerns among machine learning community, since learning algorithms are usually vulnerable to outliers or corruptions. Motivated by such a trend and needs, we pursue robustness in semi-definite programming (SDP) in this paper. Specifically, this is done by replacing the commonly used squared loss with the more robust L1-loss in the low-rank SDP. However, the resulting objective becomes neither convex nor smooth. As no existing algorithms can be applied, we design an efficient algorithm, based on majorization-minimization, to optimize the objective. The proposed algorithm not only has cheap iterations and low space complexity but also theoretically converges to some critical points. Finally, empirical study shows that the new objective armed with proposed algorithm outperforms state-of-the-art in terms of both speed and accuracy.
Keywords:
Machine Learning: Semi-Supervised Learning
Machine Learning: Feature Selection ; Learning Sparse Models
Machine Learning: Dimensionality Reduction and Manifold Learning
Machine Learning Applications: Big data ; Scalability