Multi-Class Support Vector Machine via Maximizing Multi-Class Margins

Multi-Class Support Vector Machine via Maximizing Multi-Class Margins

Jie Xu, Xianglong Liu, Zhouyuan Huo, Cheng Deng, Feiping Nie, Heng Huang

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 3154-3160. https://doi.org/10.24963/ijcai.2017/440

Support Vector Machine (SVM) is originally proposed as a binary classification model, and it has already achieved great success in different applications. In reality, it is more often to solve a problem which has more than two classes. So, it is natural to extend SVM to a multi-class classifier. There have been many works proposed to construct a multi-class classifier based on binary SVM, such as one versus all strategy, one versus one strategy and Weston's multi-class SVM. One versus all strategy and one versus one strategy split the multi-class problem to multiple binary classification subproblems, and we need to train multiple binary classifiers. Weston's multi-class SVM is formed by ensuring risk constraints and imposing a specific regularization, like Frobenius norm. It is not derived by maximizing the margin between hyperplane and training data which is the motivation in SVM. In this paper, we propose a multi-class SVM model from the perspective of maximizing margin between training points and hyperplane, and analyze the relation between our model and other related methods. In the experiment, it shows that our model can get better or compared results when comparing with other related methods.
Keywords:
Machine Learning: Classification
Machine Learning: Semi-Supervised Learning