Incremental and Decremental Optimal Margin Distribution Learning

Incremental and Decremental Optimal Margin Distribution Learning

Li-Jun Chen, Teng Zhang, Xuanhua Shi, Hai Jin

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 3523-3531. https://doi.org/10.24963/ijcai.2023/392

Incremental and decremental learning (IDL) deals with the tasks where new data arrives sequentially as a stream or old data turns unavailable continually due to the privacy protection. Existing IDL methods mainly focus on support vector machine and its variants with linear-type loss. There are few studies about the quadratic-type loss, whose Lagrange multipliers are unbounded and much more difficult to track. In this paper, we take the latest statistical learning framework optimal margin distribution machine (ODM) which involves a quadratic-type loss due to the optimization of margin variance, for example, and equip it with the ability to handle IDL tasks. Our proposed ID-ODM can avoid updating the Lagrange multipliers in an infinite range by determining their optimal values beforehand so as to enjoy much more efficiency. Moreover, ID-ODM is also applicable when multiple instances come and leave simultaneously. Extensive empirical studies show that ID-ODM can achieve 9.1x speedup on average with almost no generalization lost compared to retraining ODM on new data set from scratch.
Keywords:
Machine Learning: ML: Classification
Machine Learning: ML: Incremental learning