Accelerated Difference of Convex functions Algorithm and its Application to Sparse Binary Logistic Regression

Accelerated Difference of Convex functions Algorithm and its Application to Sparse Binary Logistic Regression

Duy Nhat Phan, Hoai Minh Le, Hoai An Le Thi

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 1369-1375. https://doi.org/10.24963/ijcai.2018/190

In this work, we present a variant of DCA (Difference of Convex function Algorithm) with the aim to improve its convergence speed. The proposed algorithm, named Accelerated DCA (ADCA), consists in incorporating the Nesterov's acceleration technique into DCA. We first investigate ADCA for solving the standard DC program and rigorously study its convergence properties and the convergence rate. Secondly, we develop ADCA for a special case of the standard DC program whose the objective function is the sum of a differentiable with L-Lipschitz gradient function (possibly nonconvex) and a nonsmooth DC function. We exploit the special structure of the problem to propose an efficient DC decomposition for which the corresponding ADCA scheme is inexpensive. As an application, we consider the sparse binary logistic regression problem. Numerical experiments on several benchmark datasets illustrate the efficiency of our algorithm and its superiority over well-known methods.
Keywords:
Constraints and SAT: Constraint Optimisation
Machine Learning: Classification
Machine Learning: Feature Selection ; Learning Sparse Models