Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart for Nonconvex Optimization

Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart for Nonconvex Optimization

Yi Zhou, Zhe Wang, Kaiyi Ji, Yingbin Liang, Vahid Tarokh

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 1445-1451. https://doi.org/10.24963/ijcai.2020/201

Various types of parameter restart schemes have been proposed for proximal gradient algorithm with momentum to facilitate their convergence in convex optimization. However, under parameter restart, the convergence of proximal gradient algorithm with momentum remains obscure in nonconvex optimization. In this paper, we propose a novel proximal gradient algorithm with momentum and parameter restart for solving nonconvex and nonsmooth problems. Our algorithm is designed to 1) allow for adopting flexible parameter restart schemes that cover many existing ones; 2) have a global sub-linear convergence rate in nonconvex and nonsmooth optimization; and 3) have guaranteed convergence to a critical point and have various types of asymptotic convergence rates depending on the parameterization of local geometry in nonconvex and nonsmooth optimization. Numerical experiments demonstrate the convergence and effectiveness of our proposed algorithm.
Keywords:
Data Mining: Theoretical Foundations
Constraints and SAT: Constraint Optimization