Differential Equations for Modeling Asynchronous Algorithms
Differential Equations for Modeling Asynchronous Algorithms
Li He, Qi Meng, Wei Chen, Zhi-Ming Ma, Tie-Yan Liu
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 2220-2226.
https://doi.org/10.24963/ijcai.2018/307
Asynchronous stochastic gradient descent (ASGD) is a popular parallel optimization algorithm in machine learning. Most theoretical analysis on ASGD take a discrete view and prove upper bounds for their convergence rates. However, the discrete view has its intrinsic limitations: there is no characterizationof the optimization path and the proof techniques are induction-based and thus usually complicated. Inspired by the recent successful adoptions of stochastic differential equations (SDE) to the theoretical analysis of SGD, in this paper, we study the continuous approximation of ASGD by using stochastic differential delay equations (SDDE). We introduce the approximation method and study the approximation error. Then we conduct theoretical analysis on the convergence rate of ASGD algorithm based on the continuous approximation.There are two methods: moment estimation and energy function minimization can be used to analyzethe convergence rates. Moment estimation depends on the specific form of the loss function, while energy function minimization only leverages the convex property of the loss function, and does not depend on its specific form. In addition to the convergence analysis, the continuous view also helps us derive better convergence rates. All of this clearly shows the advantage of taking the continuous view in gradient descent algorithms.
Keywords:
Machine Learning: Learning Theory
Machine Learning: Machine Learning