Accelerating Stratified Sampling SGD by Reconstructing Strata

Accelerating Stratified Sampling SGD by Reconstructing Strata

Weijie Liu, Hui Qian, Chao Zhang, Zebang Shen, Jiahao Xie, Nenggan Zheng

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2725-2731. https://doi.org/10.24963/ijcai.2020/378

In this paper, a novel stratified sampling strategy is designed to accelerate the mini-batch SGD. We derive a new iteration-dependent surrogate which bound the stochastic variance from above. To keep the strata minimizing this surrogate with high probability, a stochastic stratifying algorithm is adopted in an adaptive manner, that is, in each iteration, strata are reconstructed only if an easily verifiable condition is met. Based on this novel sampling strategy, we propose an accelerated mini-batch SGD algorithm named SGD-RS. Our theoretical analysis shows that the convergence rate of SGD-RS is superior to the state-of-the-art. Numerical experiments corroborate our theory and demonstrate that SGD-RS achieves at least 3.48-times speed-ups compared to vanilla minibatch SGD.
Keywords:
Machine Learning: Deep-learning Theory
Machine Learning: Deep Learning
Machine Learning: Clustering
Machine Learning: Deep Learning: Convolutional networks