On the Convergence of Stochastic Compositional Gradient Descent Ascent Method

On the Convergence of Stochastic Compositional Gradient Descent Ascent Method

Hongchang Gao, Xiaoqian Wang, Lei Luo, Xinghua Shi

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 2389-2395. https://doi.org/10.24963/ijcai.2021/329

The compositional minimax problem covers plenty of machine learning models such as the distributionally robust compositional optimization problem. However, it is yet another understudied problem to optimize the compositional minimax problem. In this paper, we develop a novel efficient stochastic compositional gradient descent ascent method for optimizing the compositional minimax problem. Moreover, we establish the theoretical convergence rate of our proposed method. To the best of our knowledge, this is the first work achieving such a convergence rate for the compositional minimax problem. Finally, we conduct extensive experiments to demonstrate the effectiveness of our proposed method.
Keywords:
Machine Learning: Adversarial Machine Learning
Machine Learning: Cost-Sensitive Learning