HOUDINI: Escaping from Moderately Constrained Saddles

HOUDINI: Escaping from Moderately Constrained Saddles

Dmitrii Avdiukhin, Grigory Yaroslavtsev

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 3442-3450. https://doi.org/10.24963/ijcai.2023/383

We give polynomial time algorithms for escaping from high-dimensional saddle points under a moderate number of constraints. Given gradient access to a smooth function, we show that (noisy) gradient descent methods can escape from saddle points under a logarithmic number of inequality constraints. While analogous results exist for unconstrained and equality-constrained problems, we make progress on the major open question of convergence to second-order stationary points in the case of inequality constraints, without reliance on NP-oracles or altering the definitions to only account for certain constraints. Our results hold for both regular and stochastic gradient descent.
Keywords:
Machine Learning: ML: Optimization