Backpropagation of Unrolled Solvers with Folded Optimization
Backpropagation of Unrolled Solvers with Folded Optimization
James Kotary, My H Dinh, Ferdinando Fioretto
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 1963-1970.
https://doi.org/10.24963/ijcai.2023/218
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which typically lacks a closed form. One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver. While flexible and general, unrolling can encounter accuracy and efficiency issues in practice. These issues can be avoided by analytical differentiation of the optimization, but current frameworks impose rigid requirements on the optimization problem's form. This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation. Additionally, it proposes a unifying view of unrolling and analytical differentiation through optimization mappings. Experiments over various model-based learning tasks demonstrate the advantages of the approach both computationally and in terms of enhanced expressiveness.
Keywords:
Constraint Satisfaction and Optimization: CSO: Constraint optimization
Machine Learning: ML: Applications
Machine Learning: ML: Optimization