Faster Training Algorithms for Structured Sparsity-Inducing Norm

Faster Training Algorithms for Structured Sparsity-Inducing Norm

Bin Gu, Xingwang Ju, Xiang Li, Guansheng Zheng

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 2163-2169. https://doi.org/10.24963/ijcai.2018/299

Structured-sparsity regularization is popular for sparse learning because of its flexibility of encoding the feature structures. This paper considers a generalized version of structured-sparsity regularization (especially for $l_1/l_{\infty}$ norm) with arbitrary group overlap. Due to the group overlap, it is time-consuming to solve the associated proximal operator. Although Mairal~\shortcite{mairal2010network} have proposed a  network-flow  algorithm to solve the proximal operator, it is still time-consuming especially in the high-dimensional setting. To address this challenge, in this paper, we have developed a more efficient solution for $l_1/l_{\infty}$ group lasso with arbitrary group overlap using an Inexact Proximal-Gradient method. In each iteration, our algorithm only requires to calculate an inexact solution to the proximal sub-problem, which can be done efficiently. On the theoretic side, the proposed algorithm enjoys the same global convergence rate as the exact proximal methods. Experiments demonstrate that our algorithm is much more efficient than network-flow algorithm, while retaining the similar generalization performance.
Keywords:
Constraints and SAT: Constraint Optimisation
Constraints and SAT: Constraints and Data Mining ; Machine Learning