Accelerating Convolutional Networks via Global & Dynamic Filter Pruning
Accelerating Convolutional Networks via Global & Dynamic Filter Pruning
Shaohui Lin, Rongrong Ji, Yuchao Li, Yongjian Wu, Feiyue Huang, Baochang Zhang
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 2425-2432.
https://doi.org/10.24963/ijcai.2018/336
Accelerating convolutional neural networks has recently received ever-increasing research focus. Among various approaches proposed in the literature, filter pruning has been regarded as a promising solution, which is due to its advantage in significant speedup and memory reduction of both network model and intermediate feature maps. To this end, most approaches tend to prune filters in a layer-wise fixed manner, which is incapable to dynamically recover the previously removed filter, as well as jointly optimize the pruned network across layers. In this paper, we propose a novel global & dynamic pruning (GDP) scheme to prune redundant filters for CNN acceleration. In particular, GDP first globally prunes the unsalient filters across all layers by proposing a global discriminative function based on prior knowledge of filters. Second, it dynamically updates the filter saliency all over the pruned sparse network, and then recover the mistakenly pruned filter, followed by a retraining phase to improve the model accuracy. Specially, we effectively solve the corresponding non-convex optimization problem of the proposed GDP via stochastic gradient descent with greedy alternative updating. Extensive experiments show that, comparing to the state-of-the-art filter pruning methods, the proposed approach achieves superior performance to accelerate several cutting-edge CNNs on the ILSVRC 2012 benchmark.
Keywords:
Machine Learning: Classification
Machine Learning: Deep Learning