Feature Statistics Guided Efficient Filter Pruning
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2619-2625. https://doi.org/10.24963/ijcai.2020/363
Building compact convolutional neural networks (CNNs) with reliable performance is a critical but challenging task, especially when deploying them in real-world applications. As a common approach to reduce the size of CNNs, pruning methods delete part of the CNN filters according to some metrics such as l1-norm. However, previous methods hardly leverage the information variance in a single feature map and the similarity characteristics among feature maps. In this paper, we propose a novel filter pruning method, which incorporates two kinds of feature map selections: diversity-aware selection (DFS) and similarity-aware selection (SFS). DFS aims to discover features with low information diversity while SFS removes features that have high similarities with others. We conduct extensive empirical experiments with various CNN architectures on publicly available datasets. The experimental results demonstrate that our model obtains up to 91.6% parameter decrease and 83.7% FLOPs reduction with almost no accuracy loss.
Machine Learning: Deep Learning: Convolutional networks
Machine Learning: Feature Selection; Learning Sparse Models
Machine Learning: Deep Learning
Data Mining: Feature Extraction, Selection and Dimensionality Reduction