Improving Deep Neural Network Sparsity through Decorrelation Regularization
Improving Deep Neural Network Sparsity through Decorrelation Regularization
Xiaotian Zhu, Wengang Zhou, Houqiang Li
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 3264-3270.
https://doi.org/10.24963/ijcai.2018/453
Modern deep learning models usually suffer high complexity in model size and computation when transplanted to resource constrained platforms. To this end, many works are dedicated to compressing deep neural networks. Adding group LASSO regularization is one of the most effective model compression methods since it generates structured sparse networks. We investigate the deep neural networks trained by group LASSO constraint and observe that even with strong sparsity regularization imposed, there still exists substantial filter correlation among the convolution filters, which is undesired for a compact neural network. We propose to suppress such correlation with a new kind of constraint called decorrelation regularization, which explicitly forces the network to learn a set of less correlated filters. The experiments on CIFAR10/100 and ILSVRC2012 datasets show that when combined our decorrelation regularization with group LASSO, the correlation between filters could be effectively weakened, which increases the sparsity of the resulting model and leads to better compressing performance.
Keywords:
Machine Learning: Neural Networks
Machine Learning: Feature Selection ; Learning Sparse Models
Machine Learning: Deep Learning