Neurons Merging Layer: Towards Progressive Redundancy Reduction for Deep Supervised Hashing

Neurons Merging Layer: Towards Progressive Redundancy Reduction for Deep Supervised Hashing

Chaoyou Fu, Liangchen Song, Xiang Wu, Guoli Wang, Ran He

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 2322-2328. https://doi.org/10.24963/ijcai.2019/322

Deep supervised hashing has become an active topic in information retrieval. It generates hashing bits by the output neurons of a deep hashing network. During binary discretization, there often exists much redundancy between hashing bits that degenerates retrieval performance in terms of both storage and accuracy. This paper proposes a simple yet effective Neurons Merging Layer (NMLayer) for deep supervised hashing. A graph is constructed to represent the redundancy relationship between hashing bits that is used to guide the learning of a hashing network. Specifically, it is dynamically learned by a novel mechanism defined in our active and frozen phases. According to the learned relationship, the NMLayer merges the redundant neurons together to balance the importance of each output neuron. Moreover, multiple NMLayers are progressively trained for a deep hashing network to learn a more compact hashing code from a long redundant code. Extensive experiments on four datasets demonstrate that our proposed method outperforms state-of-the-art hashing methods.
Keywords:
Machine Learning: Deep Learning
Multidisciplinary Topics and Applications: Information Retrieval
Computer Vision: Computer Vision