Learning to Hash Naturally Sorts
Learning to Hash Naturally Sorts
Jiaguo Yu, Yuming Shen, Menghan Wang, Haofeng Zhang, Philip H.S. Torr
Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 1587-1593.
https://doi.org/10.24963/ijcai.2022/221
Learning to hash pictures a list-wise sorting problem. Its testing metrics, e.g., mean-average precision, count on a sorted candidate list ordered by pair-wise code similarity. However, scarcely does one train a deep hashing model with the sorted results end-to-end because of the non-differentiable nature of the sorting operation. This inconsistency in the objectives of training and test may lead to sub-optimal performance since the training loss often fails to reflect the actual retrieval metric. In this paper, we tackle this problem by introducing Naturally-Sorted Hashing (NSH). We sort the Hamming distances of samples' hash codes and accordingly gather their latent representations for self-supervised training. Thanks to the recent advances in differentiable sorting approximations, the hash head receives gradients from the sorter so that the hash encoder can be optimized along with the training procedure. Additionally, we describe a novel Sorted Noise-Contrastive Estimation (SortedNCE) loss that selectively picks positive and negative samples for contrastive learning, which allows NSH to mine data semantic relations during training in an unsupervised manner. Our extensive experiments show the proposed NSH model significantly outperforms the existing unsupervised hashing methods on three benchmarked datasets.
Keywords:
Computer Vision: Image and Video retrieval
Computer Vision: Representation Learning
Computer Vision: Transfer, low-shot, semi- and un- supervised learning