SitNet: Discrete Similarity Transfer Network for Zero-shot Hashing

SitNet: Discrete Similarity Transfer Network for Zero-shot Hashing

Yuchen Guo, Guiguang Ding, Jungong Han, Yue Gao

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 1767-1773. https://doi.org/10.24963/ijcai.2017/245

Hashing has been widely utilized for fast image retrieval recently. With semantic information as supervision, hashing approaches perform much better, especially when combined with deep convolution neural network(CNN). However, in practice, new concepts emerge every day, making collecting supervised information for re-training hashing model infeasible. In this paper, we propose a novel zero-shot hashing approach, called Discrete Similarity Transfer Network (SitNet), to preserve the semantic similarity between images from both ``seen'' concepts and new ``unseen'' concepts. Motivated by zero-shot learning, the semantic vectors of concepts are adopted to capture the similarity structures among classes, making the model trained with seen concepts generalize well for unseen ones benefiting from the transferability of the semantic vector space. We adopt a multi-task architecture to exploit the supervised information for seen concepts and the semantic vectors simultaneously. Moreover, a discrete hashing layer is integrated into the network for hashcode generating to avoid the information loss caused by real-value relaxation in training phase, which is a critical problem in existing works. Experiments on three benchmarks validate the superiority of SitNet to the state-of-the-arts.
Keywords:
Machine Learning: Data Mining
Machine Learning: Feature Selection/Construction
Machine Learning: Deep Learning