One-Shot Texture Retrieval with Global Context Metric

One-Shot Texture Retrieval with Global Context Metric

Kai Zhu, Wei Zhai, Zheng-Jun Zha, Yang Cao

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 4461-4467. https://doi.org/10.24963/ijcai.2019/620

In this paper, we tackle one-shot texture retrieval: given an example of a new reference texture, detect and segment all the pixels of the same texture category within an arbitrary image. To address this problem, we present an OS-TR network to encoding both reference patch and query image, leading to achieve texture segmentation towards the reference category. Unlike the existing texture encoding methods that integrate CNN with orderless pooling, we propose a directionality-aware network to capture the texture variations at each direction, resulting in spatially invariant representation. To segment new categories given only few examples, we incorporate a self-gating mechanism into relation network to exploit global context information for adjusting per-channel modulation weights of local relation features. Extensive experiments on benchmark texture datasets and real scenarios demonstrate the above-par segmentation performance and robust generalization across domains of our proposed method.
Keywords:
Machine Learning: Deep Learning
Computer Vision: Perception
Computer Vision: Computer Vision