Disentangled Face Attribute Editing via Instance-Aware Latent Space Search
Disentangled Face Attribute Editing via Instance-Aware Latent Space Search
Yuxuan Han, Jiaolong Yang, Ying Fu
Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 715-721.
https://doi.org/10.24963/ijcai.2021/99
Recent works have shown that a rich set of semantic directions exist in the latent space of Generative Adversarial Networks (GANs), which enables various facial attribute editing applications. However, existing methods may suffer poor attribute variation disentanglement, leading to unwanted change of other attributes when altering the desired one. The semantic directions used by existing methods are at attribute level, which are difficult to model complex attribute correlations, especially in the presence of attribute distribution bias in GAN's training set. In this paper, we propose a novel framework (IALS) that performs Instance-Aware Latent-Space Search to find semantic directions for disentangled attribute editing. The instance information is injected by leveraging the supervision from a set of attribute classifiers evaluated on the input images. We further propose a Disentanglement-Transformation (DT) metric to quantify the attribute transformation and disentanglement efficacy and find the optimal control factor between attribute-level and instance-specific directions based on it. Experimental results on both GAN-generated and real-world images collectively show that our method outperforms state-of-the-art methods proposed recently by a wide margin. Code is available at https://github.com/yxuhan/IALS.
Keywords:
Computer Vision: 2D and 3D Computer Vision
Machine Learning: Explainable/Interpretable Machine Learning