Text Style Transfer via Learning Style Instance Supported Latent Space

Text Style Transfer via Learning Style Instance Supported Latent Space

Xiaoyuan Yi, Zhenghao Liu, Wenhao Li, Maosong Sun

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3801-3807. https://doi.org/10.24963/ijcai.2020/526

Text style transfer pursues altering the style of a sentence while remaining its main content unchanged. Due to the lack of parallel corpora, most recent work focuses on unsupervised methods and has achieved noticeable progress. Nonetheless, the intractability of completely disentangling content from style for text leads to a contradiction of content preservation and style transfer accuracy. To address this problem, we propose a style instance supported method, StyIns. Instead of representing styles with embeddings or latent variables learned from single sentences, our model leverages the generative flow technique to extract underlying stylistic properties from multiple instances of each style, which form a more discriminative and expressive latent style space. By combining such a space with the attention-based structure, our model can better maintain the content and simultaneously achieve high transfer accuracy. Furthermore, the proposed method can be flexibly extended to semi-supervised learning so as to utilize available limited paired data. Experiments on three transfer tasks, sentiment modification, formality rephrasing, and poeticness generation, show that StyIns obtains a better balance between content and style, outperforming several recent baselines.
Keywords:
Natural Language Processing: Natural Language Generation
Natural Language Processing: Natural Language Processing