A Survey on Low-Resource Neural Machine Translation

A Survey on Low-Resource Neural Machine Translation

Rui Wang, Xu Tan, Renqian Luo, Tao Qin, Tie-Yan Liu

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Survey Track. Pages 4636-4643. https://doi.org/10.24963/ijcai.2021/629

Neural approaches have achieved state-of-the-art accuracy on machine translation but suffer from the high cost of collecting large scale parallel data. Thus, a lot of research has been conducted for neural machine translation (NMT) with very limited parallel data, i.e., the low-resource setting. In this paper, we provide a survey for low-resource NMT and classify related works into three categories according to the auxiliary data they used: (1) exploiting monolingual data of source and/or target languages, (2) exploiting data from auxiliary languages, and (3) exploiting multi-modal data. We hope that our survey can help researchers to better understand this field and inspire them to design better algorithms, and help industry practitioners to choose appropriate algorithms for their applications.
Keywords:
Natural language processing: General
Machine learning: General