Maximum Expected Likelihood Estimation for Zero-resource Neural Machine Translation

Maximum Expected Likelihood Estimation for Zero-resource Neural Machine Translation

Hao Zheng, Yong Cheng, Yang Liu

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 4251-4257. https://doi.org/10.24963/ijcai.2017/594

While neural machine translation (NMT) has made remarkable progress in translating a handful of high-resource language pairs recently, parallel corpora are not always available for many zero-resource language pairs. To deal with this problem, we propose an approach to zero-resource NMT via maximum expected likelihood estimation. The basic idea is to maximize the expectation with respect to a pivot-to-source translation model for the intended source-to-target model on a pivot-target parallel corpus. To approximate the expectation, we propose two methods to connect the pivot-to-source and source-to-target models. Experiments on two zero-resource language pairs show that the proposed approach yields substantial gains over baseline methods. We also observe that when trained jointly with the source-to-target model, the pivot-to-source translation model also obtains improvements over independent training.
Keywords:
Natural Language Processing: Machine Translation
Natural Language Processing: Natural Language Processing