Sequence Generation: From Both Sides to the Middle

Sequence Generation: From Both Sides to the Middle

Long Zhou, Jiajun Zhang, Chengqing Zong, Heng Yu

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 5471-5477. https://doi.org/10.24963/ijcai.2019/760

The encoder-decoder framework has achieved promising process for many sequence generation tasks, such as neural machine translation and text summarization. Such a framework usually generates a sequence token by token from left to right, hence (1) this autoregressive decoding procedure is time-consuming when the output sentence becomes longer, and (2) it lacks the guidance of future context which is crucial to avoid under-translation. To alleviate these issues, we propose a synchronous bidirectional sequence generation (SBSG) model which predicts its outputs from both sides to the middle simultaneously. In the SBSG model, we enable the left-to-right (L2R) and right-to-left (R2L) generation to help and interact with each other by leveraging interactive bidirectional attention network. Experiments on neural machine translation (En-De, Ch-En, and En-Ro) and text summarization tasks show that the proposed model significantly speeds up decoding while improving the generation quality compared to the autoregressive Transformer.
Keywords:
Natural Language Processing: Natural Language Generation
Natural Language Processing: Machine Translation
Natural Language Processing: Natural Language Summarization