Infobox-to-text Generation with Tree-like Planning based Attention Network

Infobox-to-text Generation with Tree-like Planning based Attention Network

Yang Bai, Ziran Li, Ning Ding, Ying Shen, Hai-Tao Zheng

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3773-3779. https://doi.org/10.24963/ijcai.2020/522

We study the problem of infobox-to-text generation that aims to generate a textual description from a key-value table. Representing the input infobox as a sequence, previous neural methods using end-to-end models without order-planning suffer from the problems of incoherence and inadaptability to disordered input. Recent planning-based models only implement static order-planning to guide the generation, which may cause error propagation between planning and generation. To address these issues, we propose a Tree-like PLanning based Attention Network (Tree-PLAN) which leverages both static order-planning and dynamic tuning to guide the generation. A novel tree-like tuning encoder is designed to dynamically tune the static order-plan for better planning by merging the most relevant attributes together layer by layer. Experiments conducted on two datasets show that our model outperforms previous methods on both automatic and human evaluation, and demonstrate that our model has better adaptability to disordered input.
Keywords:
Natural Language Processing: Natural Language Generation
Natural Language Processing: Natural Language Processing