Better AMR-To-Text Generation with Graph Structure Reconstruction
Better AMR-To-Text Generation with Graph Structure Reconstruction
Tianming Wang, Xiaojun Wan, Shaowei Yao
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3919-3925.
https://doi.org/10.24963/ijcai.2020/542
AMR-to-text generation is a challenging task of generating texts from graph-based semantic representations. Recent studies formalize this task a graph-to-sequence learning problem and use various graph neural networks to model graph structure. In this paper, we propose a novel approach that generates texts from AMR graphs while reconstructing the input graph structures. Our model employs graph attention mechanism to aggregate information for encoding the inputs. Moreover, better node representations are learned by optimizing two simple but effective auxiliary reconstruction objectives: link prediction objective which requires predicting the semantic relationship between nodes, and distance prediction objective which requires predicting the distance between nodes. Experimental results on two benchmark datasets show that our proposed model improves considerably over strong baselines and achieves new state-of-the-art.
Keywords:
Natural Language Processing: Natural Language Generation
Natural Language Processing: Natural Language Processing