Dialogue State Induction Using Neural Latent Variable Models

Dialogue State Induction Using Neural Latent Variable Models

Qingkai Min, Libo Qin, Zhiyang Teng, Xiao Liu, Yue Zhang

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3845-3852. https://doi.org/10.24963/ijcai.2020/532

Dialogue state modules are a useful component in a task-oriented dialogue system. Traditional methods find dialogue states by manually labeling training corpora, upon which neural models are trained. However, the labeling process can be costly, slow, error-prone, and more importantly, cannot cover the vast range of domains in real-world dialogues for customer service. We propose the task of dialogue state induction, building two neural latent variable models that mine dialogue states automatically from unlabeled customer service dialogue records. Results show that the models can effectively find meaningful dialogue states. In addition, equipped with induced dialogue states, a state-of-the-art dialogue system gives better performance compared with not using a dialogue state module.
Keywords:
Natural Language Processing: Dialogue