Dialog-to-Action: Conversational Question Answering Over A Large-Scale Knowledge Base

Authors:
Daya Guo Sun Yat-Sen University
Duyu Tang Microsoft Research
Nan Duan Microsoft Research
Ming Zhou Microsoft Research
Jian Yin Sun Yat-Sen University

Introduction:

The authors present an approach to map utterances in conversation to logical forms, which will be executed on a large-scale knowledge base.

Abstract:

We present an approach to map utterances in conversation to logical forms, which will be executed on a large-scale knowledge base. To handle enormous ellipsis phenomena in conversation, we introduce dialog memory management to manipulate historical entities, predicates, and logical forms when inferring the logical form of current utterances. Dialog memory management is embodied in a generative model, in which a logical form is interpreted in a top-down manner following a small and flexible grammar. We learn the model from denotations without explicit annotation of logical forms, and evaluate it on a large-scale dataset consisting of 200K dialogs over 12.8M entities. Results verify the benefits of modeling dialog memory, and show that our semantic parsing-based approach outperforms a memory network based encoder-decoder model by a huge margin.

You may want to know: