UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

Dialog Response Generation Using Adversarially Learned Latent Bag-of-Words

Loading...
Thumbnail Image

Date

2020-08-28

Authors

Khan, Kashif

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

Dialog response generation is the task of generating response utterance given a query utterance. Apart from generating relevant and coherent responses, one would like the dialog generation model to generate diverse and informative sentences. In this work, we propose and explore a novel multi-stage dialog response generation approach. In the first stage of our proposed multi-stage approach, we construct a variational latent space on the bag-of-words representation of the query and response utterances. In the second stage, transformation from query latent code to response latent code is learned using an adversarial process. The final stage involves fine-tuning a pretrained transformer based model called text-to-text transfer (T5) (Raffel et al., 2019) using a novel training regimen to generate the response utterances by conditioning on the query utterance and the response word learned in the previous stage. We evaluate our proposed approach on two popular dialog datasets. Our proposed approach outperforms the baseline transformer model on multiple quantitative metrics including overlap metric (Bleu), diversity metrics (distinct-1 and distinct-2), and fluency metric (perplexity).

Description

Keywords

natural language generation, dialog response generation

LC Keywords

Citation