Neural Text Generation from Structured and Unstructured Data

Loading...
Thumbnail Image

Date

2019-08-28

Authors

Shahidi, Hamidreza

Advisor

Li, Ming
Lin, Jimmy

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

A number of researchers have recently questioned the necessity of increasingly complex neural network (NN) architectures. In particular, several recent papers have shown that simpler, properly tuned models are at least competitive across several natural language processing tasks. In this thesis, we show that this is also the case for text generation from structured and unstructured data. Specifically, we consider neural table-to-text generation and neural question generation (NQG) tasks for text generation from structured and unstructured data respectively. Table-to-text generation aims to generate a description based on a given table, and NQG is the task of generating a question from a given passage where the generated question can be answered by a certain sub-span of the passage using NN models. Experiments demonstrate that a basic attention-based sequence-to-sequence model trained with exponential moving average technique achieves state of the art in both tasks. We further investigate using reinforcement learning with different reward functions to refine our pre-trained model for both tasks.

Description

Keywords

deep learning, reinforcement learning, natural language processing, text generation

LC Subject Headings

Citation