UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

Future Sight: Dynamic Story Generation with Large Pretrained Language Models

Loading...
Thumbnail Image

Date

2022-08-23

Authors

Zimmerman, Brian

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

Automated story generation has been an open problem in computing for many decades. Only with the recent wave of deep learning research have neural networks been applied to automated story generation tasks. Current deep learning agents for automated story generation typically ingest a prompt or storyline on which to condition generated text. This approach lacks the dynamism to include elements of a story only decided by the model during inference. We build an interactive system using pretrained transformers finetuned on a novel objective to temporally interpolate between a story context c and a future plot event f. At inference time, users can suggest future plot events along with a distance, in sentences, to coerce a transformer decoder towards generating sentences that would both remain consistent with a story context and logically conclude with the future event. The results of our experiments demonstrate that there is a notion of adherence to both context and future in some, but not all, cases. We discuss in detail potential explanations as to why the model fails to condition on some contexts and futures with respect to the data and the parameters of our model. We include examples sampled from our model to motivate this discussion.

Description

Keywords

machine learning, natural language processing, story generation, pretrained language models

LC Keywords

Citation