UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

Future Sight: Dynamic Story Generation with Large Pretrained Language Models

dc.contributor.authorZimmerman, Brian
dc.date.accessioned2022-08-23T20:21:36Z
dc.date.available2022-08-23T20:21:36Z
dc.date.issued2022-08-23
dc.date.submitted2022-08-12
dc.description.abstractAutomated story generation has been an open problem in computing for many decades. Only with the recent wave of deep learning research have neural networks been applied to automated story generation tasks. Current deep learning agents for automated story generation typically ingest a prompt or storyline on which to condition generated text. This approach lacks the dynamism to include elements of a story only decided by the model during inference. We build an interactive system using pretrained transformers finetuned on a novel objective to temporally interpolate between a story context c and a future plot event f. At inference time, users can suggest future plot events along with a distance, in sentences, to coerce a transformer decoder towards generating sentences that would both remain consistent with a story context and logically conclude with the future event. The results of our experiments demonstrate that there is a notion of adherence to both context and future in some, but not all, cases. We discuss in detail potential explanations as to why the model fails to condition on some contexts and futures with respect to the data and the parameters of our model. We include examples sampled from our model to motivate this discussion.en
dc.identifier.urihttp://hdl.handle.net/10012/18628
dc.language.isoenen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectmachine learningen
dc.subjectnatural language processingen
dc.subjectstory generationen
dc.subjectpretrained language modelsen
dc.titleFuture Sight: Dynamic Story Generation with Large Pretrained Language Modelsen
dc.typeMaster Thesisen
uws-etd.degreeMaster of Mathematicsen
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms0en
uws.contributor.advisorVechtomova, Olga
uws.contributor.affiliation1Faculty of Mathematicsen
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Zimmerman_Brian.pdf
Size:
1.1 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: