UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

Variational Inference for Text Generation: Improving the Posterior

dc.contributor.authorBalasubramanian, Vikash
dc.date.accessioned2020-08-10T20:17:45Z
dc.date.available2020-08-10T20:17:45Z
dc.date.issued2020-08-10
dc.date.submitted2020-08-05
dc.description.abstractLearning useful representations of data is a crucial task in machine learning with wide ranging applications. In this thesis we explore improving representations of models based on variational inference by improving the posterior. We explore two approaches towards this goal: 1) auxiliary losses to regularize the latent space and enforcing desired properties and 2) normalizing flows to develop more flexible posteriors to be used during variational inference. We propose a proximity based loss function that helps in disentanglement by regularizing the latent space based on similarity according to a criterion. We evaluate our model on a task of disentangling semantics and syntax in sentences and empirically show that our model successfully manages to learn independent subspaces that learn semantics and syntax respectively. We compare our model to existing approaches using automated metrics and human evaluation to show that our model is competitive. We also explore the effectiveness of normalizing flows for representation learning and generative modeling. We perform experiments that empirically show that variational inference with normalizing flows beats standard approaches based on simple posteriors across various metrics in text generation and language modeling. We also propose a variant of planar normalizing flows called block planar normalizing flows for use in disentanglement tasks. We perform ablation experiments to empirically show that our proposed block planar flows help in improving disentanglement.en
dc.identifier.urihttp://hdl.handle.net/10012/16107
dc.language.isoenen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectMachine Learningen
dc.subjectNatural Language Processingen
dc.subjectNatural Language Generationen
dc.subjectArtificial Intelligenceen
dc.subjectNormalizing Flowsen
dc.subjectDisentanglementen
dc.subjectVariational Inferenceen
dc.titleVariational Inference for Text Generation: Improving the Posterioren
dc.typeMaster Thesisen
uws-etd.degreeMaster of Mathematicsen
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws.contributor.advisorVechtomova, Olga
uws.contributor.affiliation1Faculty of Mathematicsen
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Balasubramanian_Vikash.pdf
Size:
2.11 MB
Format:
Adobe Portable Document Format
Description:
Main thesis document
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: