Show simple item record

dc.contributor.authorBalasubramanian, Vikash
dc.date.accessioned2020-08-10 20:17:45 (GMT)
dc.date.available2020-08-10 20:17:45 (GMT)
dc.date.issued2020-08-10
dc.date.submitted2020-08-05
dc.identifier.urihttp://hdl.handle.net/10012/16107
dc.description.abstractLearning useful representations of data is a crucial task in machine learning with wide ranging applications. In this thesis we explore improving representations of models based on variational inference by improving the posterior. We explore two approaches towards this goal: 1) auxiliary losses to regularize the latent space and enforcing desired properties and 2) normalizing flows to develop more flexible posteriors to be used during variational inference. We propose a proximity based loss function that helps in disentanglement by regularizing the latent space based on similarity according to a criterion. We evaluate our model on a task of disentangling semantics and syntax in sentences and empirically show that our model successfully manages to learn independent subspaces that learn semantics and syntax respectively. We compare our model to existing approaches using automated metrics and human evaluation to show that our model is competitive. We also explore the effectiveness of normalizing flows for representation learning and generative modeling. We perform experiments that empirically show that variational inference with normalizing flows beats standard approaches based on simple posteriors across various metrics in text generation and language modeling. We also propose a variant of planar normalizing flows called block planar normalizing flows for use in disentanglement tasks. We perform ablation experiments to empirically show that our proposed block planar flows help in improving disentanglement.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.subjectMachine Learningen
dc.subjectNatural Language Processingen
dc.subjectNatural Language Generationen
dc.subjectArtificial Intelligenceen
dc.subjectNormalizing Flowsen
dc.subjectDisentanglementen
dc.subjectVariational Inferenceen
dc.titleVariational Inference for Text Generation: Improving the Posterioren
dc.typeMaster Thesisen
dc.pendingfalse
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeMaster of Mathematicsen
uws.contributor.advisorVechtomova, Olga
uws.contributor.affiliation1Faculty of Mathematicsen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages