Distributions in Semantic Space

dc.contributor.authorSelby, Kira
dc.date.accessioned2024-04-26T18:41:46Z
dc.date.available2024-04-26T18:41:46Z
dc.date.issued2024-04-26
dc.date.submitted2024-04-22
dc.description.abstractThis thesis is an investigation of the powerful and flexible applications of analyzing empirical distributions of vectors within latent spaces. These methods have historically been applied with great success to the domain of word embeddings, leading to improvements in robustness against polysemy, unsupervised inference of hierarchical relationships between words, and even used to shatter existing benchmarks on unsupervised translation. This work will serve to extend these existing lines of inquiry, with a focus on two key areas of further research: a) Probabilistic approaches to robustness in natural language. b) Approximating general distance functions between distributions in order to infer general hierarchical relationships between words from their distributions over contexts. Motivated by these initial research directions, the resulting investigations will then demonstrate novel and significant contributions to a diverse range of problems across many different fields and domains - far beyond the narrow scope of word embeddings. The key contributions of this work are threefold: 1. Proposing a probabilistic, model-agnostic framework for robustness in natural language models. The proposed model improves performance on a wide range of downstream tasks compared to existing baselines. 2. Constructing a general architecture for modelling distance functions between multiple permutation invariant sets. The proposed architecture is proved to be a universal approximator for all partially permutation-invariant functions and outperforms all existing baselines on a number of set-based tasks, as well as approximating distance functions such as KL Divergence and Mutual Information. 3. Leveraging this architecture to define a novel, set-based approach to few-shot image generation. The proposed approach outperforms all existing image-to-image baselines without making restrictive assumptions about the structure of the training and evaluation sets that might limit its ability to generalize, making it a promising candidate for scaling to true zero-shot generation.en
dc.identifier.urihttp://hdl.handle.net/10012/20506
dc.language.isoenen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectmachine learningen
dc.subjectdeep learningen
dc.subjectgenerative modelsen
dc.subjectnatural language processingen
dc.titleDistributions in Semantic Spaceen
dc.typeDoctoral Thesisen
uws-etd.degreeDoctor of Philosophyen
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms0en
uws.contributor.advisorPoupart, Pascal
uws.contributor.affiliation1Faculty of Mathematicsen
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Selby_Kira.pdf
Size:
4.7 MB
Format:
Adobe Portable Document Format
Description:
Revised thesis

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: