Browsing Theses by Supervisor "Vechtomova, Olga"
Now showing items 1-13 of 13
-
Adaptive Fusion Techniques for Effective Multimodal Deep Learning
(University of Waterloo, 2020-08-28)Effective fusion of data from multiple modalities, such as video, speech, and text, is a challenging task due to the heterogeneous nature of multimodal data. In this work, we propose fusion techniques that aim to model ... -
Controlled Generation of Stylized Text Using Semantic and Phonetic Representations
(University of Waterloo, 2022-01-21)Neural networks are a popular choice of models for the purpose of text generation. Variational autoencoders have been shown to be good at reconstructing text and generating novel text. However, controlling certain aspects ... -
Creating an Emotion Responsive Dialogue System
(University of Waterloo, 2018-10-19)The popularity of deep neural networks and vast amounts of readily available multi-domain textual data has seen the advent of various domain/task specific and domain agnostic dialogue systems. In our work, we present a ... -
Dialog Response Generation Using Adversarially Learned Latent Bag-of-Words
(University of Waterloo, 2020-08-28)Dialog response generation is the task of generating response utterance given a query utterance. Apart from generating relevant and coherent responses, one would like the dialog generation model to generate diverse and ... -
Disentangled Representation Learning for Stylistic Variation in Neural Language Models
(University of Waterloo, 2018-08-14)The neural network has proven to be an effective machine learning method over the past decade, prompting its usage for modelling language, among several other domains. However, the latent representations learned by these ... -
Disentangled Syntax and Semantics for Stylized Text Generation
(University of Waterloo, 2020-09-21)Neural network based methods are widely used in text generation. The end-to-end training of neural networks directly optimizes the text generation pipeline has been proved powerful in various tasks, including machine ... -
Disentanglement of Syntactic Components for Text Generation
(University of Waterloo, 2022-02-18)Modelling human generated text, i.e., natural language data, is an important challenge in artificial intelligence. A good AI program should be able to understand and analyze natural language, and generate fluent and accurate ... -
Future Sight: Dynamic Story Generation with Large Pretrained Language Models
(University of Waterloo, 2022-08-23)Automated story generation has been an open problem in computing for many decades. Only with the recent wave of deep learning research have neural networks been applied to automated story generation tasks. Current deep ... -
Natural Language Generation with Neural Variational Models
(University of Waterloo, 2018-08-13)Automatic generation of text is an important topic in natural language processing with applications in tasks such as machine translation and text summarization. In this thesis, we explore the use of deep neural networks ... -
Supporting Exploratory Search Tasks Through Alternative Representations of Information
(University of Waterloo, 2020-05-14)Information seeking is a fundamental component of many of the complex tasks presented to us, and is often conducted through interactions with automated search systems such as Web search engines. Indeed, the ubiquity of Web ... -
Towards Measuring Coherence in Poem Generation
(University of Waterloo, 2023-01-11)Large language models (LLM) based on transformer architecture and trained on massive corpora have gained prominence as text-generative models in the past few years. Even though large language models are very adept at ... -
Variational Inference for Text Generation: Improving the Posterior
(University of Waterloo, 2020-08-10)Learning useful representations of data is a crucial task in machine learning with wide ranging applications. In this thesis we explore improving representations of models based on variational inference by improving the ... -
Wasserstein Autoencoders with Mixture of Gaussian Priors for Stylized Text Generation
(University of Waterloo, 2021-01-28)Probabilistic text generation is an important application of Natural Language Processing (NLP). Variational autoencoders and Wasserstein autoencoders are two widely used methods for text generation. New research efforts ...