Browsing Theses by Subject "pretrained language models"
Now showing items 1-2 of 2
-
Future Sight: Dynamic Story Generation with Large Pretrained Language Models
(University of Waterloo, 2022-08-23)Automated story generation has been an open problem in computing for many decades. Only with the recent wave of deep learning research have neural networks been applied to automated story generation tasks. Current deep ... -
Towards Effective Utilization of Pretrained Language Models — Knowledge Distillation from BERT
(University of Waterloo, 2020-09-02)In the natural language processing (NLP) literature, neural networks are becoming increasingly deeper and more complex. Recent advancements in neural NLP are large pretrained language models (e.g. BERT), which lead to ...