QAVSA: Question Answering Using Vector Symbolic Algebras

No Thumbnail Available

Date

2024-11-29

Advisor

Eliasmith, Chris

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

With the advancement of large pretrained language models (PLMs), many question answering (QA) benchmarks have been developed in order to evaluate the capabilities of these models. Augmenting PLMs with external knowledge in the form of Knowledge Graphs (KGs) has been a popular method to improve their question-answering capabilities, and a common method to incorporate KGs is to use Graph Neural Networks (GNNs). As an alternative to GNNs for augmenting PLMs, we propose a novel graph reasoning module using Vector Symbolic Algebra (VSA) graph representations and a k-layer MLP. We demonstrate that our VSA-based model performs as well as QA-GNN, a model combining a PLM and a GNN-module, on 3 multiple-choice question answering (MCQA) datasets. Our model has a simpler architecture than QA-GNN, converges 37% faster during training, and has constant memory requirements as the size of the knowledge graphs increase. Furthermore, a novel method to analyze the VSA-based outputs of QAVSA is presented.

Description

Keywords

language models, vector symbolic algebra, multiple choice question answering, knowledge graph, artificial intelligence, machine learning

LC Subject Headings

Citation