UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

On Enabling Layer-Parallelism for Graph Neural Networks using IMEX Integration

Loading...
Thumbnail Image

Date

2024-06-20

Authors

Kara, Omer Ege

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

Graph Neural Networks (GNNs) are a type of neural networks designed to perform machine learning tasks with graph data. Recently, there have been several works to train differential equation-inspired GNN architectures, which are suitable for robust training when equipped with a relatively large number of layers. Neural networks with more layers are potentially more expressive. However, the training time increases linearly with the number of layers. Parallel-in-layer training is a method that was developed to overcome the increase in training time of deeper networks and was first applied to training residual networks. In this thesis, we first give an overview of existing works on layer-parallel training and graph neural networks inspired by differential equations. We then discuss issues that are encountered when these graph neural network architectures are trained parallel-in- layer and propose solutions to address these issues. Finally, we present and evaluate experimental results about layer-parallel GNN training using the proposed approach.

Description

Keywords

Parallel-in-Layer Training, Implicit-Explicit Integration

LC Keywords

Citation