UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

Dynamical Systems in Spiking Neuromorphic Hardware

dc.contributor.authorVoelker, Aaron Russell
dc.date.accessioned2019-05-10T18:29:03Z
dc.date.available2019-05-10T18:29:03Z
dc.date.issued2019-05-10
dc.date.submitted2019-04-25
dc.description.abstractDynamical systems are universal computers. They can perceive stimuli, remember, learn from feedback, plan sequences of actions, and coordinate complex behavioural responses. The Neural Engineering Framework (NEF) provides a general recipe to formulate models of such systems as coupled sets of nonlinear differential equations and compile them onto recurrently connected spiking neural networks – akin to a programming language for spiking models of computation. The Nengo software ecosystem supports the NEF and compiles such models onto neuromorphic hardware. In this thesis, we analyze the theory driving the success of the NEF, and expose several core principles underpinning its correctness, scalability, completeness, robustness, and extensibility. We also derive novel theoretical extensions to the framework that enable it to far more effectively leverage a wide variety of dynamics in digital hardware, and to exploit the device-level physics in analog hardware. At the same time, we propose a novel set of spiking algorithms that recruit an optimal nonlinear encoding of time, which we call the Delay Network (DN). Backpropagation across stacked layers of DNs dramatically outperforms stacked Long Short-Term Memory (LSTM) networks—a state-of-the-art deep recurrent architecture—in accuracy and training time, on a continuous-time memory task, and a chaotic time-series prediction benchmark. The basic component of this network is shown to function on state-of-the-art spiking neuromorphic hardware including Braindrop and Loihi. This implementation approaches the energy-efficiency of the human brain in the former case, and the precision of conventional computation in the latter case.en
dc.identifier.urihttp://hdl.handle.net/10012/14625
dc.language.isoenen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectneural engineeringen
dc.subjectspiking networksen
dc.subjectneuromorphicsen
dc.subjectrecurrent neural networksen
dc.subjectdynamical systemsen
dc.subjecttemporal representationen
dc.subjectnengoen
dc.subjectreservoir computingen
dc.subjectlong short-term memoryen
dc.subjectforce learningen
dc.subjecttheoretical neuroscienceen
dc.subjectcomputational neuroscienceen
dc.subjectloihien
dc.subjectspinnakeren
dc.subjectbraindropen
dc.titleDynamical Systems in Spiking Neuromorphic Hardwareen
dc.typeDoctoral Thesisen
uws-etd.degreeDoctor of Philosophyen
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws.comment.hiddenMy abstract contains long dashes. I'm fairly sure these are unicode-compliant but I'm not sure how to verify this when submitting via text box. I've also attached a copyright lease for one of the figures, for your informational purposes. It is *not* necessary to publish the copyright release alongside the thesis.en
uws.contributor.advisorEliasmith, Chris
uws.contributor.affiliation1Faculty of Mathematicsen
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Voelker_Aaron.pdf
Size:
19.64 MB
Format:
Adobe Portable Document Format
Description:
PhD Thesis
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.08 KB
Format:
Item-specific license agreed upon to submission
Description: