Show simple item record

dc.contributor.authorShaw, Nolan
dc.date.accessioned2019-10-17 13:18:17 (GMT)
dc.date.available2019-10-17 13:18:17 (GMT)
dc.date.issued2019-10-17
dc.date.submitted2019-09-30
dc.identifier.urihttp://hdl.handle.net/10012/15206
dc.description.abstractIn this work, I study the relationship between a local, intrinsic update mechanism and a synaptic, error-based learning mechanism in ANNs. I present a local intrinsic rule that I developed, dubbed IP, that was inspired by the Infomax rule. Like Infomax, this IP rule works by controlling the gain and bias of a neuron to regulate its rate of fire. I discuss the biological plausibility of this rule and compare it to batch normalisation. This work demonstrates that local information maximisation can work in conjunction with synaptic learning rules to improve learning. I show that this IP rule makes deep networks more robust to increases in synaptic learning rates, and that it increases the average value for the slope of the activation functions. I also compare IP to batch normalisation and Infomax, whose family of solutions were shown to be the same. In addition, an alternative rule is developed that has many of the same properties as IP, but instead uses a weighted moving average to compute the desired values for the neuronal gain and bias rather than the Adamised update rules used by IP. This rule, dubbed WD, demonstrates universally superior performance when compared to both IP and standard networks. In particular, it shows faster learning and an increased robustness to increases in synaptic learning. The gradients of the activation function are compared to those in standard networks, and the WD method shows drastically larger gradients on average, suggesting that this intrinsic, information-theoretic rule solves the vanishing gradient problem. The WD method also outperforms Infomax and a weighted moving average version of batch normalisation. Supplementary analysis is done to reinforce the relationship between intrinsic plasticity and batch normalisation. Specifically, the IP method centers its activation over the median of its input distribution, which is equivalent to centering it over the mean of the input distribution for symmetric distributions. This is done in an attempt to contribute to the theory of deep ANNs. Analysis is also provided that demonstrates the IP rule results in neuronal activities with levels of entropy similar to that of Infomax, when tested on a fixed input distribution. This same analysis shows that the WD version of intrinsic plasticity also improves information potential, but fails to reach the same levels as IP and Infomax. Interestingly, it was observed that batch normalisation also improves information potential, suggesting that this may be a cause for the efficacy of batch normalisation---an open problem at the time of this writing.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.relation.urihttp://yann.lecun.com/exdb/mnist/en
dc.relation.urihttps://www.cs.toronto.edu/~kriz/cifar.htmlen
dc.subjectartificial neural networksen
dc.subjectmachine learningen
dc.subjectdeep learningen
dc.subjectintrinsic plasticityen
dc.subjectsynergistic learningen
dc.subjectinformation theoryen
dc.titleThe Computational Advantages of Intrinsic Plasticity in Neural Networksen
dc.typeMaster Thesisen
dc.pendingfalse
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeMaster of Mathematicsen
uws.contributor.advisorOrchard, Jeff
uws.contributor.affiliation1Faculty of Mathematicsen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages