Global-connected network with generalized ReLU activation

dc.contributor.authorChen, Zhi
dc.contributor.authorHo, Pin-Han
dc.date.accessioned2019-12-04T17:35:09Z
dc.date.available2019-12-04T17:35:09Z
dc.date.issued2019-12
dc.descriptionThe final publication is available at Elsevier via https://doi.org/10.1016/j.patcog.2019.07.006. © 2019. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/en
dc.description.abstractRecent Progress has shown that exploitation of hidden layer neurons in convolutional neural networks (CNN) incorporating with a carefully designed activation function can yield better classification results in the field of computer vision. The paper firstly introduces a novel deep learning (DL) architecture aiming to mitigate the gradient-vanishing problem, in which the earlier hidden layer neurons could be directly connected with the last hidden layer and fed into the softmax layer for classification. We then design a generalized linear rectifier function as the activation function that can approximate arbitrary complex functions via training of the parameters. We will show that our design can achieve similar performance in a number of object recognition and video action benchmark tasks, such as MNIST, CIFAR-10/100, SVHN, Fashion-MNIST, STL-10, and UCF YoutTube Action Video datasets, under significantly less number of parameters and shallower network infrastructure, which is not only promising in training in terms of computation burden and memory usage, but is also applicable to low-computation, low-memory mobile scenarios for inference.en
dc.identifier.urihttps://doi.org/10.1016/j.patcog.2019.07.006
dc.identifier.urihttp://hdl.handle.net/10012/15277
dc.language.isoenen
dc.publisherElsevieren
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectcomputer visionen
dc.subjectdeep learningen
dc.subjectactivationen
dc.subjectconvolution neural networken
dc.titleGlobal-connected network with generalized ReLU activationen
dc.typeArticleen
dcterms.bibliographicCitationZhi Chen, Pin-Han Ho, Global-Connected Network With Generalized ReLU Activation, Pattern Recognition (2019), doi: https://doi.org/10.1016/j.patcog.2019.07.006en
uws.contributor.affiliation1Faculty of Engineeringen
uws.contributor.affiliation2Electrical and Computer Engineeringen
uws.peerReviewStatusRevieweden
uws.scholarLevelFacultyen
uws.scholarLevelPost-Doctorateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
zhi_chen.pdf
Size:
938.97 KB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.47 KB
Format:
Item-specific license agreed upon to submission
Description: