Global-connected network with generalized ReLU activation. (December 2019)
- Record Type:
- Journal Article
- Title:
- Global-connected network with generalized ReLU activation. (December 2019)
- Main Title:
- Global-connected network with generalized ReLU activation
- Authors:
- Chen, Zhi
Ho, Pin-Han - Abstract:
- Highlights: This work presents a novel deep-connected architecture of CNN with detailed analytical analysis and extensive experiments on several datasets. A new activation function is presented to approximate arbitrary complex functions with analytical analysis on both forward pass and backward pass. The experiments show the competitive performance of our designed network with less parameters and shallower architecture, compared with other state-of-art models. Abstract: Recent Progress has shown that exploitation of hidden layer neurons in convolutional neural networks (CNN) incorporating with a carefully designed activation function can yield better classification results in the field of computer vision. The paper firstly introduces a novel deep learning (DL) architecture aiming to mitigate the gradient-vanishing problem, in which the earlier hidden layer neurons could be directly connected with the last hidden layer and fed into the softmax layer for classification. We then design a generalized linear rectifier function as the activation function that can approximate arbitrary complex functions via training of the parameters. We will show that our design can achieve similar performance in a number of object recognition and video action benchmark tasks, such as MNIST, CIFAR-10/100, SVHN, Fashion-MNIST, STL-10, and UCF YoutTube Action Video datasets, under significantly less number of parameters and shallower network infrastructure, which is not only promising in training inHighlights: This work presents a novel deep-connected architecture of CNN with detailed analytical analysis and extensive experiments on several datasets. A new activation function is presented to approximate arbitrary complex functions with analytical analysis on both forward pass and backward pass. The experiments show the competitive performance of our designed network with less parameters and shallower architecture, compared with other state-of-art models. Abstract: Recent Progress has shown that exploitation of hidden layer neurons in convolutional neural networks (CNN) incorporating with a carefully designed activation function can yield better classification results in the field of computer vision. The paper firstly introduces a novel deep learning (DL) architecture aiming to mitigate the gradient-vanishing problem, in which the earlier hidden layer neurons could be directly connected with the last hidden layer and fed into the softmax layer for classification. We then design a generalized linear rectifier function as the activation function that can approximate arbitrary complex functions via training of the parameters. We will show that our design can achieve similar performance in a number of object recognition and video action benchmark tasks, such as MNIST, CIFAR-10/100, SVHN, Fashion-MNIST, STL-10, and UCF YoutTube Action Video datasets, under significantly less number of parameters and shallower network infrastructure, which is not only promising in training in terms of computation burden and memory usage, but is also applicable to low-computation, low-memory mobile scenarios for inference. … (more)
- Is Part Of:
- Pattern recognition. Volume 96(2019:Dec.)
- Journal:
- Pattern recognition
- Issue:
- Volume 96(2019:Dec.)
- Issue Display:
- Volume 96 (2019)
- Year:
- 2019
- Volume:
- 96
- Issue Sort Value:
- 2019-0096-0000-0000
- Page Start:
- Page End:
- Publication Date:
- 2019-12
- Subjects:
- CNN -- Computer vision -- Deep learning -- Activation
Pattern perception -- Periodicals
Perception des structures -- Périodiques
Patroonherkenning
006.4 - Journal URLs:
- http://www.sciencedirect.com/science/journal/00313203 ↗
http://www.sciencedirect.com/ ↗ - DOI:
- 10.1016/j.patcog.2019.07.006 ↗
- Languages:
- English
- ISSNs:
- 0031-3203
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 11627.xml