Research on a New Convolutional Neural Network Model Combined With Random Edges Adding

Research on a New Convolutional Neural Network Model Combined With Random Edges Adding

Jin Zhang, Sen Tian, XuanYu Shu, Sheng Chen, LingYu Chen
Copyright: © 2021 |Pages: 10
DOI: 10.4018/IJDST.2021010105
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

It is always a hot and difficult point to improve the accuracy of the convolutional neural network model and speed up its convergence. Based on the idea of the small world network, a random edge adding algorithm is proposed to improve the performance of the convolutional neural network model. This algorithm takes the convolutional neural network model as a benchmark and randomizes backwards and cross layer connections with probability p to form a new convolutional neural network model. The proposed idea can optimize the cross-layer connectivity by changing the topological structure of the convolutional neural network and provide a new idea for the improvement of the model. The simulation results based on Fashion-MINST and cifar10 data set show that the model recognition accuracy and training convergence speed are greatly improved by random edge adding reconstructed models with a probability of p = 0.1.
Article Preview
Top

Introduction

Convolutional neural networks (CNN) are becoming more and more popular in various learning tasks, especially in visual computing applications. The biggest advantage of it is that it has the ability to represent learning, thus, it can learn the characteristics of the data itself, and avoid the trouble of traditional manual feature extraction. Therefore, the application of convolutional neural networks and the study of the model structure have been hot spots for scholars to explore. In the era of big data, the amount and variety of data in the visual field are increasing. On the one hand, it increases the calculation time of the model. On the other hand, the simple neural network model is difficult to capture the essential characteristics of the data. Therefore, in order to reduce the calculation time of the model, researchers can use parallel software framework or programming model for reference to make the model parallel processing (Jing, et al, 2017; Song, Wang, Li & Gao, 2016) and shorten the calculation time. To ensure that the model can better learn the characteristics of the data, the scale of the model has also evolved toward complex aspects and the training overhead has accordingly increased. With the increase of the depth and complexity of the model, a series of problems are caused, which make it difficult to train the model, e.g. the gradient disappears (Glorot & Bengio, 2010; Bengio, Simard & Frascon, 1994). In order to solve the gradient problem of the model, the researchers proposed the Highway network (Srivastava, Greff & Schmidhubr, 2015), adding a bypass connection in the CNN architecture to enhance the information flow between layers. But this method is controlled by a gating function, which depends on data. So it is more difficult to train. ResNet (He, Zhang, Ren & Sun, 2016) uses identity connection or skip connection to solve the gating parameter problem of Highway network. However, the connection mode used in this network excessively follows the modular structure, and there is a lot of redundancy in deep ResNet. Subsequently researchers have improved their structure based on ResNet (Targ, Almeida & Lyman, 2016; Xie, Girshick, Dollar, Tu & He, 2017) to further improve the accuracy of image classification. Szegedy C. et al. introduced the inception structure into short connections (Szegedy, Ioffe, Vanhoucke & Alexander, 2017), which improved the model's convergence speed. Therefore, the introduction of short connections in the model will greatly benefit the performance of the model.

Convolutional neural networks have a more significant effect in simulating the structural characteristics of biological neural networks in terms of network depth, but their performance is far from the advanced features of real biological neural networks. Biological neurological research shows that the neural network of the brain has inherent random characteristics to a certain extent (Hu, Liao & Mao, 2003). The connection of the brain's network structure (Achard, 2006) belongs to a new type of network proposed by Watts (Watts & Strogatz, 1998) named the small world network, which between the regular network and the random network. The small-world network will improve the propagation efficiency of its network structure. However, Convolutional neural network is a kind of feedforward neural network with deep structure, and its connection architecture is approximately regular. Therefore, how to make the network structure has random features and improve network efficiency in a convolutional neural network is a key issue.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 2 Issues (2023)
Volume 13: 8 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing