Input Space Partitioning for Neural Network Learning

Input Space Partitioning for Neural Network Learning

Shujuan Guo, Sheng-Uei Guan, Weifan Li, Ka Lok Man, Fei Liu, A. K. Qin
Copyright: © 2013 |Pages: 11
DOI: 10.4018/jaec.2013040105
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

To improve the learning performance of neural network (NN), this paper introduces an input attribute grouping based NN ensemble method. All of the input attributes are partitioned into exclusive groups according to the degree of inter-attribute promotion or correlation that quantifies the supportive interactions between attributes. After partitioning, multiple NNs are trained by taking each group of attributes as their respective inputs. The final classification result is obtained by integrating the results from each NN. Experimental results on several UCI datasets demonstrate the effectiveness of the proposed method.
Article Preview
Top

Terminology And Concepts

Letjaec.2013040105.m01 denote the promotion rate of two attributes i and j, which is defined by:

jaec.2013040105.m02
(1) where jaec.2013040105.m03 represents the classification error obtained by training with single attribute i, and jaec.2013040105.m04 represents the classification error obtained by training with two attributes i and j. When the promotion rate of two attributes is 1, these two attributes are considered to be mutually supportive for classification. Otherwise, they are considered to be mutually interfered for classification.

To take full advantage of inter-attribute promotions, we compute the average value of all jaec.2013040105.m05withjaec.2013040105.m06 Any pair of attributes whose classification error is less than this average value is considered to have significant promotion to each other. The smaller the corresponding classification error is, the more significant the promotion is.

In statistics, correlation measures the strength and direction of the linear relationship between two random variables. There exist many ways of calculating correlation. This paper employs the Pearson’s correlation coefficients (Sedgwick, 2012).

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 13: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing