Development of a New Means to Improve the Performance of Self-Organizing Maps

Development of a New Means to Improve the Performance of Self-Organizing Maps

Vijaya Prabhagar Murugesan, Punniyamoorthy M.
Copyright: © 2022 |Pages: 16
DOI: 10.4018/IJDA.307065
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In machine learning, a Self-Organizing map plays a significant role in finding hidden patterns or intrinsic structures in data. In this study, a new modified expression is arrived at to update the radius of the neighbourhood of BMU in SOM. Further, a new approach is introduced to find the eligible nodes for an update in SOM. We have also incorporated the previous work, such as new initialization algorithms to find the initial weight vectors, a method to place the weight vectors in each node of the grid and a method to identify the number of clusters to improve the performance of the proposed SOM algorithm. The proposed SOM performance in terms of Quantization error (QE), Convergence time (CT) and Modified Semantic Relevance Index (MSRI) are compared with conventional SOM for both class label and non-class label datasets. In addition to the above measure, Classification Accuracy (CA) is also used to evaluate the performance against class label datasets. The proposed SOM algorithm shows better performance in all cases.
Article Preview
Top

1. Introduction

In data clustering applications, SOM is one of the popular unsupervised learning algorithms in machine learning. The discretized representation of the input space of the training samples in the SOM algorithm is called a map. It differs from other artificial neural networks because they use a neighbourhood function to preserve the topological properties of space. Teuvo Kohonen introduced the SOM algorithm in 1980 is called a Kohonen map. It consists of a grid of nodes where each node is called a neuron. Each node in the grid is related to a weight vector, which is a position in the SOM grid; (i.e., it has the same dimension as each input vector). The weight vectors move towards the input data without affecting the map's topology during the SOM process. Consequently, the SOM describes a mapping from a higher-dimensional input space to lower-dimensional map space. Once trained, the map can classify a vector from the SOM grid by finding the node with the closest weight vector to the input space vector (Apostolakis 2010; Han et al. 2012; Shieh and Liao 2012; Bhatia 2019).

SOM finds a wide range of applications in various domains, including speech recognition, image data compression, robot control, pattern recognition, and medical diagnosis (Murtagh 1995; Chen et al. 2000; Lapidot et al. 2002; Neagoe and Ropot 2002; Ressom et al. 2003, 2015; Chow and Rahman 2007; Petrilis and Halatsis 2008; Santos et al. 2008; Pölzlbauer et al. 2010; Shieh and Liao 2012). In the medical field, cardiac disease is diagnosed through a classification method based on a hybrid neural fuzzy-logic system with a self-organizing map (Lee et al. 2020). Other applications, like the banking sector, assess the banks' performance through an integrated multi-criteria decision-making model with Self-Organizing Map (Ozcalici and Bumin 2020). Wang et al. (2020) proposed a new ensemble model to evaluate class-imbalanced credit risk, integrating multiple sampling, multiple kernel fuzzy self-organizing map and local accuracy ensemble (Wang et al. 2020).

Appiah et al. (2012) presented a tri-state self-organizing map (bSOM) model, which takes a binary input vector and maintains tristate weights. The bSOM model is designed and implemented to field-programmable gate arrays (FPGA), achieving very high training and execution speeds, and is easily integrated into a more extensive on-chip system. And it is used in two applications, like hand-written character recognition and moving object identification (Appiah et al. 2012). Arous et al. (2010) developed evolutionary learning of a SOM for speech signals. An evolutionary learning needs: a genetic representation (chromosome), a selection mechanism, genetic operators (such as crossover and mutation) and a fitness function. This study aims to bring an organization measure for a SOM, and it will be used as a fitness value by an evolutionary algorithm such as GA (Arous and Ellouze 2010). Ayadi et al. (2012) proposed a growing variant of SOM called MIGSOM: Multilevel Interior Growing SOMs for high-dimensional data clustering to overcome the limits of its predefined classical structure SOM (Ayadi et al. 2012). Chi et al. (2006) proposed and evaluated a two-stage clustering method, which combines an ant-based SOM and K-means. The ant-based SOM clustering model, ABSOM, embeds the exploitation and exploration rules of state transition into the conventional SOM algorithm to avoid falling into local minima (Chi and Yang 2006).

Complete Article List

Search this Journal:
Reset
Volume 5: 1 Issue (2024)
Volume 4: 1 Issue (2023)
Volume 3: 2 Issues (2022): 1 Released, 1 Forthcoming
Volume 2: 2 Issues (2021)
Volume 1: 2 Issues (2020)
View Complete Journal Contents Listing