Proficient Normalised Fuzzy K-Means With Initial Centroids Methodology

Proficient Normalised Fuzzy K-Means With Initial Centroids Methodology

Deepali Virmani, Nikita Jain, Ketan Parikh, Shefali Upadhyaya, Abhishek Srivastav
Copyright: © 2018 |Pages: 18
DOI: 10.4018/IJKDB.2018010104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This article describes how data is relevant and if it can be organized, linked with other data and grouped into a cluster. Clustering is the process of organizing a given set of objects into a set of disjoint groups called clusters. There are a number of clustering algorithms like k-means, k-medoids, normalized k-means, etc. So, the focus remains on efficiency and accuracy of algorithms. The focus is also on the time it takes for clustering and reducing overlapping between clusters. K-means is one of the simplest unsupervised learning algorithms that solves the well-known clustering problem. The k-means algorithm partitions data into K clusters and the centroids are randomly chosen resulting numeric values prohibits it from being used to cluster real world data containing categorical values. Poor selection of initial centroids can result in poor clustering. This article deals with a proposed algorithm which is a variant of k-means with some modifications resulting in better clustering, reduced overlapping and lesser time required for clustering by selecting initial centres in k-means and normalizing the data.
Article Preview
Top

Literature Review

In this section, the comparisons between various clustering algorithms, for instance, k-means algorithm, k-medoids algorithm, normalized k-means algorithm and fuzzy k-means algorithm, their advantages and limitations are discussed. A new algorithm is proposed in the following section.

Complete Article List

Search this Journal:
Reset
Open Access Articles
Volume 8: 2 Issues (2018)
Volume 7: 2 Issues (2017)
Volume 6: 2 Issues (2016)
Volume 5: 2 Issues (2015)
Volume 4: 2 Issues (2014)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing