Load Balancing Approaches in Cloud and Fog Computing Environments: A Framework, Classification, and Systematic Review

Load Balancing Approaches in Cloud and Fog Computing Environments: A Framework, Classification, and Systematic Review

Hiba Shakeel, Mahfooz Alam
Copyright: © 2022 |Pages: 24
DOI: 10.4018/IJCAC.311503
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Cloud and fog computing are modern technologies that handle multiple dynamic user requests. Cloud provides demand-based services to users over the internet on pay-as-you-go basis. Fog handles real-time requests that are received from smart devices. Millions of requests arrive at the cloud-fog layer, often leading to overloaded virtual machines (VMs). Load balancing (LB) is an important issue for cloud-fog systems and has been proved to be an NP-hard problem. It is essential as it distributes the load equally among VMs to properly utilize resources and improve quality of service (QoS). Therefore, this paper presents a complete classification of LB algorithms and also a comprehensive study using heuristic, meta-heuristic, and hybrid approaches in cloud and fog computing environments. The main goal of this paper is to highlight the importance of LB to overcome the challenges of the systems. This study reviews papers of the last seven years and systematically discusses them using various tables and pie charts. Finally, the paper concludes with the research gaps and future insights.
Article Preview
Top

1. Introduction

The recent trend of computing technologies such as cloud, fog, internet of things (IoT), and internet of everything (IOE), have helped us deal with myriad of complex real-world problems. Often, they are combined to have improved results. Technologies are advancing and the landscape of solvable problems is widening. No domain in today’s world is devoid of computing services. Cloud and fog computing have pushed technological advancement to whole another level. Hence, it has captured the attention of many researchers (Bittencourt et al., 2018).

Cloud computing (CC) is an environment that provides on-demand acquisition of system resources on pay-as-you-use basis, typically for computation, data storage, and data management. Users utilize these resources remotely, without actually owning them. The prime benefit it offers over an on-premises environment is that it lowers the cost and time for availing services like servers, memory elements, databases, security, software, and others. CC saves cloud users and organizations from the additional expense of setting up local servers, purchasing hardware and software, storing and managing data, supplying electricity, and managing human resources needed for taking care of everything. All of these are handled by remote servers over the internet (Alam & Khan, 2017, Alam et al., 2021). Some of the other advantageous features of the cloud are as follows:

  • Scalability - The ability of the cloud to handle website traffic by distributing and redirecting user requests to various servers and network devices.

  • Rapid elasticity - Characteristic of cloud that makes sure that resources are available to clients anytime, anywhere and in any quantity. Resources are scaled up and down according to the client’s need without compromising the quality and without the client knowing about it.

  • Flexibility- The feature that facilitates shifting of load from one node to other active nodes in case failure occurs.

  • Self-curing - In case of application failure, one of the multiple updated copies of data is provided by the cloud, saving the application from crash.

A closely related concept to CC is fog computing (FC), which came after the cloud. The concept behind the metaphors is– clouds float high in the sky and fogs are closer to Earth. Similarly, FC is a flexible and decentralized architecture where storage, management, and computation occur somewhere in between the cloud and data source (end devices and sensors). In fog, the proximity of analytics is reduced and is brought closer to where data is generated, resultantly reducing latency, bandwidth, and network traffic, while increasing security and QoS. Any device that can compute, store and have network connectivity is called a fog node. Fog nodes are responsible for processing requests without intervention of a third party. Dispersed fog nodes communicate to improve services. Some of the advantageous features of fog are- location awareness and low latency, mobility support, handling of real-time requests, geo-distributed nodes, and linkage to the cloud (Bittencourt et al., 2018; Beraldi et al., 2020). Cloud cum fog architecture handles tremendous number of requests reaching them. Fog addresses real-time requests and sends the rest to the cloud for further processing and storage. IoT or smart devices rely on fog technology. As IoT devices are showing a surprising increase in number, a massive amount of data reaches the network, facing various threats. Some important issues faced by the cloud cum fog paradigm are given as; security, performance, data leak, resource utilization, cost, network management, load balancing, and many more.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024)
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing