Load and Cost-Aware Min-Min Workflow Scheduling Algorithm for Heterogeneous Resources in Fog, Cloud, and Edge Scenarios

Load and Cost-Aware Min-Min Workflow Scheduling Algorithm for Heterogeneous Resources in Fog, Cloud, and Edge Scenarios

Jyoti Bisht, Venkata Subrahmanyam Vampugani
Copyright: © 2022 |Pages: 20
DOI: 10.4018/IJCAC.2022010105
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Fog computing and Edge computing are few of the latest technologies which are offered as solution to challenges faced in Cloud Computing. Instead of offloading of all the tasks to centralized cloud servers, some of the tasks can be scheduled at intermediate Fog servers or Edge devices. Though this solves most of the problems faced in cloud but also encounter other traditional problems due to resource-related constraints like load balancing, scheduling, etc. In order to address task scheduling and load balancing in Cloud-fog-edge collaboration among servers, we have proposed an improved version of min-min algorithm for workflow scheduling which considers cost, makespan, energy and load balancing in heterogeneous environment. This algorithm is implemented and tested in different offloading scenarios- Cloud only, Fog only, Cloud-fog and Cloud-Fog-Edge collaboration. This approach performed better and the result gives minimum makespan, less energy consumption along with load balancing and marginally less cost when compared to min-min and ELBMM algorithms
Article Preview
Top

Introduction

Cloud computing is a technology that offers computation service over internet on pay per use basis. It offers various types of resources like – storage, compute, networking etc., which can be dynamically provisioned according to users demand. Also this technology frees user from the burden of managing resources, updating and upgrading them, and various other technical issues. The initial cost and time required for setting up of working environment is greatly reduced from users end. Though this technology is offering multiple benefits to user but it is also having some shortcoming associated with it – like increasing response time, increasing demand of network bandwidth, security of users, heavily loaded network, energy requirements and others. Gou et al. (2016) presented various security challenges in cloud environment. Also the network is flooding with data as the number of IoT devices increasing. Since this technology is not new, intensive research is done and still going on to overcome these issues but many of the problems are still prevailing due to issues in its inherent architecture.

In cloud-only scenario, many researchers are working to overcome the problems associated with scheduling, migration and serverless computing. Dad et al. (2020) proposed vm scheduling techniques to minimize energy consumption in cloud. Jeba et al. (2019) also proposed a vm migration technique in cloud for reducing power consumption. Soltani et al. (2020) proposed a migration based technique to support long executing functions support serverless computing in cloud. Li et al. (2018) proposed cooperative computation for two-party computation using social conformity.

Fog computing and Edge computing are newer technologies that are offered as a solution to some challenges in cloud. These technologies do not exist independently. They are added to the traditional cloud computing environment to solve the problems and improvement of services. They provide improved services because they overcome the architectural limitation of Cloud. Both these technologies try to offload the workload closer to the location where data is generated to reduce latency. Adding these technologies to traditional cloud environment can provide multiple benefits along with providing support for real time applications that require strict deadlines to be met. Sarkar et al. (2015), Bonomi et al. (2012) proposed various application scenarios of Fog computing and its suitability to be used with IoT devices. Also Jalali et al. (2016) proposed that use of fog servers may reduce energy in cloud computing. Al-Qerem et al. (2019) proposed technique to reduce communication delay in fog-cloud environment using partial validation at fog.

Fog computing has a three layer architecture. The topmost layer contains the cloud servers. These cloud servers contains huge collection of resources which can server large number of users. The middle layer contains fog servers. Fog servers provide similar features as that of cloud servers but are limited by resources. The bottom most layer is end devices layer which are the locations where the data is actually getting generated. In fog computing or fog-cloud collaboration scenario, data processing or analytics can be done at cloud (top) or fog layer (middle) collectively. Edge computing also has a similar architecture with one difference. In this technology data processing or analytics can also be done at the end devices layer where the data is generated without being offloaded. Hence it is called edge computing. In cloud-fog-edge scenario computations can be done in all the three layers. The diagram in Figure 1 represents different offloading scenarios where computations can be done.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024)
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing