Towards Efficient Resource Management in Fog Computing: A Survey and Future Directions

Towards Efficient Resource Management in Fog Computing: A Survey and Future Directions

M. Sudhakara, K Dinesh Kumar, Ravi Kumar Poluru, R Lokesh Kumar, S Bharath Bhushan
Copyright: © 2020 |Pages: 25
DOI: 10.4018/978-1-7998-0194-8.ch010
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Cloud computing is an emerging field. With its three key features and its natural favorable circumstances, it has had a few difficulties in the recent years. The gap between the cloud and the end devices must be reduced in latency specific applications (i.e., disaster management). Right now, fog computing is an advanced mechanism to reduce the latency and congestion in IoT networks. It emphasizes processing the data as close as possible to the edge of the networks, instead of sending/receiving the data from the data centre by using large quantity of fog nodes. The virtualization of these fog nodes (i.e., nodes are invisible to the users) in numerous locations across the data centres enabled the fog computing to become more popular. The end users need to purchase the computing resources from the cloud authorities to process their excessive workload. Since computing resources are heterogeneous and resource are constrained and dynamic in nature, allocating these resources to the users becomes an open research issue and must be addressed as the first priority.
Chapter Preview
Top

Introduction

Internet of Things, abbreviated as IoT, is now a burning trend in technology and by 2025, the count of connected components will increase and almost 26 billion gadgets are estimated for installation at typical speed. IoT implementation result in creating huge repository of data which is to be processed and also do analyse in real time. So that it enhance the quantity of workloads related to data centres, requiring wide range of computing resources and make the providers to face new challenges. To engage with challenges and satisfy the services of computing, several wide ranging data centres or clouds are implemented. Cloud data centres belonging to cloud computing, are able to categorize some portion of computing resources (like servers, storage, applications, services and networks) which are shareable. These can be accessed with ease by the users based on their requirements. Although the clouds are commonly be created in remote locations which are far away from users. It leads to have more transmission cost and also congestion in transmission. Additionally it is not endurable with applications which need real-time collaboration in IoT aspects (IDC, Worldwide Internet of Things Forecast 2015 -2019).

Fog computing in IoT, was introduced by Cisco and they recommended a challenging solution. In this computing, the relevant information is given to data centres and performs computing, decision making and so on occurs through various computing gadgets which consumes less power. Those gadgets are known as Fog Nodes (FNs) and these are implemented at the network edges so that to offload the services of data computing from cloud. Hence Fog computing yields minimum latency and speed response. Network virtualization is deployed in fog computing in which Fog nodes are not visible to users and they could only buy the computing resources from the cloud data centres in order to process their more workloads. The requests from the users to data centres and each of those data centres are able to gather the computing resources from total FNs. The noticeable thing here is that every cloud data centre allocates different computing resources from multiple FNs to multiple users. Therefore the computing resources are used by users properly (B. P. Rimal et.al, 2014).

In conventional cloud computing, to minimize the workloads related to data centres Fog computing provides an alternate way in supporting locally distributed, less latency and QoS related IoT applications. As mentioned earlier Fog computing was proposed by Cisco to elaborate the cloud computing to the edges of a network. In simple terms ‘Fog’ is ‘a cloud somewhat closer to ground’ which is from Core to edge computing allowing better services otherwise applications. Fog computing is highly essential platform which gives computing, storage and services related to networking between EU and DC belonging conventional cloud computing (L. M. Vaquero et.al, 2014)

In general data created by the user gadgets like smart mobiles and some other wearable’s in smart city are transferred to clouds and is to be processed and stored as well. But in real time applications it is ineffective in future because it is essential to improve the communication latencies when huge collection of devices connected to Internet (M. Aazam et.al, 2016). The applications will have adverse effects when there is increase in the communication latencies, so that it leads to degrade the total Quality of Service and Quality of Experience. Similarly another computing type which minimize the above issue is getting the computing resources near to user gadgets and sensors and can utilize them to perform processing of data. It will reduce the data size which was sent to cloud and consequently minimizes the communication latencies. To comprehend this kind of computing the present research notion is to divide few of computing resources which are in available at many data centres by getting them closer to network edges to users and to sensors. A type of computing which utilizes the resources which are closer to network edges is known as “Edge computing”. Adversely to the cloud resources, those resources at edge are:

  • 1.

    Resource Constrained: Restricted resources for computing as the edge devices contain small processors and a controlled power cost.

  • 2.

    Heterogeneous: The processors with varied systems.

Hence handling resources is said to be one of the major challenges in both the Fog and also in edge computing.

Complete Chapter List

Search this Book:
Reset