An Efficient Markov Chain Model Development based Prefetching in Location-Based Services

An Efficient Markov Chain Model Development based Prefetching in Location-Based Services

Ajay Kumar Gupta, Udai Shanker
Copyright: © 2021 |Pages: 17
DOI: 10.4018/978-1-7998-7756-1.ch005
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

A quite significant issue with the current location-based services application is to securely store information for users on the network in order to quickly access the data items. One way to do this is to store data items that have a high likelihood of subsequent request. This strategy is known as proactive caching or prefetching. It is a technique in which selected information is cached before it is actually needed. In comparison, past constructive caching strategies showed high data overhead in terms of computing costs. Therefore, with the use of Markov chain model, the aim of this work is to address the above problems by an efficient user future position movement prediction strategy. For modeling of the proposed system to evaluate the feasibility of accessing information on the network for location-based applications, the client-server queuing model is used in this chapter. The observational findings indicate substantial improvements in caching efficiency to previous caching policies that did not use prefetch module.
Chapter Preview
Top

Introduction

In the modern age, access to the information is quite easy (Ben et al., 2017)(Gupta & Shanker, 2020d)(Gupta & Shanker, 2020a). Users can quickly access data stored in videos, images, and sounds enabling the internet service providers to serve users in a better way. Proactive caching is a well-known methodology for performance optimization that significantly improves data look-ups. The past research claimed that latency (Cao, 2003) could be greatly decreased by embedding proactive caching phenomenon by helping the consumer to have easy access to the data without requesting from server and reduction in unnecessary overhead. Proactive caching (Darwish et al., 2018) focuses on the user’s ability to anticipate mobility trends by travelling from one place to another in an environment in order to assess which data objects should be placed on which cache node depending on the frequency of content demanded by the users. The traditional prefetching strategies overcome the latency state of the processor only; the encouraging aim of the prefetching approach to be used against mobile devices, however, is to minimize both the time of processing as well as time for communication (Gupta & Shanker, 2020b)(Gupta & Shanker, 2020c). The problem statement for prefetching is also to suggest a new approach to reinforce the prefetch processing of a low network communication mobile device system (Gupta & Prakash, 2018). With the support of machine learning techniques, this approach improves customer satisfaction and optimizes cache access. Prefetching has been reported in recent studies to be a hot subject for research on computer-based web services. Informed mobile prefetching (IMP) architecture had been designed in (Patterson et al., 1995) to prefetch data using least recently used (LRU) cache, which utilizes costs & benefits analysis to handle the allocation of the disk buffer to competitive users. Previous prefetching research was built specifically for desktop environments, and is agnostic to resource limitations imposed by CPU, memory and mobile device’s battery power (Gupta & Shanker, 2018a). As the main aim of prefetching was to minimize access latency, all of the prefetching logic might historically operate on the end computer. However, for a smartphone and tablet, the prefetching approach is a necessity to minimize the number of system resources request. Therefore, this chapter presents the prefetching process that may be appropriate for content caching in the mobile world by looking at this promising solution. This chapter's key contribution is to make a model to decide the data objects to be prefetched using the Markov model. Using the user's movement trajectory, the process of next location prediction through the mobility Markov chain model has the better accuracy and least error than that of previous forecasting methods used in caching.

Complete Chapter List

Search this Book:
Reset