A Novel Architectural Model for Dynamic Updating and Verification of Data Storage in Cloud Environment

A Novel Architectural Model for Dynamic Updating and Verification of Data Storage in Cloud Environment

Dharmendra Singh Rajput, Praveen Kumar Reddy M., Ramasubbareddy Somula, Bharath Bhushan S., Ravi Kumar Poluru
Copyright: © 2021 |Pages: 9
DOI: 10.4018/IJGHPC.2021100105
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Cloud computing is a quickly emerging computing model in the IT industry. Due to the rapid increase in technology, many clients want to store multiple copies of the same data in multiple data centers. Clients are outsourcing the data to cloud service providers and enjoying the high quality of service. Cloud service providers (CSP) are going to charge extra amounts for storing multiple copies; CSP must provide the firm guarantee for storing multiple copies. This paper proposes a new system model for storing and verifying multiple copies; this model deals with identifying tarnished copies which are transparent for the clients. Also, it deals with dynamic data control in the cloud with optimal results.
Article Preview
Top

Introduction

Cloud computing (CC) is a significant Information technology (IT) move and an-other model of figuring over shared registering assets, such as data transmission, stock-piling, servers, preparing force, administrations, and applications (Reddy & Babu, 2018). Today, this new worldview has turned out to be famous and is accepting a lot of consideration from analysts in the scholastic and Industrial networks. An ongoing survey demonstrates that more than 79% of associations endeavor to use information redistributing on the grounds that it calms the weight of upkeep cost just as the overhead of putting away information locally. In addition, the clients are capable to get to Information from anyplace and whenever rather than utilizing committed machines. Clients store in-formation in the cloud; their fundamental concern is whether the cloud can keep up the information trustworthiness Furthermore, and information can recuperate when there is a failure in server or data loss. It is beneficial to private organizations to outsource data to remote cloud service providers (CSP) as it allows them to store more data than on private computer systems. It also helps the organizations to concentrate on innovations and reduces the tedious task constant server updates and other computing issues. It allows most users to access the remotely stored data from various parts of the world which make it easier. The data outsourced to CSP are usually not trustable. Hence the security of data is in question for the user over privacy data. This lack of control over sensitive data raises questions about the integrity and confidentiality of the CSP. Such events can handle by encrypting important data before outsourcing to remote servers. One of the most important demands of the customers is to have strong evidence that the cloud server still possesses their data without being fiddled with the content or the data being deleted over time. To minimize the storage cloud service provider (CSP) may tend to dispose of a few information or information duplicates that are not used frequently, or alleviate. Data availability in the cloud can be achieved by data replication. Replicated data is stored on different servers situated in different areas of the world. Clients have to check regularly whether CSP maintains multiple copies of same data in different data servers because there is a chance that CSP can cheat the clients by storing the only single copy of data. The significant task of the client is to verify whether data is replicated and to check corrupted file or deleted file can be recoverable. If the data is recoverable, the data is safe since we can easily get from other replicas. Outsourcing of huge files to data centers is another imperative. To validate stored data client should not download all the stored data this is restrictive in terms of time, storage and bandwidth. Providing security for data is also a major constraint in cloud several researchers have proposed different security algorithms in the cloud. In (Schwarz & Miller, 2006) author has proposed an algebraic signature approach for accessing remote data. In (Prajapat et al., 2013) author has proposed a new security algorithm using time-variant approach. In (Subashini & Kavitha, 2011) author has introduced a security model for storing metadata in the cloud. If the client performs the continuous check on data authentication cannot be applied (Tamassia, 2003). In (Ateniese et al., 2011) author has developed Provable Data Possession (PDP) model, in this model data is preprocessed by the client, and meta-data produced for verification. The file stored on an untrusted server and the client deletes the duplicate file to check the server response. The server demonstrates that the data has not tinkered. In (Ateniese et al., 2011; Ateniese et al., 2007; Gazzoni Filho & Barreto, 2006; Golle et al., 2002; Mykletun et al., 2006; Sebé et al., 2008; Shah et al., 2007; Shah et al., 2008; Zeng, 2008) different authors have proposed PDP systems which deal with static data in which outsourced data remains same in servers. In (Ateniese et al., 2008; Erway et al., 2015; Hao et al., 2011; Wang, Wang, Ren et al, 2009; Wang, Wang, Li et al, 2009) several authors have proposed PDP systems which focus on the single copy of dynamic data. Later (Barsoum & Hasan, 2010; Curtmola et al., 2008; Hao & Yu, 2010) authors have proposed PDP systems which concentrate on multiple copies of static data. In (Reddy & Babu, 2017) the authors have investigated the inspecting issue that engages data owners to check the genuineness of remote data. Various shows have been recommended: e.g., Remote Integrity Checking (RIC), Proof of Retrievability (POR) and Provable Data Possession (PDP) shows. Frameworks involving data owner and circulated server is known as private confirmation structure. Job of two substances in private confirming structure is clarified underneath: (an) Information owner: is the owner of data, comprise of the two people and organization. Data owner is depending on cloud specialist organization for appropriate sustentation of data. (b) Cloud Storage Server (CSS): keeps up data vault space to data owner. Appropriated specialist organization is definitive for dealing with the circulated store servers.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 2 Issues (2023)
Volume 14: 6 Issues (2022): 1 Released, 5 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing