A Method of Sanitizing Privacy-Sensitive Sequence Pattern Networks Mined From Trajectories Released

A Method of Sanitizing Privacy-Sensitive Sequence Pattern Networks Mined From Trajectories Released

Haitao Zhang, Yunhong Zhu
Copyright: © 2019 |Pages: 27
DOI: 10.4018/IJDWM.2019070104
Article PDF Download
Open access articles are freely available for download

Abstract

Mobility patterns mined from released trajectories can help to allocate resources and provide personalized services, although these also pose a threat to personal location privacy. As the existing sanitization methods cannot deal with the problems of location privacy inference attacks based on privacy-sensitive sequence pattern networks, the authors proposed a method of sanitizing the privacy-sensitive sequence pattern networks mined from trajectories released by identifying and removing influential nodes from the networks. The authors conducted extensive experiments and the results were shown that by adjusting the parameter of the proportional factors, the proposed method can thoroughly sanitize privacy-sensitive sequence pattern networks and achieve the optimal values for security degree and connectivity degree measurements. In addition, the performance of the proposed method was shown to be stable for multiple networks with basically the same privacy-sensitive node ratio and be scalable for batches of networks with different sensitive nodes ratios.
Article Preview
Top

Introduction

With a widespread use of GPS (Global Positioning System) positioning devices in automotive and terminal equipment in addition to the fast development of social networks and location-based services, industry sectors can collect and store large amounts of trajectories in a variety of ways (Zhu, Zheng., & Wong, 2019), meaning that this type of data grows rapidly in daily life (Giannotti, 2011; Williams, Thomas, Dunbar, Eagle, & Dobra, 2015; Dobra, Williams, & Eagle, 2015). Analyzing trajectories using data mining tools can discover interesting patterns and regularities, which will help to provide auxiliary decisions for relevant industry applications (Gabrielli, Fadda, Rossetti, Nanni, Piccinini, Pedreschi et al., 2018; Blondel, Decuyper, & Krings, 2015), promote personalized medical care and precision marketing. In addition, trajectory data as a new type of data can also assist scientific workers to carry out intelligent transportation (Kujala, Aledavood, & Saramäki, 2016), urban planning (Louail, Lenormand, Ros, Picornell, Herranz, Friasmartinez et al., 2014; Li, Sun, Cao, He, & Zhu, 2016) and other research works (Ortale, Ritacco, Pelekis, Trasarti, Costa, Giannotti et al., 2008).

As technologies are intended to be neutral, they harbor neither benevolent nor malevolent intent with respect to the individuals using them. In particular, a curious or malicious user can also use the trajectory data mining tools to find non-interesting patterns. Specifically, this can include privacy-sensitive mobility patterns (i.e., mobility patterns involve privacy-sensitive spatial regions, such as military restricted areas, religious sites, private houses, private clubs, red-light-district, etc.), which will pose a threat to the location privacy of specific users (Giannotti & Pedreschi, 2008; de Montjoye, Hidalgo, Verleysen, & Blondel, 2013). Privacy-preserving data mining outsourcing (Liu, Wang, Shang, Li, & Zhang, 2017; Monreale, Rinzivillo, Pratesi, Giannotti, & Pedreschi, 2014) and privacy-preserving distributed data analytics (Monreale, Rinzivillo, Pratesi, Giannotti, & Pedreschi, 2014) are two methods to ensure that privacy-sensitive patterns are not detected by attackers in systems with trusted central servers and untrusted central servers, respectively. While, when a collector (i.e., location service provider) of trajectories wants to release (i.e., publish and share) the trajectories with a third party, the sanitization methods based on the strategy of knowledge hiding will be adopted, that is, (s)he must sanitize the trajectories to eliminate privacy-sensitive mobility patterns to prevent a threat to the privacy of the users whose trajectories were collected.

The existing sanitization methods for privacy-sensitive mobility patterns mainly aim to hide privacy-sensitive mobility patterns (Rajesh, Sujatha, & Lawrence, 2017; Bonchi & Ferrari, 2010; Aggarwal & Yu, 2008), while changing the original trajectories as little as possible. In addition, these methods are specified to certain types of mining techniques, which include association rule hiding (Tsai, Wang, Song, & Ting, 2016), sequence pattern hiding (Quang, Tai, Huynh, & Le, 2016), sequence rule hiding (Zhang, Wu, Chen, Liu, & Zhu, 2017) and so on.

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 6 Issues (2023)
Volume 18: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing