Article Preview
TopIntroduction
Currently, big data technology is more deeply and widely used (Wang et al., 2013; Tao et al., 2013), and artificial intelligence technologies such as machine learning and deep learning are also accelerating development (Wang et al., 2019). To obtain more convenient digital services, a “portrait” based on the personal data of group users is inevitable. However, the “portrait” in different fields has different requirements for data collection, combined with the different security levels of data storage by data collectors and the strong background knowledge of attackers, which makes data leakage, data selling and other data security problems (Wang et al., 2019) pose significant risks to users, developers and society (Siau et al., 2020). Aiming at reducing the risk of privacy disclosure (Dumbill et al., 2013; Meng et al., 2013), researchers have proposed privacy protection technologies such as anonymization technology (Sweeney, 2002; Machanavajjhala et al., 2007; Li et al., 2007; Xiao et al., 2007), cryptography technology (Clifton et al., 2002; Rothe, 2002; Jiang et al., 2006; Ishai et al., 2006), differential privacy technology (Dwork, 2006; Dwork, 2008; Dwork et al., 2009) and blockchain technology (Turesson et al., 2021). Among them, differential privacy technology is currently the mainstream privacy protection technology. It is difficult for attackers to use background knowledge to predict the sensitive attributes of individuals who add noise compared to the primary data, so as to control the disclosure of personal privacy in a small range (Li et al., 2012; Xiong et al., 2104).