Comprehension, Perception, and Projection: The Role of Situation Awareness in User Decision Autonomy When Providing eConsent

Comprehension, Perception, and Projection: The Role of Situation Awareness in User Decision Autonomy When Providing eConsent

Wendy Rowan, Yvonne O'Connor, Laura Lynch, Ciara Heavin
Copyright: © 2021 |Pages: 31
DOI: 10.4018/JOEUC.286766
Article PDF Download
Open access articles are freely available for download

Abstract

Health social networks (HSNs) allow individuals with health information needs to connect and discuss health-related issues online. Political-technology intertwinement (e.g. GDPR and Digital Technology) highlights that users need to be aware, understand, and willing to provide electronic consent (eConsent) when sharing personal information online. The objective of this study is to explore the ‘As-Is’ factors which impact individuals’ decisional autonomy when consenting to the privacy policy (PP) and Terms and Conditions (T&Cs) on a HSN. We use a Situational Awareness (SA) lens to examine decision autonomy when providing eConsent. A mixed-methods approach reveals that technical and privacy comprehension, user perceptions, and projection of future consequences impact participants’ decision autonomy in providing eConsent. Without dealing with the privacy paradox at the outset, decision awareness and latterly decision satisfaction is negatively impacted. Movement away from clickwrap online consent to customised two-way engagement is the way forward for the design of eConsent.
Article Preview
Top

1. Introduction

Decision making in the use of technology is important, with concerns over users being truly informed about the choices they make online (Williams, Burnap, Sloan, Jessop, & Lepps, 2017). Many social networking sites exist where individuals create public profiles within a service, connecting with other users (Boyd, 2007; Jeong & Kim, 2017; Li, Cheng, & Teng, 2020; Ortiz, Chih, & Tsai, 2018; Rathore, Sharma, Loia, Jeong, & Park, 2017). A growing number of users leverage online fora in an attempt to finding information relevant to their healthcare needs (Choi et al., 2017). This has led to the development and proliferation of Health Social Networks (HSNs).

HSNs are online services where people connect and share relevant health data (Li, 2013). HSNs offer users emotional support, Q&A with physicians, quantified self-tracking and/or access to clinical trials (Choi et al., 2017; O'Leary, Coulson, Perez-Vallejos, & McAuley, 2020). The key value for users is their ability to connect with others with similar health situations (Choi et al., 2017; Meng, 2016; Swan, 2009). When individuals share personal health information (PHI) online, they contribute to ‘big data sets’ that could potentially be used for medical research or by other third parties (Lee, Park, Chang, & Ko, 2019; Leon-Sanz, 2019; Murdoch & Detsky, 2013). Some HSNs sell anonymized PHI data to pharmaceutical companies, universities, and research labs (Bouraga, Jureta, & Faulkner, 2019; Kotsilieris, Pavlaki, Christopoulou, & Anagnostopoulos, 2017; Swan, 2009).

Existing research recognises the affordances of ‘big data’ in the health domain, extolling big data as an opportunity to leverage patient and practitioner data as a means of improving the quality and efficiency of healthcare systems (Horehájová & Marasová, 2020; McAfee, Brynjolfsson, & Davenport, 2012; Milenkovic, Vukmirovic, & Milenkovic, 2019). Big data has the potential to improve problem solving by providing greater insight into complex issues (Madden, Gilman, Levy, & Marwick, 2017). Given the existence of these big data sets the evolution of big data analytics is inevitable, bringing with it several challenges including the need to establish robust privacy and security standards and governance to protect patients and their PHI (Price, 2020; Price & Cohen, 2019; Raghupathi & Raghupathi, 2014; Sharma, Singh, & Rehman, 2020). Madden, Gilman, Levy & Marwick, (2017) remind us that big data holds the risk of information misuse, a “black box society” (Pasquale, 2015), a “transparency paradox” (Richards & King, 2013) and a lack of “algorithmic accountability” (Rosenblat, Kneese, & Boyd, 2014) where the individual is oblivious as to how their data is being manipulated (Madden et al., 2017).

Complete Article List

Search this Journal:
Reset
Volume 36: 1 Issue (2024)
Volume 35: 3 Issues (2023)
Volume 34: 10 Issues (2022)
Volume 33: 6 Issues (2021)
Volume 32: 4 Issues (2020)
Volume 31: 4 Issues (2019)
Volume 30: 4 Issues (2018)
Volume 29: 4 Issues (2017)
Volume 28: 4 Issues (2016)
Volume 27: 4 Issues (2015)
Volume 26: 4 Issues (2014)
Volume 25: 4 Issues (2013)
Volume 24: 4 Issues (2012)
Volume 23: 4 Issues (2011)
Volume 22: 4 Issues (2010)
Volume 21: 4 Issues (2009)
Volume 20: 4 Issues (2008)
Volume 19: 4 Issues (2007)
Volume 18: 4 Issues (2006)
Volume 17: 4 Issues (2005)
Volume 16: 4 Issues (2004)
Volume 15: 4 Issues (2003)
Volume 14: 4 Issues (2002)
Volume 13: 4 Issues (2001)
Volume 12: 4 Issues (2000)
Volume 11: 4 Issues (1999)
Volume 10: 4 Issues (1998)
Volume 9: 4 Issues (1997)
Volume 8: 4 Issues (1996)
Volume 7: 4 Issues (1995)
Volume 6: 4 Issues (1994)
Volume 5: 4 Issues (1993)
Volume 4: 4 Issues (1992)
Volume 3: 4 Issues (1991)
Volume 2: 4 Issues (1990)
Volume 1: 3 Issues (1989)
View Complete Journal Contents Listing