What Can Fitness Apps Teach Us About Group Privacy?

What Can Fitness Apps Teach Us About Group Privacy?

Miriam J. Metzger, Jennifer Jiyoung Suh, Scott Reid, Amr El Abbadi
DOI: 10.4018/978-1-7998-3487-8.ch001
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter begins with a case study of Strava, a fitness app that inadvertently exposed sensitive military information even while protecting individual users' information privacy. The case study is analyzed as an example of how recent advances in algorithmic group inference technologies threaten privacy, both for individuals and for groups. It then argues that while individual privacy from big data analytics is well understood, group privacy is not. Results of an experiment to better understand group privacy are presented. Findings show that group and individual privacy are psychologically distinct and uniquely affect people's evaluations, use, and tolerance for a fictitious fitness app. The chapter concludes with a discussion of group-inference technologies ethics and offers recommendations for fitness app designers.
Chapter Preview
Top

Introduction

In November 2017, Strava, a popular fitness app that allows users to record and share their exercise routes or routines via smartphone and fitness trackers, published a global heatmap based on user data. The data were collected from individual users during 2015 and 2017 and consisted of one billion instances of user activities that covered three trillion GPS data points along a distance of 10 billion miles (Drew, 2017; Hern, 2018). The following year, Nathan Ruser, a 20-year-old college student from Australia, posted on Twitter (see Figure 1) that locations and routines of military bases and personnel around the world were being revealed by the heatmap that Strava produced (Pérez-Peña & Rosenberg, 2018; Tufekci, 2018).

Figure 1.

Nathan Ruser’s tweet showing Strava heatmaps (Source: https://twitter.com/Nrg8000/status/957318498102865920?s=20)

978-1-7998-3487-8.ch001.f01

While the heatmap used data that were anonymized and thus did not reveal personal information about any individual, when the individual data were aggregated to build the heatmap, it revealed the location and routines of identifiable groups, including U.S. military personnel who use Strava and who are based in various countries around the world. Ruser’s Twitter post quickly caught the attention of the press, and BBC News reported that Strava’s heatmap revealed the potential exercise routes of U.S. soldiers in countries such as Syria, Yemen, Niger, Afghanistan, and Djibouti (BBC News, 2018). As Strava had 27 million users around the world by 2018, Ruser noted that this heatmap also showed the perimeter and possible patrol routes of known and secret military bases of other countries as well, such as Russia and Turkey. “The revelation that individual data collection that may have seemed harmless in isolation could upend closely-guarded government secrets was a wakeup call to many people who never considered what the larger ramifications of sharing their location data with apps like Strava could be” (Romano, 2018, n.p.). In response to the Strava event, U.S. troops and civilian Defense Department employees are now prohibited from using geolocation features on both government-issued and personal devices in locations identified as “operational areas” (Lamothe, 2018).

The Strava example shows that while aggregating anonymized individual data can protect specific individuals’ identity information, such data still have privacy implications for groups that are identified or profiled by the technology. The revelation of where a military group is located puts both the group as a whole, as well as individual members of that group, at risk. So, by threatening group privacy (e.g., revealing the location of a secret military base), the privacy and safety of individual group members (e.g., soldiers stationed on that base) are also threatened, even when the data are not linked to any of those individuals’ identities.

Key Terms in this Chapter

True Categorization Threat: Threat arising from correctly and publicly categorizing someone who is part of a group as being a member of the group against their will.

FitNow: A fictitious fitness app used for the purpose of our experiments.

Passive Groups: Determined groups or collectivities by algorithmically extracting subsets or classes of similar individuals based on common habits and characteristics.

False Categorization Threat: Threat arising from incorrectly categorizing someone who is not part of a group as being a member of the group.

Data Ethics: The branch of ethics that studies and evaluates moral problems related to data in order to achieve morally good solutions.

Active Groups: Groups or collectivities that are identifiable by members of society and where group members know they are members.

Anonymization: A process of removing personally identifying information from data for the purpose of preserving the privacy of the people whom the data describe.

Strava: A popular fitness tracking app.

Group-Inference Algorithm: A form of machine learning and data mining that produces group-level inferences by algorithmic aggregation of individual-level data.

Entitativity: A psychological perception of the extent to which a collection of people constitutes a group.

Group Privacy: The privacy of groups as a whole; the right for groups to control data about themselves.

Complete Chapter List

Search this Book:
Reset