The Dark Side of Fitness Tracking Apps

· 5 min read
The Dark Side of Fitness Tracking Apps
Photo by Joshua Miranda: https://www.pexels.com/photo/close-up-of-a-smartphone-screen-displaying-a-fitness-app-statistics-6298479/

Fitness tracking applications and devices have grown tremendously in popularity in recent years as more people seek to improve their health and wellness. By tracking metrics like steps, heart rate, sleep patterns and more, these tools provide insights that can help motivate lifestyle changes. However, along with these benefits come growing privacy and security risks that users should be aware of when deciding whether and how to utilize fitness tracking technology.

Revealing Secret Locations

One of the starkest reminders of privacy and security concerns with fitness tracking data came in 2018 when an investigative report from The Guardian revealed how user activity shared to the popular Strava fitness tracking app inadvertently exposed the locations of secret U.S. military bases. By analyzing aggregated and anonymized user-generated activity maps published by Strava, which included heatmap representations of thousands of activities like running and cycling routes, the report was able to identify several restricted areas that corresponded to bases in countries like Afghanistan and Syria.

This incident underscored how even data shared at an anonymized, global level can potentially reveal sensitive information given the right analysis and context. Individual user profiles were not exposed in this case, but it highlighted how meticulously logged GPS routes have the potential to expose hidden locations if improper privacy and security practices aren't followed. It also served as an example of how user expectations of privacy may differ between general social sharing scenarios versus activities that take place in secure government or military facilities.

Subsequent research has shown that privacy and anonymization methods employed by even the largest fitness apps can fail to fully protect users' sensitive location information. A 2020 study from the University of Illinois examined the implementation of "Privacy Zones" offered by popular apps like Strava and Garmin, which are intended to hide portions of exercise routes near private locations like a user's home. Through an interactive online survey, researchers had over 600 participants attempt to deduce protected locations based on visualized exercise routes with Privacy Zones applied.

Even with the smallest zone size (1/8 mile radius) and just 3 routes displayed, 68% of participants were able to guess the hidden location within 50 meters, approximately the size of a residential property. Having more routes available actually increased accuracy, as patterns could be more easily inferred. The study concluded that users' privacy-sensitive locations remain at risk despite Privacy Zone protections. Meanwhile, a separate 2022 report by Dark Reading found popular fitness apps continued leaking users' precise GPS location data, even when Privacy Zones were enabled, due to vulnerabilities in app code and inappropriate handling of sensitive tracking information.

Privacy Concerns of Fitness Tracking Users

Qualitative research has also provided insights into individuals' privacy perceptions and behaviors related to sharing fitness tracking data online. Interviews conducted by the American Heart Association found users have nuanced views, expressing both enthusiasm for connected social features but also concerns about unintended access to sensitive health information. A lack of control over data use and concerns about profiling by advertisers or employers were commonly cited fears.

Studies have additionally shown user privacy expectations don't always align with how personal fitness data is actually handled. A 2021 analysis of 35 diabetes management mobile apps by University of Texas Medical Branch researchers found all transmitted user data to third parties, contradicting many apps' privacy policies. With no specific legal protections for health data collected outside clinical settings, this analysis serves as a cautionary reminder that consumers must actively scrutinize how apps may monetize and share their personal information behind the scenes.

Case Study - Strava

Strava is one of the most popular fitness tracking apps, used by millions worldwide to log runs, bike rides, and other activities. On the surface, it functions like a standard health and fitness app, monitoring biometrics, pace, distance and more. However, Strava takes things a step further by integrating robust social networking features that actively encourage users to connect, friend each other, comment on activities and more.

This is where things get problematic from a privacy perspective. By default, Strava publicly shares all user activity routes, metrics and location data unless privacy settings are deliberately adjusted. So anyone can view detailed maps of another person's exact routes, along with photos and start/stop locations - potentially revealing home addresses.

Even adjusting the general profile visibility setting doesn't fully protect privacy, as features like "Segments" and the "Flyby" option override these preferences by default. Segments allows anyone to create leaderboards for physical challenges that publicly expose the names and profiles of all participants, even if their profile is set to private. The sinister "Flyby" option notifies users if they pass by another Strava user during an activity, revealing their photo, profile and home location without consent.

Advice from Experts on Enhancing Digital Health Privacy

Given these known security and privacy risks, cybersecurity specialists and medical researchers recommend a range of best practices users can adopt when choosing and utilizing fitness trackers and apps:

Use established, reputable brands
Choose fitness trackers and apps from well-known companies that have invested significant resources into security. Large brands face greater reputational risk if privacy issues arise, so they have more incentive to properly secure user data and promptly address vulnerabilities. Smaller startups may not have the same security priorities or ability to respond adequately in a breach situation.

Closely review privacy policies
It's important for users to take time reading an app's privacy policy fully to understand exactly what data is collected, how it's used internally by the app developer, and whether any information is shared with third parties. Use services like Exodus Privacy, that help you break down security policies for the applications, before using them. Default settings should also be checked, as sensitive sharing options like geolocation are sometimes set to "on" without explicit consent. Having awareness of potential data flows allows users to make informed participation decisions.

Employ strong passwords and authentication
Strong, unique passwords for each account help prevent hackers from accessing personal fitness profiles through breaches of other services. Multifactor authentication adds another layer of security by requiring a code sent to a separate device during login. These precautions place hurdles before malicious actors aiming to intercept health and location data.

Limit location sharing carefully
Publicly broadcasting one's home address, workplace or frequent exercise routes through GPS data poses potential privacy and safety risks. An assailant, stalker or thief could glean patterns to target specific individuals. Adjusting privacy settings to opt-out of location sharing or releasing only minimal GPS points can help manage exposure of sensitive places.

Keep apps and devices updated
Like all software, fitness apps and the operating systems on paired devices will occasionally require updates to address newly discovered flaws that can enable unauthorized access if left unpatched. Downloading updates promptly after release helps bolster protection of personal information over time, as new vulnerabilities are continually found and sealed.

Avoid commercialization of sensitive data
Apps funded through ads or the resale of health metrics present conflicts of interest, as users have less control over how their intimate data may be packaged and distributed commercially without consent. Monetization also provides weaker incentives to prioritize privacy over profits.

Advocate for policymakers to strengthen legal protections for health data beyond clinical care settings and regulate data broker industries that thrive on personal information trading.

Continued Growth of Connected Health Devices Demands Strong Protections

As more personal devices, apps, wearables and even implantable sensors enter the healthcare ecosystem, capturing ever richer and more intimate types of personal medical and wellness data, strong regulatory guardrails and cultural shifts will be needed to ensure individual privacy and consent are protected. Current piecemeal sectoral laws in countries like the U.S., and a generally lax approach taken by technology platforms and data brokers, leave much consumer health information vulnerable or even commoditized without individuals' knowledge.

Yet, consumers also increasingly demand interconnectivity and insights from their digital devices. Finding balanced, equitable and practical solutions will require input from technology innovators, medical experts, privacy advocates, policymakers and most importantly - individuals sharing their own priorities, boundaries and lived experiences navigating this new frontier of connected care. With open dialogue and a shared commitment to ethical data practices, the promise of digital health tools to improve lives need not come at the cost of user trust, autonomy or well-being. But meaningful change will take ongoing, cooperative effort from all stakeholders