Strava and Behavioral Economics

I am a self-described health and fitness nut, and in the years since smartphones have become an essential device in our day-to-day lives, technology has also slowly infiltrated my daily fitness regime.  With such pervasive use of apps to track one’s own health and lifestyle choices, is it any wonder that companies are also collecting the data that we freely give them, with the potential to monetize that information in unexpected ways?  Ten years ago, when I went outside for a run, I would try to keep to daylight hours and busy streets because of the worry that something could happen to me and no one would know.  Now, the worry is completely different – now I am worried that if I use my GPS-enabled running app, my location (along with my heart rate and running speed) is saved and stored in some unknown database, to be used in some unknown manner.

 

Recently, a fitness app called Strava made headlines after it published a heat map showing the locations and workouts of users who made the data public (which is the default setting) and inadvertently revealed the location of secret military bases and the daily habits of personnel.  It was a harsh reminder of how the seemingly innocuous use of an everyday tool can have serious consequences – not just personally, but also professionally, and even for one’s own safety (the Strava heatmap showed certain jogging routes of military personnel in the Middle East).  Strava’s response to the debacle was to release a statement that said they were reviewing their features, but also directed their users to review their own privacy settings – thus the burden remains on the user to opt out, for now.

 

Fitness apps don’t just have the problem of oversharing their users’ locations.  Apps and devices like Strava, or Fitbit, are in the business of collecting a myriad of health and wellness data, from sleep patterns, and heart rates, to what the user eats in a day.  Such data is especially sensitive, because it relates to a user’s health – however, because the user is not sharing it with their doctor or hospital, they may not even realize the extent to which others’ may be able to infer their private sensitive information.

 

One of the biggest issues here is the default setting.  Behavioral economics studies show that the status quo bias is a powerful indicator of how us humans make (or fail to make) decisions.  Additionally, most users simply fail to read and understand privacy statements when they sign up to use an app.  Why do some companies still choose to make the default setting “public” for users of their app – especially in cases where it is not necessary? For Strava, if the default had been to “opt in” to share your location and fitness tracking data with the public, their heatmaps would have looked very different.

 

It is not in the interest of companies to allow the default settings to be anything other than public.  The fewer people who share data, the less the company has about you, and the less likely they are able to use the data to their benefit – such as targeted marketing techniques, or using the data to develop additional features for the individual user.  Thus, they could argue that collecting their users’ data on a more widespread basis also benefits their users in the long run (as well as their own revenues).  However, headlines like this one erode public trust in technology companies – and companies such as Strava would do well to remember that their revenues also depend on the trust of their users.  In the absence of allowing “private” or “friends only” default settings, these companies would do well to analyze the potential consequences before releasing the public data that they collect about their users.

 

Leave a Reply