Strava has recently launched an updated set of privacy controls for managing visibility of activities on the platform. I welcome changes like this—I like granular privacy controls, so long as they come with conservative defaults. I think that building applications on the assumption that your users want their data to travel as little as possible is a solid and user-respecting foundation.
Strava's new privacy controls mostly allow you to set what kind of data other Strava users can see about you—you can let only your followers view your activities, your profile, or other metrics that Strava keeps on you. These are great controls, and very useful for preventing malicious activities (e.g. like stalking).
It made me think a little bit, and without particular resolution, about the different kinds of privacy controls that applications can give you, specifically: privacy from other users vs. privacy from the application developer themselves. Google, for instance, gives you some pretty granular controls over the historical data that they keep, allowing you, nominally, to erase your history on their platform (whether they're just setting a
deleted=true flag in their database, we'll never know). Strava allows you to opt-out of being included in Strava Metro and the Global Heatmap, and anonymises any data they do include.
My personal threat profile doesn't include stalkers, and I suppose that my activities could be used by a clever individual to do some kind of phishing, but my personal concerns are far more related to massive data-harvesting organisations and the machine-learning models that they're training on me. That was the thinking behind Jernl, a proof-of-concept journaling app that I made, which uses a user's password to encrypt all of a user's data, so that not even I, who owns the database, can view it. I don't just want to restrict access to other users, but to the owners of the platform themselves.