Should apps share details of women’s menstruation and sex lives with Facebook and other sites? Some already do

Posted on Sep 13, 2019 by Glyn Moody

Back in January, Privacy News Online wrote about some important research from Privacy International. It found that 61% of the Android apps it investigated automatically transfer data to Facebook the moment a user opens them. This happens whether or not people have a Facebook account, and regardless of whether they are logged into Facebook or not. In March, Privacy International re-tested the most popular apps to see whether companies had improved their data protection: “Two thirds of all apps we retested, including Spotify, Skyscanner and KAYAK, have updated their apps so that they no longer contact Facebook when you open the app.” By implication, the other third were still sending data to Facebook whether you wanted that or not.

As a follow-up to this work, the organization looked at a class of apps where privacy is particularly important because they handle some of the most personal and intimate data: period tracker apps. These are typically used by women who wish to maximize the chance of conceiving a child. Once again, a key problem is that some apps pass data to Facebook even before the user gives permission to do so:

Our traffic analysis reveals, first of all, that Maya informs Facebook when you open the app. There is already a lot of information Facebook can assume from that simple notification: that you are probably a woman, probably menstruating, possibly trying to have (or trying to avoid having) a baby. Moreover, even though you are asked to agree to their privacy policy, Maya starts sharing data with Facebook before you get to agree to anything.

The Maya app also shares this information with a site called wzrkt.com. Wzrkt stands for “Wizard Rocket”, the former name of a company now known as CleverTap. In their response to Privacy International’s report, CleverTap describes itself as “a customer retention platform that helps consumer brands maximize user lifetime value, optimize key conversion metrics, and boost retention rates.” Similarly, another menstruation app, MIA, shares everything you enter with both Facebook and AppsFlyer – “a service that enables app owners to analyse and interpret the performance of their marketing efforts.” With Maya:

it is not just your mood, medical data, sexual intercourse and personal notes that gets shared with Facebook. In fact, it is every single interaction between you and the app. When you open the app, how you navigate through the app, the dates of your menstruation cycle, and so on.

The researchers point out why this kind of highly-personal information is so sought-after by marketing companies:

understanding when a person is in a vulnerable state of mind means you can strategically target them. Knowing when a teenager is feeling low means an advertiser might try and sell them a food supplement that is supposed to make them feel strong and focused. Understanding people’s mood is an entry point for manipulating them.

Advertisers like to claim that they need this kind of fine-grained information about people in order to offer them appropriate advertising. But passing on details about what we like and who we are also allows us to be nudged into clicking on ads and buying products by exploiting our weak points. Moreover, the Cambridge Analytica scandal showed that it is not just companies selling products and services who want to know how people are feeling at a particular moment. When there are upcoming elections, political parties may want to know if potential voters feel anxious, stressed or excited, so that they can adapt their narratives accordingly – exactly as Cambridge Analytica did in both the US and UK, and probably elsewhere too.

Aside from the high level of intrusion this kind of tracking represents, there’s another worrying aspect. Judging by the 187,000 reviews of Maya on Google Play, almost nobody is aware of how their most personal information is being passed around. That’s not a surprise: Privacy International had to use some fairly sophisticated software tools in order to study the data flows from these period tracking apps. Few general users would be able to do that, even if it occurred to them to try. But the more sensitive the personal data that is being collected, the stronger should be the protections to keep it safe at all times, and the greater should be the transparency about how it used.

Many might feel that legislation is required to prevent the abuse of particularly intimate information. The EU has had the far-reaching GDPR legislation in place for over a year now, designed to rein in precisely these practices. But the Maya app discussed above shows why passing laws, no matter how strong and well intentioned, may not help much. Maya comes from a company called Plackal, which is based in Bangalore, India. Enforcing those EU rules in India is hard.

The central problem is not legal, but economic. Advertisers are demanding too much information about people because they view micro-targeted advertising as the best way to make money. Internet giants like Facebook and Google are currently happy to provide that level of detail because their users are either unaware of how their privacy is being undermined, or feel powerless to do anything about it.

As this blog has pointed out before, the solution is to move away from micro-targeted ads towards contextual marketing. It’s an approach that has been used by traditional media with great success for over a century. Ads are chosen based on the surrounding editorial, and have no need for intimate details about the person viewing them. The fact that someone is reading a particular post, article or Web page means they are likely to be interested in the topics it discusses, and that provides enough information for targeted marketing. In this situation, there is no justification for apps to send any personal information to Facebook or anyone else, much less intimate details about people’s sex lives.

Featured image by Cheryl Holt.