No Body's Business But Mine: How Menstruation Apps Are Sharing Your Data

menstruation

Photo by Erol Ahmed on Unsplash

 

In December 2018, Privacy international exposed the dubious practices of some of the most popular apps in the world.

Out of the 36 apps we tested, we found that 61% automatically transfer data to Facebook the moment a user opens the app. This happens whether the user has a Facebook account or not, and whether they are logged into Facebook or not. We also found that some of those apps routinely send Facebook incredibly detailed and sometimes sensitive personal data. Again, it didn’t matter if people were logged out of Facebook or didn’t have an account.

This sharing happens through the Facebook Software Development Kit (SDK), a set of software development tools that can be used to develop apps for a specific operating system. In an email to us on 29 December 2018, Facebook described how their product works:

“Developers can receive analytics that allow them to understand what the audience of their app enjoys and improve their apps over time. Developers may also use Facebook services to monetise their apps through Facebook Audience Network. Subject to that Facebook user's prior consent, Facebook may also use this data to provide that user with more personalised ads.”

Facebook routinely receives data users, non-users and logged-out users outside its platform through Facebook Business Tools. For instance, any website that has integrated a Facebook “Like” button or a tracking pixel automatically sends data to Facebook.

Facebook's SDK for Android allows app developers to integrate their apps with Facebook’s platform and contains a number of core components: Analytics, Ads, Login, Account Kit, Share, Graph API, App Events and App Links. For example, using Facebook's SDK allows apps to use a "Login with Facebook" based authentication, meaning users can log in using their Facebook account.

Following the revelations, two thirds of the companies we exposed have updated their apps. This year, we decided to follow the same methodology to look into the apps we share some of our most sensitive data with: menstruation apps.

Menstruation apps are not just concerned with your menstruation cycles. As our partner organisation Coding Rights showed in their research, Menstruapps – How to Turn Your Period Into Money (For Others), they collect information about your health, your sexual life, your mood and more – all in exchange for telling you what day of the month you’re most fertile or the date of your next period. In fact, the data you share with your menstruation app is probably information you would not share with others.

We therefore wanted to make sure that they keep this information to themselves, rather than sharing it with other companies. We initially looked at the most popular apps: Period Tracker by Leap Fitness Group; Period Tracker Flo by Flo Health, Inc.; Period Tracker by Simple Design Ltd.; and Clue Period Tracker by Biowink.

We did a dynamic analysis of the apps using our data interception environment (available here and see Annex 1 for methodology) to look at the data that those apps share with Facebook. We were pleased to see none of those apps did, including Clue, which changed their practices after we called them out in our first round of checks.

But what about other apps? The ones that may not be the biggest players but still boast millions of users? We set out to look at apps we noticed were popular in different parts of the world and decided to look at Maya by Plackal Tech, MIA by Mobapp Development Limited, My Period Tracker by Linchpin Health, Ovulation Calculator by Pinkbird, Period Tracker by GP International LLC and Mi Calendario by Grupo Familia.

Period Tracker by GP International LLC did not appear to share any data with Facebook. The other apps we looked at on the other hand turned out to be a little more indiscreet.

As we will expose in this report, Maya by Plackal Tech and MIA by Mobapp Development Limited conducted – at the time of the research – what we believe to be extensive sharing of sensitive personal data with third parties, including Facebook. However, we are pleased to announce after we shared this report with Maya by Plackal Tech, they said:

“We understand your concern that in addition to providing the analytics SDK, Facebook is also a social network and an ad network. We have hence removed both the Facebook core SDK and Analytics SDK from Maya. Version 3.6.7.7 with these changes is live on the Google Play Store and will be submitted for review to the Apple App Store by this weekend. We continue to use the Facebook Ad SDK, post opt-in to our terms and conditions and privacy policy. Maya does not share any personally identifiable data or medical data with the Facebook Ad SDK. The Ad SDK helps us earn revenue by displaying ads that our users can opt out of by subscribing to Maya's premium subscription.”  (See Annex 2)

The full responses from all the companies we contacted and which responded to us are available in the annex.

Linchpin Health did not respond. MIA did not wish to have their response published.

 

Feeling anxious? Got lucky last night? Having some health issues? Tell Maya and they’ll let Facebook and others know (oh, and they’ll share your diary too…)

 

Maya by Plackal Tech (over 5 million downloads on Google Play) is the kind of app that wants you to share. A lot. The problem is what you share won’t stay between you and Maya. Our traffic analysis reveals, first of all, that Maya informs Facebook when you open the app. There is already a lot of information Facebook can assume from that simple notification: that you are probably a woman, probably menstruating, possibly trying to have (or trying to avoid having) a baby. Moreover, even though you are asked to agree to their privacy policy, Maya starts sharing data with Facebook before you get to agree to anything. This raises some serious transparency concerns.

Maya informs Facebook the app has been opened
Maya informs Facebook the app has been opened

 

Medical data is among the most sensitive data one can collect. Confidentiality is at the heart of medical ethics and countries that have data protection laws traditionally have a separate regime for health data, which includes health data, which are considered sensitive data. Thus, when Maya asks you to enter how you feel and offers suggestions of symptoms you might have - suggestions like blood pressure, swelling or acne - one would hope this data would be treated with extra care. But no, that information is shared with Facebook.

The Maya app interface: entering health data
The Maya app interface: entering health data
Maya lets Facebook know you have edited your health data
Maya lets Facebook know you have edited your health data
How your medical data is shared with Facebook
How your medical data is shared with Facebook

 

When it comes to your medical data, Maya is not just concerned with your symptoms. They also want to know about your use of contraception. And that too is shared with Facebook.

"Pill data edited": how Maya shares your contraceptive practice with Facebook
"Pill data edited": how Maya shares your contraceptive practice with Facebook

 

Beyond your medical data, Maya also asks for other health related information about your mood and how you feel. If you are feeling happy, anxious or excited, you can let Maya know and they will share it with Facebook.

The Maya app interface: entering your mood
The Maya app interface: entering your mood
How Maya shares your mood with Facebook
How Maya shares your mood with Facebook

 

Although Maya’s privacy policy says that no personal data is disclosed to advertisers, it then states that users' personal data may be used "to comply with our advertisers’ wishes by displaying their advertisement to that target audience". They don’t specify whether this involves health-related data.

 

There is a reason why advertisers are so interested in your mood; understanding when a person is in a vulnerable state of mind means you can strategically target them. Knowing when a teenager is feeling low means an advertiser might try and sell them a food supplement that is supposed to make them feel strong and focused. Understanding people’s mood is an entry point for manipulating them. And that is all the more worrying in an age when Facebook is having so much impact on our democracies, as the Cambridge Analytica scandal revealed. Indeed, it is not just advertisers that will want to know how we feel; as elections approach, political parties may want to know if we feel anxious, stressed or excited so that they can adapt their narratives accordingly.

 

Like other menstruation apps, Maya is also gathering data about our intimate life - requesting information about when you have had sex and whether the intercourse was protected or not. 

Protected or unprotected: the Maya interface expects data about your sexual life
Protected or unprotected: the Maya interface expects data about your sexual life

 

Unlike the other data that you enter, which is shared in human readable text with Facebook, the app encodes this data by qualifying protected sex as “2” and unprotected sex as “3.”

What Facebook sees when you enter "protected sex" in Maya
What Facebook sees when you enter "protected sex" in Maya
What Facebook sees when you enter "unprotected sex" in Maya
What Facebook sees when you enter "unprotected sex" in Maya

In their response to us Maya states:

All data accessed by Maya are also essential to the proper functioning of the product. Predicting information pertaining to menstrual cycles is complex and dependent on thousands of variables.” (See Annex 2)

We understand that certain personal data is necessary to provide the service to users. It is hard to see, however, how whether you’ve had unprotected sex or not is relevant to predicting menstruation cycles.

Maya is not just asking you to click to enter information. It’s also encouraging you to enter your own notes and comments in sections like “Reminders” or in a diary-like section. Considering the nature of the app, we would expect the information users enter to be of a sensitive nature. As we conducted traffic analysis, we entered “something very sensitive entered here” in the diary section of the app to see what would happen. The result? What we wrote was shared with Facebook.

How the text in your diary is shared with Facebook
How the text in your diary is shared with Facebook

 

So far, we have highlighted what we perceive as the most sensitive data that Maya shares with Facebook. But it is worth remembering that it is not just your mood, medical data, sexual intercourse and personal notes that gets shared with Facebook. In fact, it is every single interaction between you and the app. When you open the app, how you navigate through the app, the dates of your menstruation cycle, and so on.

But as we did the traffic analysis, we noticed Facebook was not the only one getting that data. Everything was also shared with another third party that appeared as “wzrkt.com”.

Your medical data is shared with "wrkt.com"
Your medical data is shared with "wrkt.com"

 

On the picture above, you can see what data sharing with wzrkt.com looks like. Here, we went to the “symptoms” section of the app and we clicked on “diarrhea” and “nausea.”

So, who is “wzrkt.com”? Wzrkt stands for “Wizard Rocket”, the former name of a company now known as CleverTap. In their response to our report, CleverTap describe themselves as “a customer retention platform that helps consumer brands maximize user lifetime value, optimize key conversion metrics, and boost retention rates.”

As we mentioned above, the sharing starts before the user has time to even consent to it. Once you do agree to their privacy policy, what is it that you are effectively agreeing to? When it comes to third-party data sharing, Maya’s privacy policy (as of August 19th 2019) contains the following statement:

“We may share Your information with our sponsors, and/or business partners. Your Information could be shared so that you may receive newsletters, offers, information about new services, and other information, if applicable. The information collected from You and other users may be analysed in different manners.”

Besides this general information, no other information is provided to users about the exact recipients with whom data might be shared, what this data could entail, and what these “different manners” are. The users are not even told if this data anonymised or not.

 

If you have unprotected sex, MIA will tell you what to do. And share it with Facebook and others

 

MIA Fem by Mobapp Development Limited (over 1 million downloads on Google Play) was the next app we looked at. Maya, MIA - different companies, similar names and – at the time of research – similar practices.

Like Maya, MIA wants you to agree to their privacy policy when you first sign up, but they don’t wait for you to agree to start sharing your data with Facebook. Data about you and your device are relayed to Facebook the moment you open the app, thereby letting them know you are using a menstruation app.

And, just like Maya, Facebook is not the only third party that will get access to your data if you use MIA. Everything that is shared with Facebook is also shared with AppsFlyer. AppsFlyer is “a service that enables app owners to analyse and interpret the performance of their marketing efforts.” (cf. AppFlyer’s response Annex 6).

MIA shares your menstruation cycle data with AppsFlyer
MIA shares your menstruation cycle data with AppsFlyer

 

Before you start, MIA wants to know if you intend to use the app as a regular period tracker, or if you are trying to get pregnant and using it to maximise your chances. Effectively, this does not make any difference in terms of how you use the app or what the app has to offer. The big difference, of course, is for the advertisers. The moment you click on the icon to let the app know you are trying to get pregnant, you are immediately targeted with an ad for a premium version to of the app to help you conceive. The information is also shared with Facebook.

With MIA Facebook gets to know why you use a menstruation app
With MIA Facebook gets to know why you use a menstruation app

 

The data of pregnant women is particularly valuable to advertisers: expecting parents are consumers who are likely to change to their purchasing habits. In the US for instance, an average person’s data is worth $0.10, while a pregnant woman’s will be $1.50.  

Like all menstruation apps, MIA starts by asking you for the date of your last period, the duration of your periods and the duration of your cycle. This is all shared with Facebook and AppsFlyer.

Here MIA shares the date of our last period with Facebook
Here MIA shares the date of our last period with Facebook
And here it shares the duration of our cycle as we have entered it with Facebook
And here it shares the duration of our cycle as we have entered it with Facebook
The data we have entered about menstruation cycle is also shared with AppsFlyer
The data we have entered about menstruation cycle is also shared with AppsFlyer

 

Now, that MIA, Facebook and AppsFlyer know everything there is to know about your menstruation cycle, it is time for them to ask you about your date of birth. Note that when this data is shared with Facebook it also lets Facebook know what phase of the cycle you are currently on.

The current phase of your cycle is date of birth is shared with Facebook
The current phase of your cycle is date of birth is shared with Facebook

 

MIA also gives you the option to enter all sorts of data. Not just about your health and your moods but more broadly about your lifecycle, such as use of coffee, alcohol, cigarettes, tampons…

Cigarettes and coffee: all the lifestyle data MIA wants to know about you
Cigarettes and coffee: all the lifestyle data MIA wants to know about you

 

Now, this data does not immediately get shared with Facebook when you make a selection. However, upon selecting one or more options MIA offers to “analyse [your] symptoms.” 

Stomach, feminine tools and hair: medical and lifestyle data you can enter when using MIA
Stomach, feminine tools and hair: medical and lifestyle data you can enter when using MIA

 

When you click on it, you will be presented with a collection of articles that have been tailored for you based on what you have selected, and occasionally based on other information that MIA has inferred about you, like your menstruation cycle phase. For instance, we selected ‘masturbated’ in the section on sex and were recommended an article called “Masturbation: What You Want to Know But Are Ashamed to Ask.”

masturbation mia interface
How the data you enter (here masturbation) is used to suggest articles
How the data you enter (here masturbation) is used to suggest articles

 

This is where Facebook comes in. The app lets Facebook know what articles are featured in your personal feed, thereby letting Facebook know what you have entered into the app. You can see it here, after we entered masturbation:

Facebook gets data on the articles that are suggested based on the information you entered
Facebook gets data on the articles that are suggested based on the information you entered

 

And this is how MIA gets to share your most intimate data with Facebook. See below how MIA shares information about your alcohol consumption, the ups-and-down of your sex life, or when you experience cramps during your ovulation phase (you may not even know you are ovulating but MIA has figured that out for you and it is letting Facebook know).

On this picture, it is clear for Facebook that the article we got is targeted based on our alcohol comsumption data
On this picture, it is clear for Facebook that the article we got is targeted based on our alcohol comsumption data
Here, the article is targeted based on our sexual life information
Here, the article is targeted based on our sexual life information
Here, the article is targeted based on profiling MIA does based on our data (i.e we are ovulating) and data we have directly entered (i.e cramps)
Here, the article is targeted based on profiling MIA does based on our data (i.e we are ovulating) and data we have directly entered (i.e cramps)

 

This is also how AppsFlyer gets to find out about you.

The targeted articles are also shared with AppsFlyer
The targeted articles are also shared with AppsFlyer

 

Note that again, when you select “cramps”, AppsFlyer get some extra details about your menstruation cycle, in this case “no period, no ovulation.”

No period, no ovulation: the extra details AppdFlyer gets about you
No period, no ovulation: the extra details AppdFlyer gets about you

 

Beyond the health and lifestyle questions that shape your “Personal feed,” MIA has a separate section for “Reminders” - by reminders they mean a reminder for birth control pill. By asking people to enter this data, MIA is, again, collecting medical data. Far from treating it with utmost care, it is shared with Facebook and AppsFlyer. Here, we entered “Name of my Pill” where MIA was asking to enter the name of our pill. In their response to these findings, Appsflyer stated:

“In this case we have reached out to the app developer and reminded them of this and will work with them to ensure our services are not used to collect any such personal information.” (See Annex 6)

What Facebook sees when you enter your birth control pill reminder
What Facebook sees when you enter your birth control pill reminder

 

It should be underlined that the observations mentioned above referred to the practices of MIA at the times of the research, namely in May 2019.  

 

What about the other apps?

 

We also looked at the following apps and found that they all informed Facebook when you open the app:

  • My Period Tracker by Linchpin Health (over 1 million downloads on Google Play),
  • Ovulation Calculator by Pinkbird (over 500,000 downloads on Google Play),
  • Mi Calendario by Grupo Familia (over 1 million downloads on Google Play)

As we highlighted earlier, this already reveals information, which could be potentially used for advertising purposes, and it is all the more worrying that this happens without the users’ consent.

Mi Calendario by Grupo Familia was also using an outdated version of the Facebook SDK, which presents security concerns.

 

What does the law say about all this?

 

When it comes to data protection, the big divide remains whether your app is either based in the European Union or offers services to users who are in the European Union, or if it is based outside of the European Union and not meant for users in the European Union. If you are in the EU, you are protected by the General Data Protection Regulation (GDPR).

Privacy International has been calling out the practices of companies that set different standards for their EU customers and their non-EU customers, as we believe everyone should benefit from the high standards of protection GDPR has set.

GDPR obliges data controllers (in this case, the company that owns the app) to provide adequate information to data subjects (the users) so that they are properly informed about the use of their personal data. In practice this is mostly done through privacy policies, which provide some basic information to users regarding possible uses, purposes, transfers, among other things, of people’s personal data. Those policies need to be written in concise, plain, understandable and user-friendly language. However, the problem most of the time is that these policies contain vague and generic wording or merely provide for indicative or non-exhaustive lists of what the company can do with your data. European data protection laws, namely the GDPR, oblige controllers to provide data subjects with information relating, at least, to the contact details of the controller, the purposes and legal bases under which their personal data will be processed, information about the recipients to which their personal data will be disclosed, including third country transfers, as well as basic information regarding the exercise of their data protection rights, such as the right to access their personal data, request their erasure or lodge a complaint with their regulator etc. This information needs to be provided at the point of collection of personal data from users.

 

Maya by Plackal Tech

Maya, like every app we have reviewed for this research, processes large amounts of personal data, including data relating to health, which could be deemed as a special category data (sensitive data) under EU data protection laws, as we highlighted before.

In their privacy policy (as of August 19th 2019), Plackal Tech is explicit that Maya collects information about "notes, symptoms, or moods" as well as "information that you enter into the App, including the length of your menstrual cycles, and general information about your health such your weight, mood, temperature and/or any physical intimacy".

Plackal Tech is located in India. However, it is serving EU users as it is available on the Google Play Store UK, which means that a UK user can download and use the app in the EU. Although they do not specifically mention use by EU users, the Terms and Privacy Policy of the app states that the app is available in India or in other jurisdictions (sic).

EU data protection law forbids the processing of special category data, except under specific circumstances, such as with the explicit consent of the user. In this case, it is questionable whether Maya could claim to have obtained users’ informed, unambiguous and explicit consent for its data sharing, considering that personal data is shared before users even get to see, let alone agree, to the privacy policy. In other words, it is hard to see how an average user would even implicitly agree to an app sharing such intimate details of their health and sexual life with Facebook, as this goes beyond what one would reasonably expect in this context.

Plackal Tech also states that they “may also collect the precise location of your device when the app is running in the foreground or background”. They “may also derive your approximate location from your IP address”.

It is questionable whether this extensive data collection is strictly necessary for providing the service requested by users and, accordingly, raises a series of questions regarding the compatibility of these apps with EU data protection law. For example, the principle of data minimisation requires controllers to process the minimum amount of personal data that is necessary for providing the service.

While Maya’s privacy policy states that information might be disclosed to third parties, it does not provide precise information about the categories of personal data of users that it is disclosing or any precise information about who these third parties might be.

Although it mentions that no personal data is disclosed to advertisers, the privacy policy states that users' personal data may be used "to comply with our advertisers’ wishes by displaying their advertisement to that target audience”, it does not specify whether this also involves health-related data.

It is also worth noting that Maya does not seem to provide adequate information regarding the rights of EU users. For example, the privacy policy does not provide adequate information about users’ rights to rectify their personal data or any information about their right to lodge a complaint with the supervisory authority.

 

MIA by Mobapp Development Limited

As we highlighted before, the question of the collection of “sensitive data” is raised again with MIA’s privacy policy (as of August 19th 2019), which at the time of research, clearly stated that the app may collect “menstrual cycles dates, symptoms related to menstrual cycle, information about health and activities (sleep, mood, diseases, sex, steps etc.), body measurements, which may include information about personal health issues you provide, including information about your physical states”.

Additionally, MIA mentioned in their privacy policy that it could use the personal data it collected for a number of purposes, including for “training of machine-learning algorithms” and “performing background checks on users”. However, the privacy policy did not specify what exact categories of personal data could be used for these purposes and whether this included sensitive data relating to sexual health. This raises serious transparency concerns, as users need to be given meaningful information about the use of algorithms by these services, and especially how this use might affect them.

GDPR applies to MIA as the data controller is based in the EU (Cyprus) and the app is available for download on the Google Play Store UK. In other words, as EU users located in the UK are able to download and use the app, MIA seems to be offering its services to EU users and therefore needs to abide by its GDPR obligations.

 

My Period Tracker by Linchpin Health

My Period Tracker by Linchpin Health is also available for download by EU users, as it is featured on the Google Play Store UK, which might mean that it is serving an EU audience and thus needs to comply with EU data protection laws (GDPR). However, there is no functioning link to the app’s privacy policy or even website on the Google Play Store, which might constitute a breach of GDPR and a failure of the company/controller/app(choose one) to adequately inform data subjects about the uses of their data.

 

Mi Calendario

It should be mentioned that, based on its privacy policy, Mi Calendario seems to be targeting a Latin American audience.

It is worth noting that, at the time of writing, the link to their privacy policy on the GooglePlay store was not working. Following the sharing of our report with Mi Calendario, the link has now been fixed.

 

Conclusion

The wide reach of the apps that our research has looked at might mean that intimate details of the private lives of millions of users across the world are shared with Facebook and other third parties without those users’ free, unambiguous and informed or explicit consent, in the case of special-category (sensitive) personal data, such as data relating to a user's health or sex life.

Our research highlights that the apps we have exposed raise serious concerns when it comes to their compliance with their GDPR obligations, especially around consent and transparency. Indeed, EU data protection laws seeks to ensure that users maintain control over their personal data at all times and that they should be aware of the exact and specific purposes these data might be used for by controllers, namely companies. It equally applies to controllers that process data within the EU/EEA and to controllers that might be based outside the EU/EEA but still target EU users with their services

This raises interesting points. First, even when GDPR applies, for example, in EU/EEA countries, this does not mean that controllers abide by the regulation. As our research illustrates, apps targeting EU users need to comply with, among others, strict consent and transparency obligations regarding the processing of personal data, but they often fail to do so. This should lead to a call for stronger enforcement - EU data protection laws have always been there, what is needed is effective and fruitful investigations by regulators.

Secondly, while apps that are located in Europe might be failing to meet their GDPR obligations, EU users are still provided with an appropriate right of redress, such as the possibility to raise the issue with the controller directly, or to file a complaint before their national supervisory authority, or even to bring a case against the controller before national courts. However, the case is not the same for users based in countries without proper data protection laws or with data protection laws that lack effective enforcement. The practices highlighted by this research should serve as an example of abuse that should prompt law-makers and regulators to uphold users’ rights.

Companies should also not escape their responsibilities. Facebook have announced they will launch a tool that will enable their users to stop apps and businesses sharing their data with the social network, which will address the problem for some users. However, it is insufficient, as it will fail to protect app users who do not have a Facebook profile.

The responsibility should not be on users to worry about what they are sharing with the apps they have chosen. The responsibility should be on the companies to comply with their legal obligations and live up to the trust that users will have placed in them when deciding to use their service. In order to guide best practices, we are suggesting the following recommendations:

 

Recommendations for menstruation apps

  • Undertake in-depth privacy and risk impact assessments when designing their applications with consideration for their users and the potential harms they could experience.
  • Limit the data collected, many menstruation apps appear to request superfluous data - including sensitive personal data - to build a profile of their users. Only data that is necessary for the purpose the app states should be collected.
  • Limit data sharing only to what is strictly necessary for the purpose of providing the services. This requires checking default data sharing settings of tools provided by third-parties such as Facebook's SDK or third-party data management tools.

 

Recommendations for non-EU governments

  • Implement effective data protection legislation which complies with internationally recognised data protection standards and aligns with their national and international human rights obligations to protect people's dignity and autonomy, in order to ensure that the processing of personal data by public and private entities is effectively regulated.

 

Recommendations for Facebook

  • Facebook needs to better explain how it uses the data that it automatically receives through the Facebook SDK, how long the data is stored and if it is being shared. 
  • Facebook should do more to offer products and services that make it as easy as possible for developers to protect the privacy of their users by design and by default. For instance, the default implementation of the SDK should not automatically transmit data the second an app is launched. 
  • Facebook should take steps to make it easier for people to exercise their data rights on all personal data that Facebook stores, whether they have a Facebook account or not. 

 

Recommendations for regulators

  • Ensure data protection laws are properly enforced.
  • Give extra scrutiny to apps that under the pretence of necessity disproportionately collect vast amounts of health data (including sexual health data) and share it without the explicit consent of users.
  • Ensure app developers abide by transparency requirement of EU data protection laws.
  • Make sure users maintain control over their data and can meaningfully exercise their data protection rights.

 

Recommendations for users

Even if they will not affect the kind of tracking that we have described in this report, we recommend that people make full use of all existing privacy settings, including:

  • Resetting your advertising  ID regularly. This can be found on most Android devices under, Settings > Google > Ads > Reset Advertising ID.
  • Limitting ad personalization by opting out of ad personalization in the Android settings. This can be found on most Android devices under, Settings > Google > Ads > Opt out of personalized Advertising

Regularly reviewing the permissions that you have given to different apps and limitting them to what it strictly necessary for the way in which you want to use that App. This can be found on most Android devices under, Settings > Apps or Application Manager (depending on your device, this may look different) > tap the app you want to review > Permissions.