Covid-19 response: Corporate Exploitation

Companies all over the world are pitching data products, services & solutions to Coronavirus - from big tech to companies that might not be household names but PI has long challenged for their exploitative data practices. Here we set out examples and the key points for companies to consider.

Key points
  • Companies all over the world are pitching data products, services & solutions to Coronavirus.
  • Examples include use of location data by Big Tech and Telecommunications companies as well as mobile advertising, data analysis and visualisation companies.
  • Companies already infamous for exploitative data practices are taking advantage to peddle their services.
  • We set out key points that companies must consider in responding to this crisis, if they do not want to make things worse.
News & Analysis
Coronavirus and Data Exploitation

These are difficult and challenging times around the world.

In this global crisis, businesses are stepping in to support efforts by Governments and public health authorities to seek to control the impact of the virus.

This is important, as help of all kinds is sorely needed. However, we must also be wary that industry initatives and public-private partnerships are not used as an opportunity to profit from this crisis or to exploit data without legal safeguards.

Companies all over the world are pitching their data products, services and solutions - from big tech, to telcos, to companies that might not be household names but PI and others have long challenged for their exploitative data practices.

Since early March, PI has been working with our network of partners around the world to track measures launched by both governments and companies.

Some of the company examples are set out below, together with key points that any company should keep in mind when considering its approach to using data to support responses to this public health emergency.

Examples of what companies have done so far

Companies of all sorts have for years been gathering our location data, often unnecessarily. Now the question of whether to share and how to use location data is at the centre of the debate around monitoring the spread of the virus and contact tracing:

  • Telecommunications companies are entering into agreements with public authorities or even third party analytics companies 
  • Facebook is sharing aggregate data with researchers - reportedly from Facebook’s private vault of location data collected by the company’s apps. Facebook is working as part of the COVID-19 Mobility Data Network,  a collaboration between Facebook, Camber Systems, Cuebiq, and health researchers from 13 universities - to use corporate location data from mobile devices to give local officials “consolidated daily situation reports” about “social distancing interventions.” Facebook later announced the release of various tools as part of their Data for Good programme, which includes new types of disease prevention map (co-location, movement range trends and the social connectedness index) and prompts on Facebook encouraging people in the US to participate in a survey from Carnegie Mellon University Delphi Research Center designed to help health researchers identify COVID-19 hotspots earlier.

  • Google, which gathers detailed location data from millions of people who use Android phones and some Google apps will be publishing Community Mobility Reports charting movement trends over time by geography, across different categories of places such as retail and recreation, groceries and pharmacies, parks, transit stations, workplaces, and residential. Google's response to supporting government efforts, includes working with the White House to develop new data mining techniques to examine the COVID-19 Open Research Dataset and collaborating on the development of apps in the US, Spain and Australia. Google has also joined the COVID-19 Healthcare Coalition, a group of healthcare technology and research organisations, to build a data exchange.

  • Mobile advertising companies - which have long been quietly collecting vast amounts of location data - are now reportedly sharing this with Governments, in part helping those governments work around the more regulated telecoms companies.

Location data can also be abused to shame and to increase fear:

  • For example, the visualisation using the location data of young people partying in Florida during spring break. Data visualization company Tectonix reportedly used cell phone location data collected by another company X-Mode to map out the travels of thousands of spring breakers, using geo-spatial big-data analysis software. In their own words, Xmode specialise in the curation and use of precise location data. As explained in Xmode’s privacy policy, they receive most of this location information from an “SDK” that they provide to Apps. PI has previously written in detail about the problems of data sharing facilitated by SDKs, in particular Facebook’s. Xmode provide their SDK to publishers to ‘monetise’ their services, i.e. so they can get access to the location data of app users. They also get data from other ‘data compilers’ and the US Postal Service. The uses include making inferences, building consumer profiles, to categorize users into “interest segments” and facilitating ad targeting. Tectonix receives data, including precise location data and advertising IDs to make inferences, curate data products and create visualisations - as was the case with the spring breakers’ data. Such indiscriminate collection, analysis, sharing and re-purposing of unique identifiers such as Ad IDs and location data - which provides intimate insights into our lives - raises a raft of privacy and data protection issues.

Its not just companies with location data pitching in:

  • For example, Kisna Health, a company that produces internet-connected thermometers, created a map of fever levels in the US. The New York Times reported that Kisna’s thermometers upload the user’s temperature readings to a centralised database, with a million thermometers in circulation and are in talks with six states about distributing more.

Companies already infamous for exploitative practices are also taking advantage of this crisis as an opportunity to peddle their services:

  • NSO Group, the Israeli company known for the targeted spyware it sells to governments, which has been used to target journalists and activists, has taken the opportunity to now pitch data analysis products.

  • Clearview AI, which came to global attention earlier this year for providing facial recognition tools to law enforcement is reportedly negotiating a partnership with state agencies to monitor infected people and individuals they interacted with.

  • Palantir, the US-based company that sells data software and has been the centre of numerous scandals is reportedly collaborating with authorities in development of new tools to monitor resources.

Then there are some less well known but potentially equally exploitative companies:

  • Banjo, a company that allegedly aims to report crimes as they happen and combines social media and satellite data with public information, like CCTV camera footage, 911 calls, and vehicle location, to detect criminal or suspicious activity, will now be releasing a tool designed to respond to the outbreak. 

  • StatSocial a data broker which has announced Crisis Insights built on StatSocial’s Silhouette social data platform that monitors and analyses more than 1.3 billion social accounts covering more than 70% of US households. Crisis Insights seeks to identify the changing dynamics of customers and consumers who are engaging in 30 topics across across four major categories:

    • Coronavirus/COVID-19 (e.g. medical influencers, elder care, health treatments)
    • Preparedness (home security, social isolation, stockpilers, parenting during crisis)
    • Psychographics (anxiety/stress, #FlattenTheCurve, #ReturnToWork, Urban Exodus)
    • Economy (Business Travellers, Working From Home, Job Security, Stock Market)

What companies need to consider

As these examples demonstrate, companies around the world, big and small, are using this crisis as the impetus to gather more data, do more analysis and share more data - this has knock on consequences for all of our rights as well as society now and in the long term.

Before steaming ahead, gathering more data, analysing more data and sharing and offering more data, companies should at least be mindful of the following five points:

1. What is your objective?

What is the problem that you are trying to solve? Is what you are proposing going to help towards resolving this public health crisis? Is it something that public health officials and experts are asking for? Is there a demonstrable public health benefit, as defined by public health experts?

Or is your intervention mostly aimed at promoting your business? In which case it’s important to know that in this crucial period with stretched resources and an anxious public, anything which distracts from responding effectively puts people’s lives at risk.

2. Is it lawful?

You must have a clear and lawful justification for whatever you are doing with data. You must be aware of and respect the legal frameworks, including human rights laws and data protection laws, that apply to you and those you may be sharing data with. This includes consideration of the necessity and proportionality of any measures. In practice, this means seeking to achieve your aim in the way that least interferes with people’s rights.

3. Are you being transparent?

Be clear with everyone what you are proposing, what data you are gathering and from where, who it will be shared with and on what basis, how data will be used and for what purposes. Be clear about what the consequences might be and how you will monitor, mitigate and report on potential harms. Carry out human rights impact assessments and data protection impact assessments, and publish these. If you’re claiming something has been anonymised, explain how it has been anonymised and be clear about how it could be de-anonymised. Be open to scrutiny and audit.

4. Does it respect data protection principles?

Data protection laws around the world protect personal data. They should not be seen as a barrier to protecting public health, and rather are a framework within which data can be shared in a way that is secure and fosters trust. Even if there is no such comprehensive legal protection in place in a country or jurisdiction, you should still seek to respect the core principles.

First, a warning, if you consider that the data you are sharing is not personal data, think again: data is notoriously hard to annonymise and requires more that simply removing obvious identifiers. It is therefore important to be transparent about data models and annoymisation techniques.

Second, put into practice the following principles:

  • data minimisation - not gathering more data than needed;
  • purpose limitation - limiting the purposes for which data can be further used;
  • security - ensuring data is secure with limited access, and that any third parties are taking equivalent measures; and
  • data retention - be clear that special services/ measures and data use are being deployed because of a specific crisis and are extraordinary and temporary in character.

5. Are you profiting from the pandemic?

If the answer is yes, think very carefully about the benefit you are providing. Ensure that you don’t lock public authorities into exclusive long-term contracts - there must be sunsets. Firewall data used to address the crisis from other business and commercial interests, put limits on its further use and do not monetise data derived from activity related to coronavirus.

Listen to calls to protect rights

As countries and companies around the world have turned to invasive surveillance techniques to fight the crisis, over 100 Civil Society Organisations are calling for these measure to respect human rights.

A number of these points have been emphasised by public health officials:

Dr. Michael Ryan, a key advisor for the World Health Organization, emphasised the need to safeguard privacy and data protection in the responses to the coronavirus:

We take the issues of personal data protection and intrusion very, very seriously,’ said. He said that the WHO is working to ensure that “all of the initiatives we’re involved with, while aiming to develop good public health information, in no way interfere with the individual rights to privacy and protections under the law. It is important when we talk about surveillance and the surveillance society that in the case of public health the gathering of information about individuals, their movements must be done with the consent of the community and in many cases of the individual themselves.

As well as data protection oversight bodies:

The Council of Europe Convention 108 Committee emphasised:

Transparency and “explainability” of analytics or AI solutions, a precautionary approach and a risk management strategy (including the risk of re-identification in the case of anonymised data), a focus on data quality and minimisation, and the role of human oversight are some of the key points to take into account in the development of innovative solutions to fight against COVID-19.

Stop exploiting data!

So as a company, if you find you are doing something that doesn’t seem right, or isn’t in accordance with what the key points we’ve outlined above, STOP.

Ask, why are you doing it? Is it to help people or to report on them? To help society or to scare them? To minimise the impact of a severe public health crisis or to legitimise your highly problematic and exploitative business model?