What we need to see: user control over intelligence

Body

We should be able to know the metadata and other observed and derived data that is generated through our interactions, and where this data leaks to and who has access, e.g. in WhatsApp and SMS or financial transactions, what does which provider have access to and what does that allow them to infer?

Individuals are asked for their consent when their data is to be used to generate analytics for purposes beyond their own direct advantage and legitimate interest, even if the data that is taken from their use is de-identified or anonymised. 

Individuals should be able to filter-out metadata and other observed data and prevent processing on platforms, e.g.  removing photo metadata and processing on platforms unless an individual wishes for metadata to be disclosed.

Where systems are de facto compulsory and it is impossible for individuals to object, they should be able to be pseudonymous and they must be able to represent themselves as is in their interests, which in inflexible systems would mean the ability to lie and fabricate data. The exception to this would be when systems have a legitimate and specific purpose, as those systems would minimise data, which should be the default.

Even as the user-interface of devices and services disappears and background processing takes more data than under knowledgeable consent, we need more transparency of the data processing on devices and the data emerging from devices and services. Just as firewalls are able to identify and interfere with flows of data from computers, we want to see innovations that give individuals controls over data emerging from other technologies, whether from scripts running on websites to IoT devices calling home.

A deeper understanding of ‘identity’ is needed — not just something that you register, but something that is emergent from your behaviour and conduct. And we should be able to determine what are our identities and assigned characteristics, or profiles, when generated by others, and maintain the right to object.

What this will mean

Competition law would have to consider dominance of a company through the knowledge is has on individuals activities, intelligence and insight it possesses on individuals and groups and whole societies, and the choices it made through the design of systems.

User-generated content systems will permit users to control data disclosure, including through the restraint and even fabrication of observed data. A location tracking and communications tool should allow the user to mis-represent their location to others, instead of relying only on system-generated GPS data – and exceptions to this rule must be clear, e.g. gaming. By design these systems must allow for reduction of observed, derived and inferable data, e.g. in photos and posts.

Platforms should limit ability of third parties to conduct unlawful surveillance, and these third parties should not be able to collect personal data (e.g. photo location) except when necessary and proportionate to a legitimate aim. They should also inform users what data is accessible to third parties, how, and under what circumstances.

Essential reform actions


Regulators will need to broaden their remits around data, intelligence, and power, e.g. competition regulators need to reflect upon data, data protection regulators need to increase the scope of their work to consider analytics, anonymous data, group privacy.

Stronger controls on social media to prevent the generation of SOCMINT, and stronger rules on access by third parties.

Cases of positive steps

Laws restricting the use of social media data by employers.
Google Latitude permitted you to mis-represent your location.
Apple collecting analytics only with consent and then using differential privacy.
ICRC questioning the metadata of messaging applications.
Mixnet technology development.