Submission to the Scottish Parliament’s Justice Sub-Committee on Policing inquiry into facial recognition policing

Advocacy
FRT

On November 1, 2019, we submitted evidence to an inquiry carried out by the Scottish Parliament into the use of Facial Recognition Technology (FRT) for policing purposes.

In our submissions, we noted that the rapid advances in the field of artificial intelligence and machine learning, and the deployment of new technologies that seek to analyse, identify, profile and predict, by police, have and will continue to have a seismic impact on the way society is policed.

The implications come not solely from privacy and data protection perspectives, but from the ethical question for a democratic society of permitting the roll out of such intrusive technology.

As the European Data Protection Supervisor recently confirmed:

A person’s face is a precious and fragile element her identity and sense of uniqueness. It will change in appearance over time and she might choose to obscure or to cosmetically change it – that is her basic freedom. Turning the human face into another object for measurement and categorisation by automated processes controlled by powerful companies and governments touches the right to human dignity – even without the threat of being used as a tool for oppression by an authoritarian state.

Moreover, it tends to be tested on the poorest and most vulnerable in society, ethnic minorities, migrants and children.

We are deeply concerned that the use of FRT by the police, including Police Scotland, raises significant problems for fundamental rights and individual freedoms. So far, this technology has offered law enforcement new opportunities to experiment with or engage in novel forms of surveillance, in an arbitrary or unlawful fashion, which lacks transparency and proper justification, and fails to satisfy both international and European human rights law standards.

Due to the impermissibly intrusive nature of this technology, Privacy International submits that live or real-time FRT should never be deployed. Moreover, police should not be allowed to make use of this technology without justification that relies on concrete and specific threats to national security or public safety, as well as in absence of legal safeguards, such as publicly available legal frameworks, existence of reasonable suspicion, independent authorisation, ex post notification. Such requirements effectively mean the situations in which such technology can be used in accordance with law are so few so as to render any further testing or expenditure of resources on FRT disproportionate.

It is only at the point the bare minimum actions above are taken that it is possible to take an informed approach as to whether FRT should ever be deployed.  As the European Court of Human Rights has underlined:

a state that claims a pioneer role in the development of new technologies bears special responsibility for striking the right balance in this regard.