UK MPs ignore concerned constituents questions about Facial Recognition Technology
In this piece, we unpack the responses received from UK Members of Parliament between November 2023 and June 2024 following the initial launch of our campaign “The End of Privacy in Public”, and discuss the current state of regulation of FRT in the UK. In doing so, we reiterate our call for FRT’s use to be effectively regulated.
- Cross-party MPs sidestepped direct questions about the use of FRT in their constituencies.
- Some MPs expressed concern and some top-level understanding of the risks of FRT.
- There was some recognition amongst MPs of the need to strike a balance.
- There was some assertions by MPs that FRT is only used by police when necessary.
- Most MPs attempted to assure constituents that the use of FRT was being regulated by data protection laws.
In the wake of Privacy International’s (PI) campaign against the unfettered use of Facial Recognition Technology in the UK, MPs gave inadequate responses to concerns raised by members of the public about the roll-out of this pernicious mass-surveillance technology in public spaces. Their responses also sidestep calls on them to take action.
The UK is sleepwalking towards the end of privacy in public. The spread of insidious Facial Recognition Technology (FRT) in public spaces across the country will mean we can no longer walk down our local high streets without being subject to constant monitoring and identification by dystopian surveillance technology.
This alarming trajectory led PI to launch a campaign entitled “The End of Privacy in Public” in November 2023. The aim was to raise awareness of these concerns in order to mobilise and empower members of the public. It encouraged the public to take action by writing to their elected representatives in the UK, Members of Parliament (MPs), to urge them to act, and provide answers about how FRT is being used in their constituency.
The campaign was launched after our research revealed a concerning lack of knowledge among MPs, including that 70% of MPs don’t even know whether FRT is being used in their own constituency.
As part of this on-going campaign, some members of the public not only emailed their MPs, but also shared the responses with PI. We are very appreciative, and thank those who have engaged in our campaign so far.
In this article, we unpack the responses received from MP’s between November 2023 and June 2024, and discuss the current state of regulation of FRT in the UK. In doing so, we reiterate our call for FRT’s use to be effectively regulated.
In light of the continued harms posed by the unregulated deployment of FRT, Privacy International is relaunching this campaign. We hope to continue to highlight the dangers that this surveillance technology poses and bring it to the attention of MPs.
Key-highlights from the MPs responses
Overall, most of the responses from MPs acknowledged their constituents’ concerns with the deployment of FRT, and showed a basic understanding of some general developments in this area. Some didn’t engage with the harms of FRT at all, however, and none of them provided specific answers about the actual use of FRT in their constituency. Alarmingly, as far as we know, none of the MPs took any further action as requested by their constituents.
Some of the highlights from the MPs responses are discussed in more detail below.
Cross-party MPs sidestep direct questions about the use of FRT in their constituencies
A major shortcoming in all of the MPs’ responses is that all of them ignored the questions that were actually put to them by their constituents.
None of the MPs acknowledged, let alone responded to, the clearly listed key requests from their constituents. These included asking the MPs to (1) confirm whether or not FRT is being used in their constituency; (2) contact the local retail consortium and/or write to the largest local retailers and event spaces, to ascertain if they are using FRT in their constituency; and (3) contact the Chief Constable to demand information about the local police force’s deployment of FRT in local public spaces.
Some MP responses did take constituents’ emails more seriously and recognised concerns, with one MP stating:
“I am opposed to the use of facial recognition technology in public surveillance […] I believe that the privacy and freedom from surveillance are fundamental democratic rights and must be protected”.
Expressions of concern and some top-level understanding of the risks
Some MPs acknowledged real concerns about the use of FRT.
For example, one response noted that use by the private sector is growing, stating " I am concerned that private use of biometrics is not subject to the required level of regulatory oversight or due process", and further recognised concerns about unintended consequences of the use of FRT, such as biases against people from black and ethnic minority backgrounds, and women.
A minority of responses also explicitly opposed the use of live facial recognition technology in public surveillance and detailed how they had taken steps to voice that opposition, such as signing an open letter. Furthermore, two of the responses explicity assert the importance of the protection of our privacy, stating:
“As digital technology becomes increasingly prevalent in our day-to-day lives, it is critical that our privacy is protected.”.
Recognition of the need to strike a balance
There was also some recognition of the need to strike a balance between “individual liberty and collective safety” when it comes to the use of FRT in public spaces in the UK. One response acknowledged the need for “safeguards and limitations, such as appropriate transparency and accountability measures”. The same MP’s response also urged caution around changes to legislation that could lead to the “proactive surveillance of people”.
Only used by the police when necessary
Two MPs assert that FRT is only used by police when necessary for the investigation of crimes and to protect the public. Specifically, they stated:
"[t]he Home Office is working with police forces to enable searching of relevant images only where it is necessary for them to do so to investigate crime and protect the public."
Aside from this broad assertion, however, these responses failed to grapple with their constituents’ concerns that there are not specific safeguards in place to ensure this necessity assessment is being made transparently and appropriately.
The UK Information Commissioners Office (ICO) similarly opined that proportionality considerations, with regard to law enforcement use of live FRT, must be clearly defined or “we are likely to continue to see inconsistency across police forces and other law enforcement organisations in terms of necessity and proportionality determinations relating to the processing of personal data” and warned of concerns of “inconsistence and compliance failures”.
Don’t worry, it’s regulated by data protection laws
Most of the responses from MPs attempted to assure constituents that the use of FRT in the UK is regulated by the General Data Protection Regulation, 2016 (GDPR) and the Data Protection Act, 2018. They also placed a lot of emphasis on the role of the Information Commissioner’s Office. For example, one MP noted: “I want to reassure you that the use of biometric data (including facial images) by private companies to identify individuals is already regulated by the General Data Protection Regulations and the Data Protection Act 2018. Under the legislation, data processing must be fair, lawful and transparent. In addition, individuals who believe their data has been misused can make a complaint to the Information Commissioner’s Office (ICO), the independent regulator of the legislation.” The MP went on to note that “[t]he ICO continues to keep a watching eye on companies who fail to observe relevant data protection regulation when it comes to facial recognition.”
Multiple responses made similar references to the regulation of FRT by data protection laws. While this isn’t incorrect - data protection laws do regulate FRT - they only regulate certain aspects of it. And that regulation differs depending on whether private companies or law enforcement are deploying the FRT. As we detail in the next section, data protection legislation alone is not sufficient to effectively regulate FRT.
How FRT is currently (un)regulated and what PI is doing about it
The MPs that pointed out that FRT is regulated by data protection laws were not wrong - they do regulate FRT. FRT entails the processing of personal data, specifically biometric data, and public and private bodies in the UK that process this data are required to comply with data protection laws. But these laws only regulate a narrow aspect of FRT - concerns related to data protection. The deployment of FRT raises significantly more concerns than just ones related to personal data. The indiscriminate mass surveillance of public spaces raises issues related to bias and discrimination, and infringements of the rights to privacy, freedom of expression and freedom of association. The pervasive use of FRT poses additional ethical and legal considerations for our conceptions of privacy in public spaces. Data protection laws do not sufficiently address these concerns.
The deployment of FRT by law enforcement is, arguably, further controlled by the police common law powers, the Surveillance Camera Code (POFA 20212), the Police and Criminal Evidence Act 1984 and the Human Rights Act. However, this patchwork of powers, codes and laws poses some significant challenges that undermine effective regulation. First, none were drafted and implemented to specifically deal with FRT, which often leads to clumsy, not-fit-for-purpose application. Second, although they respond to certain elements - as discussed above with data protection laws - such a patchwork can lead to gaps and overlaps which result in an inadequate coverage of all the issues concerning FRT. Third, it creates a very complex legal environment which makes it difficult to ascertain obligations and responsibilities, posing a particular challenge for compliance and oversight. Fourth, this lack of legal certainty can result in inconsistent or arbitrary application which in turn undermines public trust.
This patchwork of laws is accordingly insufficient. It does not effectively regulate all aspects of FRT, creates legal uncertainty and does not provide sufficient safeguards against the risks and harms posed by this intrusive technology. PI is accordingly calling for the effective regulation of FRT.
The Information Commissioners Office’s (ICO) has similarly called for a statutorily binding code of practice to provide further safeguards that address biometric technology such as live FRT. As noted by the ICO “a code of practice would offer law enforcement agencies and the public alike a highly desirable level of clarity and consistency. It would also contribute to the degree of transparency necessary as the use of LFR expands.”
Conclusion
In light of the concerns arising from the MPs’ responses, and the continuation of the rollout of FRT in public spaces across the UK since these responses were received, we are re-launching our campaign to highlight the dangers that this surveillance technology poses to society and our right to privacy, and bring this to the attention of Members of Parliament.