
Privacy International issues complaint to the UK regulator regarding the deployment of two algorithms for immigration purposes suspected of failing to adequately comply with data protection law
On 18th August 2025, PI issued a formal complaint to the UK Information Commissioner (ICO) about the Home Office’s use of two automated tools in immigration enforcement operations, which PI argues do not adequately comply with the UK General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA 2018).
- UK Home Office’s uses two algorithms for immigration purposes seemingly without sufficient safeguards to protect the right to privacy and meet data protection standards.
- Migrants appear to be subject to automated decision making without adequate human review processes.
- Migrants are not adequately informed and therefore are unable to challenge invasive data processing activities of automated tools.

On 18th August 2025, Privacy International has issued a formal complaint to the UK Information Commissioner (ICO) about the Home Office’s (HO) use of two automated tools in immigration enforcement operations, which PI argues do not adequately comply with the UK General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA 2018). The Identify and Prioritise Immigration Cases (IPIC) tool and the Electronic Monitoring Review Tool (EMRT) appear to be used to make life-altering decisions, including detention, removal from the UK, or electronic monitoring via GPS tags, in ways PI argues lack sufficient safeguards for the right to privacy and data protection standards.
The complaint is based on crucial evidence and valuable resources provided by the Public Law Project, Duncan Lewis Solicitors and Wilson Solicitors.
The complaint highlights significant concerns with these tools including the collection, retention, and processing of data presumptively in violation of many of the legal requirements set out in the UK GDPR and the DPA 2018. From PI’s perspective, the HO’s use of these tools is opaque, lacks a clear and foreseeable justification, processes unjustifiable volumes of personal data, is subject to inadequate human review, and may be adversely impacting migrants who are subject to them. amongst many other concerns. Had the Home Office carried out a proper Data Protection Impact Assessment (DPIA), as they are legally required to do, these are issues they might have noticed.
PI is asking the ICO to issue an enforcement notice to cease all data processing activities, to investigate the HO’s use of these tools, to consider their compliance with the GDPR, and to ensure the HO complies with data protection laws.
This complaint is part of wider efforts by Privacy International to expose and challenge the use of the different invasive and abusive technologies and automated systems to which migrants are exposed in the UK and beyond in an increasing hostile environment. The UK Home Office has already been issued various legally binding decisions ruling some of its policies and practices including the electronic monitoring of migrants and the seizing and extracting data from migrants’ mobile phones to be unlawful.
Serhat Ozturk, Legal Officer, Privacy International said:
“It is concerning that once again we have found that the Home Office is resorting to using invasive and secretive tactics as part of a wider hostile environment policy. In addition to the Home Office’s disregard to its legal obligations, we are particularly worried that the Home Office is trusting opaque automated systems to make decisions that will forever alter people’s lives, by deciding whether they can stay in the UK, and whether they will be subject to detention or 24/7 monitoring if they do get to stay. And worryingly, those affected are not even informed that these tools are being used which is leaving them in the dark and without the ability to challenge decisions that profoundly affect their lives.”
Arianne Griffith, Research Director, Public Law Project said:
“The Home Office’s use of technology to make decisions, including around electronic monitoring as a condition of bail in immigration cases, has been worryingly opaque. These are decisions with incredibly high stakes. One algorithmic tool - the Electronic Monitoring Review Tool - for example, recommends whether somebody should be fitted with a GPS tracking device which affects how they go about their day. Research by PLP, Bail for Immigration Detainees and Medical Justice has shown that these devices are dehumanising and can cause considerable psychological harm. We welcome Privacy International’s efforts to hold the Home Office to account for its use of this technology. Transparency is critical for us to be sure that government decisions are accurate, lawful, and fair.”
Jonah Mendelsohn, Solicitor, Public Law & Human Rights, Wilson Solicitors LLP said:
“The accelerating deployment of AI in immigration enforcement raises profound questions about fairness, accountability and transparency. Immigration decisions, which are often life-changing for our clients, should not be made by systems that are impossible to scrutinise or challenge. Without greater clarity on when and how such data-intensive technology is used in the immigration system, there will always be risks to the fundamental rights of migrants.”
Jeremy Bloom, Consultant Solicitor at Duncan Lewis said:
“We are deeply concerned about the lawfulness of the Home Office’s use of these algorithms in cases involving our clients. The lack of detail in Home Office records when the IPIC algorithm is used does nothing to allay those concerns. IPIC recommendations are sometimes accepted by officials without any reasons given, and the use of the algorithm is clearly being used in decision-making relating to individuals held in immigration detention. We haven’t seen evidence that individuals are being made aware that these tools are being used in the processing of their data, nor have we seen assessments of when, how and why the tools will be used, and what measures are in place to safeguard individual privacy and data rights. We are grateful to Privacy International for bringing this complaint and urge the ICO to act robustly to protect those affected by these tools.”
Notes to Editors
The use of automated tools to assist or replace human decision-making have grown considerably across the public sector in the UK, and this complaint focuses on two algorithms which appear to be used in immigration enforcement operations, namely the Identify and Prioritise Immigration Cases (IPIC) and the Electronic Monitoring Review Tool (EMRT).
About IPIC
The IPIC tool generates automated recommendations and prioritises casework for immigration enforcement purposes, based on predefined “business rules.” These recommendations inform decisions related to deportation, assessing barriers to removal, determining suitability for digital reporting, detention, removal, voluntary departure, emergency travel documents, denial of services, and referral to other government departments.
It can be deployed on specific groups based on filtering criteria. The tool is used on individuals who are subject to immigration control and who are therefore liable for detention and removal pursuant to the Immigration Act 1971 and subsequent legislation.
About EMRT
The EMRT is designed to assist in the context of quarterly Electronic Monitoring (EM) reviews carried out by the Home Office to decide if GPS tracking remains appropriate as a condition of immigration bail pursuant to paragraph 4 of Schedule 10 to the Immigration Act 2016.
Firstly, it can determine, via an automated harm score, the minimum period an individual will remain subject to an ankle tag after which they may be ‘transitioned’ to a non-fitted device (NFD). NFDs are handheld devices equipped with a fingerprint scanner that requires the subject to submit biometric information several times a day. Second, it can generate automated recommendations as regards whether an individual should remain subject to an ankle tag or be transitioned to an NFD.
Concerningly, the automatically generated harm score appears to determine the total length of time that an individual remains subject to both an ankle tag and an NFD.
Human involvement in the harm score seems limited to inputting information into the tool and checking if the minimum amount of time relative to an individual’s harm tier as expired before transitioning them to an NFD.
PI has previously successfully challenged the use of GPS tagging of migrants released on immigration bail, with the ICO and two courts ruling against the Home Office’s policy.
Specific violations
Based on information obtained by Privacy International through Freedom of Information requests, documents shared by other CSOs and evidence gathered by legal representative, PI has submitted a complaint to the ICO highlighting concerns with these tools including:
- No transparency (and/or inadequate information) is provided to data subjects as to the nature and extent of data collection and processing.
- There is an absence of a clear, accessible and foreseeable legal basis authorising the processing in violation of the lawfulness principle.
- The processing does not comply with the fairness principle and in particular falls outside the reasonable expectations of data subjects.
- The extent of data collected and the uses of the ARMTs does not comply with the principles of necessity and proportionality.
- The re-purposing of input datasets to generate automated recommendations is incompatible with the purpose limitation principle.
- The retention of certain data is unjustified and in breach of the storage limitation principle.
- The HO has failed to carry out a lawful Data Protection Impact Assessment (DPIA) and/or undertake a DPIA at all in case of the EMRT. It has also failed to demonstrate compliance with the data protection principles pursuant to the accountability principle.
- The human review processes implemented by the HO are inadequate as they may in certain cases be carrying out solely ADM in breach of Article 22(1) of the GPDR.
For additional follow up please contact: [email protected]
For additional information:
- Visit the case page for this complaint: https://www.privacyinternational.org/legal-action/ico-complaint-against-uks-automated-recommendation-tools-immigration-operations
- WhatDoTheyKnow, ‘Privacy International request to Home Office’, (27 July 2024), https://www.whatdotheyknow.com/request/clarification_regarding_uses_of.
- Read our summary of the complaint: https://www.privacyinternational.org/long-read/5639/pi-rings-alarm-bell-and-alerts-ico-about-use-algorithms-home-office-and-their-impact
- Learn about Data Protection: https://privacyinternational.org/taxonomy/term/512
Privacy International (PI) is a London-based charity that campaigns against companies and governments who exploit our data and technologies. We expose harm and abuses, mobilise allies globally, campaign with the public for solutions, and pressure companies and governments to change.
Visit our website at www.privacyinternational.org