PI rings the alarm bell and alerts ICO about the use of algorithms by the Home Office and their impact on migrants

PI filed a complaint with the UK Information Commissioner’s Office (ICO) against the Home Office’s policy and practice of collecting and processing data through two algorithmic tools used in immigration enforcement operations. 

Key findings
  1. UK Home Office’s uses two algorithms for immigration purposes seemingly without sufficient safeguards to protect the right to privacy and meet data protection standards.
  2. Migrants appear to be subject to automated decision making without adequate human review processes.
  3. Migrants are not adequately informed and therefore are unable to challenge invasive data processing activities of automated tools.
Long Read

On the basis of a year of legal research by PI as well as documents obtained by other civil society organisations, and evidence provided by legal representatives fighting these automated systems on behalf of their clients, on the 18th August 2025, we issued a formal complaint to the UK Information Commissioner (ICO) regarding the Home Office’s use of two ‘automated recommendation-making tools’ (ARMTs), the Identify and Prioritise Immigration Cases tool (IPIC) and the Electronic Monitoring Review Tool (EMRT).

Both these tools appear to being used in immigration enforcement operations to assist or even replace human decision-making in ways that could lead to life-changing outcomes for migrants, seemingly without sufficient safeguards to protect the right to privacy and meet data protection standards. This raises concerns as to whether the Home Office is adequately complying with the UK General Data Protection Regulation (GDPR) and 2018 Data Protection Act, which both regulate the processing of personal data in the UK.

It’s time for the Home Office to be held to account.

Press release

On 18th August 2025, PI issued a formal complaint to the UK Information Commissioner (ICO) about the Home Office’s use of two automated tools in immigration enforcement operations, which PI argues do not adequately comply with the UK General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA 2018).

Shedding light on secretive practices

Unfortunately, there is very little publicly available information about how the UK Home Office is using innovation and new technologies to sustain their dehumanising ‘hostile environment’. This lack of transparency poses a constant challenge not only for migrants themselves but also for those advocating for their rights, including other CSOs or the legal community.

The use of AI and automation is emerging in a variety of sectors. When we started to uncover their use for immigration purposes and saw how they were potentially deployed in a way that was directly harming and undermining migrants, their rights and dignity, we committed to investigating further. We wanted to reveal what tools the government is using, understand their impact, and build an evidence base to systematically challenge harmful policies—while empowering migrants to reclaim their rights and dignity against the UK Home Office.

To shed light on this issue, we spent a year submitting freedom of information requests about a tool called "Identify and Prioritise Immigration Cases” (known as IPIC), eventually filing a complaint with the ICO, which finally led the Home Office to disclose some information about the functioning of this AI tool. However, even this was not enough as the Home Office refused to give us any specific information - arguing migrants would use it to ‘game’ the system.

Despite only getting some information, this was more than has previously been publicly available and, combined with documents made available to PI thanks to the incredible work of the Public Law Project as well as Duncan Lewis Solicitors and Wilson Solicitors, we were able to put the pieces of the puzzle together about how two algorithms - IPIC and the EMRT - operate.

News & Analysis

“IPIC” ("Identify and Prioritise Immigration Cases”) is an algorithm utilised by the UK Home Office that automatically identifies and recommends migrants for particular immigration decisions or enforcement action. After a year of submitting Freedom of Information Act requests, we finally received some information on this secretive AI tool used to decide the fate of migrants. 

The Identify and Prioritise Immigration Cases tool (IPIC)

The IPIC tool generates automated recommendations and prioritises casework for immigration enforcement purposes, based on predefined “business rules.” These recommendations inform decisions related to detention, removal, denial of services, voluntary departure, referral to other government departments. The tool can also target specific groups based on 10 different filtering criteria including location. It is used on individuals who are subject to immigration control and who are therefore liable for detention and removal in accordance with the Immigration Act 1971 and subsequent legislation.

The Electronic Monitoring Review Tool (EMRT)

The EMRT is designed to assist in the context of quarterly Electronic Monitoring (EM) reviews carried out by the Home Office to decide if GPS tracking remains appropriate. In the first instance it can be used to generate an automated harm score which is used to set the minimum period an individual will remain subject to an ankle tag after which they may be ‘transitioned’ to a non-fitted device (NFD). And secondly, it can generate automated recommendations as regards whether an individual should remain subject to an ankle tag or be transitioned to an NFD.

PI has previously successfully challenged the use of GPS tagging of migrants released on immigration bail, with the ICO and two courts ruling against the Home Office’s policy.

The people being impacted by these tools

Behind what the Home Office sees as shiny, sexy tools, are people, not mere data points but people whose lives are being turned upside down.
These automated systems appear to be being used to decide whether to detain and remove migrants from the UK, or to subject them to GPS tags based on automated “harm scores”.

This is yet another example of how the UK Home Office appears to believe it can make use of new technologies such as automated-decision making without having to uphold its obligations to protect the rights of migrants.

Case Study

Privacy is a fundamental human right that applies to everyone, regardless of where they come from or why they move. Migrants and refugees are no exception. They have the same right to a private life and to be free from intrusive surveillance as anyone else. Yet, for people on the move, this right to

Every migrant has the right to dignity, safety, and justice, and these rights must be protected.

No one is above the law. The UK Home Office has an obligation to comply with the 2018 Data Protection Act and the UK General Data Protection Regulation (GDPR). They must be held accountable.

What we uncovered raised some serious concerns

We invite you to read our 100-page complaint if you would like more details, but here is summary of what we uncovered about these two tools, which forms the basis of the complaint PI filed with the ICO on the 18th August 2025.

The lack of transparency underpinning their deployment

These two tools have been rolled out in a non-transparent way. Whilst tag wearers subject to the EMRT were provided information about the tool, it appeared to be uneven, inconsistent and incomplete; tag wearers were not informed about the nature and extent of collection and processing of their personal data by the tool.

Those subject to IPIC were not given any information about how this tool was used as part of the decision-making process they were subject to, or how it may process their data.

Our assessment, based on the information we have had access to, is that the way the two tools have been deployed by the Home Office raises concerns about whether the Home Office is facilitating individuals’ right to be informed about how their personal data is being collected and used in line with the data protection principle of transparency. This is a fundamental protection under the UK GDPR and the DPA 2018, core to ensuring an effective protection of people’s data. A failure to respect it leaves those affected in the dark, without meaningful information about how these tools operate and how their data is being processed. It also denies people the ability to challenge decisions that profoundly affect their lives. In a nutshell, it strips them of their agency not only to control how their data gets used and what decisions are made about them, but it prevents them from seeking redress when those rights are violated.

There is no clear and foreseeable lawful justification for their use

Our assessment is that the legal justification used by the Home Office for the deployment of these tools is overly broad and ill-defined. It lacks clarity, precision, and foreseeability - arguably meaning the use of the tools is in violation of the data protection principle of lawfulness.

The Home Office has done little to explain how the legal basis, the condition for processing sensitive data, they selected for the processing of personal data by IPIC - namely “performance of a public task” and “substantial public interest” - apply in practice. Given the potential intrusiveness of these tool and the sensitivity of data they could be processing, we would expect a greater level of detail about their legal justification to be provided to individuals whose data is being processed.

Regarding data processing with EMRT, the Home Office does not appear to have considered whether the use of this tool requires a separate legal basis. Instead, it has relied on the legal basis used for IPIC, justifying this by claiming that the functionalities of the two tools align. This is also how the Home Office rationalised its decision not to carry out a separate Data Protection Impact Assessment (DPIA) for the EMRT, relying instead on the DPIA conducted for IPIC, despite significant differences between the two tools.

Given the distinct nature of the GPS tagging legal framework, which is governed by the Immigration Act 2016, as compared to the Immigration Act 1971, which IPIC falls under, the Home Office’s current position to utilise the same legal basis for both tools appears to fall short of the requirement to provide an adequate lawful basis. This failure means individuals do not have a clear and publicly accessible indication that their data is processed by IPIC and/or EMRT, which contravenes the UK’s obligations regarding limitations on the right to privacy, as well as its obligations under UK data protection law.

News & Analysis

In the span of three months, two UK courts and one regulatory authority handed down rulings on the UK's GPS tagging of migrants, dealing serious blows to the legality of the policy. We delve into these three rulings and their implications for people and the wider policy.

Use case and large-scale processing, including of sensitive data, are not justified as necessary and proportionate

We found that the Home Office had seemingly failed to carry out necessity and proportionality assessments for either tool, examining whether the processing of personal data by the tools is needed to achieve a legitimate aim and is not unnecessarily intrusive. First, there was no justification for their use, and neither tool appeared subject to limitations on their use, according to the documents we have reviewed. Second, there appears to be no mechanism to limit the volume and categories of personal data processed, namely input data used by each tool to generate their decisions. In our assessment, the stated objectives fail to justify the collection and use of data in the large-scale data processing activities that IPIC and EMRT are seemingly used for and are therefore unnecessary and disproportionate.

The failure to minimise the amount of data processed and to justify the use of tools themselves is further concerning given that these tools are processing vast amounts of sensitive, so-called ‘special category data’, such as health information, vulnerabilities, ethnic or racial origin and genetic or biometric data for the purposes of unique identification, as well as records of criminal conviction. Such data is awarded additional safeguards as its misuse can lead to discrimination prohibited under human rights instruments and contrary to constitutional protections of non-discrimination. In the case of these tools, the Home Office does not appear to have considered the risks associated with processing such data, in particular in such a large volume and we believe it has fallen short of its obligations to safeguard the rights and interests of individuals.

Data being re-purposed to facilitate the tools

The data that is fed into the tools for decision-making processes such as an individual’s name, gender, nationality, data of birth and other data associated with their immigration case as outlined above, includes data from varied and broad datasets (e.g. ‘detention details’), which in several cases may have been collected for wholly different purposes.

The re-purposing of datasets outside their original or primary purpose to generate automated recommendations is incompatible with the data protection principle of purpose limitation. This principle requires that personal data be collected for a specific, clearly defined purpose and not re-used in a manner incompatible with that purpose. Any use of that data for an incompatible purpose must be supported by a new lawful basis and an updated data protection impact assessment, which does not seem to have occurred here.

This concern is compounded by the fact that the retention of certain data is unjustified and in breach of the storage limitation principle, which requires that personal data be kept only for as long as is necessary to achieve the purpose for which it was collected, after which it should be deleted or anonymised. In the case of IPIC, some data may possibly be retained for dubious purposes for at least 5 years, after all other relevant personal data has been deleted. Data processed under the EMRT appears to have no retention limits at all. The mere availability of this data raises concerns about mission and function creep in the future, where information collected for one purpose is gradually used for unrelated purposes without proper legal authority.

Discriminatory impacts resulting from the design and use of business rules

From the information gathered, it appears that the current uses of IPIC and the EMRT may be adversely impacting individuals who are subject to those tools.

IPIC’s automated prioritisation function enables case filtering based on nationality, which may result in direct discrimination because it can introduce biases and lead to individuals being treated less favourably solely on the basis of a protected characteristic. In addition, the use of ‘associations data’ — linking individuals to others who have interacted with the criminal justice system — may result in indirect discrimination, as it can disproportionately affect certain groups who are overrepresented in law enforcement datasets due to structural inequalities. These features risk disproportionately targeting certain nationalities and/or ethnic group without apparent adequate safeguards or justification.

Video

Migrants in the UK are forced by the Home Office to wear GPS tracking devices - humiliated, controlled, spied on, stigmatised - even if they’ve lived in the UK for years.

Capita PLC is one of the companies outsourced by the UK Home Office to deliver this inhumane policy - earning £38 million a year for it.

They’re profiting from the hostile environment.

We’re appalled at this treatment. With Bail for Immigration Detainees and Migrants Organise, we’re calling on Capita PLC to stop providing services to the Home Office.

Regarding the EMRT, there is a significant risk that the tool systematically profiles individuals, for example by incorporating biases related to protected characteristics such as nationality, resulting in continuous 24/7 GPS tracking (either by way of ankle devices or non-fitted devices (NFDs). This occurs despite vulnerabilities that may make the use of such intrusive tracking technology disproportionate, all potentially without people’s knowledge. Such tagging practices have been well-documented to have a detrimental impact on individual’s well-being as they experience constant fear of triggering a breach alert, running out of battery, and concerns the tool won’t function and their movements may be misinterpreted.

Processing activities subvert reasonable expectations and may be unfair to those affected

The processing activities of the Home Office also do not appear to comply with the data protection principle of fairness, which requires that personal data be processed in a way that individuals would reasonably expect and that does not cause unjustified harm; the use of both tools is very likely to fall outside the reasonable expectations of individuals.

Firstly, individuals are unlikely to expect how much of their data is seemingly being used to inform the tools’ decision-making. Moreover, the clear design nudges implemented across IPIC’s recommendations are also incompatible with the fairness requirement. Caseworkers are required to provide a justification when rejecting a recommendation, but not when accepting one. Caseworkers also have more time to revise rejected recommendations than accepted ones. Together, these features discourage adequate and fair scrutiny during human review.

In the case of the EMRT, individuals subject to GPS tagging would reasonably expect that quarterly reviews of electronic monitoring (EM) are conducted by a human decision-maker who considers whether GPS tracking is appropriate. However, the EMRT restricts reviewers to either accepting or rejecting a recommendation, typically to continue tagging or downgrade to a NFD, without broader discretion. Even more concerningly, an automatically generated harm score determines the total length of time that an individual remains subject to both an ankle tag and an NFD. PI also uncovered there is an unpublished policy, which seems to indicate that periods of time in which an individual’s tag is out of contact (OOC) are automatically treated as breaching bail conditions and used as data by the EMRT when assessing compliance. This is despite well-documented concerns regarding both GPS tagging and NFD.

Explainer

Electronic tags have been a key part of criminal justice for many years throughout the world. As traditional radio-frequency tags are replaced by GPS ankle tags, we examine how these different technologies work and the seismic shift that will result from 24/7 location monitoring and data analytics, enabled by GPS tags.

As we noted above, there is a lack of transparency about how this system works and how data is processed. This means that an individual will not know when they will be treated as having breached their bail conditions, and as a result may not be able to challenge any decisions made using this data.

The tools are put in control of migrants’ lives without adequate supervision

Recognising the potential risks associated with automated decision-making (ADM), including concerns around inaccuracy, unfairness, and discrimination, it is essential to embed safeguards within any ADM process. Such safeguards should ensure meaningful human intervention and uphold individuals’ rights not to be subject to decisions based solely on automated processing.

In the case of both IPIC and EMRT, we have found that the human review processes implemented by the Home Office appear inadequate and the lack of meaningful human intervention in some features of the tool seemingly amounts to decisions made solely by automated processing - such as the harm score generated by the EMRT.

The documents seemingly reveal a practice of automated recommendations being accepted with limited human oversight, by design. This issue is compounded by unclear and inconsistent guidance provided to caseworkers, as well as “design nudges” that encourage acceptance of EMRT and IPIC recommendations with minimal scrutiny, as explained above.

Ultimately this has led to a situation in which the Home Office appears to be outsourcing complex decisions with direct and immediate life-altering implications for migrants, including children, to opaque tools, where those adversely affected are not informed how they can challenge these decisions and seek recourse.

Not only is solely automated decision-making of this type non-compliant with data protection laws but it is unacceptable that a person’s life, dignity, and future are being determined in this way.

The Home Office acted as if it was above the law

The above findings all point to the Home Office’s failure to demonstrate compliance with the data protection principles pursuant to the accountability principle.

This has also been evidenced by our view that the Home Office has failed to carry out a comprehensive and adequate Data Protection Impact Assessment (DPIA) in the case of IPIC or to undertake one at all for EMRT. Undertaking such an assessment is crucial to safeguarding individuals’ rights by identifying and mitigating anticipated negative impacts.

The Home Office’s decisions to use these tools in the way they have been doing shows a complete disregard for their legal obligations that protect migrants’ rights and in this particular case data protection laws that regulate the processing of personal data in the UK. This disregard also undermines the dignity of individuals, as robust data protection is essential to safeguarding their privacy, autonomy, and freedom from unjustified surveillance and exploitation by governments.

Automation: yet another threat for migrants

This is yet another example that migrants continue to face an increased level of human rights violations through hostile immigration policies and practices. At borders and beyond, their fundamental human rights and dignity are being violated through old and new technologies. These systems in place reinforce the dehumanising rhetoric about migrants.

Long Read

On International Migrants Day, we reflect on wins and losses in the fight against violations of migrants’ rights.

Authorities, like the UK Home Office must stop testing invasive and secretive tools on communities that are in vulnerable positions who depend on them for decisions about where they can live safely whether they can remain in the UK or face deportation, whether they will be subjected to 24/7 surveillance, or have their freedoms restricted through detention.

The use of ADM is yet another tool in the wider hostile environment, and a very worrying one that must be scrutinised and challenged now before it is deployed further across immigration enforcement operations and decision-making.

The UK Home Office must be held accountable

We want to see the Home Office being put in check and subject to scrutiny. It cannot get away with deploying invasive practices with such limited transparency.

As result of our complaint if the ICO finds the Home Office have breached the UK GDPR and the 2018 Data Protection Act, a few avenues are available to them.

In the first instance, PI is requesting that the ICO issue an enforcement notice requiring the Home Officer to cease all data processing activities through the use of EMRT and IPIC. At minimum, we would expect the ICO to request the Home Office to bring those processing activities into compliance with the UK GDPR, should they agree with our findings.

Beyond this complaint, PI will continue to call for a humane and dignified approach to immigration enforcement and border management which protects migrants. Immigration policies and practices must be designed and deployed within a legal framework based on the principles of fairness, accessibility, amongst others, and respect for human rights and dignity. They also must be transparent about the new ways in which technologies are being used to make decisions about migrants to ensure that there is scrutiny that these uses comply with existing laws and safeguards. Government and public authorities responsible for immigration must stop using invasive techniques for immigration control.

To keep up to date with our work on this issue, please sign up to our mailing list