Producing real change: key highlights of our 2025 results by season

Our key achievements since the beginning of 2025.

Long Read

We continue producing real change by challenging governments and corporations that use data and technology to exploit us.

Though 2025 has been a tumultuous year, we’ve achieved some wins and we would like to share them with you.

Creating real change is hard, and worthwhile changes takes time. We uncover problems, draw attention to them, and pressure for improvement. In the third quarter of the year, we helped ensure Clearview AI’s business falls under the GDPR; challenged facial recognition searches of passport and immigration databases in the UK; saw how our advocacy around banning FRT in schools can be useful for others; pushed for more transparency from the European Commission; and tested a “learning through playing” approach with regards to Data and Elections.

Take a look below for a quick overview of the results we produced or contributed towards, by season.

Autumn 2025

UK tribunal confirms that Clearview AI’s business should comply with GDPR

On 7th October, the UK Upper Tribunal (UT) ruled that Clearview AI’s controversial facial recognition business is subject to GDPR. This decision follows the ICO’s appeal of a 2023 ruling that suspended Clearview’s £7.5 mln fine, which was imposed by the ICO following PI’s complaint. Intervening in the appeal, PI and our lawyers played a crucial role in arguing for the interpretation of the GDPR provisions which was accepted by the court.

What it means in short: Private companies can’t escape data protection law just because they work closely with foreign law enforcement or national security agencies and should be accountable for their practices.

UK Government takes steps to regulate secret facial recognition searches of passport and immigration databases

For years the UK Home Office has secretly allowed police forces to search UK passport and immigration database photos with facial recognition technology. PI and Big Brother Watch took action calling for moratorium of this practice. Consequently, in September 2025, the Home Office published new guidance on “Handling facial image search requests from law enforcement organisations”, trying to address the concerns we raised with them. We are reviewing the guidance and considering potential further action.

What it means in short: Although the UK Government has taken steps to regulate secret facial recognition searches of its passport and immigration databases, we remain committed to challenging this problematic practice.

Campaign against FRT in Indian schools

A coalition of educationists, teachers’ unions, parents’ groups and civil society organisations from India launched a campaign against the Indian government’s plan to introduce facial recognition systems in schools. Their campaign has been grounded in the UN Special Rapporteur on the Right to Education’s call for banning use of FRT in schools, which was informed by PI’s advocacy.

What it means in short: Facial recognition in schools should be banned. Our advocacy got support from the UN and resulted in practical action at the national level.

More transparency from European Commission

In August 2025 PI submitted a complaint to the European Ombudsman on the European Commission’s (EC) refusal to give public access to documents concerning various aspects of a series of mergers. As a result, on 17 November, the European Ombudsman opened an investigation on the case.

What it means in short: EU Ombudsman’ investigation is an important step towards better transparency and accountability of the EC’s practices.

Data and Elections - Learning through playing

Following the publication of our Technology, Data and Elections Checklist, we have created a deck of Technology, Data and Elections Playing Cards. Both the checklist and the deck of cards aim to give electoral observers, civil society organisations and the general public necessary tools to understand and question the role of technologies in the electoral process. A number of CSOs and electoral observers have played it and used it in their educational programmes.

What it means in short: Learning about data and elections can be engaging and fun - try it yourself.

Summer 2025

Our Menstruation app research gets further attention from industry

Following the launch of our research on the Menstruation apps data sharing, we have been approached by two app providers. One of the apps was not part of our analysis, but after reading the report they offered to meet with us and have an open dialogue regarding their app and privacy policy.

What this means in short: Period-tracking apps must protect their users. Our research initiated conversations on how firms can make their products better.

US Congress asked PI to testify on the UK Government’s secret surveillance powers (TCN)

In June, the US House Judiciary Subcommittee on Crime and Federal Government Surveillance held a hearing on how foreign governments might influence Americans’ data through the CLOUD Act. The spotlight was on a controversial data-sharing agreement between the US and the UK, and a secret UK order (known as a ‘Technical Capabilities Notice’) that allegedly required Apple to create a backdoor into its iCloud storage to grant potential access to users’ private data. PI was one of two civil society organisations invited to testify, a strong recognition of our expertise and leadership on these critical issues. Meanwhile, our legal challenge against the secret UK order is moving forward. The Investigatory Powers Tribunal (IPT) expressed interest in hearing in public as much as possible of Apple’s and PI’s claims in early 2026.

What this means in short: A secret order which can be used to force the re-architecture of entire systems represents a dangerous, disproportionate and intrusive surveillance power, which needs to be restrained. PI’s vast experience can challenge that power.

ILO agreed to develop binding standards for platform workers

On 12th June 2025, the General Conference of the International Labour Organization (ILO) agreed a resolution committing to adopting a binding Convention, supplemented by a Recommendation, concerning decent work in the platform economy. Discussions will continue with a view to adopting these new standards at the International Labour Conference in 2026.

This decision came a couple of weeks after a joint declaration made by PI and 32 other organisations. In our declaration we asked the ILO to protect workers from algorithmic harms by adopting legally binding standards on decent work in the platform economy.

What this means in short: platform workers should be protected by international labour standards. PI is working with other organisations to demand the necessary standards.

UN expert calls for ban of Facial Recognition Technology in all educational settings and for rights-based approach to safety in education

In June, the UN Special Rapporteur on Education issued her report on "Safety in education”. The document declares that the “right to be safe in education requires an all-encompassing rights-based approach to safety, for all rights-holders, in all contexts, for all hazards” and calls on states to ensure all security measures respect human rights. The report also states that “facial recognition must be banned in all education”. These were among the key advocacy points made in our submission to the Report.

What this means in short: Students should be protected from abusive use of tech, particularly Facial Recognition Technology.

Spring 2025

UK tribunal pushes for more transparency and will hear our challenge against secret order

In March 2025, PI launched a legal challenge against the UK government’s use of a Technical Capability Notice. Following our complaints, on April 7 the Investigatory Powers Tribunal confirmed it will hear our challenge to the legality of the Home Secretary’s decision to use a Notice to secretly force Apple to allegedly reduce security in order to give the UK Government access to users’ secured data stored on its iCloud service. The Tribunal also rejected the Government request to keep basic details of Apple’s case secret. We launched this case in partnership with the UK campaign group Liberty.

What this means in short: The tribunal agreed that it’s in everyone’s interest that we have an open examination of the obscure government’s surveillance powers. The next stage of the case will delve into the substance of our legal objections.

Our Menstruation app research gets industry attention

As part of our revisiting of earlier research on the data practices of period-tracking apps, we contacted the companies whose apps we tested in this latest round. We received significant feedback from them. One company, in their reply, said they revised slightly their privacy policy to reflect our position, and others provided relevant comments or recommendations. Given that we’ve just recently launched our report, this is a good indication that our research can drive change.

What this means in short: Period-tracking apps must consider their data sharing practices amidst heightened concerns in this challenging policy environment in order to better protect their users. Getting feedback from companies on our research is a first step towards making their products better.

Lawsuit against use of facial recognition in Brazilian schools

In April 2025, the Public Prosecutor’s Office of Paraná, Brazil, sued the state government for using facial recognition in public schools. The Prosecutor’s Office argues that the program violates children’s data protection rights, especially given their vulnerability and the lack of informed consent. The lawsuit demands: (i) Immediate suspension of biometric data collection from nearly 1 million students (ii) R$15 million in damages for collective moral harm. This came after the UN Special Rapporteur on the right to education published her report on Academic freedom, which recommends that states ban facial recognition technologies from educational institutions. We have been advocating for this, by exposing the problem and calling for the Special Rapporteur’s recommendation to ban FRT in schools.

What this means in short: Children in school in Brazil should be free from harmful use of the Facial Recognition Technology, and children’s rights should be properly protected. Our demands materialised in a concrete action from an independent legal body.

Kenyan high court ruled against WorldCoin on case brought by ICJ Kenya

On 5 April 2025, the International Commission of Jurists (ICJ) Kenya, our partner organisation, together with the Katiba Institute filed a case against Worldcoin in the High Court of Kenya. They challenged Worldcoin’s collection, processing, and transfer of biometric data (such as iris and facial scans) without proper consent or a legally required Data Protection Impact Assessment. On 5 May, the court ruled that the company had violated Kenya’s Data Protection Act and ordered the deletion of illegally collected data. PI had not been directly involved in case, but supported the advocacy and research around it.

What this means in short: The ruling set a legal precedent for how tech companies must handle sensitive personal data responsibly and impose protections of people’s biometric data from misuse by private companies.

Please consider supporting us as we continue our work to protect privacy and human rights around the world.