Our key achievements from 2025

Here is a selection of our biggest achievements in 2025.

Long Read

Last year was challenging and brought more uncertainty in our lives. But despite this, in 2025 we made visible progress towards making the world a better place for all of us. We challenged governments and corporations that exploit data and technology, pushed for new national and international policy standards, drove standard-setting action by courts and regulators. We educated and campaigned with others.

As a result, we produced significant impacts that directly affect people across the world. Here are some of our biggest achievements from last year.

Challenging governments and corporations

UK tribunal confirms that Clearview AI’s business should comply with GDPR

On 7th October, the UK Upper Tribunal (UT) ruled that Clearview AI’s controversial facial recognition business is subject to GDPR. This decision follows the ICO’s appeal of a 2023 ruling that suspended Clearview’s £7.5 mln fine, which was imposed by the ICO following PI’s complaint. Intervening in the appeal, PI and our lawyers played a crucial role in arguing for the interpretation of the GDPR provisions which was accepted by the court.

For the last five years we have been successfully challenging Clearview’s unlawful practices, seeking accountability in five European countries and protecting hundreds of millions of people whose pictures are unknowingly scraped to provide facial recognition services to law enforcement authorities.

What it means in short: Private companies can’t escape data protection law just because they work closely with foreign law enforcement or national security agencies and should be accountable for their practices.

UK Government takes steps to regulate secret facial recognition searches of passport and immigration databases

For years the UK Home Office has secretly allowed police forces to search UK passport and immigration database photos with facial recognition technology. PI and Big Brother Watch took action calling for moratorium of this practice. Consequently, in September 2025, the Home Office published new guidance on “Handling facial image search requests from law enforcement organisations”, trying to address the concerns we raised with them. We are reviewing the guidance and considering potential further action. In December 2025, the Home Office took a further step by launching a consultation to help develop a new legal framework for the use of facial recognition and similar technologies by law enforcement. We responded to that consultation and continue our fight.

What it means in short: Although the UK Government has taken steps to regulate secret facial recognition searches of its passport and immigration databases, we remain committed to challenging this problematic practice.

Higher international policy standards

New Guidance for Observing Personal Data Use in Elections

In December 2025, election observation organisations from all regions of the world met at the UN to reaffirm their commitment to the principles of international election observation and published the Principles and Guidance for Observing Personal Data Use in Elections. These principles and guidance are essential for strengthening the capacity of election observers to assess how personal data is used throughout the election process.

For more than five years, PI has worked to strengthen oversight of data and technology in elections, partnering with global and regional election observation organisations and developing practical tools for monitoring data use in elections. This sustained effort is reflected in the Guidelines, which acknowledge our contribution and highlight our checklist as a useful resource for electoral observers.

What it means in short: Publication of the Principles and Guidance for Observing Personal Data Use in Elections by the DoP is an important recognition that modern election observation needs to scrutinise how personal data and technologies are regulated and used during and around elections. This is exactly what were advocating for over the last five years.

ILO agreed to develop binding standards for platform workers

On 12th June 2025, the General Conference of the International Labour Organization (ILO) agreed a resolution committing to adopting a binding Convention, supplemented by a Recommendation, concerning decent work in the platform economy. Discussions have been initiated with a view to adopting these new standards at the International Labour Conference in 2026.

This decision came a couple of weeks after a joint declaration made by PI and 32 other organisations. In our declaration we asked the ILO to protect workers from algorithmic harms by adopting legally binding standards on decent work in the platform economy.

Later in the year, PI and global allies presented their proposals to improve key provisions of the draft ILO Convention supplemented by a Recommendation concerning decent work in the platform economy.

What this means in short: platform workers should be protected by international labour standards. PI is working with other organisations to demand the necessary standards.

Shaping agendas and inspiring others

PI’s work on Facial Recognition Technology (FRT) in schools gets support at the UN and shapes change in Brazil and India

In 2025, our advocacy against FRT in schools brought some important achievements:

  • Following our earlier advocacy efforts and their results, in April 2025, the Public Prosecutor’s Office of Paraná, Brazil sued the state government for using facial recognition in public schools. The Prosecutor’s Office argues that the program violates children’s data protection rights, especially given their vulnerability and the lack of informed consent.
  • In June 2025, the UN Special Rapporteur on Education issued her report on "Safety in education”. The report declared that the “right to be safe in education requires an all-encompassing rights-based approach to safety, for all rights-holders, in all contexts, for all hazards”; called on states to ensure all security measures respect human rights; and stated that “facial recognition must be banned in all education”. These were among the key advocacy points made in our submission to the Report.
  • A couple of months later a coalition of educators, teachers’ unions, parents’ groups and civil society organisations from India launched a campaign against the Indian government’s plan to introduce facial recognition systems in schools. Their campaign is grounded in the UN Special Rapporteur on the Right to Education’s call for banning use of FRT in schools, which was informed by PI’s advocacy.

What it means in short: Facial recognition in schools should be banned. Our advocacy got support from the UN and resulted in practical action at the national level in India and Brazil.

Courts and regulators adopt decisions setting new standards

More transparency from European Commission

In August 2025 PI submitted a complaint to the European Ombudsman on the European Commission’s (EC) refusal to give public access to documents concerning various aspects of a series of mergers. As a result, on 17 November, the European Ombudsman opened an investigation on the case.

What it means in short: EU Ombudsman’s investigation is an important step towards better transparency and accountability of the EC’s practices.

Following the media report that the UK Government had issued Apple with an order (known as “Technical Capability Notice” (TCN)) to maintain the capability to provide access to data stored on its iCloud system by Apple users anywhere in the world, in March 2025, PI sought to intervene in Apple’s case before the UK Investigatory Powers Tribunal and to challenge the TCN regime in general. Since then:

  • Following our complaints, on 7 April 2025, the Tribunal confirmed it will hear our challenge to the legality of the Home Secretary’s decision to use a TCN to create the ability to access users’ secured data stored on Apple’s iCloud service. We launched this case in partnership with the UK campaign group Liberty. The Tribunal also rejected the UK Government’s request to keep basic details of Apple’s case secret.
  • On 23 July 2025, after case management submissions from all parties, the Tribunal issued a case management order. The Tribunal directed the UK Government to agree “assumed facts” with Apple which were to form the basis of a hearing to be scheduled in early 2026.
  • In June, the US House Judiciary Subcommittee on Crime and Federal Government Surveillance held a hearing on how foreign governments might influence Americans’ data through the CLOUD Act. The spotlight was on this data-sharing agreement between the US and the UK, and its interaction with the UK’s order on Apple. PI was one of two civil society organisations invited to testify, a strong recognition of our expertise and leadership on these critical issues.
  • In August 2025, the Investigatory Powers Tribunal (IPT) dismissed Apple’s legal challenge to the TCN regime, following a “change in circumstances”. The case brought by PI and Liberty against the TCN regime will carry on, however, and is moving toward a hearing in 2026.

What this means in short: A secret order which can be used to force the re-architecture of entire systems represents a dangerous, disproportionate and intrusive surveillance power, which needs to be restrained. PI’s vast experience can challenge that power.

Education and Working with partners

PI’s research and educational materials are relevant and useful to others

Over the last year, a number of other civil society organisations, academics, journalists and experts expressed interest or directly used our materials for educational or communications purposes. Among the most frequently used and referenced materials in 2025 were: our Guides (and particularly Social media and messaging apps settings and LLM Guides); our Learning resources (particularly about Mass Surveillance), our Technology, Data and Elections checklist, our Militarisation of Tech resources, our second Menstruation apps report, and our Technical Capability Notice (TCN) resources. And, of course we’ve seen stable interest in our Podcasts and Legal action developments.

What this means in short: Our research and educational materials are valued and continue to be a reliable resource for a broad range of audiences.

Supporting partners to achieve change and help affected people

In 2025, we were happy to support our partners work and celebrate their achievements. For instance, the court case brought by ICJ Kenya against Worldcoin in the High Court of Kenya resulted in a historic decision. The court ruled that Worldcoin violated Kenya’s Data Protection Act and ordered the deletion of illegally collected data. Also in Kenya, Haki na Sheria, through “art-as-advocacy" events, helped members of local communities understand the Maisha Namba identity system, as well various risks associated with it. In Colombia, Karisma worked on the ground with the actual and potential beneficiaries of the Sisben welfare system, helping them understand and act upon potential situations of discrimination and exclusion resulting from its design. Our partners Paradigm Initiative and Transparencia Electoral trained election observers and ensured that protecting personal data and critically examining use of technology in elections is part of their methodologies. This is only an illustrative subset of the extremely needed and impactful work done by our network of partners.

What this means in short: Working together with organisations around the world is an effective way to address global privacy challenges arising from the disproportionate and abusive use of technology across various social domains.

Please consider supporting us as we continue our work to protect privacy and human rights around the world.