Advanced Search
Content Type: Examples
Documents obtained under a FOIA request show that Washington, DC police have for years used online surveillance tools to monitor social media activity, collect data on individual users and their social graphs, and monitor public protests. The police departments using these techniques offer little transparency. The documents reveal, however, the claims companies like Dataminr and Voyager make in marketing their technology to police - for example, to be able to compile a list of thought leaders…
Content Type: Examples
US students demonstrating over the war in Gaza wear masks and blankets to block counter-protesters from filming them or posting them images online hoping to identify them, as has happened repeatedly since the protests began. In some places, university policies or state laws ban wearing masks, even though many protesters prefer wearing masks to help avoid covid infection. https://www.theguardian.com/us-news/2024/apr/30/why-are-pro-palestinian-students-wearing-masks-campusPublication:…
Content Type: Examples
Metropolitan Police used live facial recognition and attacked a crowd of trans rights campaigners, solidarity activists and anti-fascists protesting a conference on conversion therapy. Participants report being sprayed in their faces with PAVA at close range and subjected to personal physical attacks.https://netpol.org/2024/03/27/police-surveillance-and-use-of-pepper-spray-at-trans-solidarity-protest-condemned/Publication: Network for Police MonitoringWriter: NetPolPublication date: 2024-03-27
Content Type: Examples
Spreading facial recognition technology - according to figures from the Carnegie Endowment for International Peace, government agencies in 78 countries use facial recognition systems - is changing the risk of participating in protests by making it impossible to count on being anonymous or in a group too large to arrest. In one example, police in Moscow reportedly use facial recognition to identify and preemptively arrest people who might be on their way to join protests. In other cases,…
Content Type: Examples
Chinese students and newly graduated activists in London report that they frequently see middle-aged Chinese men at protests watching them without participating, report getting strange calls, and say their families have been threatened by local authorities in China. Experts say they may be experiencing an escalation of surveillance that previously was limited to photographs and monitoring. Many of activists are newcomers to protest and were unaware of these risks. There were more than 151,000…
Content Type: Examples
The UK's political and cultural institutions are increasingly joining the police and private intelligence companies in tracking peaceful activists without transparency or accountability. The intelligence company Welund lists among its customers BP and many other oil and gas giants, as well as public authorities including the Greater London Authority. It provides them with a daily dashboard listing planned demonstrations, events, and protests that may be disruptive. https://www.…
Content Type: Examples
Student protesters accused Harvard administrators of attempting to surveil and identify students participating in a vigil for 100-plus Palestinians who died under Israeli attack while awaiting humanitarian aid. Current Harvard policy prohibits classroom disruptions, and lowered tolerance for protest in general. https://www.thecrimson.com/article/2024/3/4/protesters-accuse-harvard-surveillance/Publication: Harvard CrimsonWriter: Sally E. EdwardsPublication date: 2021-03-04
Content Type: Examples
Section 702 of the US Foreign Intelligence Surveillance Act, due to expire in April 2024 unless renewed, is intended to allow intelligence agencies to surveil foreigners overseas but under the rubric of "foreign influence" or "foreign intelligence gathering" can easily be abused to surveil Americans at home in response to political leaders' current obsessions. Government documents showed in 2023 that the FBI has misused S702 to search Black Lives Matter protesters' communications between 2020…
Content Type: Advocacy
In January 2024, the ILO published a report, Realizing Decent Work in the Platform Economy, following a decision by the ILO Governing Body that the 2025 and 2026 International Labour Conferences would discuss standard-setting on decent work in the platform economy. The report - and the new ILO standard in development - are of interest to Privacy International because of the impacts on workers' privacy and autonomy that arise from the growing use of invasive surveillance practices and…
Content Type: Advocacy
In May 2024, we made a submission for the forthcoming report of the UN Special Rapporteur on the right to education to the General Assembly in October 2024.
Amongst others we recommend the UN Special Rapporteur for this upcoming report to:
Underline the need for a human rights-based approach to all AI systems in the education sector and describe the necessary measures to achieve it.
Reassert that any interference with the right to privacy and the advancement of the right to education due to…
Content Type: External content
The public register is key to addressing the information imbalance of algorithmic management by allowing workers (and candidates) and their representatives to understand what algorithms are being used and how they work. In order to do this, the register must be in accessible non-technical language and kept up to date. It must include a list of all algorithms that affect worker's treatment while at work. For each listed algorithm, the following information must be included:This is different…
Content Type: Explainer
Behind every machine is a human person who makes the cogs in that machine turn - there's the developer who builds (codes) the machine, the human evaluators who assess the basic machine's performance, even the people who build the physical parts for the machine. In the case of large language models (LLMs) powering your AI systems, this 'human person' is the invisible data labellers from all over the world who are manually annotating datasets that train the machine to recognise what is the colour…
Content Type: Explainer
IntroductionThe emergence of large language models (LLMs) in late 2022 has changed people’s understanding of, and interaction with, artificial intelligence (AI). New tools and products that use, or claim to use, AI can be found for almost every purpose – they can write you a novel, pretend to be your girlfriend, help you brush your teeth, take down criminals or predict the future. But LLMs and other similar forms of generative AI create risks – not just big theoretical existential ones – but…
Content Type: Long Read
IntroductionFor years PI has been documenting the market dominance and associated power of Big Tech over the digital economy, and the threats this poses to our privacy and wider rights.The digital economy is characterised by a handful of Big Tech companies that have established and maintained dominance over the digital market through opaque and exploitative practices. Big Tech exploits the data of those who use their platforms in ways which interfere with our privacy and wider rights. In…
Content Type: Advocacy
In an increasingly digitised world, automation, artificial intelligence and sensitive data processing present new and rapidly shifting challenges which underscore the urgent need for states to ensure that the rights of persons with disabilities are explicitly addressed and centred when it comes to the use of data and technology. Digital technologies can offer important opportunities for accessibility and the realisation of human rights of persons with disabilities, but can also present…
Content Type: Video
Links- Andres Freund's Mastodon - where he revealed the backdoor: - Read more in Ars Technica's article about it - Read more in The Verge's article - Read more in Wired's article about it - Check out this excellent and very helpful infographic- The XKCD comic we mention
Content Type: Advocacy
As part of our campaign 'The End of Privacy in Public' and our wider work monitoring developments of facial recognition technology (FRT) in the UK, we continue to to challenge the government, the police and the private sector regarding their unfettered roll out of FRT in the UK.To this end, we co-signed a letter sent on 4 June 2024, alongside UK civil society organisations campaigning against the use of facial recognition, to retailers across the UK calling on them to not use live FRT within…
Content Type: Advocacy
As part of our campaign 'The End of Privacy in Public' and our wider work monitoring developments of facial recognition technology (FRT) in the UK, we continue to to challenge the government, the police and the private sector regarding their unfettered roll out of FRT in the UK. In May 2024, we co-signed a letter with a coalition of UK based NGOs regarding a recent investigation that exposed The Metropolitan Police's (the Met) use of website PimEyes. PimEyes acts as a facial recognition ‘…
Content Type: Long Read
Social media is now undeniably a significant part of many of our lives, in the UK and around the world. We use it to connect with others and share information in public and private ways. Governments and companies have, of course, taken note and built fortunes or extended their power by exploiting the digital information we generate. But should the power to use the information we share online be unlimited, especially for governments who increasingly use that information to make material…
Content Type: Long Read
Table of contentsIntroductionWeighing the (potential) benefits with the risksPrivacy rights and the right to healthThe right to healthPrivacy, data-protection and health dataThe right to health in the digital contextWhy the drive for digitalImproved access to healthcarePatient empowerment and remote monitoringBut these same digital solutions carry magnified risks…More (and more connected) dataData leaks and breachesData sharing without informed consentProfiling and manipulationTools are not…
Content Type: People
Tara is a Legal Officer at Privacy International. She works on legal advocacy and supports PI’s litigation efforts in the areas of surveillance, the authoritarian use of technology and social and economic justice. Tara was admitted as an attorney in South Africa in March 2018, and holds a Master of Laws (LLM), Bachelor of Laws (LLB) and undergraduate degrees in law and political science. Before joining PI, Tara worked as a human rights attorney in South Africa, specialising in information…
Content Type: Advocacy
What's happening with digital ID in Kenya?In 2018, the Kenyan government tried to introduce the Huduma Namba project. Among other things, the project established the National Identity Integrated Management System (NIIMS); a centralised database purposed to consolidate all government records about an individual into a single ID system. In April 2019, PI submitted an expert affidavit challenging the NIIMS. In 2020, the High Court of Kenya acknowledged several key issues raised by PI in its…
Content Type: Long Read
In 2024, Privacy International continued to produce real change by challenging governments and corporations that use data and technology to exploit us.Since the beginning of the year, we’ve achieved some big wins and would like to share them with you.Take a look below for a quick overview of the results we produced or contributed towards, by season.Winter & Spring 2024New EU regulation empowers consumersOn 17 January 2024, the European Parliament adopted the Directive on empowering…
Content Type: Advocacy
Generative AI models cannot rely on untested technology to uphold people's rightsThe development of generative AI has been dependent on secretive scraping and processing of publicly available data, including personal data. However, AI companies have to date had an unacceptably poor approach towards transparency and have sought to rely on unproven ways to fulfill people's rights, such as to access, rectify, and request deletion of their dataOur view is that the ICO should adopt a stronger…
Content Type: Advocacy
At PI we have been observing with concern the rapid expansion of technologies in educational settings, which has included a wide array of tools that allow the surveillance of students and academic staff, to the detriment of their privacy and academic freedom. We consider this upcoming report as an essential platform to examine the intricate interplay between academic freedom, freedom of expression, and surveillance conducted by both public and private entities through Education…
Content Type: Advocacy
While PI recognises the threats posed by cybercrime, PI reiterates the need both for a narrow scope for the proposed Convention, focusing solely on core cyber-dependent crimes, as well as for effective safeguards throughout the entire treaty to ensure human rights are respected and protected, especially in the areas of privacy and freedom of expression. Throughout the negotiations most of proposals by Member States and other stakeholders aimed at restricting the scope of the treaty and…
Content Type: Video
Links - Read more about PI's work on encryption- Matt Blaze and crypto.com; you can now find Matt at mattblaze.org - More about ITAR and the export of cryptography- More about France's ban on encryption ending in this 1999 article from the Register- More about the Data Encryption Standard - Find out more about the Clipper Chip or take a look at this NY Times article from 1994 (paywalled)- Matt Blaze's flaw in the Clipper Chip- NSA Data Center and NSA holding data- An…
Content Type: Advocacy
On 18 October 2023, the Inter-American Court of Human Rights (IACtHR or Court) issued a historic judgment declaring the Republic of Colombia internationally responsible for human rights violations against several members of the human rights non-profit Colectivo de Abogados y Abogadas José Alvear Restrepo (CAJAR)and their relatives. This groundbreaking decision marks the first acknowledgment within the inter-American context of a state’s international responsibility for violating the right to…
Content Type: News & Analysis
Is the AI hype fading? Consumer products with AI assistant are disappointing across the board, Tech CEOs are struggling to give examples of use cases to justify spending billions into Graphics Processing Units (GPUs) and models training. Meanwhile, data protection concerns are still a far cry from having been addressed.
Yet, the believers remain. OpenAI's presentation of ChatGPT was reminiscent of the movie Her (with Scarlett Johannsen's voice even being replicated a la the movie), Google…