Get out of our face, Clearview!

Achieved Results

On 29 November 2021, the UK data protection authority (ICO) found "alleged serious breaches of the UK's data protection laws" by Clearview AI, and issued a provisional notice to stop further processing of the personal data of people in the UK and to delete it. It also announced its "provisional intent to impose a potential fine of just over £17 million" on Clearview AI. On 23 May 2022, the ICO issued its final decision, imposing a fine of over £7.5 million on the company, and ordering it to delete and stop processing data of UK residents.

On 16 December 2021, the French data protection authority (CNIL) found Clearview's data processing of French residents illegal and ordered it to cease processing and to delete the data within two months.

On 10 February 2022, the Italian data protection authority (Garante) found "several infringements by Clearview AI", fined the company €20 million, and ordered it to delete and stop processing data of Italian residents.

On 13 July 2022, the Greek data protection authority (Hellenic DPA) imposed a €20 million fine on Clearview AI, the highest fine it ever imposed, and also required Clearview AI to delete and stop processing data of data subjects located in Greece.

Our legal action against a company that collects photos of you and your loved ones online.

clearview animation socmint facial recognition

Companies like Clearview AI are in the business of hunting faces. They trawl through sites like Instagram, YouTube and Facebook, as well as personal blogs and professional websites, and save a copy of public photos that contain a face. They then use facial recognition technology to extract the unique features of people’s faces, effectively building a gigantic database of our biometrics. We have taken legal action to stop their practices in Europe.

Why do they do this?

Because they make money by giving access to their huge database of faces to the police or even private companies.

What this means is that without you knowing, your face could be stored indefinitely in Clearview AI’s face database, accessed by a wide variety of strangers, and linked to all kinds of other online information about you.

PI thinks what Clearview AI is doing goes against privacy laws and is incredibly invasive and dangerous. It also threatens the freedom of opinion and expression of many. That’s why we, together with three other organisations, have taken legal action against the company.

What’s the problem with companies using public images like this?

Almost everyone has photos of them online, whether they know it or not. You could be uploading photos of yourself daily on your social media accounts. You could appear in photos from a work conference you recently attended. You could be sitting at a cafe and end up in the background of some other customer's photo. Or, you could be participating in a street protest and end up in a journalist's photo coverage of it. In many cases, you did not choose for your face to appear online, and where it appears can say a lot about you and your life. But Clearview's technology allows its clients to, at the click of a button, identify you and recoup all of that information about you.

This form of surveillance constitutes a serious interference with privacy rights. More generally, the development and deployment of this sort of surveillance by private actors has a chilling effect on people’s willingness to express themselves online, and can be a threat to people going about their lives freely. It is crucial for a healthy, striving and open Internet that people feel free to share personal information and photos however and wherever they want, without the fear that they might be 'grabbed' by private companies and shared with strangers.

These systems can also cause particular harm to vulnerable communities, who are at heightened risk of harassment and discrimination. In the hands of law enforcement and authorities, tools like these could potentially enable the grouping of people based on their ethnicity or other characteristics, opening the door to discriminatory tracking and monitoring, or practices like predictive policing.

What you can do!

While we take legal action to ask regulators to stop Clearview AI’s practices, people in the EU can ask the company if their face is in its database. And they can request that their face is no longer included in the searches that the company’s clients perform.

While Clearview AI has now removed from its website information on how to submit a request for access and deletion, we believe that it remains their legal obligation to respond to people’s requests. PI has provided general guidance on exercising your data access rights.