King’s Cross has been watching you - and the police helped

Police forces in the UK started trialing facial recognition technology in public events a few years ago, but are taking a step further in the questionable use of this extremely intrusive technology.

Key findings
  • Police in the UK have been using facial recognition at crowded events since 2016.
  • A King’s Cross site property owner was revealed to operate CCTV cameras equipped with facial recogntion technology (FRT) in secret.
  • It was later reported that these cameras were the result of a partnership between the site’s owner and the Metropolitan Police.
  • A report by the Met Police shows that there was no oversight regarding this agreement, even though local police had shared images of citizens with a private company.
Case Study
police serving ice cream

By assigning policing functions to private companies, governments put our everyday lives and interctions under constant surveillance.

Facial recognition technology (FRT) is fairly present in our daily lives, as an authentication method to unlock phones for example. Despite having useful applications, FRT can also be just another technology used by those in power to undermine our democracies and carry out mass surveillance. The biometric data collected by FRT can be as uniquely identifying as a fingerprint or DNA. The use of this technology by third parties, specially without your consent, violates your right to privacy.

The proliferation of surveillance technologies and consequently of the companies that develop them, have provided governments with myriad of opportunities to outsource surveillance. Companies are invited, compelled or even volunteer to team up with police and law enforcement agencies to install CCTV systems, facilitate smart cities, provide access to personal data or even carry out policing functions traditionally entrusted to the state.

In this piece we will have a look into what the first trials of FRT by Police in the UK were like. Then, we delve into a secret deal that was reportedly struck between London’s Metropolitan Police and a private company at King’s Cross in London.

Police and FRT in the UK

The first trials of FRT by police forces in the UK achieved nothing but mediocre results.

In August 2016 the Metropolitan Police conducted the first trial of facial recognition systems in the UK for crowd control. The trial happened at the 50th anniversary of the Notting Hill Carnival, an event attended by more than one million people each year. The Met police used a database of faces of individuals who were either forbidden to attend the event or wanted by police at the time. This trial failed to pick any suspects out of the crowd.

The following year, the Met police tried the facial recognition system again, in the same event, hoping that they might be lucky this time. The system did identify individuals, but it did so with a failure rate of 98%, wrongfully reporting a total of 102 people as suspects.

South Wales police was given £2.1m by the Home Office to deploy and test FRT in mass gatherings like concerts and royal family visits. But again, the technology has reportedly mislead the police 91% of times.

Despite the unimpressive statistics gathered from these short-term FRT operations, we have seen the police taking a step forward in the use of the technology and turning to private entities for long-term secret deals.

 

person being handed coffee by a police officer
The last thing we want, by entrusting companies with these intrusive tools, is creating one more reality full of exploitation and abuse.

Partnering with the private sector

King’s Cross Central Limited Partnership (KCCLP), the private owner of King’s Cross Central 67-acre site in London, has had CCTV cameras equipped with FRT running secretly within its premises for nearly two years, between May 2016 and March 2018. This operation involved a deal between KCCLP and the Metropolitan Police, which was kept out of the public sphere.

When the use of FRT in King’s Cross was first reported, a spokeswoman said it was meant to “ensure public safety” and that it had only been used to help the Metropolitan and British Transport Police “prevent and detect crime in the neighbourhood”. Both police forces first told BBC News that they were unaware of any partnerships or involvement in the case, but the collaboration had been running since 2016.

As the UK Information Commissioners Office (ICO) commenced an investigation, the Mayor of London asked for justifications. The Metropolitan police handed a report to the Mayor of London that disclosed that an agreement had been struck on a borough level, and the process never reached higher instances of the force.

The first agreement to enable image sharing with Kings Cross Estate Services was signed in 2016. At that time the MPS did not operate the Basic Command Unit structure. In 2016, Facial Recognition for purposes connected to law enforcement was also in its infancy. It is within this context that Camden Borough entered into an information sharing agreement to enable images to be shared with Kings Cross Estate Services.
Source: Report to the Mayor of London by the Metropolitan Police

The fact that a technology is not mainstream or at “its infancy” in the context of law enforcement, does not mean it can be used with no oversight or reporting. On the contrary, such an intrusive technology should demand extra transparency and scrutiny.

The report mentions that “Camden Police provided images of wanted individuals, known offenders and missing people to Kings Cross Estate Services”. Who are the people in these databases? Researchers have complained about the lack of clarity in defining the term “wanted” in facial recognition databases in the past.

The condition of being ‘wanted’ was consistently stated as a criterion for being enrolled on a watchlist. Those included on the watchlist thus apparently ranged from individuals wanted by the courts to those wanted for questioning, across a range of different offences.
Source: quote from two academics at the University of Essex, Daragh Murray and Pete Fussey, for Wired.

This ambiguity means that any of us could end up in one of those watchlists. Having sensitive biometric data, such as facial images, exchanged by the police in secret and in absence of any proper oversight or safeguards is definitely not something to be taken lightly.

Following the debate that surfaced after the above revelations, King’s Cross Central Limited Partnership announced to have droped their plans to reintroduce any form of FRT at the King’s Cross Estate.

We believe that the use of FRT by private companies is extremely intrusive, unnecessary and disproportionate. Privatisation of surveillance is nothing more than an effort to distort long-established societal premises of privacy and perceptions of authority, at the expense of our dignity. Maintaining or endorsing secret watchlists by private companies is fundamentally against democratic principles.

Governments often abuse their powers to carry out unlawful surveillance; PI has been long complaining about governments use of IMSI catchers, indiscriminate data retention schemes or government hacking which can seriously undermine both our privacy and security as Internet users, just to name a few.

The last thing we want, by entrusting companies with these intrusive tools, is creating one more reality full of exploitation and abuse. And, similar to governments failing to adhere to their transparency and accountability obligations, companies are often shielding behind confidentiality and trade secret exceptions to refuse to provide us with information about their data practices, for example.

Police should not have dual loyalty to a private company and the public; police should be loyal to the public. A private company’s aim is to generate profit and not to protect us. Such companies do not act from the public good perspective and this enhances the risk of abuse and exploitation.

Is it over then?

Despite the announcement made by KCCLP, public-private surveillance partnerships are still lurking around the globe. Recently Amazon announced that they will be putting a one-year suspension on sales of its facial recognition tech to law enforcement. But what about Amazon Ring and its collaborations with police forces? We must keep an eye on and stop partnerships like this, making sure they do not step on our rights under the justification of being there for a greater good.

There is power in our hands to challenge abuse. If you want to know more about how to request information from public and private entities check our Freedom of Information and Data Subject Access Request tips. These are tools that we can use as citizens to exercise control over our data and demand that both governments and companies are held to account!