Search
Content type: Long Read
TAKE ACTION TO STOP THE END OF PRIVACY IN PUBLIC1. IntroductionThe use of facial recognition technology (FRT) by law enforcement and private companies in public spaces throughout the UK is on the rise. In August 2023, the government announced that it is looking to expand its use of FRT, which it considers “an increasingly important capability for law enforcement and the Home Office”. The indiscriminate use of this dystopian biometric technology to identify individuals in public spaces is a form…
Content type: Examples
Based on a draft methodology from Russia's Emergency Situations Ministry, Kommersant business daily reports that Rostec's data subsidiary, Natsionalny Tsentr Informatizatsii, is developing software that will use machine learning to detect and prevent mass unrest. The software will analyse news reports, social media postings, public transport data, and video surveillance footage; if it fails to prevent mass unrest it is expected to direct the crowd's movements to stop it from escalating. The…
Content type: News & Analysis
As Amnesty International and Forbidden Stories continue to publish crucial information about the potential targets of NSO Group’s spyware, we know this much already: something needs to be done.
But what exactly needs to be done is less obvious. Even though this is not the first time that the world has learned about major abuses by the surveillance industry (indeed, it’s not even the first time this month), it’s difficult to know what needs to change.
So how can the proliferation and use of…
Content type: Explainer
What is predictive policing?
Predictive policing programs are used by the police to estimate where and when crimes are likely to be committed – or who is likely to commit them. These programs work by feeding historic policing data through computer algorithms.
For example, a program might evaluate data about past crimes to predict where future crimes will happen – identifying ‘hot spots’ or ‘boxes’ on a map. But the data these programs use can be incomplete or biased, leading to a ‘feedback…