Search
Content type: Examples
In 2018 a report from the Royal United Services Institute found that UK police were testing automated facial recognition, crime location prediction, and decision-making systems but offering little transparency in evaluating them. An automated facial recognition system trialled by the South Wales Police incorrectly identified 2,279 of 2,470 potential matches. In London, where the Metropolitan Police used facial recognition systems at the Notting Hill Carnival, in 2017 the system was wrong 98% of…
Content type: Impact Case Study
What is the problem
For over two decades we have been documenting an alarming use and spread of surveillance. It is no longer just the wars on terror or drugs or migration that is driving this trend. The management of health crises and distribution of welfare regularly are among others being used to justify this turn to increasingly invasive forms of surveillance. From country to country we see the same ideas and the same profiteers expanding their reach.
When we first released our report on…