Amazon’s facial recognition software, Rekognition, incorrectly matched 28 US lawmakers to mugshots


The American Civil Liberties Union (ACLU) used Rekognition, Amazon’s facial recognition software, to compare images of US lawmakers to a publicly available database of 25,000 mugshot photos. The ACLU’s study validated research that has shown that facial recognition technology is more likely to produce false matches for women and people with darker skin. Amazon’s software misidentified 28 lawmakers as being the people in mugshots, and these false matches were disproportionately of members of minority communities, including six lawmakers who were Black. While lawmakers from minority groups made up 20 percent of Congress, they represented 40 percent of false matches. The ACLU’s study raises concerns that Amazon’s software will disproportionately subject women and members of minority communities to wrongful arrests and further exacerbate racial disparities in the criminal justice system.  Oregon police are already using Rekognition, and Amazon is actively pitching its software for real-time use by law enforcement in other jurisdictions. Equipping law enforcement with this technology could make minority communities more vulnerable to abuses by law enforcement and deter people within those communities from exercising their rights to freedom of expression or freedom of religion.



Author: Jacob Snow

Publication: ACLU of Northern California


See more examples
Related learning resources
Target Profile