After three years, Amazon stopped using an AI-based hiring tool that discriminated against women

Examples
Date

From 2014 to early 2017, Amazon used an artificial intelligence (AI) hiring tool to review prospective employees’ resumes and select qualified candidates, based on Amazon’s previous hiring decisions from a ten-year period; however, the tool was much more effective at simply selecting male candidates, rather than the most qualified candidates, because Amazon had hired predominantly male candidates in the past. The hiring tool learned to discriminate against resumes that included the word “women’s,” which may have reflected membership in women’s organisations, or all-women’s colleges, and to prefer resumes that featured “masculine” language. In 2015, Amazon became aware of the tool’s discrimination based on resume terms and edited it to treat women-related terms as neutral. However, by early 2017, Amazon abandoned its use of the tool because the company could not ensure that the tool would not learn other ways of discriminating against otherwise qualified candidates. Amazon’s botched experiment with this tool is an example of the transparency problems posed by AI and how it can encode and entrench existing human biases.

Source: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G/

Writer: Jeffrey Dastin

Publication: Reuters

See more examples
Related learning resources
Target Profile