Researchers scraped videos of transgender vloggers off YouTube without their knowledge to train facial recognition software


In August 2017, it was reported that a researcher scraped videos of transgender Youtubers documenting their transition process without informing them or asking their permission, as part of an attempt to train artificial intelligence facial recognition software to be able to identify transgender people after they have transitioned.

These videos were primarily of transgender people sharing the progress and results of hormone replacement therapy, including video diaries and time-lapse videos. The researcher extracted photos from these videos to obtain data to train facial recognition software to be able to identify someone mid- or post-transition from photos or videos from before they began hormone replacement therapy. Some people who were included in this dataset even had their transition photos published in subsequent scientific research papers.

The researcher justified the need for this research and the development of this technology because of the unlikely scenario that a terrorist could attempt to cross a border by using hormone replacement therapy to avoid detection through facial recognition.

This incident is an example of the hasty approach too many artificial intelligence researchers are taking, by first using peoples’ data without their permission and only asking questions later. This approach is particularly harmful for marginalized communities, such as transgender people, who already face heightened harassment, discrimination, and violence, which could be ramped up by violations of their privacy and by facial recognition technology.

Writer: James Vincent

Publication: The Verge