Emotional recognition technology enters recruitment

Recruiters are beginning to incorporate emotional recognition technology into the processes they use for assessing video-based job applications. Human, a London-based start-up, claims its algorithms can match the subliminal facial expressions of prospective candidates to personality traits. It then scores the results against characteristics the recruiter specifies. HireVue, which sells its service to Unilever, uses the emotion database of Affectiva, a specialist in emotion recognition that works in market research and advertising. The AI companies argue that the technology helps remove bias in recruiting. However, it's unclear what happens to the data these services collect, whether the data used to train emotion recognition algorithms is "personal data" under the data protection laws, and whether employers go on to use this technology to track employees' moods. 

https://www.ft.com/content/e2e85644-05be-11e8-9650-9c0ad2d7c5b5
Writer: Patricia Nilsson
Publication: Financial Times
 

What is Privacy International calling for?

People must know

People must be able to know what data is being generated by devices, the networks and platforms we use, and the infrastructure within which devices become embedded.  People should be able to know and ultimately determine the manner of processing.

Limit data analysis by design

As nearly every human interaction now generates some form of data, systems should be designed to limit the invasiveness of data analysis by all parties in the transaction and networking.

Control over intelligence

Individuals should have control over the data generated about their activities, conduct, devices, and interactions, and be able to determine who is gaining this intelligence and how it is to be used.

Identities under our control

Individuals must be able to selectively disclose their identity, generate new identities, pseudonyms, and/or remain anonymous. 

We should know all our data and profiles

Individuals need to have full insight into their profiles. This includes full access to derived, inferred and predicted data about them.

We may challenge consequential decisions

Individuals should be able to know about, understand, question and challenge consequential decisions that are made about them and their environment. This means that controllers too should have an insight into and control over this processing.