Blank Space: Why Taylor Swift doing facial recognition is a bad idea

News & Analysis
Taylor swift holding a phone (from the videoclip)

Taylor Swift may be tracking you, particularly if you were at her Rose Bowl show in May.

According to an article published by Vanity Fair, at Swift’s concert at the California stadium, fans were drawn to a kiosk where they could watch rehearsal clips. At the same time – and without their knowledge - facial-recognition cameras were scanning them, and the scans were then reportedly sent to a “command post” in Nashville, where they were compared to photos of people who are known to be Swift’s stalkers.

While Swift is rightly concerned for her safety, the increasing use of facial recognition in public spaces, like concerts or protests, is deeply disturbing. Swift’s hit song “Blank Space” offers a good opportunity to explain why.

Nice to meet you, where you been?
I could show you incredible things

New technologies may represent great opportunities, but they can pose serious threats to the people they are supposed to empower if there are not strong safeguards. This is  more evident with the rapid and widespread implementation of biometric technology.

Biometric technology can authenticate an individual into a system or device through a “one to one” check, with their legitimately obtained consent and the option to opt-out of the system. This system compares a (hopefully decentralised) source of biometric data with a person’s physical features; for example, unblocking our mobile devices.

But the use of biometric data in a “one to many” identification system violates our right to privacy. Such systems do not confirm the identity of a known individual through a “one-to-one” system but instead aim to match biometrics measurements of an unknown identity to a mass biometric database. And it can be used to identify individuals in a crowd through the use of facial recognition identification technology.

Magic, madness, heaven, sin
Saw you there and I thought
Oh my God, look at that face
You look like my next mistake

Yes, Taylor, this could well be a mistake.

The varying accuracy and failure rates of such technologies can lead to misidentification, discrimination and many other harms. In criminal investigations, an individual could risk becoming a suspect based on wrongly identified biometric data. Misidentification could even hinder an asylum seeker’s fundamental right to seek asylum if they are wrongly identified as someone else who has already had their claim rejected.

Numerous examples have surfaced of bias in facial recognition systems that failed to recognise non-white faces, labelled non-white people as "gorillas", "animals", or "apes" (Google, Flickr), told Asian users their eyes were closed when taking photographs (Nikon), or tracked white faces but couldn't see black ones (HP).

In other words, your non-white fans are disproportionately likely to be affected by bias in facial recognition, the same technology you are using to protect yourself.

So it's gonna be forever
Or it's gonna go down in flames

Biometric data can identify a person for their entire lifetime. This makes the creation of a biometric database problematic, as risks will have to be anticipated far into the future - this could be a change in the political situation, a data breach, or the development of technology that facilitates biometrics being used for more purposes, therefore revealing more information about individuals than is currently possible or foreseeable.

You can tell me when it's over
If the high was worth the pain

Whilst biometric technology is improving, it is not infallible: its conceptual weaknesses, vulnerability to fraud and misuse, margin for error, and ability to be used for a wide range of purposes remain widely controversial.

The privacy risks associated with ID and biometric systems are numerous, ranging from identity theft and fraud to social sorting and persecution. When adopted in the absence of strong legal frameworks and strict safeguards, biometric technologies pose grave threats to privacy and personal security, as their application can be broadened to facilitate discrimination, profiling and mass surveillance. 

'Cause we're young and we're reckless
We'll take this way too far

Databases of facial images are proliferating. Few people realise how many databases may include images of their face. These may be owned by data brokers, social media companies such as Facebook and Snapchat, and governments. In Russia, Vkontakte, a local social media app, allowed online vigilantes to uncover the social media profiles of female porn actors and harass them.

As of 2017, the FBI's latest system gave it the ability to scan the images of millions of ordinary Americans collected from mugshots and driver licence databases in 18 states. Both the extent and the accuracy of the FBI's system were questioned by the House Committee on Oversight and Government Reform, which was told that roughly one in seven searches returned a list made up entirely of innocent candidates, even though the target of the search was actually in the database. 

It'll leave you breathless
Or with a nasty scar

The use of biometric data does not guarantee the protection of one’s identity, but rather the opposite. It raises additional concerns and irreversible consequences, as such data is absolutely unique to an individual. If one’s biometric data is stolen or misused it might compromise their legal identity, but yet they cannot be given a new one.

Got a long list of ex-lovers
They'll tell you I'm insane

The collection and (indefinite) storage of biometric data raises two distinct yet interlinked questions: first, why it needs to be stored at all, and second, who manages and owns the data and for what purpose it will be used.

Recognising the risks of mass data retention, opponents of biometric databases have argued that their creation is not necessary to achieve the intended purpose of identification. Furthermore, the retention of data in databases raises questions as to who can access this information, under what circumstances, and for what period of time.

But I've got a blank space baby
And I'll write your name

There is a huge pressure to identify people in more and more ways, and with more data gathered about them. This pressure comes from governments as they introduce giant schemes, often deploying biometrics. It also comes from private companies, as they attempt to identify us with or without our knowledge to track our movements and profile our behaviour.

It also comes from the international community as well, where ID schemes are considered a developmental goal in themselves, often meeting the needs of donors rather than people. At the heart of this lies the question: why do you need this proposed identity system, and why do you need it at a particular moment for a particular transaction? 
 

'Cause darling I'm a nightmare dressed like a daydream

Your face is a defining feature of your identity. But it’s also just another data point waiting to be collected and processed. At a time when cameras are ubiquitous and individual data collection is baked into nearly every transaction a person can make, faces are increasingly up for grabs, and generally without our knowledge or consent.

Privacy is a fundamental human right and in today’s digital world, it is the cornerstone that safeguards who we are and reinforces our autonomy and self-determination in the face of increasing state power. By its very nature biometric data is intrinsically linked to what constitutes us as ‘humans’ as it brings together various elements, which make up our respective and unique identity (gender, size, skin colour, ethnic origin, etc.). It has been argued that the collection, analysis and storage of such innate and personal data is “de-humanising” as it reduces the individual, the human being, to a number.

You are rightly concerned about your safety and you care about your privacy. Surely, you also care about the safety and privacy of your fans. Tracking their facial features – and without their consent – is not a good idea, and will have severe and harmful consequences.

Don't say I didn't say I didn't warn ya