Discord pushes back age verification after security risks revealed

Discord’s halted rollout exposes the inherent privacy risks of biometric and ID‑based checks.

News & Analysis
Discord logo

Discord’s decision to pause its rollout of age verification emphasises what privacy advocates have been long-saying about the technology: it creates a honeypot with no guarantees for users that the company they’re uploading their face to is safe.

Earlier this month, the messaging app said that users globally would have to verify their age with a face scan or by uploading a form of ID if they want to access adult content.

It was planning on using third-party vendor Persona - until it was discovered by security researchers that Persona’s frontend code was exposed on the open internet.

With it, attackers could find out how requests are structured, how data flows between services, and how Persona validates identities.

This information could be the basis to construct fake verification scripts or bypass safeguards entirely.

As well as drawing criticism from its users, Discord has already risked the privacy of its users when it was found official ID photos of around 70,000 people were potentially leaked after the third-party company - which Discord refused to name - was hacked.

Now, the company has had to push its plans back - developing "more verification options" for users that would not require facial or ID scans, such as using credit card verification.

The takeaway is simple: age‑verification schemes introduce structural risks that cannot be mitigated merely by assurances from vendors or platform operators.

Age checks that demand biometric scans, government ID uploads, or behavioural profiling expands the data surface available for exploitation.

Biometric data is especially valuable - unlike a password, your face and fingerprints cannot be changed. Once they have been hacked, you cannot get them back.

Age verification also normalises digital identity checks everywhere on the internet, when people should be free to use the web as they please - without being forced to present their papers.

Discord’s shift towards credit‑card verification does not resolve these structural issues.

It still relies on third‑party processing, and still creates new hubs of sensitive data about users.

It's essential that tech companies protect their users' privacy even while complying with other legal requirements.

It should not be the default that a myriad of actors can get their hands on your personal information.

You should complain to any platform that requests this information from you, and voice your opposition before it is deployed, before it grows out of control, and before it’s too late.