Communities at risk: How security fails are endangering the LBGTIQ+ community

News & Analysis
LGBT march

Earlier this month, Brunei attracted international condemnation for a new law that will make gay sex punishable by death. While this is clearly abhorrent, Brunei is not the only country with explicit anti-gay laws.

Homosexuality is criminalised in over 70 countries around the world. And even in countries where gay sex is legal, such as the US, the LGBTIQ+ community still faces discriminatory surveillance and profiling by law enforcement agencies.

Through using the Internet and mobile apps, many people have found safe spaces online where they can freely express themselves. However, a lack of foresight and commitment to strong data protection standards by app developers have resulted in a series of security fails that have put the LGBTIQ+ community at serious risk.

The LGBTIQ+ community must now be aware of the serious consequences that their online lives can wreak on their offline lives, as their data is used against them to track their location and movements, and gain extremely personal insights into their preferences, connections, and even their medical history.

This enables governments and companies to construct profiles of them, using these highly sensitive details to make inferences or predictions that may or may not be accurate. Increasingly, profiles are being used to make or inform consequential decisions, from credit scoring, to hiring, to policing.

My location is none of your business

In 2016, it was revealed that the location of users of gay dating apps such as Grindr and Hornet could be pinpointed even when they’d turned on features intended to block it. In addition to the technical implications, this security flaw was literally putting people’s lives at risk – particularly for those who had not come out publicly as LGBTIQ+ or who lived in a hostile location and could face persecution.

The issue regained prominence in 2018, when new research highlighted a vulnerability that exposed the information of all three million Grindr users, including the location data of people who had opted out of sharing such information. The sensitivity of this data for the LGBTIQ+ community cannot be overstated – Grindr has users in 234 countries and territories, and homosexuality is illegal in approximately a third of these places.

Repressive regimes benefit from the leaking or indiscriminate sharing of this data. For instance, police in Egypt used gay dating apps such as Grindr and Hornet to find targets for arrest and imprisonment. Between October 2013 and March 2017, the Egyptian Initiative for Personal Rights documented more than 230 LGBTIQ+-related arrests. 

But companies also benefit from this data, in ways that we have only scratched the surface of. The woes of Grindr users were compounded when it came to light that the app was sending their HIV status and last tested date along with their GPS data, phone ID, and email to two app-optimising companies, Apptimize and Localytics. This data-sharing practice raises issues of informed consent and shows a lack of openness in how exactly user data is being processed.

Reneging on responsibility  

Companies and mobile app developers are building systems that accumulate vast amounts of our data without proper regard to risk or security. They have a responsibility to protect the privacy and data of their users, especially for the most vulnerable among us. Instead, we have seen them commit a number of security flaws that expose the LGBTIQ+ community to increased persecution and the potential for further discrimination.

We live and share a huge amount of our lives online. This dependence means there is much at stake and much to protect. It is time that companies take security seriously and ensure that their services do not place users at further risk of harm.

We found this here.