Cloud Storage App ‘Ever’ Has Been Weaponising Peoples’ Photos

News & Analysis
Facial recognition

Photo by Mike MacKenzie (via www.vpnsrus.com)

 

Ever, a cloud storage app, is an example of how facial recognition technology can be developed in ways people do not expect and can risk amplifying discrimination.

Ever is a cloud storage app that brands itself as “helping you capture and rediscover your life’s memories,” including by uploading and storing personal photos; Ever does not advertise that it uses the millions of photos people upload to train its facial recognition software, which it then aims to sell to law enforcement, private companies, and the military.

It is of serious concern that Ever is using its platform, in a way that violates peoples’ right to privacy, to create an inherently discriminatory, error-prone, forensically unreliable technology. Furthermore, Ever’s process of creating that technology is opaque and unaccountable.

While many companies such as Amazon and Microsoft train facial recognition technology from publicly available datasets (which may nonetheless include photos obtained without peoples’ knowledge or permission), Ever uses photos of its own customers and their friends and family–which they only alluded to in their privacy policy after  a query from NBC News. Ever’s updated policy provides that it “uses facial recognition technologies” and that peoples’ files “may be used to help improve and train our products and these technologies.”

Despite the privacy policy change, Ever uses peoples’ photos in ways they likely do not anticipate to train software that can ultimately be used for surveillance or to produce discriminatory outcomes that people would not necessarily condone. Furthermore, friends and family members who have not signed up Ever, but who are included in photos uploaded by others, do not know that their images are being used in this way. Using peoples’ photos for software to sell to outside companies is not based primarily on protecting customers’ interests or improving their cloud-storage experience. Ever’s practice violates the right to privacy of people who use its services and of people whose images appear in photos uploaded to Ever.

There is a lack of transparency and accountability in the technology Ever is developing. The dataset of photos that Ever uses could be unrepresentative of the broader population. As a result, the technology could have higher error rates for groups whose photos are absent from or underrepresented in the photos Ever uses to train its software. Because Ever’s dataset is derived from the photos people store, the database cannot be examined by independent outside groups to uncover such problems and demand change. In contrast, in the case of Amazon’s Rekognition software, researchers have exposed errors and biases in the software that have spurred demands for Amazon to stop selling it to law enforcement.

Furthermore, it is unclear how Ever is labelling the shape and composition of peoples’ facial features in the photos it feeds to its facial recognition technology, which the technology depends on to learn to create detailed biometric maps of peoples’ faces and match photos. Mislabelling features, employing gender binaries, and failing to recognize physical differences such as disability or injury could lead to discriminatory outcomes and amplify discrimination.

Ever serves as an example of how companies are able to exploit people’s data for their own purposes, in part due to the absence of laws restricting the development and use of facial recognition technology. Around the world, such technology is being rolled out and used for a wide range of purposes, often in secret and with highly discriminatory effects.

San Francisco is taking a step in the right direction and serves as a blueprint for how jurisdictions could respond to the type of problem Ever embodies: on May 14, 2019, legislators voted in favour of the “Stop Secret Surveillance Ordinance” to prevent local law enforcement from using facial recognition, making San Francisco the first city in the United States to do so. However, the San Francisco ordinance applies only to public entites, not to private companies.

San Francisco is one of a number of jurisdictions that have recognised the potentially dangerous application of such technology and sought to provide safeguards: Privacy International calls on others to do so and is actively campaigning for increased transparency and accountability over the use of such technology as part of our Neighbourhood Watched campaign.

Further reading:

How Privacy International is working to ensure new technologies, including facial recognition, are governed and used in ways that protect our privacy, preserve our civic spaces, and support democracy: https://privacyinternational.org/feature/2852/protecting-civic-spaces

Privacy International’s Response to the West Minster Hall Debate on Facial Recognition: https://privacyinternational.org/advocacy-briefing/2835/our-response-westminster-hall-debate-facial-recognition

Privacy International’s Neighbourhood Watched Campaign: https://privacyinternational.org/campaigns/neighbourhood-watched