Privacy International http://privacyinternational.org/rss.xml en Open letter to EU Member States: Deliver ePrivacy now! http://privacyinternational.org/news-analysis/3255/open-letter-eu-member-states-deliver-eprivacy-now <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>On 11 October 2019, Privacy International together with EDRi, BEUC, AccessNow and Open Society European Policy Institute, <a href="https://privacyinternational.org/sites/default/files/2019-10/ePrivacy_NGO_letter_20191011.pdf">sent an open letter</a> to EU Member States, to urge them to conclude the negotiations on the ePrivacy Regulation.</p> <p>The letter highlights the urgent need for a strong ePrivacy Regulation in order to tackle the problems created by the commercial surveillance business models, and expresses the deep concerns by the fact that the Member States, represented in the Council of the European Union, still have not made decisive progress, more than two and a half years since the Commission presented the proposal.</p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Screenshot%202019-10-11%20at%2010.04.33.png" width="1459" height="678" alt="civil society" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Screenshot%202019-10-11%20at%2010.04.33_0.png" width="1459" height="678" alt="civil society" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Screenshot%202019-10-11%20at%2010.04.33_1.png" width="1459" height="678" alt="civil society" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/adtech" hreflang="en">AdTech</a></div> <div class="field__item"><a href="/topics/e-privacy" hreflang="en">e-privacy</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/challenging-corporate-data-exploitation" hreflang="en">Challenging Corporate Data Exploitation</a></div> </div> </div> <div class="field field--name-field-attachments field--type-file field--label-hidden field__items"> <div class="field__item"><table data-striping="1"> <thead> <tr> <th>Attachment</th> <th>Size</th> </tr> </thead> <tbody> <tr class="odd"> <td> <span class="file file--mime-application-pdf file--application-pdf"> <a href="http://privacyinternational.org/sites/default/files/2019-10/ePrivacy_NGO_letter_20191011.pdf" type="application/pdf; length=138774">ePrivacy_NGO_letter_20191011.pdf</a></span> </td> <td>135.52 KB</td> </tr> </tbody> </table> </div> </div> </div> </div> Fri, 11 Oct 2019 09:04:46 +0000 staff 3255 at http://privacyinternational.org The Identity Gatekeepers and the Future of Digital Identity http://privacyinternational.org/long-read/3254/identity-gatekeepers-and-future-digital-identity <span class="field field--name-title field--type-string field--label-hidden">The Identity Gatekeepers and the Future of Digital Identity</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span lang="" about="/user/43" typeof="schema:Person" property="schema:name" datatype="">staff</span></span> <span class="field field--name-created field--type-created field--label-hidden">Thursday, October 10, 2019</span> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Photo by Nadine Shaabana on Unsplash</p> <h2><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><strong>Digital identity providers</strong></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></h2> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Around the world, we are seeing the growth of digital IDs, and companies looking to offer ways for people to prove their identity online and off. The UK is no exception; indeed, the trade body for the UK tech industry is calling for the development of a <a href="https://www.techuk.org/images/documents/digital_id_FINAL_WEBSITE.pdf">“digital identity ecosystem”</a>, with private companies providing a key role. Having a role for private companies in this sector is not necessarily a problem: after all, <a href="https://privacyinternational.org/topics/identity">government control over ID is open to abuse and exploitation</a>. But we need to ask what we want this industry to look like, and how we can build one without exploitation. These are the new digital gatekeepers to our lives.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>In our <a href="https://privacyinternational.org/advocacy/3215/response-uks-call-evidence-digital-identity">response to a recent UK government consultation on digital identity</a>, we highlighted the imperative of avoiding a future for digital identity that exploits the data of individuals. As we wrote in the response, </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <blockquote> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>“It would be positive for both the UK, and the development of identity systems around the globe, if the UK builds a digital identity ecosystem that becomes a world-leader in respecting the rights of individuals and communities. Yet the risks of digital identity are large, from dangers surrounding the curtailing of people’s rights and state surveillance, to the exploitation of their data by private companies. As a result, the highest standards must be in place to meet the promise of a world-leading system.”</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> </blockquote> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>This is an imperative, given how these digital identity companies are becoming gatekeepers to access key services, both online and off. People increasingly either have to use their services to go about their lives, or life becomes difficult without them. Thus, they are in a powerful position. At the same time, proving your identity is something that most people don’t really want to spend a lot of time thinking about. This is a powerful combination that leaves opportunities for abuse.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>As we imagine what the future will look like for digital ID, we don't want to see one in which companies in the digital ID industry are able to exploit our trust and take advantage of their position in the market. The burgeoning digital ID industry deserves our attention, and potential abuses must be brought to light.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>It is essential that we question how this industry <em>should </em>behave. One of the behaviours to query, as we've seen in other sectors, is businesses using your data for other purposes. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <h2><strong><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Using your data for other things: the example of Yoti</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></strong></h2> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>An example of this is the UK-based digital identity provider, Yoti. Since the introduction of the Yoti app in 2017 it had been <a href="https://www.biometricupdate.com/201903/yoti-partners-with-facetec-for-biometric-authentication-with-liveness-detection">downloaded 3.7 million times</a> <span><span>by May 2019</span></span>. According to Yoti this figure is now over 5 million. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Operating through its app, if a user chooses, they can add an ID document. Yoti specify these as <a href="https://www.yoti.com/wp-content/uploads/2019/08/Yoti-privacy-information-Yoti-app-19-August-2019.pdf">“Government-issued or other official identity documents (for example, passport, driving licence)”</a>. Yoti makes use of the government-issued ID document to provide <a href="https://www.yoti.com/wp-content/uploads/2019/08/Yoti-privacy-information-Yoti-app-19-August-2019.pdf">verified identity for its users</a>. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Users can also take a ‘selfie’ of themselves for the purpose of having a <a href="https://www.yoti.com/wp-content/uploads/2019/08/Yoti-privacy-information-Yoti-app-19-August-2019.pdf">photo on the account</a>. This photo can be <a href="https://www.yoti.com/wp-content/uploads/2019/08/Yoti-privacy-information-Yoti-app-19-August-2019.pdf">“shared as part of proving your identity.</a>” </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Yoti <a href="https://www.yoti.com/wp-content/uploads/2019/08/Yoti-privacy-information-Yoti-app-19-August-2019.pdf">states</a> the purpose of their biometric identity app is to “provide you with a quick, easy, secure and privacy-friendly way to prove your age and / or identity, online and in person”. If an organisation accepts Yoti, then you can share details that have been verified by Yoti and taken from the ID documents you have uploaded to the Yoti app. The app includes welcome features, like the ability to only share particular attributes – for example, the ability to only share the fact that the user is ‘over 18’, rather than sharing all the information on their ID. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Yoti is used not only by private businesses, but also the <a href="https://www.yoti.com/blog/yoti-chosen-as-the-official-identity-provider-for-the-government-of-jersey/">States of Jersey</a> <span><span>(Government of Jersey)</span></span>; the Improvement Service (for local government) <a href="https://www.yoti.com/blog/yoti-partners-with-the-improvement-service-to-help-deliver-digital-services-to-scottish-citizens/">in Scotland</a>; and the Yoti Citizen Card option is a pre-ticked option when applying for <a href="https://www.citizencard.com/">CitizenCard</a> proof of age cards. Yoti also works globally, with an <a href="https://www.businesstoday.in/technology/news/digital-identity-app-yoti-enters-india-to-profile-users-of-truly-madly/story/274459.html">office in India</a> and user experience research <a href="https://www.yoti.com/blog/digital-identity-in-the-last-mile-lessons-from-africa/">in Africa</a>. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><img alt="Pre-ticked box" data-entity-type="file" data-entity-uuid="4ed38372-5b66-4a83-b8c9-cbeb4988020d" src="/sites/default/files/inline-images/Wire%202019-10-10%20at%205_13%20PM.png" /></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The concept behind the core Yoti offering is not unproblematic. Big Brother Watch has criticised Yoti’s part in a growing “<a href="https://www.theregister.co.uk/2018/02/02/facial_recognition_age_verification/">casual use of biometric checks</a>”. Privacy International has critiqued the extent to which identification systems can truly capture a complex, changing, essential thing like <a href="https://privacyinternational.org/feature/2274/identity-discrimination-and-challenge-id">‘identity’</a>, resulting in discrimination; concerns echoed by researcher <a href="https://deepdives.in/can-data-ever-know-who-we-really-are-a0dbfb5a87a0">Zara Rahman</a>. While Yoti is keen to emphasise the work they are doing on issues surrounding inclusion, and are aware of some of the complexities, the fact remains that the credentials that are valid in the current Yoti app are based largely on state ID documents. Privacy International’s research has shown that identification requirements are a major source of exclusion for those who <a href="https://privacyinternational.org/feature/2544/exclusion-and-identity-life-without-id">lack access to identification</a>.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>But we must also ask what Yoti is doing with the data they obtain from users of their app. This includes data from government issued identity documents, like passports: the ‘gold-standard’ for an identity credential. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Similarly, they have data including the image of a person’s face (‘the selfie’), verified to be theirs against that same document. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>That they get these sources of data from users places them in a privileged position compared to many other apps. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>What they do with this, besides their core offering of identity and attribute verification, needs a closer look.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <h3><em><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><strong>Yoti Age Scan</strong></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></em></h3> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>In April 2019, Yoti launched a new initiative, and potential income stream for the company: <a href="//www.yoti.com/yoti-age-scan-whitepaper-deep-dives-into-yotis-revolutionary-age-estimation-technology/">Yoti Age Scan technology</a>. This product - described by Yoti as <a href="https://www.yoti.com/yoti-age-scan-whitepaper/">“using Artificial Intelligence (AI) for good</a>” - <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">estimates an individual’s age based on their image</a>. This is used, for example, within the Yoti app for those who have not uploaded a verified ID document that contains their age; at self-service checkouts to see if an individual is <a href="https://www.yoti.com/blog/streamlining-the-self-checkout-experience-with-ncr/">old enough to buy alcohol</a>; to <a href="https://www.yoti.com/blog/using-ai-for-good-with-yubo/">access social media services</a> aimed at teenagers; or to access <a href="https://www.yoti.com/age-verification-on-adult-websites/">adult content online</a>. Yoti charge businesses to estimate the age of a face.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>In the case of the use of Yoti outside of the app, a photo of the individual is analysed by Yoti with no other identifying information, and the algorithm decides whether this person is over a certain age threshold. The photo of the individual is deleted and not further stored.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>But how did Yoti train its algorithm? As outlined in <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">Yoti’s white paper on the Age Scan</a>, data to train their <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">algorithm is from three sources</a>: data from Yoti users, from the APPA-Real database which Yoti state is a “public domain source”, and from volunteer data collection activities in Kenya and according to Yoti also in the UK. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>In relation to data from Yoti users, if you have downloaded the Yoti app, at the point you add your verified identity document and it is accepted, your data becomes part of the training dataset. Specifically, this includes the <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">year of birth</a> and <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">gender</a> taken from your verified identity document; your ‘selfie’ photo taken <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">when you set up the account</a>; and Yoti researchers add other tags/ attributes for example by <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">tagging skin colour</a>. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>As of July 2019, Yoti had data of <a href="https://s3-eu-west-1.amazonaws.com/prod.marketing.asset.imgs/yoti-website/Yoti-Age-Scan_Digital.pdf">over 72,000 users</a> that they were using to build and test their model. Yoti have told us that this data is held on a separate R&amp;D server, where it is not stored with data like the name of the user.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Privacy International have engaged with Yoti and raised concerns about Yoti’s actions when we met in person. This includes:</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <ul><li><span><span>At the point an individual has a verified ID document on their Yoti account, they are added to the training dataset. Yet once you have a verified ID document linked to the Yoti App, not only would you have no need to use Age Scan within the App, there are a vanishingly small number of scenarios when you would need to use Age Scan to prove your age when buying age restricted goods. This is because you can simply show the retailer your <a href="https://agecheck.yoti.com/">verified age in the App</a>. </span></span><br /><br /><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Yoti refute that there are a vanishingly small number of scenarios when you would need to use Age Scan if you have the App. For instance, an individual might wish to buy age-restricted goods at a self-service checkout without taking out their phone or if they do not have their phone on them. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span><br />  </li> <li><span><span>At the time that Privacy International spoke to Yoti, in Privacy International’s view their <a href="https://www.yoti.com/wp-content/uploads/2019/07/Yoti-privacy-information-Yoti-app-22-July-2019.pdf">July 2019 Privacy Policy</a> lacked transparency as to the use of Yoti user’s data for the purposes of age verification, and the quality of information provided was poor. There was little clarity as to how the users’ data was used as part of the Age Scan dataset. </span></span><br /><br /><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Privacy International’s concerns were based on a detailed review of Yoti’s Privacy Policy, as well as our experience of signing up to the app and going through the ‘on-boarding process’. Yoti have made improvements to their Privacy Policy following our conversation. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span><br />  </li> <li><span><span>In Privacy International’s opinion, in adding Yoti users’ data to a training dataset used to train age verification technology, Yoti were using their customers' data in a way they would not reasonably expect and for a purpose other than which they provided it, raising questions as to the fairness of this use and the principle of purpose limitation, both of which are core tenants of data protection. </span></span><br />  </li> <li><span><span>When we first approached Yoti, there was no accessible way for Yoti users to opt out of use of their data in the training dataset and no accessible way for Yoti App users to request that their data is deleted from the training set without stopping them being able to use the app altogether. Yoti have subsequently introduced an opt-out for “Our internal research” of biometric data. As they state on the options screen, “We need to use data to test our security checks and technology. Don’t worry, our research team has no other data to identify you personally. Once you withdraw your consent, this can’t be undone.” Of course, apart from your photograph, Yoti do have access to ‘other data’ that they use for Yoti Age Scan: the date of birth from your passport. </span></span><br />  </li> <li><span><span>This opt-out is buried deep in Yoti’s setting menu, and is far from obvious. As a result, while it is technically true that a user could opt-out prior to uploading their documents and adding a profile picture, it seems unlikely that many would stumble across that option. Rather, an opt-in would be a better solution.</span></span></li> </ul><p><img alt="biometrics settings" data-entity-type="file" data-entity-uuid="a256577c-c89a-42ce-a8e8-0867e5c82e9d" src="/sites/default/files/inline-images/Wire%202019-10-10%20at%205_21%20PM.png" /></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Since talking to Privacy International, Yoti have made welcome <a href="https://www.yoti.com/wp-content/uploads/2019/08/Yoti-privacy-information-Yoti-app-19-August-2019.pdf">improvements to their privacy policy</a>, which have gone some way to making it some degree clearer the use to which they’re putting a user’s photo and passport data. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>We would encourage Yoti to actively communicate to customers whose data formed part of the training dataset to ensure they are aware of how their data has been used. We do not believe that the more than 70,000 users whose data has been used to train the algorithm were adequately informed about the use of their photographs and data from their passport or other identity document. Yoti have informed us they have no way to contact their users to do this. However, even so, Yoti could take steps to reach out to their users: through a notice on the app, public communication, and notices at the places where people use Yoti. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Yoti have replied at length to this piece <a href="/sites/default/files/2019-10/ReplytoPI091019.pdf">[see below]</a>. In addition to feedback on this piece, Privacy International sought further clarification on a number of points, to which Yoti also replied. One such point, was Yoti’s reliance on ‘legitimate interests’ justification for use of data in the dataset. Yoti were not forthcoming with a copy of their legitimate interests’ assessment, on the basis that this was an internal company-confidential document. Whilst we acknowledge that there may be a justification for not publishing such assessments in full, we would encourage companies, including Yoti to lead the way in publishing legitimate interest assessments, or at least a summary. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>This approach mirrors the Article 29 Working Party Guidance, endorsed by EDPB, on data protection impact assessments which states</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <blockquote> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>“...controllers should consider publishing at least parts, such as a summary or conclusion of their DPIA. The purpose of such a process would be to foster trust in the controller’s processing operations and demonstrate accountability and transparency. It is particularly good practice to publish a DPIA where members of the public are affected by the processing operation.” </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> </blockquote> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Yoti Age Scan is just one example of digital identity. The issues outlined above can be used to reflect on wider issues relating to the use of data gathered in the course of identity services: how do we want the identity industry to treat our data? What is the future for this market, and how do we limit what these companies are doing with the data they gather in the course of their operations? </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <h2><strong><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The future of digital identity</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></strong></h2> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>As we look towards the future of a digital identity market, we cannot allow it to develop into one where players profit from the exploitation of the data of its users. If we see a situation where digital identity companies are earning their keep from the use of data outside of the core provision of identity, then we risk both the public trust and a distortion of the market. These distortions will lead to activities not to the benefit of the user, but one in which the user is a mere product.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p> </p></div> <div class="field field--name-field-topic field--type-entity-reference field--label-inline"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/new-forms-identity" hreflang="en">New forms of identity</a></div> <div class="field__item"><a href="/topics/identity" hreflang="en">Identity</a></div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-inline"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/id-identity-and-identification" hreflang="en">ID, Identity and Identification</a></div> </div> </div> <div class="field field--name-field-attachments field--type-file field--label-inline"> <div class="field__label">Attachments</div> <div class="field__items"> <div class="field__item"> <span class="file file--mime-application-pdf file--application-pdf"> <a href="http://privacyinternational.org/sites/default/files/2019-10/ReplytoPI091019.pdf" type="application/pdf; length=1745327" title="ReplytoPI091019.pdf">Yoti's Right of Reply Response to Privacy International</a></span> </div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/demanding-identity-systems-our-terms" hreflang="en">Demanding identity systems on our terms</a></div> <div class="field__item"><a href="/campaigns/exposing-new-frontiers-identity" hreflang="en">Exposing new frontiers of identity</a></div> </div> </div> <div class="field field--name-field-principle-or-recommendatio field--type-entity-reference field--label-above"> <div class="field__label">What is PI calling for</div> <div class="field__items"> <div class="field__item"><a href="/recommendation-principle-or-safeguard/people-must-know" hreflang="en">People must know</a></div> <div class="field__item"><a href="/recommendation-principle-or-safeguard/limit-data-analysis-design" hreflang="en">Limit data analysis by design</a></div> <div class="field__item"><a href="/recommendation-principle-or-safeguard/identities-under-our-control" hreflang="en">Identities under our control</a></div> </div> </div> Thu, 10 Oct 2019 14:33:55 +0000 staff 3254 at http://privacyinternational.org Use of 2FA information for commercial purposes is unacceptable http://privacyinternational.org/news-analysis/3251/use-2fa-information-commercial-purposes-unacceptable <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span>The latest news of Twitter “inadvertently” <a href="https://help.twitter.com/en/information-and-ads">sharing email addresses or phone numbers</a> provided for safety or security purposes (for example, two-factor authentication) for advertising purposes is extremely concerning for several reasons.</span></p> <p><span>First of all, it is not the first time </span><span>for </span><span>Twitter</span><span>'s used people's data in ways they wouldn't expect or that ignores their choices</span><span>: </span><span>in </span><span>August, <a href="https://privacyinternational.org/news-analysis/3111/twitter-may-have-used-your-personal-data-ads-without-your-permission-time-fix">the company disclosed</a> that it may have shared data</span><span> on users with advertising partners, even if they ha</span><span>d</span><span> opted out from personalised ads, and shown people advertising based on inferences made about the devices they use without permission. </span></p> <p><span>In May 2019, <a href="https://help.twitter.com/en/location-data-collection">Twitter also disclosed</a> a bug that resulted in an account’s location data being shared with a Twitter ad partner, in certain circumstances.</span></p> <p><span>As we wrote at the time, Twitter's latest disclosures show how urgently the industry needs to change, but until then, there's </span><span>more </span><span>that Twitter could already do right now. We believe that social media platforms like Twitter need to do much more to increase transparency around how ads are targeted at users, something <a href="https://privacyinternational.org/strategic-areas/challenging-corporate-data-exploitation">we have been campaigning for a long time</a></span><span>.</span><span> Our <a href="https://privacyinternational.org/long-read/3244/social-media-companies-have-failed-provide-adequate-advertising-transparency-users">recent analysis</a> </span><span>shows just how far Twitter (as well as Google and Facebook) have to go when it comes to providing users with ads transparency, as well as the shocking disparity in application of policies globally. </span></p> <p><span>When it comes to targeted ads</span><span> as a minimum</span><span> people should be able to understand </span><span>how their data's being used, why they are seeing a particular ad and have meaningful choices that are respected. </span></p> <p> </p> <p><span><strong>Undermining trust in 2FA</strong></span></p> <p><span>Second, and very importantly: </span><span>these</span><span> practices such as those in the recent Twitter disclosure </span><span>undermine people’s trust in two-factor authentication (2FA), a critical security feature, and makes them less secure in the long term. </span></p> <p><span>This is concerning for everyone, and particularly worrying for activists, dissidents and communities at risk all over the world </span><span>which</span><span> have a clear need to protect their security and who use social networks to communicate and organise.</span></p> <p><span>And Twitter is far from being the only one: for instance, Facebook ha</span><span>s</span><span> history when it comes to</span><span> the</span><span> blurring of lines between contact information provided for security, and contact information provided for other purposes. Earlier this year, it emerged that Facebook </span><span>was</span><span> making mobile phone numbers (which users believed to be) provided for the express purpose of 2FA both searchable, and a target for advertising by default. </span></p> <p><span>One of the myriad </span><span>of </span><span>ways Facebook displays targeted adverts to users is through so-called "Custom Audiences". These "custom audiences" are lists of contact details, including phone numbers and email addresses, uploaded by advertisers. Facebook then matches this "custom audience" with the details they hold, to target adverts at accounts associated with this contact information.</span></p> <p><span><a href="https://privacyinternational.org/report/3025/facebook-must-explain-what-its-doing-your-phone-number-update ">We asked Facebook to explain</a> what they did with people's phone numbers and had a long (and somewhat... confusing) exchange with them.</span></p> <p><span>We are not aware exactly when Facebook started asking for users's mobile phone numbers; how many users provided these and when. We are also unable to interrogate when and why each user uploaded their phone number, what percentage believed this was solely for the purposes of 2FA, and why they believed this. However, it is clear that many people provided their phone numbers believing that it would make their accounts more secure, and as a result, many companies were able to conduct targeted advertising based on this user data.</span></p> <p><span>Disclosing or using information provided for security purposes for any other purpose, including advertising</span><span> is unacceptable: we believe that companies should protect their users' safety and never use critical security features</span><span> for profit.</span></p> <p> </p> <p><em>Photo: <a href="https://en.wikipedia.org/wiki/File:Twitter_Headquarters.jpg">Twitter Headquarters - Christinatt, photo by Troy Holden</a> (CC -Share Alike 3.0 Unported)</em></p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Twitter_Headquarters.jpg" width="1024" height="683" alt="Twitter Headquarters - Christinatt (photo by Troy Holden)" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Twitter_Headquarters_0.jpg" width="1024" height="683" alt="Twitter Headquarters - Christinatt (photo by Troy Holden)" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Twitter_Headquarters_1.jpg" width="1024" height="683" alt="Twitter Headquarters - Christinatt (photo by Troy Holden)" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-above"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/expose-data-exploitation-data-profiling-and-decision-making" hreflang="en">Expose Data Exploitation: Data, Profiling, and Decision Making</a></div> <div class="field__item"><a href="/what-we-do/protect-people-and-communities-online" hreflang="en">Protect People and Communities Online</a></div> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/adtech" hreflang="en">AdTech</a></div> <div class="field__item"><a href="/topics/data-exploitation" hreflang="en">Data Exploitation</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/challenging-corporate-data-exploitation" hreflang="en">Challenging Corporate Data Exploitation</a></div> <div class="field__item"><a href="/strategic-areas/safeguarding-peoples-dignity" hreflang="en">Safeguarding Peoples&#039; Dignity</a></div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/tell-companies-stop-exploiting-your-data" hreflang="en">Tell companies to stop exploiting your data!</a></div> <div class="field__item"><a href="/campaigns/when-social-media-makes-you-target" hreflang="en">When Social Media makes you a target</a></div> </div> </div> <div class="field field--name-field-targeted-adversary field--type-entity-reference field--label-above"> <div class="field__label">More about this Adversary</div> <div class="field__items"> <div class="field__item"><a href="/taxonomy/term/577" hreflang="en">Facebook</a></div> </div> </div> </div> </div> Thu, 10 Oct 2019 09:42:30 +0000 staff 3251 at http://privacyinternational.org PI response to confused governments’ confusing declaration of war and victory on encryption http://privacyinternational.org/news-analysis/3245/pi-response-confused-governments-confusing-declaration-war-and-victory <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Today’s announcement regarding the UK and US agreement signed pursuant to the US CLOUD Act is being touted on both sides of the Atlantic as a major victory for law enforcement and security. But it is a step backward for privacy.</p> <p>And it’s far more complicated than their press release and letter to industry.</p> <p>The agreement replaces the prior system, under which law enforcement agencies from around the world, including the UK, had to meet US legal standards in order to get access to content held by US service providers - like Facebook and Google - in the US. It turns out the US protections with regard to such content are in fact robust - requiring proof of probable cause before that content can be turned over to law enforcement. This was protecting many people across the world.</p> <p>The MLAT system, which the new UK-US agreement replaces with regard to the UK, is flawed.  But its flaws are mainly logistical. The US government does not devote enough resources to processing requests from other countries - leading to a backlog. Instead of choosing to spend more money, the US has decided to let other countries get access to content stored in the US under their own legal regimes - most of which are not as privacy protective. For instance, we’ve often written about the flaws in the UK regime and <a href="https://privacyinternational.org/news-analysis/3242/no-uk-hasnt-just-signed-treaty-meaning-end-end-end-encryption">do so again here</a>. Indeed, even in its first case study released today, the UK Home Office notes it struggled to meet the US probable cause standard. It is unfortunate that the US has chosen to permit the UK to side step such privacy protections instead of investing more in improving the MLAT process.</p> <p>The UK is also not to be satisfied. In addition to its new found access under the CLOUD Act agreement, the UK is yet again calling for significant restrictions on encryption of communications and devices. End to end encryption protects the communications of people across the world; from protestors in Hong Kong, to journalists in Colombia, to human rights defenders in Egypt. Aid workers and <a href="https://privacyinternational.org/report/2509/humanitarian-metadata-problem-doing-no-harm-digital-era">humanitarian operations</a>. Encryption protects all of them. Either it is turned on for everyone, or broken for everyone.</p> <p>Device encryption helps make sure that our lives are not laid bare to others when our laptops or mobile phones are lost or stolen. (See our guidance for politicians on <a href="https://staging.privacyinternational.org/blog/657/winning-debate-encryption-101-guide-politicians">understanding encryption policy</a>.)</p> <p>In seeking to improve the strength of the encryption applied to our communications, Facebook is trying to be a good actor (something <a href="https://privacyinternational.org/corporateabusetimeline?tid=440">we do not say often in the privacy context</a>). Building more secure systems is important for everyone and urgent. </p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/889DEFC1-E8CA-4AAD-B267-4C1C93AA595A.png" width="588" height="413" alt="Crypto" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/8DC54613-4A74-4850-8A21-F9B9359B6043.png" width="588" height="413" alt="Crypto" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/2DD2CEC1-D45C-4CDE-826A-E5D6AE3415B3.png" width="588" height="413" alt="Crypto" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/cross-border-data-access" hreflang="en">Cross Border Data Access</a></div> <div class="field__item"><a href="/topics/encryption" hreflang="en">Encryption</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/government-exploitation" hreflang="en">Government Exploitation</a></div> </div> </div> <div class="field field--name-field-type-of-intervention field--type-entity-reference field--label-above"> <div class="field__label">Related work PI does</div> <div class="field__item"><a href="/how-we-fight/educational" hreflang="en">Educational</a></div> </div> <div class="field field--name-field-type-of-impact field--type-entity-reference field--label-above"> <div class="field__label">Type of Impact</div> <div class="field__items"> <div class="field__item"><a href="/impact/winning-and-losing-and-still-fighting-crypto-wars" hreflang="en">Winning and losing and still fighting the crypto wars</a></div> </div> </div> <div class="field field--name-field-education-course field--type-entity-reference field--label-above"> <div class="field__label">Education material</div> <div class="field__items"> <div class="field__item"><a href="/education/data-and-surveillance" hreflang="en">Data and Surveillance</a></div> </div> </div> <div class="field field--name-field-date field--type-datetime field--label-above"> <div class="field__label">Date</div> <div class="field__item"><time datetime="2019-10-03T12:00:00Z" class="datetime">Thursday, October 3, 2019</time> </div> </div> </div> </div> Thu, 03 Oct 2019 21:55:34 +0000 tech-admin 3245 at http://privacyinternational.org Social media companies have failed to provide adequate advertising transparency to users globally http://privacyinternational.org/long-read/3244/social-media-companies-have-failed-provide-adequate-advertising-transparency-users <span class="field field--name-title field--type-string field--label-hidden">Social media companies have failed to provide adequate advertising transparency to users globally</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span lang="" about="/user/1" typeof="schema:Person" property="schema:name" datatype="">tech-admin</span></span> <span class="field field--name-created field--type-created field--label-hidden">Thursday, October 3, 2019</span> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span><span><em><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span>An analysis of what Facebook, Google, and Twitter have done to provide users with political ad transparency as of September 2019. Our full analysis is linked below.</span></span></em></span></span></p> <p><span><span><span><span>Recently the role of social media and search platforms in political campaigning and elections has come under scrutiny. Concerns range from the spread of disinformation, to profiling of users without their knowledge, to micro-targeting of users with tailored messages, to interference by foreign entities, and more. Significant attention has been paid to the transparency of political ads, and more broadly to the transparency of online ads.</span></span></span></span></p> <p><span><span><span><span>Notably, in the lead up</span></span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span> to the 2019 EU Parliamentary elections Facebook, Google, and Twitter, as well as the Interactive Advertising Bureau and others, agreed to take a series of steps to prevent online disinformation on their respective platforms. These measures are reflected in a self-regulatory </span></span><a href="https://ec.europa.eu/digital-single-market/en/news/code-practice-disinformation"><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span>Code of Practice on Disinformation</span></span></a> <span><span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span><span>and t</span></span></span></span></span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span><span>he </span></span></span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span>companies provided the European Commission with monthly updates on their progress. The Commission’s final report, due in<span><span> </span></span>November 2019, will indicate to what extent the companies met their commitments within the Code of Practice. Already, however, there has been </span></span><a href="https://blog.mozilla.org/blog/2019/04/29/facebooks-ad-archive-api-is-inadequate/"><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span>criticism</span></span></a><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span> of the companies’ actions, which as will be documented below, were comparatively minimal. Additionally, while the Code of Practice is intended to be applied within the EU, PI has also looked at the implementation of similar measures in other countries. We are concerned that the measures are enforced unequally in different parts of the world, leading to unfair treatment of users.</span></span></span></span></p> <p><span><span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span>This document outlines PI’s understanding of what steps Facebook, Google and Twitter have taken so far, PI’s analysis of the companies’ shortcomings in online political and issue-based advertising transparency, and suggestions for what companies should do moving forward.</span></span></span></span></p> <p> </p> <h3><span><span><strong><span><span>Main takeaways</span></span></strong></span></span></h3> <ul><li><span><span><span><span><span>Companies have applied different policies in different countries. In jurisdictions where companies are under pressure to act (by governments, institutions such as the EU, or civil society), they have adopted self-regulatory practices. Where such pressure is absent, they have, by and large, failed to act.</span></span></span></span></span></li> <li><span><span><span><span><span>Facebook, Google, and Twitter have taken a blatantly fragmented approach to providing users with political ad transparency. Most users around the world lack meaningful insight into how ads are being targeted through these platforms.</span></span></span></span></span> <ul><li><span><span><span><strong><span><span>Facebook</span></span></strong><span><span> provides heightened transparency for political ads in 35 countries (roughly 17% of the countries in the world). This means that for roughly 83% of the countries in the world, the company does not require political advertisers to become authorised, for political ads to carry disclosures, or for ads to be archived.</span></span></span></span></span></li> <li><span><span><span><strong><span><span>Google</span></span></strong><span><span> provides heightened transparency for political ads in 30 countries (roughly 15% of the countries in the world).</span></span></span></span></span></li> <li><span><span><span><strong><span><span>Twitter</span></span></strong><span><span> provides heightened transparency for ads tied to specific elections (rather than political ads more generally) in 32 countries (roughly 16% of the countries in the world).</span></span></span></span></span> <ul><li><span><span><span><span><span>Outside of the US,<strong> Twitter</strong> does not treat political ads or political issue ads differently from promoted tweets, meaning that these ads (which are political, but not tied to an election), run without heightened transparency.</span></span></span></span></span> <ul><li><span><span><span><span><span>Within the analysis, PI has provided an example of a UK Brexit party ad being run on Twitter without being marked political, and therefore with no targeting information provided. The ad has since been deleted.</span></span></span></span></span></li> </ul></li> </ul></li> </ul></li> <li><span><span><span><span><span>The companies do not provide meaningful transparency into political issue ads (which each platform defines separately, or not at all) that have run or are running on their platforms.</span></span></span></span></span> <ul><li><span><span><span><strong><span><span>Google</span></span></strong><span><span> has not defined what it considers to be "political issues" and therefore transparency into what political issue ads have run or are running, to whom they are being shown, how much was spent, etc., is impossible.</span></span></span></span></span></li> </ul></li> <li><span><span><span><span><span>The targeting provided by Facebook, Google, and Twitter is inadequate - it is still impossible to meaningfully understand who political advertisers are targeting across the three platforms.</span></span></span></span></span> <ul><li><span><span><span><span><span>The ad libraries of Facebook, Google, and Twitter, in varying degrees, provide broad ranges of targeting information on some ads in some countries, instead of meaningful insight into how an ad was targeted. This is especially problematic given the granularity with which advertisers, political or not, are able to micro-target ads users.</span></span></span></span></span> <ul><li><span><span><span><span><span>Google is especially deficient given that it </span></span><a href="https://transparencyreport.google.com/political-ads/advertiser/AR492809907662225408/creative/CR248237839714615296"><span><span>only provides</span></span></a><span><span> broad ranges of insight, such as 100K-1M as the number of times an ad was shown, rather than meaningful information about how an ad or campaign was targeted.</span></span></span></span></span></li> </ul></li> </ul></li> </ul><p> </p> <p><em>Facebook asked PI to make its full comments publicly available. They are available below.</em></p></div> <div class="field field--name-field-topic field--type-entity-reference field--label-inline"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/learning-topics/data-and-elections" hreflang="en">Data and Elections</a></div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-inline"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/expose-data-exploitation-data-profiling-and-decision-making" hreflang="en">Expose Data Exploitation: Data, Profiling, and Decision Making</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/defending-democracy-and-dissent" hreflang="en">Defending Democracy and Dissent</a></div> </div> </div> <div class="field field--name-field-attachments field--type-file field--label-inline"> <div class="field__label">Attachments</div> <div class="field__items"> <div class="field__item"> <span class="file file--mime-application-pdf file--application-pdf"> <a href="http://privacyinternational.org/sites/default/files/2019-10/cop-2019_0.pdf" type="application/pdf; length=936488">cop-2019_0.pdf</a></span> </div> <div class="field__item"> <span class="file file--mime-application-pdf file--application-pdf"> <a href="http://privacyinternational.org/sites/default/files/2019-10/facebook-102019.pdf" type="application/pdf; length=128868">facebook-102019.pdf</a></span> </div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/empowering-people-advertising-transparency" hreflang="en">Empowering people with advertising transparency</a></div> </div> </div> <div class="field field--name-field-targeted-adversary field--type-entity-reference field--label-above"> <div class="field__label">More about this Adversary</div> <div class="field__items"> <div class="field__item"><a href="/taxonomy/term/577" hreflang="en">Facebook</a></div> <div class="field__item"><a href="/taxonomy/term/578" hreflang="en">Google</a></div> </div> </div> <div class="field field--name-field-date field--type-datetime field--label-above"> <div class="field__label">Date</div> <div class="field__item"><time datetime="2019-10-03T12:00:00Z" class="datetime">Thursday, October 3, 2019</time> </div> </div> Thu, 03 Oct 2019 18:10:35 +0000 tech-admin 3244 at http://privacyinternational.org No, the UK Hasn’t Just Signed a Treaty Meaning the End of End-to-End Encryption  http://privacyinternational.org/news-analysis/3242/no-uk-hasnt-just-signed-treaty-meaning-end-end-end-encryption <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>A new UK Times <a href="https://www.thetimes.co.uk/article/police-can-access-suspects-facebook-and-whatsapp-messages-in-deal-with-us-q7lrfmchz">report</a> claims that “WhatsApp, Facebook and other social media platforms will be forced to disclose encrypted messages from suspected terrorists, paedophiles and other serious criminals under a new treaty between the UK and the US.”</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Several other media outlets have followed up on the report, with headlines <a href="https://finance.yahoo.com/news/uk-us-set-sign-treaty-155558746.html?guccounter=1">such as</a> “UK and US set to sign treaty allowing UK police ‘back door’ access to WhatsApp and other ‘end to end encrypted’ messaging platforms”.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>While the Treaty is of concern, and while there are indeed a number of very real and imminent threats to the use of secure and encrypted applications, such headlines are misleading and need clarification and context.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>(TL/DR: No, this particular Treaty doesn’t mean the end of end-to-end encryption, but there’s other threats out there you should be concerned about).</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><strong>What is the new “Treaty”?</strong></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The Times reports that the UK Home Secretary will sign an agreement next month “that compels US social media companies to hand over information to the police, security services and prosecutors”. Presumably, this refers agreements signed under the Clarifying Lawful Overseas Use of Data <a href="https://epic.org/privacy/cloud-act/">(CLOUD) Act</a>, signed into law in the US in 2018. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Essentially, the CLOUD Act aims to allow non-US governmental agencies to request access to data stored in the US and to allow US agencies to request data from companies based abroad. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The CLOUD Act was appended to another bill and <a href="https://www.eff.org/deeplinks/2018/03/responsibility-deflected-cloud-act-passes">rushed</a> through the US legislature with minimal scrutiny just as <em>United States v. Microsoft Corp</em>was being heard before the US Supreme Court in 2018. The case would have determined whether or not US law enforcement bodies could compel communications service providers to hand over data held abroad, in this case whether Microsoft would have to hand over emails and other private information associated with a particular account based in Ireland. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Privacy International intervened in the case, <a href="https://privacyinternational.org/legal-action/united-states-v-microsoft-corp-law-enforcement-cross-border-data-access">arguing</a> that if the US government were to succeed in unilaterally seizing data stored abroad, it would set the stage for repeated violations of data-protection laws around the world. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The passing of the CLOUD Act, however, meant the case was deemed moot by the court.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The CLOUD Act allows the US executive to enter into agreements with other countries to allow their national agencies to make data requests directly to communications service providers located in the US, who otherwise may be prevented from complying with those requests because of US law protecting the privacy of communications. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>There are some safeguards. For example, the US Department of Justice states that any countries which enter into agreement “require significant privacy protections and a commitment to the rule of law.” </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>This might sound reasonable enough, especially as the current system for international data requests known as the Mutual Legal Assistance Treaty is notoriously slow, but there are significant problems with the CLOUD Act (which could have been rectified within the law had more consultation and scrutiny been allowed). </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>For example, in the US access to communications content requires a judicial finding of probable cause, which is not the case in the UK. British agencies are also allowed to use broad thematic warrants, which are not targeted at named individuals but rather at entire groups of people. While such thematic warrants are not permitted under the CLOUD Act, the oversight of those warrants, once served on US companies, is not sufficient to guarantee this prohibition will be followed. Therefore, UK agencies could make requests based on a legal process with significantly less safeguards than the US process.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The agreement that the UK and US are about to sign may bolster some of these protections, but we do not know because we have not seen the text of the agreement. That is another reason for concern.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Overall, the CLOUD Act is controversial: to read more, see our legal commentary available <a href="https://www.lawfareblog.com/weak-link-double-act-uk-law-inadequate-proposed-cross-border-data-request-deal">here</a> and <a href="https://privacyinternational.org/feature/2002/us-uk-deal-both-sides-deserve-scrutiny">here</a>, or other analyses by our friends at the <a href="https://www.aclu.org/blog/privacy-technology/internet-privacy/cloud-act-dangerous-piece-legislation">ACLU</a>, <a href="https://www.eff.org/deeplinks/2018/02/cloud-act-dangerous-expansion-police-snooping-cross-border-data">EFF</a>, and <a href="https://cdt.org/blog/cloud-act-implementation-issues/">CDT</a>.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><strong>Does the CLOUD Act mandate encryption backdoors?</strong></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The implementation of the CLOUD Act does not, however, mean that communications service providers will have to decrypt people’s communications to comply</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The Department of Justice has said that the CLOUD Act is “encryption neutral” – meaning it neither requires decryption nor stops governments from ordering decryption to the extent authorized by their laws. “This neutrality allows for the encryption issue to be discussed separately among governments, companies, and other stakeholders”, <a href="https://www.justice.gov/opa/press-release/file/1153446/download">according</a> to the Department.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>While the Act is neutral on this point, it’s important to remember that particularly for services such as WhatsApp, only the content of communications is end-to-end encrypted, not the metadata (the who, what, and where associated with the account and messages). This metadata, while commonly perceived to be less sensitive than content of messages, is actually just as invasive: indeed, the European Court of Human Rights recently agreed with Privacy International in our challenge to the UK’s mass surveillance regime, <a href="https://privacyinternational.org/feature/2267/uk-mass-interception-law-violates-human-rights-fight-against-mass-surveillance">finding</a> that collection of such metadata could be as intrusive than the content of the communication itself.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><strong>While the CLOUD Act doesn’t deal directly with end-to-end encryption, there are still threats to encryption coming from the UK and elsewhere</strong></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Make no mistake, government agencies are ramping up efforts to access secure end-to-end encrypted communications. Earlier this year, government representatives from Australia, Canada, New Zealand, the UK, and the US (“The 5 Eyes”) issued a <a href="https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/822818/Joint_Meeting_of_FCM_and_Quintet_of_Attorneys_FINAL.pdf">statement</a> calling on tech companies to “include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format”.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>There is even an idea on the table which would see tech companies forced to secretly add a law enforcement participant to group chats. Known as the Ghost Protocol, Privacy International <a href="https://privacyinternational.org/news-analysis/3002/ghosts-your-machine-spooks-want-secret-access-encrypted-messages">joined</a>other NGOs, security researchers, and companies earlier this year to express shared concerns about the serious threats to cybersecurity and fundamental human rights including privacy and free expression posed by the proposal.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The Ghost Protocol is one possible instantiation of the UK’s broad power under the Investigatory Powers Act 2016 to serve “<a href="http://www.legislation.gov.uk/ukpga/2016/25/section/253/enacted">technical capability notices</a>” on service providers – even those who operate outside of the UK. These notices can be used when “necessary for securing that the operator has the capability to provide any assistance which the operator may be required to provide in relation to any relevant authorisation”, including interception warrants. They may include “obligations relating to the removal by a relevant operator of electronic protection applied by or on behalf of that operator to any communications or data.” So while the CLOUD Act may not threaten end-to-end encryption, UK law does.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><strong>Going forward…</strong></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Secure encryption is vital to allow people to use technology without fear of governments, corporations, criminals or others accessing their private data. As tech plays a more central role in our everyday lives, from communicating to banking to using the fridge, encryption will become even more important in the future. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>While the CLOUD Act isn’t the end of end-to-end encryption, that doesn’t mean there is nothing to worry about. We will continue to monitor and challenge threats to people’s rights and cybersecurity, recognising that accurately reporting developments, especially developments that are happening purposefully out of public view, is incredibly difficult – but important. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p> </p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>                                                                                     </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Screenshot%202019-10-01%20at%2013.01.35.png" width="588" height="413" alt="e2e encryption" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Screenshot%202019-10-01%20at%2013.01.35_0.png" width="588" height="413" alt="e2e encryption" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-10/Screenshot%202019-10-01%20at%2013.01.35_1.png" width="588" height="413" alt="e2e encryption" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-above"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/fight-government-hacking-powers" hreflang="en">Fight Government Hacking Powers</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/defending-democracy-and-dissent" hreflang="en">Defending Democracy and Dissent</a></div> <div class="field__item"><a href="/strategic-areas/government-exploitation" hreflang="en">Government Exploitation</a></div> </div> </div> <div class="field field--name-field-type-of-intervention field--type-entity-reference field--label-above"> <div class="field__label">Related work PI does</div> <div class="field__item"><a href="/how-we-fight/legal-brief-or-intervention" hreflang="en">Legal Brief or Intervention</a></div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/security-should-protect-people-not-exploit-them" hreflang="en">Security should protect people, not exploit them</a></div> </div> </div> </div> </div> Tue, 01 Oct 2019 12:01:56 +0000 staff 3242 at http://privacyinternational.org GFF Challenge to use of government spyware (Germany) http://privacyinternational.org/legal-case-description/3240/gff-challenge-use-government-spyware-germany <div class="node node--type-legal-case-description node--view-mode-token ds-2col-stacked-fluid clearfix"> <div class="group-header"> <div id="field-language-display"><div class="js-form-item form-item js-form-type-item form-type-item js-form-item- form-item-"> <label>Language</label> English </div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Federal Constitutional Court (Germany)</p> <p>Case number: 2 BvR 1850/18</p> <p>Status: Open</p> <p>On 30 September 2019, Privacy International submitted a statement to the German Constitutional Court in a case concerning the government use of spyware, such as state trojans, in the context of criminal investigations. </p> <p>The case was originally brought by the German organisation <a href="https://freiheitsrechte.org/trojaner/">Gesellschaft für Freiheitsrechte (GFF)</a> in 2018. Specifically, on 22 August 2018, the GFF lodged a constitutional complaint in Karlsruhe against the use of so-called state -trojans and the state’s handling of IT security gaps in the absence of proper accountability.</p> <p>In 2017, the German Code of Criminal Procedure (StPO) was amended to allow investigating authorities to "impinge" upon information technology systems in order to collect data from them. This, in turn, would require the installation of software that reads data and transmits it to law enforcement authorities extracting it from the device of the person being targeted by surveillance technology. Such software is generally referred to as “state trojans”.</p> <p>As a form of government surveillance, state trojans present unique and grave threats to privacy and security. It has the potential to be far more intrusive than any other surveillance technique, permitting the government to remotely and surreptitiously access personal devices and all the intimate information they store. It also permits the government to conduct novel forms of real-time surveillance, by covertly turning on a device's microphone, camera, or GPS-based locator technology, or by capturing continuous screenshots or seeing anything input into and output from the device. The use of trojans allows governments to manipulate data on devices, by deleting, corrupting or planting data; recovering data that has been deleted; or adding or editing code to alter or add capabilities, all while erasing any trace of the intrusion. These targets are not confined to devices. They can extend also to communications networks and their underlying infrastructure.</p> <p>At the same time, the use of state trojans has the potential to undermine the security of targeted devices, networks or infrastructure, and potentially even the internet as a whole. Computer systems are complex and, almost with certainty, contain vulnerabilities that third parties can exploit to compromise their security. Government use of state trojans often depends on exploiting vulnerabilities in systems to facilitate a surveillance objective. Government hacking may also involve manipulating people to interfere with their own systems. These latter techniques prey on user trust, the loss of which can undermine the security of systems and the internet.</p> <p>Focusing on the privacy and security concerns raised <span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US">by the government use of state trojans, the statement <span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US">puts forward the following submissions: </span></span></p> <p>1. The use of state trojans can threaten the essence of the rights to privacy and data protection under international and European human rights law, if not properly constrained;</p> <p>2. The use of state trojans violates states’ obligations to effectively guarantee the security and integrity of IT systems;</p> <p>3. The use of state trojans may be incompatible with the principles of necessity and proportionality under both international and European law.</p> <p>Privacy International believes that the manner in which Germany decides to deal with IT system vulnerabilities, in the context of state trojans, affects not only people residing in Germany, but also potentially everyone that is a user of the World Wide Web. We therefore hope that our statement will assist the Court in assessing the fundamental rights concerns raised by the use of state trojans</p> <p>***</p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Bundesverfassungsgericht<span lang="DE" xml:lang="DE" xml:lang="DE"> (BVerfG)</span></span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">2 BvR 1850/18</span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Stand: Offen</span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Am 30. September 2019 hat Privacy International mit einem Schreiben vor dem Bundesverfassungsgericht Stellung genommen, das um den Einsatz von Staatstrojanern im Rahmen von Ermittlungsverfahren geht.</span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Die Verfassungsbeschwerde wurde ursprünglich von der <a href="https://freiheitsrechte.org/trojaner/">Gesellschaft für Freiheitsrechte (GFF)</a> im Jahr 2018 eingelegt. Im Einzelnen hat die GFF am 22. August 2018 <span lang="DE" xml:lang="DE" xml:lang="DE">Beschwerde gegen den Einsatz von sogenannten Staatstrojanern und den unverantwortlichen staatlichen Umgang mit IT-Sicherheitslücken eingelegt.</span></span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Im Jahr 2017 wurde die StPO dahingehend geändert, dass die Untersuchungsbehörden in die Informationstechnologiesysteme "eingreifen" können, um Daten von ihnen zu erheben. Dies wiederum würde die Installation von Software erfordern, die Daten liest, um sie aus dem Gerät der Person zu extrahieren, und an die Strafverfolgungsbehörden überträgt. Diese Überwachungstechnologie wird allgemein als "Staatstrojaner" bezeichnet.</span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Als eine Form der staatlichen Überwachung stellt der Einsatz von Staatstrojanern eine einzigartige und schwerwiegende Bedrohung für die Privatsphäre und Sicherheit dar. Es hat das Potenzial, weitaus eindringender zu sein als jede andere Überwachungstechnik, die es der Regierung ermöglicht, aus der Ferne und heimlich auf persönliche Geräte und alle darin gespeicherten vertraulichen Informationen zuzugreifen.</span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Der Einsatz von Staatstrojanern erlaubt der Regierung auch, neuartige Formen der Echtzeitüberwachung durchzuführen, indem sie im Geheimen die Mikrofon-, Kamera- oder GPS-basierte Ortungstechnologie eines Geräts einschaltet, kontinuierliche Screenshots macht oder alles sieht, was in das Gerät eingegeben und von ihm ausgegeben wird. Er ermöglicht es Regierungen, Daten auf Geräten zu manipulieren, indem sie Daten löschen, beschädigen oder einpflanzen, gelöschte Daten wiederherstellen oder Code-Änderungen oder -Editierungen vornehmen, um Fähigkeiten zu ändern oder hinzuzufügen, während sie gleichzeitig jede Spur des Eindringens löschen. Diese Ziele sind nicht auf Geräte beschränkt. Sie können sich auch auf Kommunikationsnetze und die ihnen zugrunde liegende Infrastruktur erstrecken.</span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Gleichzeitig hat der Einsatz von Staatstrojanern das Potenzial, die Sicherheit von Zielgeräten, Netzwerken oder Infrastrukturen und möglicherweise sogar das Internet als Ganzes zu gefährden. Computersysteme sind komplex und enthalten mit an Sicherheit grenzender Wahrscheinlichkeit Schwachstellen, die Dritte ausnutzen können, um ihre Sicherheit zu gefährden. Regierungshacken hängt oft davon ab, Schwachstellen in Systemen auszunutzen, um ein Überwachungsziel zu erreichen. Hacking kann auch die Manipulation von Menschen beinhalten, um in ihr eigenes System einzugreifen. Diese letztgenannten Techniken setzen auf das Vertrauen der Nutzer, dessen Verlust die Sicherheit der Systeme und des Internets untergraben kann.</span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">In dieser Stellungnahme, die sich auf die Datenschutz- und Sicherheitsbedenken konzentriert, die durch den Einsatz von Staatstrojanern hervorgerufen werden, werden die folgenden Punkte aufgegriffen: </span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">I. Der Einsatz von Staatstrojanern kann den Kern des Rechts auf Privatspha<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">re und Datenschutz nach internationalen und europa<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">ischen Menschenrechtsbestimmungen gefa<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">hrden, wenn er nicht angemessen eingeschra<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">nkt wird. </span></span></span></span></span></span></span></span></span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">II. Der Einsatz von Staatstrojanern versto<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">ßt gegen die Verpflichtung der Staaten, die Sicherheit und Integrita<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">t von IT-Systemen wirksam zu gewa<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">hrleisten. </span></span></span></span></span></span></span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">III. Der Einsatz von Staatstrojanern kann mit den Grundsa<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">tzen der Notwendigkeit und Verha<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">ltnisma<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">ßigkeit sowohl nach internationalem als auch nach europa<span lang="DE" xml:lang="DE" xml:lang="DE">̈<span lang="DE" xml:lang="DE" xml:lang="DE">ischem Recht unvereinbar sein. </span></span></span></span></span></span></span></span></span></p> <p><span lang="DE" xml:lang="DE" xml:lang="DE">Privacy International ist der Auffassung, dass die Nutzung von Systemschwachstellen durch Staatstrojaner nicht nur Menschen mit Wohnsitz in Deutschland, sondern auch potenziell alle Internetnutzer betrifft, da die Ausnutzung von Systemschwachstellen in der Regel heißt, dass diese nicht zeitnah geschlossen werden. Wir hoffen, dass unsere Stellungnahme dem Bundesverfassungsgericht bei der Beurteilung der Grundrechtsanliegen, die durch den Einsatz von Staatstrojanern aufkommen, unterstutzen wird. </span></p> <p> </p> <p>List photo by Adam Jones, <a href="https://creativecommons.org/licenses/by-sa/2.0">CC BY-SA 2.0</a></p></div> <div class="field field--name-field-legal-proceedings field--type-entity-reference field--label-above"> <div class="field__label">Legal Action</div> <div class="field__items"> <div class="field__item"><a href="/legal-action/gff-challenge-use-government-spyware-germany" hreflang="en">GFF Challenge to use of government spyware (Germany)</a></div> </div> </div> <div class="field field--name-field-location-region-locale field--type-entity-reference field--label-above"> <div class="field__label">Location / Region / Locale</div> <div class="field__items"> <div class="field__item"><a href="/location/germany" hreflang="en">Germany</a></div> </div> </div> <div class="field field--name-field-icon field--type-image field--label-above"> <div class="field__label">Icon</div> <div class="field__item"> <img src="/sites/default/files/2019-09/Replica_of_Trojan_Horse_-_Canakkale_Waterfront_-_Dardanelles_-_Turkey_%285747677790%29.jpg" width="500" height="750" alt="trojan_horse_replica" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-09/Replica_of_Trojan_Horse_-_Canakkale_Waterfront_-_Dardanelles_-_Turkey_%285747677790%29.jpg" width="500" height="750" alt="trojan_horse_replica" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/government-exploitation" hreflang="en">Government Exploitation</a></div> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Learning Topic</div> <div class="field__items"> <div class="field__item"><a href="/learning-topics/government-hacking" hreflang="en">Government Hacking</a></div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-above"> <div class="field__label">Our Interventions</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/fight-government-hacking-powers" hreflang="en">Fight Government Hacking Powers</a></div> </div> </div> </div> <div class="group-footer"> </div> </div> Mon, 30 Sep 2019 17:17:27 +0000 staff 3240 at http://privacyinternational.org Buying a smart phone on the cheap? Privacy might be the price you have to pay http://privacyinternational.org/long-read/3226/buying-smart-phone-cheap-privacy-might-be-price-you-have-pay <span class="field field--name-title field--type-string field--label-hidden">Buying a smart phone on the cheap? Privacy might be the price you have to pay</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span lang="" about="/user/1" typeof="schema:Person" property="schema:name" datatype="">tech-admin</span></span> <span class="field field--name-created field--type-created field--label-hidden">Friday, September 20, 2019</span> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><em>Research by Privacy International shows that cheap smartphones come with a hidden cost: pre-installed apps that can't be deleted and that leak your data.</em></p> <p> </p></div> <div class="field field--name-field-repeating-image-and-text field--type-entity-reference-revisions field--label-hidden field__items"> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><span><span><span>Last year, a Privacy International member of staff travelled alone to the Philippines to meet with the Foundation for Media Alternatives (FMA), one of our <a href="https://www.fma.ph/"><span>partner organisations</span></a>. As she arrived in Manila, she realised her phone was broken - a fairly big problem for someone who's on their own, 6,666 miles from home. Her first stop, straight off the plane, was the closest shop to buy a cheap phone to tell her friends and loved ones she'd landed safely. That's how she ended up with a brand new MYA 2, a smart phone by MyPhone, a Filipino brand that cost no more than 19 USD at the time.</span></span></span></p> <p><span><span><span>Over the past few years, smart phones have become incredibly inexpensive. Cheap smart phones are one of the reasons more than half the world's population is now online, very slowly closing the global digital divide. While growing connectivity is undeniably positive, some device vendors have <a href="https://www.wsj.com/articles/app-traps-how-cheap-smartphones-help-themselves-to-user-data-1530788404"><span>recently</span></a> come under scrutiny for siphoning user data and their invasive private data collection practices.</span></span></span></p> <p><span><span><span>That's why we decided to take a closer look at the MYA 2, once our colleague returned back to the UK. We were particularly interested in the phone's pre-installed apps (often called 'bloatware'), what permissions these app make use of, and how they behave. Earlier this year, the first <a href="https://arxiv.org/abs/1905.02713"><span>large-scale study</span></a> of pre-installed software on Android devices - from more than 200 vendors - found harmful behaviours and backdoored access to sensitive data and services without user consent or awareness.</span></span></span></p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-09/PhoneFrontSmall.jpg" width="3167" height="2328" alt="Phone from the front" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><span><span><strong><span><span>The basics</span></span></strong></span></span></p> <p><span><span><span>The first thing we noticed was that the phone was running on Android 6.0, an outdated version of the operating system Android, which was released in 2015. This is curious, given that the MyPhone is a "certified Android Partner", meaning the device has been tested for security and performance, comes with Google apps pre-installed, and is Play Protect certified. Since the phone is running outdated software, users are being exposed to known security vulnerabilities, which have been patched in more up to date versions of Android.</span></span></span></p> <p><span><span><strong><span><span>When bloatware gets political</span></span></strong></span></span></p> <p><span><span><span>Beside the "usual apps" (calculator, clock, Google services…), our MYA 2 came with the following apps pre-installed:</span></span></span></p> <ul><li><span><span><span><span>MyPhoneRegistration - an app for registering the device with the Manufacturer</span></span></span></span></li> <li><span><span><span><span>Pinoy - a Portal app, providing various services, such as news, podcasts and well-being themed content</span></span></span></span></li> <li><span><span><span><span>Facebook Lite - a slimmed version of the Facebook App</span></span></span></span></li> <li><span><span><span><span>Brown Portal - a portal with a browser-like interface for MyPhone users</span></span></span></span></li> </ul><p><span><span><span>Additionally, the phone came with a number of pre-loaded apps, that existed in the phone's filesystem, but were not installed:</span></span></span></p> <ul><li><span><span><span><span>Baidu_Location – Location Service for Baidu Maps</span></span></span></span></li> <li><span><span><span><span>Xender – A tool for file synchronisation and transmission</span></span></span></span></li> </ul><p><span><span><span>As these pre-loaded apps were not active on the device, we didn't test them. These apps are likely destined for MyPhone's alternative market, China, rather than the domestic market of the Philippines.</span></span></span></p> <p><a href="https://privacyinternational.org/node/3229"><span><span><strong><span>MyPhoneRegistration app</span></strong></span></span></a></p> <p><span><span><span>MyPhoneRegistration is an app that allows you to register your MyPhone device with MyPhone in order to make it easier for you to do things like access warranties or get software updates and for them to send you advertisements and promotional material. The app gets permissions to make and manage phone calls, to send and view SMS messages, and to access storage.</span></span></span></p> <p><span><span><span>Pre-installed applications can be installed as "privileged apps", giving them far wider access to the phone than user permissions would give them. Also, because they're privileged, they often cannot be removed by the user. Since the app is not available on the Google Store, it cannot receive updates.</span></span></span></p> <p><span><span><span>MyPhone have confirmed this with the following response to this article: "For the models we have launched before, we have lost access and support to update the apps we have pre-installed, but we remain committed to provide a secure platform to our new and upcoming devices by complying to the latest Google requirements to keep the devices secure."</span></span></span></p> <p><span><span><span>Based on a network analysis we have conducted using <a href="https://privacyinternational.org/node/2732"><span>our app testing environment</span></a>, we have discovered that the app tries to contact the remote server without any security protocol (SSL/TLS). This means that personal information like your IMEI number (a unique number that identifies the device), name, date of birth and gender are shared without encryption with <a href="http://www.zed.com/"><span>Zed</span></a> (the company hosting MyPhoneRegistration’s server). It appears that this server is no longer at the path hardcoded into the app. As a result, the phone endlessly tries to transmit the details to the missing server, as the connection always fails. It does so insecurely, giving away the user's name, gender, data of birth and IMEI to any eavesdropper on any network that the user connects to, for example Wifi.</span></span></span></p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-09/Screenshot%202019-08-15%20at%2017.21.33.png" width="1396" height="660" alt="data being transmitted" typeof="foaf:Image" /> </div> <div class="field field--name-field-caption field--type-string field--label-hidden field__item">Screenshot from mitmproxy, showing personal data being transmitted unencrypted</div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><span><span>We also identified vulnerabilities that could allow a malicious individual with physical access to the phone to run their own code in the MyPhoneRegistration app context, allowing them to execute code with the same privileges as the MyPhoneRegistration app. When combined with other known vulnerabilities within Android 6.x, this could compromise the device remotely. As this app cannot be updated or deleted by the user, this vulnerability threatens the user permanently.</span></span></p> <h4><a href="https://privacyinternational.org/node/3227"><span><span><strong>Brown Portal app</strong></span></span></a></h4> <p><span><span>Brown Portal, which is also not available on the Google Play Store, has permission to access the phone's storage, which means it has access to your photos and files. The pre-installed app, which can't be deleted, also gets access to your network information and has location permission, which means your location can be tracked at all times.</span></span></p> <p><span><span>Brown Portal is part of the Brown and Proud movement, a campaign launched by MyPhone's parent company, <a href="https://web.archive.org/web/20180527182652/http:/www.yugatech.com/mobile/brown-and-proud-a-new-local-tech-brand-with-an-mlm-model/">Solid Group Inc</a>. The campaign was designed to create smart phones and internet of things devices for the Philippines, that celebrate Filipino identity and <a href="https://brown.com.ph/about">empower consumers to become leaders.</a> It's unclear what happened to the campaign, but Brown Portal is still included in <a href="https://gadgetpilipinas.net/2018/09/myphone-brown-2-review/">recent MyPhone devices</a>, at least as of September 2018.</span></span></p> <h4><a href="https://privacyinternational.org/node/3230"><span><span><strong>Pinoy app</strong></span></span></a></h4> <p><span><span>The Pinoy app is another app by MyPhone that comes <a href="https://www.pinoytechnoguide.com/2013/01/myphone-pinoy-content-download-and-setup.html">pre-installed with every MyPhone device</a>. Similar to Brown Portal, Pinoy – a nickname referring to Filipino people – alludes to a sense of national identity. The app offers a number of paid services, such as music or access to news and entertainment (jokes, horoscopes, "guidance", "experience", "advice"…) - much of which is religious or political.</span></span></p> <p><span><span>The guidance section, for instance, suggests that users subscribe to a platform called Reform Ph "focusing on reforming the Philippines through improvement of the political systems". Users can also sign up to a service that sends them a "quote to brighten up their day" or to a daily SMS service that "focuses on the beauty of life and the goodness/faithfulness of our God" for just 2.50 Filipino pesos (Php) a day. Other paid services include Bible comics and Christian music, for Php 5 three times a week.</span></span></p> <p><span><span>The Pinoy app also contains free services, such as "My Faith", a section that contains downloadable audio recording of Christian prayers, and “My Country”, a section that celebrates Filipino culture with a wide range of content, including a Filipino history book, recordings of famous movie lines, Filipino quotes, riddles, games and "pick-up lines".</span></span></p> <p><span><span>Pinoy gets permission to access the phone's contacts, location, SMS, storage and make phone calls. The app cannot be deleted, and also communicates with Zed servers over an insecure channel. It isn't on the Google Play Store and therefore cannot receive updates.</span></span></p> <h4><a href="https://privacyinternational.org/node/3228"><span><span><strong>Facebook Lite app</strong></span></span></a></h4> <p><span><span>Facebook Lite, an Android app designed for low speed connections and low-cost phones, also come pre-installed with the phone. To function in those conditions, the app uses less RAM and CPU power than the regular Facebook app and is most popular in India and the United States. Lite exists so that Facebook users, who are using old phones that are not supported by the regular app, can still access Facebook.</span></span></p> <p><span><span>The app gets permission to access your calendar, camera, contacts, location, microphone, phone, SMS and storage. Facebook Lite is available on the Google Store and can be updated, however when uninstalled through the Play Store, it just reverts to the pre-installed version, which still cannot be uninstalled. In other words, the app cannot be removed.</span></span></p> <p><span><span>Facebook Lite was in the news earlier this year, when Facebook left between 200 and 600 million account <a href="https://www.vice.com/en_us/article/qvy9k7/facebook-hundreds-of-millions-user-passwords-plaintext-data-leak">passwords exposed</a> to its 20,000 employees. The leak, which was revealed in 2019, had been happening since 2012, and affected users who had logged in at least once using Facebook Lite.</span></span></p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-09/DSC_4026.jpg" width="4928" height="3264" alt="Phone from the rear" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><h3><span><span><strong>Data exploitation by design and by default</strong></span></span></h3> <p><span><span>Our case study of a single low-cost smartphone shows how data exploitation and poor security is often built into the devices that people rely on as their only means of communication.</span></span></p> <p><span><span>We discovered multiple security issues with pre-installed apps that can't be updated or deleted. Since the phone is shipped with an out-dated version of Android, it comes with known vulnerabilities that will not be patched and that can be exploited cheaply by anyone, from scammers to government agencies.</span></span></p> <p><span><span>More fundamentally though, our findings raise the question of whether cheap phones are at least partially subsidised by exploitative data practices. Aside from Facebook Lite, the apps we highlighted above are all tied to the manufacturer, MyPhone. Some of them offer paid services, which means there will be extra revenue for MyPhone, others like Brown Portal are there to promote MyPhone as a brand and encourage the purchase of other devices. Since these apps make use of vast permissions, they also get access to a lot of user data. The fact that some apps contain religious and patriotic content, raises questions as to the potential for political parties to exploit cheap phones in countries with limited democratic accountability.</span></span></p> <h3><span><span><strong>Privacy: a human right, not a luxury</strong></span></span></h3> <p><span><span>Privacy is a fundamental right guaranteed under the Universal Declaration of Human Rights, at least in theory. In reality, there are stark contrasts between regions that uphold high standards of data protection, and places where users at the mercy of what we call the <a href="https://privacyinternational.org/long-read/2677/we-need-fix-data-wild-west">data wild west</a>. </span></span>In some places, like the Philippines, there might be a legal framework in place to regulate the processing of personal data, but the accountability and enforcement mechanisms remains a challenge.</p> <p><span><span>For those who live in the data wild west and can only afford cheap phones as their <a href="https://www.idrc.ca/en/project/understanding-digital-access-and-use-global-south">sole way to access the internet</a>, we're now also seeing that privacy is becoming a luxury that few can afford. While buying a recent Apple phone will guarantee you a secure Operating System (OS) and good encryption, buying a brand new MyPhone, like we did, will leave you with an OS with vulnerabilities left unpatched for years, and apps like MyPhoneRegistration that share your personal data in plain text. Even downloading apps that offer secure communications proved extremely difficult.</span></span></p> <h3><span><span><strong>What Google and manufacturers should do</strong></span></span></h3> <p><span><span>It is time for this double punishment to end. Being economically vulnerable should not mean losing your fundamental rights and companies have a responsibility to protect their consumers. In particular, it is time for Android to confront its duties and obligations: MyPhone is not a random company that happens to be using the open source Android OS, it’s an official <a href="https://www.android.com/certified/partners/">Android certified partner</a>.</span></span></p> <p><span><span>Android claims that certified partners are "Play Protect certified Android devices [that] are tested for security and performance and pre-loaded with Google apps". The device we looked at is not only insecure, but it's also pre-loaded with apps that cannot be found on the Google Play Store. This, and the fact that the phone comes with an outdated version of Android, raises questions about the criteria Google applies to certify partners.</span></span></p> <p><span><span>Ultimately, pre-installed apps undermine the Android brand, especially when certified partners pre-load their phones with insecure apps that scoop up large amounts of user data. It's up to Google to make sure that manufacturers using their trademarks don't sully their brand, and don't take advantage of customers who can only afford cheap phones.</span></span></p> <p><span><span>Phone companies themselves, however, should not escape responsibility. While technology needs to be accessible to all, our human rights should not be the price we have to pay for it.</span></span></p> <p><span><span>Jam Jacobs of the <a href="https://privacyinternational.org/partners/foundation-media-alternatives">Foundation for Media Alternatives</a> said the following about Privacy Internationals research:</span></span><br />  </p> <blockquote> <p>That affordable technology is facilitated by compromised individual rights is far too common a phenomenon these days. And while, as a problem, the risks it poses are not restricted to global south jurisdictions like the Philippines, these regions’ populations remain the most vulnerable both in terms of protections and available legal remedies. This report by Privacy International highlights these points and more.<br />  <br /> As a civil society organisation advocating for human rights in the digital realm, FMA echoes the call for private companies to take their responsibility of upholding customer rights more seriously. We would add that the government should also proactively take up the cudgels on behalf of its citizens most of whom find themselves beholden to the whims of big businesses. Regulators cannot expect the private sector to get things right all on their own. That would be a sure recipe for failure, and an abandonment of their clear mandate as public servants.</p> </blockquote></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-09/IMG_0443.jpg" width="3024" height="4032" alt="The MYA2 box with Play Protect label" typeof="foaf:Image" /> </div> <div class="field field--name-field-caption field--type-string field--label-hidden field__item">Google Play Protect is prominent on the phones packaging</div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p> </p> <p>In the creation of this report we contacted both MyPhone and Zed, highlighting our concerns. Zed didn't reply before publication. MyPhone sent the following statement via email on 17 September 2019:</p> <blockquote> <p>"With our goal to deliver a unique experience to our end-users, we have pre-installed our in-house apps like Brown Portal, Pinoy and Registration, which bring contents based on their interest, promoting Pinoy culture, send them our latest product updates and services. In this way, we can continue to improve our future products and services we offer.</p> </blockquote> <blockquote> <p>At the same time, as we value the privacy of our user base, we have now our Privacy Policy accessible in our latest and upcoming devices that covers the collection, use, disclosure, transfer, and storage of their personal information.</p> </blockquote> <blockquote> <p>We hope that you will reconsider the messaging that you want to convey on your article. We at MyPhone value the privacy of our consumers as well as you are and we are dedicated to improve our privacy controls towards acceptable standards."</p> </blockquote> <p> </p></div> </div> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-inline"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/learning-topics/smartphones-and-privacy" hreflang="en">Smartphones and Privacy</a></div> </div> </div> <div class="field field--name-field-type-of-intervention field--type-entity-reference field--label-inline"> <div class="field__label">Related work PI does</div> <div class="field__item"><a href="/how-we-fight/technical-analysis" hreflang="en">Technical Analysis</a></div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/challenging-corporate-data-exploitation" hreflang="en">Challenging Corporate Data Exploitation</a></div> </div> </div> <div class="field field--name-field-location-region-locale field--type-entity-reference field--label-inline"> <div class="field__label">Location</div> <div class="field__items"> <div class="field__item"><a href="/location/philippines" hreflang="en">Philippines</a></div> </div> </div> <div class="field field--name-field-targeted-adversary field--type-entity-reference field--label-above"> <div class="field__label">More about this Adversary</div> <div class="field__items"> <div class="field__item"><a href="/taxonomy/term/578" hreflang="en">Google</a></div> </div> </div> <div class="field field--name-field-device field--type-entity-reference field--label-above"> <div class="field__label">Device</div> <div class="field__item"><a href="/taxonomy/term/723" hreflang="en">MyPhone MYA2</a></div> </div> Fri, 20 Sep 2019 07:00:00 +0000 tech-admin 3226 at http://privacyinternational.org Buying a cheap smartphone? 7 things you need to know http://privacyinternational.org/node/3234 <div class="node node--type-frequently-asked-question node--view-mode-token ds-2col-stacked-fluid clearfix"> <div class="group-header"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span><span><span><span>Privacy is a fundamental right and shouldn’t be a luxury. But if you have a cheap phone, your data might be at risk. </span></span></span></span></p></div> <div class="field field--name-field-question-1 field--type-faqfield field--label-hidden field__items"> <div class="field__item"><div id="faqfield_field_question_1_node_3234"><h3 class="faqfield-question">Cheap phones, how can they be so cheap?</h3><div class="faqfield-answer"><p><span><span><span><span>Over the past few years, smart phones have become incredibly inexpensive. So inexpensive, that more than half the world’s population is now online, powered by new smartphones for as little as 15 USD.</span></span></span></span></p><p><span><span><span><span>Modern smartphones are often sold as a <a href="https://www.investopedia.com/terms/l/lossleader.asp">loss-leader</a>. In fact, a significant number of phone manufacturers <a href="https://thenextweb.com/apple/2016/08/22/apple-samsung-smartphone-income/">aren’t making a profit on their handsets</a>. Handsets are partially subsidised by the products and services that come with pre-installed apps – as well as the data that these harvest and share. </span></span></span></span></p><p><span><span><span><span>One of the biggest costs incurred by a manufacturer is long-term support, such as patching. That’s why most low-cost devices have limited warranties and rarely receive system updates.</span></span></span></span></p><p><span><span><span><span>Telecommunication companies also subsidise smartphones, particularly those sold on pay-monthly contracts. They will bundle their own apps and services.</span></span></span></span></p></div><h3 class="faqfield-question">What are preinstalled apps (aka bloatware) and why are they on my phone?</h3><div class="faqfield-answer"><p><span><span><span><span>When you first unbox your new phone and turn it on, it might come as a surprise to see that there are already a number of apps preinstalled on the device beyond what's part of Android. Such apps may include the manufacturer’s own app store, utilities and even games or social media apps that you've no intention of ever using.</span></span></span></span></p><p><span><span><span><span>Since iPhones are sold at a higher price point, cheap phones usually run on the Android Operating System (OS). The openness of the Android source code makes it possible for any manufacturer to ship a custom version of the OS along with proprietary pre-installed apps.</span></span></span></span></p><p><span><span><span><span>Pre-installed apps get added by a range of actors. Chipset makers (such as Mediatek, Qualcomm) add apps that are generally hidden to the user as they provide API’s for other components in the phone. Manufacturers add apps that contribute to the "unique" selling point of the device, such as health apps, camera/video apps or audio Services. Some apps are also bundled with system updates delivered over the air, piggybacking on a phone manufacture’s Android system update. Finally, the telecommunications provider or the vendor may include their own apps, such as Video on Demand services, their own browsers or tools to check account information.</span></span></span></span></p></div><h3 class="faqfield-question">Why do some pre-installed apps come with privacy and security risks?</h3><div class="faqfield-answer"><p><span><span><span><span>While some apps may be useful to users, some pre-installed apps on your phone are used to offset the cost of the phone itself. Harmful pre-installed apps harvest and share data from the device, commit click fraud, or come with security vulnerabilities. Earlier this year, the first large-scale </span></span><span><span><span><a href="https://haystack.mobi/papers/preinstalledAndroidSW_preprint.pdf">study of pre-installed software on Android devices from more than 200 vendors</a>  </span></span></span><span><span>found harmful behaviours and backdoored access to sensitive data and services without user consent or awareness.</span></span></span></span></p><p><span><span><span><span>One of the fundamental problems of pre-installed apps is that they can exist outside of standardised update processes. In other words: they don’t receive updates, even when vulnerabilities are discovered, which means that the apps could be compromised.</span></span></span></span></p><p><span><span><span><span>Such compromise may be worse for pre-installed apps than it is for the apps you choose to install yourself. This is because pre-installed apps often make use of "custom-permissions" that allow app developers to define activities on a device outside the scope of the standard permissions suite that Android uses now have. For example, when you install an app, you are usually asked, whether you want the app to have access to your camera, microphone etc. But for pre-installed apps, a developer could specify a custom permission to access the camera, and then use the camera without the permission of the user.</span></span></span></span></p><p><span><span>Some pre-installed apps use exploits to root devices. Sometimes, it's not possible to delete them. Other malicious behaviour of pre-installed apps includes built-in malware, data exploitation or ad- and click-fraud. Bloatware can also make it impossible to install important security patches if it takes up too much memory and then can't be deleted.</span></span></p></div><h3 class="faqfield-question">Who is at fault?</h3><div class="faqfield-answer"><p><span><span><span><span>It’s the developers of pre-installed apps that make design-decisions that undermine the security and privacy of your devices. That said, manufacturers of cheap phones often seem to be using access to user data as a way to subsidise the phone. In other words: you are paying with your data. Some manufacturers are also being deceived by malicious app developers. As Google's Android Security 2018 Year in Review remarks: "developers of pre-installed PHAs [Potentially Harmful Apps] only need to deceive the device manufacturer or another company in the supply chain instead of large numbers of users, so it’s easier to achieve large-scale distribution."</span></span></span></span></p><p><span><span><span><span>There’s also a broader issue here. <a href="https://haystack.mobi/papers/preinstalledAndroidSW_preprint.pdf">Academic research on pre-installed apps</a> has concluded that “the supply chain around Android’s open source model lacks transparency” and that this “has facilitated potentially harmful behaviours and backdoored access to sensitive data and services without user consent or awareness”.</span></span></span></span></p><p><span><span>We think that Google could do more to address the privacy and security concerns with pre-installed apps, for instance by banning pre-installed apps that can’t be deleted, by increasing transparency around the Android certification process and by better enforcing their own rules.</span></span></p></div><h3 class="faqfield-question">What’s the difference between Android (Open Source) and Google&#039;s Android? </h3><div class="faqfield-answer"><p><span><span><span><span>Android is a mobile operating system developed by Google. Google designs, develops and markets its own Android smartphones, such as the Pixel. The source code for Android is open-source and since Google publishes most of the code under the non-copyleft Apache License version 2.0., anybody can modify and redistribute the code. The license does not grant rights to the "Android" trademark, so device manufacturers and wireless carriers have to license it from Google under individual contracts. </span></span></span></span></p><p><span><span><span><span>Android is also associated with a suite of proprietary software developed by Google, called Google Mobile Services that very frequently comes pre-installed in devices, which usually includes the Google Chrome web browser and Google Search and always includes core apps for services such as Gmail, Google Play Store, and Google Play Services. </span></span></span></span></p><p><span><span><span><span>This is where is gets complicated. Google licenses their Google Mobile Services software, along with Android trademarks, only to hardware manufacturers for devices that meet Google's compatibility standards specified in the Android Compatibility Program document. The Android team at Google certifies these devices to ensure they are secure and ready to run apps from Google and the Play Store. These devices are called Play Protect Certified Android devices and come with a Google Play Protect logo.</span></span></span></span></p></div><h3 class="faqfield-question">Why does Google allow harmful apps to be pre-installed, even on Play Protect Certified Android devices?</h3><div class="faqfield-answer"><p><span><span><span><span>Google is aware of the problem and has dedicated a significant share of its “Android Security and Privacy Year in Review 2018” report on the issue of Potentially Harmful Applications.</span></span></span></span></p><p><span><span><span><span>Since March 2018 Google has begun to block "uncertified" Android devices from using Google Mobile Services software, and now also displays a warning indicating that "the device manufacturer has preloaded Google apps and services without certification from Google".</span></span></span></span></p><p><span><span><span><span>In their 2018 security report, Google <a href="https://threatpost.com/google-warns-of-growing-android-attack-vector-backdoored-sdks-and-pre-installed-apps/143332/">declare</a>: </span></span><span><span><span>“We expanded the program in 2018 and now every new Android-certified device goes through the same app scanning process as apps on Google Play. Additionally, our security scanner looks for other common security and privacy issues and denies device certification until device manufacturers fix these problems.”</span></span></span></span></span></p><p><span><span><span><span><span>While there is certainly awareness, and while there seem to be improvements, our case studies of Android certified phones suggest that certification process does not seem to be working as well as it should. In fact, it seems that people who want to buy a secure new phone that doesn’t violate their privacy, can’t rely on the Play Protect logo. </span></span></span></span></span></p><p><span><span><span><span>Google could do more to address the privacy and security concerns with pre-installed apps, for instance by banning pre-installed apps that can’t be deleted, by increasing transparency around the Android certification process and by better enforcing their own rules.</span></span></span></span></p></div><h3 class="faqfield-question">I have an iPhone, does this mean my privacy is protected?</h3><div class="faqfield-answer"><p><span><span><span><span>iPhones and iOS devices still come with pre-installed apps, some of which you can’t delete, and many of which collect information about you. </span></span></span></span></p><p><span><span><span><span>Fundamentally, the difference between iOS and Android is that the former is a proprietary “closed” system, while Android is “open”. This comes with advantages and disadvantages. One advantage is that Apple controls the OS and also produces the hardware (the same holds true for Google’s own Android phones). As a result, such phones receive timely updates and always run the latest version of the OS. It also means that you know exactly which apps will come pre-installed with an iPhone, so there are less surprises.</span></span></span></span></p><p><span><span><span><span>A disadvantage of a closed system is that iDevices also lock users into an Apple ecosystem which can be expensive to leave. Their lock-in practices can also put users at risk, as it happens with its <a href="https://www.wired.com/story/ios-security-imessage-safari/">web browser and messaging platforms</a> limiting users from adopting more secure alternatives.</span></span></span></span></p><p><span><span><span><span>Privacy International believe that privacy is a fundamental right that should be accessible to everyone.</span></span></span></span></p><p><span><span><span><span>As we’ve said in <a href="https://privacyinternational.org/sites/default/files/2018-12/How%20Apps%20on%20Android%20Share%20Data%20with%20Facebook%20-%20Privacy%20International%202018.pdf">previous research about apps and privacy</a>, both Apple and Google should do much better. They should allow users to block third party tracking in Android and iOS. Users should be prompted to reset their advertising ID regularly, for instance, but not limited to when resetting a device to its factory settings. We also think that Android and iOS can do better when it comes to device permissions, for instance by giving users the ability to authorise whether an app can connect to the Internet or use certain sensors in the device that can be used to fingerprint and/or profile users. </span></span></span></span></p><p><span><span><span><span>When it comes to cheap phones, and harmful pre-installed apps, we think it is Google’s responsibility (and also in the company’s interest) to ensure that manufacturers and telecommunications companies don’t exploit their customers, while damaging the Android brand.</span></span></span></span></p></div></div></div> </div> </div> <div class="group-footer"> </div> </div> Thu, 19 Sep 2019 21:52:25 +0000 tech-admin 3234 at http://privacyinternational.org #LowCostTech - Philippines Phone Case Study Explained http://privacyinternational.org/video/3231/lowcosttech-philippines-phone-case-study-explained <span class="field field--name-title field--type-string field--label-hidden">#LowCostTech - Philippines Phone Case Study Explained</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span lang="" about="/user/1" typeof="schema:Person" property="schema:name" datatype="">tech-admin</span></span> <span class="field field--name-created field--type-created field--label-hidden">Thursday, September 19, 2019</span> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><div class="video-js vjs-default-skin vjs-big-play-centered" style="width: 100%; height: 0; position: relative; padding-bottom: 56.25%;"><iframe allowfullscreen="" frameborder="0" sandbox="allow-same-origin allow-scripts" src="https://media.privacyinternational.org/videos/embed/d8f54ea0-7b8f-417b-a0b9-8a6c3485c9c0?subtitle=en&amp;warningTitle=0" style="width: 100%; height: 100%; position: absolute; top: 0; left: 0;"></iframe></div> <p>Christopher Weatherhead and Eva Blum-Dumontet Discuss the finding of Privacy International's report on the <a href="https://staging.privacyinternational.org/taxonomy/term/723">MyPhone MYA2</a> from the Philippines</p> </div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-09/MyPhone-LCT.jpg" width="1920" height="992" alt="image-video" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-09/MyPhone-LCT_0.jpg" width="1920" height="992" alt="image-video" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-09/MyPhone-LCT_1.jpg" width="1920" height="992" alt="image-video" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/learning-topics/smartphones-and-privacy" hreflang="en">Smartphones and Privacy</a></div> </div> </div> <div class="field field--name-field-type-of-intervention field--type-entity-reference field--label-above"> <div class="field__label">Related types of work PI does</div> <div class="field__item"><a href="/how-we-fight/technical-analysis" hreflang="en">Technical Analysis</a></div> </div> <div class="field field--name-field-device field--type-entity-reference field--label-above"> <div class="field__label">Device</div> <div class="field__item"><a href="/taxonomy/term/723" hreflang="en">MyPhone MYA2</a></div> </div> Thu, 19 Sep 2019 14:11:32 +0000 tech-admin 3231 at http://privacyinternational.org