Privacy International http://privacyinternational.org/rss.xml en Part 1: how anti-abortion activism is exploiting data http://privacyinternational.org/long-read/3096/part-1-how-anti-abortion-activism-exploiting-data <span class="field field--name-title field--type-string field--label-hidden">Part 1: how anti-abortion activism is exploiting data</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span lang="" about="/user/1" typeof="schema:Person" property="schema:name" datatype="">tech-admin</span></span> <span class="field field--name-created field--type-created field--label-hidden">Monday, July 22, 2019</span> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><em>Photo by <a href="https://unsplash.com/@bigkids?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">David Werbrouck</a> on <a href="https://unsplash.com/">Unsplash</a></em></p> <p> </p> <p><span><span><em>This is an ongoing series about the ways in which those searching for abortion information and procedures are being traced and tracked online. This work is part of a broader programme of work aimed at </em><a href="https://privacyinternational.org/strategic-areas/safeguarding-peoples-dignity"><em>safeguarding the dignity of people</em></a><em> by challenging current power dynamics, and redefining our relationship with governments, companies, and within our own communities. As an enabling right, privacy plays an important role in supporting the exercise of reproductive rights as </em><a href="https://www.ohchr.org/Documents/Publications/NHRIHandbook.pdf"><em>recognised</em></a><em> in international human rights law.</em></span></span></p> <p> </p> <h3><span><span>Intrusive data collection software and digital marketing systems are being developed and promulgated around the world by powerful and politically connected <a href="https://twitter.com/gavinsblog/status/990582509162418176">US-based</a> anti-abortion organisations. </span></span></h3> <p><span><span>As anti-abortion organisations wake up to the utility of personal data to tailor and target messages online, data-intensive technologies and tools are being specifically developed for crisis pregnancy centres – which reportedly sometimes <a href="https://www.huffingtonpost.co.uk/entry/crisis-pregancy-centers-supreme-court_n_5a09f40ae4b0bc648a0d13a2?guccounter=1&amp;guce_referrer=aHR0cHM6Ly9jb25zZW50LnlhaG9vLmNvbS8&amp;guce_referrer_sig=AQAAAH1adad1LR9ZOIdnAoGKFMFwyRWZl93RF7OPspdlMzIydLj63-EPzNAxr2imx0fswpG5qUEOmMSJCjfJ2ndHvcieAP1b1lmEisgi2UrXu_WPs-DSouu6IKfH8O6DtgzEPEym1pXNiUcpIN83icmMMSLOgYu1OrLNpPMQZdk_o7yk">masquerade</a> as licensed medical facilities and which have been criticised for providing those seeking medical help with <a href="https://www.telegraph.co.uk/women/womens-health/10622816/Abortion-scandal-abortions-increase-breast-cancer-risk-claims-counsellor.html">false</a> and <a href="https://rewire.news/article/2017/12/08/google-removes-misleading-anti-choice-fake-clinic-ads/">misleading</a> information. </span></span></p> <p><span><span>There have been a number of recent examples of data being used to target people seeking information about abortion online – through <a href="https://foreignpolicy.com/2018/06/01/abortion-referendum-how-ireland-resisted-bad-behaviour-online/">advertising</a>, <a href="https://www.maltatoday.com.mt/news/national/95506/antiabortion_groups_using_deception_and_intimidation_tactics_prochoice_coalition_warns?fbclid=IwAR23xTJEVkCsjoPAZFfKUbMutxK45GO23TOqBpKzRZkC4j9-qFiu2YZDil0#.XSiT45NKhxh">misleading information</a>, and <a href="https://www.thetimes.co.uk/article/fake-abortion-website-faces-legal-action-kmbmrl8f0">misleading</a> websites. In this context, as more and more data is collected by and made available to anti-abortion organisations, understanding how they and their political allies are aiming to use this information is crucial. </span></span></p> <p><span><span>Heartbeat International is an international anti-abortion organisation that is <a href="https://www.nextlevelcms.com/about-us">particularly focused</a> on using data to understand the needs and trends of anti-abortion centres. The organisation <a href="https://www.nextlevelcms.com/about-us">powers</a> a data-intensive content management system that was developed <a href="https://www.nextlevelcms.com/better-together">specifically</a> to remove data silos between anti-abortion centres globally. </span></span></p> <p> </p> <blockquote> <p><span><span><em>“While in Washington, Heartbeat was invited to attend an intimate Pro-Life Advocate roundtable with Vice President Mike Pence. Sitting next to the Vice President of the United States of America, Heartbeat President Jor-El Godsey had the opportunity to share the… life-changing work of pregnancy help organisations.” </em></span></span></p> <p><span><span>This quote was featured in <a href="https://www.heartbeatinternational.org/images/AR18.pdf">Heartbeat International’s 2018 annual report</a><span><span>. </span></span><span><span><span><span><span>It illustrates Heartbeat’s close proximity to US political power.</span></span></span></span></span> Such political allies may limit the likelihood that the development and use of data-intensive systems by centres will be regulated or challenged.</span></span></p> </blockquote> <p> </p> <p><span><span>The ability to exercise reproductive rights, as recognised in international human rights law, depends in part on the political will of those in power in a particular country. As an enabling right, privacy plays an important role in supporting the exercise of reproductive rights. In countries where there is opposition to reproductive rights as well as limited data privacy laws, there is a significant risk of people’s data being exploited in an attempt to restrain reproductive rights.  </span></span></p> <p> </p> <h2> </h2> <h2><span><span><em>Heartbeat International</em></span></span></h2> <blockquote> <p><span><span><em>“We believe we’re better together, and so is our data. Knowing the real-time trends of the larger life-affirming community is a crucial, yet untapped gateway to breakthrough success on the local level—until now, that is.”</em></span></span></p> <p><span><span><a href="https://www.heartbeatservices.org/services-home">Source: Heartbeat International website</a></span></span></p> </blockquote> <p> </p> <p><span><span>Heartbeat International is an important player in the global anti-abortion scene. <a href="https://www.heartbeatservices.org/services-home">Self-described</a> as “the largest worldwide network of pregnancy help organizations”, Heartbeat International runs a network of “over 2,700 affiliated pregnancy help organizations worldwide and affiliated pregnancy help organizations in more than 60 countries” – it says it <a href="https://www.heartbeatservices.org/international/international-affiliates">has</a> “700 affiliate locations outside the US”. </span></span></p> <p><span><span>Becoming a an affiliate <a href="https://www.heartbeatservices.org/about-us/why-affiliate/benefits">provides</a> discounted access to Heartbeat’s anti-abortion web design and digital marketing service, Extend Web Services, as well as its helpline, Option Line. Heartbeat <a href="https://www.heartbeatservices.org/services-home/">markets</a> to its network of affiliates its content management system called Next Level, which “harnesses the power of big data” and gives anti-abortion centres “the ability to enter and access information anywhere at any time”.</span></span></p> <p><span><span>Heartbeat also offers its training courses at a discount to its affiliates, which <a href="https://www.heartbeatservices.org/resources/resources-by-topic/networking/8-steps-for-advancing-your-social-media-strategy">include</a> courses such as “<a href="https://www.heartbeatservices.org/resources/resources-by-topic/networking/8-steps-for-advancing-your-social-media-strategy">8 steps for advancing your social media strategy</a>”, “<a href="https://www.heartbeatservices.org/resources/store/hb-conference-recordings/2019-conference-recordings/7-keys-to-google-ad-grants">7 keys to Google Ad Grants</a>”, “<a href="https://www.heartbeatinternational.org/resources/resources-by-topic/networking/search-engine-marketing-101">search engine marketing 101</a>”, and “<a href="https://www.heartbeatinternational.org/resources/store/heartbeat-academy/online-marketing-the-good-the-bad-the-ugly">online marketing</a>”.</span></span></p> <p> </p> <h2><span><span><em>Extend Web Services</em></span></span></h2> <blockquote> <p><span><span><em>“She’s looking online. Be there for her.” </em></span></span></p> <p><span><span>This is the message displayed on the homepage of Extend Web Services’ website, promoted by Heartbeat International, which builds campaign tools and websites for anti-choice pregnancy centres.</span></span></p> <p> </p> </blockquote> <p><span><span>In some cases, those searching for abortion information or procedures are in desperate circumstances. Heartbeat International-supported Extend Web Services, is developing websites that attract “<a href="https://extendwebservices.com/about">abortion-minded</a>” people, make them “<a href="https://extendwebservices.com/about">feel comfortable</a>”, and “<a href="https://extendwebservices.com/services/websites">effectively reach women in crisis online</a>”. This can become problematic when a person searching for abortion information online is targeted with misleading information. For <a href="https://rewire.news/article/2017/12/08/google-removes-misleading-anti-choice-fake-clinic-ads/">example</a>, a person may be delayed in obtaining abortion care if they see an ad for what they think is a medical clinic but is in reality a crisis pregnancy centre.</span></span></p> <p><span><span>Extend Web Services was developed to provide anti-abortion crisis pregnancy centres and other related organisations with websites, campaign optimisation tools, local search tools, and design services. The company’s mission in part <a href="https://extendwebservices.com/about">states</a>: <em>“We are experts at making sure your website is attracting the abortion-minded client and representing your center in a way that will make your clients feel comfortable with the service they will receive.”</em></span></span></p> <p><span><span>The services of Extend are being used by a variety (for example <a href="https://nlpregnancy.org/">here</a>, <a href="https://www.embarazoayuda.org/ser-informado/educacion-sobre-el-aborto">here</a>, <a href="https://www.carenetpcc.org/be-informed/abortion-information">here</a>, <a href="https://natlhousingcoalition.org/">here</a>, <a href="https://www.apcbrevard.com/">here</a>, <a href="https://www.aaapregnancyoptions.org/about-us/who-we-are">here</a>, <a href="https://www.jdwcenter.org/contact">here</a>) of anti-choice  pregnancy centres inside the US and globally, such as <a href="https://www.lifelinemalta.eu/contact-us">in Malta</a>, where abortion remains illegal. </span></span></p> <p><span><span>Extend’s out-of-the-box website templates suggest guarded anti-abortion language for the website homepage, navigation, and elsewhere. In a response to a request to comment from  PI, an Extend representative told PI [emphasis added] that the company restricts the ability for their clients to change the language used on “<strong>5 medical pages</strong>” which are “provided and managed by Extend Web Services/Heartbeat International”. These pages, the representative told PI, include: “<strong>Abortion Information/Education”, “Abortion Recovery”, “Sexual Health”, “Pregnancy”, and “Emergency Contraception</strong>”. The representative further told PI “<em><span><strong>All other pages of the website are able to be fully customized by the client in terms of content, imagery, etc. They are able to request one of the 5 pages listed above to be completely removed from the site if they don't like the content. They can provide their own content to fully replace one of those pages as well - they just aren't allowed to edit the content on those pages that we provided.</strong>”</span></em></span></span></p> <p><span><span><span>The closeness of Extend’s relationship to Heartbeat International, which isn’t immediately made clear on their respective websites, is confirmed in the above exchange.</span></span></span></p> <p><span><span>Extend also <a href="https://extendwebservices.com/services/pay-per-click">offers</a> assistance to anti-abortion centres for obtaining Google’s AdWords Grant for Non-Profits. Earlier in 2019 it was <a href="https://www.theguardian.com/technology/2019/may/12/google-advertising-abortion-obria">reported</a> that another anti-choice network had been given $150,000 worth of free ads by Google. Google has been <a href="https://www.theguardian.com/technology/2019/may/12/google-advertising-abortion-obria">criticised</a> for allowing anti-choice organisations to run misleading advertisements, in violation of Google’s policies.</span></span></p> <p><span><span>Extend’s website design and other tools show a learned understanding of how to communicate to those seeking abortion information online. The increased collection of data from those working at crisis pregnancy centres about people seeking abortion information or procedures could be incredibly valuable to companies like Extend, in honing their targeting techniques and templates online.</span></span></p> <p> </p> <h2><span><span><em>Option Line</em></span></span></h2> <p><span><span>Option Line is a <a href="https://optionline.org/">website</a>, chat service, and call line that was developed by Heartbeat International for deployment on anti-abortion websites. Extend Web Services <a href="https://extendwebservices.com/services/websites">includes</a> Option Line’s chat service, which uses LiveChat software to operate, on the website packages they provide by default and it is visible on many of the anti-choice websites provided to centres by Extend.</span></span></p></div> <div class="field field--name-field-repeating-image-and-text field--type-entity-reference-revisions field--label-hidden field__items"> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/1.png" width="616" height="960" alt="Image source: Extend Web Services website" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><em><span><span>Source: Extend Web Services <a href="https://extendwebservices.com/services/websites">website</a></span></span></em></p> <p> </p> <p> </p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Picture2.png" width="1085" height="287" alt="Source: screenshots from two crisis pregnancy centre websites." typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><em><span><span>Source: screenshots from two crisis pregnancy centre websites.</span></span></em></p> <p> </p> <p><span><span>Prior to beginning a chat, the Option Line chat interface requires visitors to enter their name, demographic information, location information, as well as if someone is considering an abortion. Only after submitting this personal information does the chat begin. It is unclear where the data submitted prior to the chat beginning, as well as the data generated during the chat ends up, and who has access to it. When this was raised by PI a representative from Extend responded [emphasis added] “<em><span><strong>All information is safe and secure within Heartbeat International and only Heartbeat International has access to this information</strong>” </span></em><span>but it is still not clear who specifically at Heartbeat has access to this data, if this also includes Heartbeat subsidiary programmes such as Next Level Content Management System, how the data is used, stored, and for how long.</span></span></span></p> <p><span><span>This data has the potential to include medical and health data, which is subject to heightened privacy laws in the US and the EU, as well as other countries around the world. Option Line’s <a href="https://optionline.org/terms-of-use/">terms of use</a> state that “all remarks” sent through the website – other than information directly requested – can be used by Option Line “for any and all purposes” that it believes “to be appropriate to the mission and vision of Option Line”. </span></span></p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Picture3.png" width="947" height="641" alt="Image source: Option Line website" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><em><span><span>Source: Option Line <a href="https://extendwebservices.com/services/websites">website</a></span></span></em></p> <p> </p> <p><span><span>While it is unclear how the data is used, and by whom, what is said in the chat could be very valuable to Heartbeat and its affiliate anti-choice organisations. For example, representatives from local anti-abortion clinics could be told of the name of someone who is “abortion-minded”, and attempt to reach them by other <a href="https://www.maltatoday.com.mt/news/national/95506/antiabortion_groups_using_deception_and_intimidation_tactics_prochoice_coalition_warns?fbclid=IwAR23xTJEVkCsjoPAZFfKUbMutxK45GO23TOqBpKzRZkC4j9-qFiu2YZDil0#.XSiT45NKhxh">methods</a>.</span></span></p> <p> </p> <blockquote> <p><span><span><em>“The data your organisation collects needs to work not just for you, but for the rest of the pregnancy help movement.” Source: </em><a href="https://www.nextlevelcms.com/better-together"><em>Next Level website</em></a></span></span></p> </blockquote> <p> </p> <h2><span><span><em>Next Level Content Management Solution </em></span></span></h2> <p><span><span>It may not be clear to those visiting crisis pregnancy centres how the health and medical information collected from them will be used and shared. Especially for those desperate for medical help, those in areas with a limited number of abortion clinics, or those visiting a clinic where their first language is not spoken, people are likely to provide whatever information is asked of them during a visit without question.</span></span></p> <p><span><span>In 2017, Heartbeat International <a href="https://pregnancyhelpnews.com/next-level">unveiled</a> the Next Level Content Management Solution (CMS). The system appears to unify what questions people are asked when seeking a centre’s help, and to centralise the information that visitors to anti-abortion centres are asked to provide during their visit. The types of information that is collected, which is visible in a promotional video on Next Level’s website, <a href="https://www.nextlevelcms.com/">includes</a> name, address, email address, ethnicity, marital status, living arrangement, education, income source, alcohol, cigarette, and drug intake, medications and medical history, sexual transmitted disease history, name of the referring person/organisation, pregnancy symptoms, pregnancy history, medical testing information, and eventually even ultrasound photos.</span></span></p> <p><span><span>Next Level <a href="https://www.heartbeatinternational.org/next-level-supporter">market</a><span><span>s</span></span> the software as a system that “[m]akes seamless data collection possible for pregnancy centres”. They say that it “allows information to move from the receptionist to the client, from the client to the coach or mentor, and from the mentor to the nurse’s office”, and visualise the system as data streams flowing from individual pregnancy centres to a centralised cloud.</span></span></p> <p> </p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Picture4.png" width="947" height="664" alt="Image source: Next Level website" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><em><span><span>Source: Next Level <a href="https://www.nextlevelcms.com/better-together">website</a></span></span></em></p> <p> </p> <p><span><span>Next Level’s privacy policy <a href="https://www.nextlevelcms.com/privacy-policy.pdf">states</a> that the company “may share such information with Next Level affiliates, partners, vendors, or contract organizations, or as legally necessary”, but provides no further information about how people’s personal information is shared or analysed within Heartbeat’s network of 2,700 affiliate organisations and partners, or outside of this network.</span></span></p> <p><span><span>It’s also unclear to what degree Next Level personnel have access to centre client information. The company’s FAQ <a href="https://www.nextlevelcms.com/faq">state</a><span><span>s</span></span> that the company does have administrator-level access to client information but that “[a]ccess is granted only to a limited number of Next Level personnel as is necessary to perform the functions of the Next Level Software”. It provides no clarity as to what “a limited number” means or what it considered “necessary to perform the functions”.</span></span></p> <p><span><span>In an email to PI, Heartbeat President Jor-El Godsey said [emphasis added] that “<strong><em>All data of a personal identifying nature captured by Heartbeat International and its subsidiary programs is protected and kept confidential as a matter of our continuing </em><a href="https://www.heartbeatservices.org/about-us/commitment-of-care"><em>commitment</em></a></strong><em><strong> to confidentiality in accordance with applicable laws of the U.S. and relevant nations. To improve our services and understand the changing needs of those we seek to serve Heartbeat uses only aggregated and de-identified information to formulate and analyze trends.</strong>”</em></span></span></p> <p><span><span>It is unclear what Heartbeat’s subsidiary programmes are – these could include Next Level, Extend, and Option Line – as well as what they have access to. In 2018 Heartbeat <a href="https://www.heartbeatinternational.org/images/AR18.pdf">reported</a> that its affiliates had served 1.5 million clients. It’s also unclear how and who “de-identifies” information provided to Heartbeat, and at what point such de-identification occurs. Next Level, as a part of Heartbeat International, may have access to vast amounts of identifiable information and therefore understanding how such de-identification occurs in crucial.</span></span></p> <p><span><span>In addition to the content management system, Next Level also provides centres with a mobile phone application, which gives them access to client information outside the office – specifically <a href="https://www.nextlevelcms.com/overview">saying</a> “Because God often leads you to do “pro-life work” when we’re away from the office, you need a tool that travels with you. Next Level’s native mobility turns an on-the-fly conversation into an open client file, to help you follow up on these divine, “unexpected” appointments”.</span></span></p> <p><span><span>Next Level also provides a client-side version of the app, that <a href="https://www.nextlevelcms.com/">allows</a> clients to “set and change appointments, look at ultrasounds pictures” and also “click in or text her mentor, and talk to someone at Option Line very quickly”. The Next Level website uses similar language to promote the client app, <a href="https://www.nextlevelcms.com/overview">saying </a>“send her home with an app that includes everything from her ultrasound image and baby’s heartbeat to her proof of pregnancy, and vitally connects her to your organization moving forward”.</span></span></p> <p><span><span>PI was unable to test what information is generated by the Next Level mobile application, which requires a centre-given username and password. </span></span></p> <p><span><span>Phone apps are <a href="https://privacyinternational.org/report/2647/how-apps-android-share-data-facebook-report">notorious</a>, however, for collecting data about users’ activity that most people would not expect. App permissions can reveal information about a user’s <a href="https://www.theguardian.com/technology/2019/jan/04/weather-channel-app-lawsuit-location-data-selling">live location,</a> <a href="https://www.wired.com/story/app-permissions/">access to photos,</a> <a href="https://www.wired.com/story/app-permissions/">contacts</a>, and <a href="https://www.wsj.com/graphics/how-pizza-night-can-cost-more-in-data-than-dollars/">much more</a>. </span></span></p> <p><span><span>Included on the Next Level intake form shown in the promotional video is an “authorisation to release records” disclaimer which states: “In signing this release, I waive any privilege or confidentiality rights that I may have with respect to the specific disclosure authorised herein. I also agree to release Heartbeat International from any and all liability relating to any disclosure made in accordance with this authorisation”.</span></span></p> <p> </p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Picture5.png" width="947" height="597" alt="Source: Screen grab from video available on Next Level’s website shows authorisation to release records field." typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><em><span><span>Source: Screen grab from video available on Next Level’s <a href="https://www.nextlevelcms.com/">website</a> shows authorisation to release records field.</span></span></em></p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Picture6.png" width="947" height="597" alt="Source: Screen grab from video available on Next Level’s website shows collection of whether a person is “abortion-minded”." typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><em><span><span>Source: Screen grab from video available on Next Level’s <a href="https://www.nextlevelcms.com/">website</a> shows collection of whether a person is “abortion-minded”.</span></span></em></p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Picture7.png" width="947" height="597" alt="Source: Screen grab from video available on Next Level’s website shows other data collected – including “Living Arrangements”." typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><em><span><span>Source: Screen grab from video available on Next Level’s <a href="https://www.nextlevelcms.com/">website</a> shows other data collected – including “Living Arrangements”.</span></span></em></p> <p> </p> <p><span><span>In an email to PI, Heartbeat Director Jor-El Godsey concluded [emphases added], <em>“</em><em><span><strong>Actually, our primary focus is to support and promote alternatives to abortion. It is in service to that effort that data becomes a tool, among others, to help strengthen our services and nurture our network of affiliated locations.</strong>”</span></em></span></span></p> <p> </p> <p> </p></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><h2><span><span><em>Looking forward</em></span></span></h2> <p><span><span>Privacy harms are a time-shifted risk. What at one point in our life may seem to be acceptable and harmless information we’re asked to provide, may in the future be used against us in ways we cannot expect or prepare for. It is unclear how information collected from those visiting crisis pregnancy centres is combined, analysed, and shared with other centres, organisations, and networks. People visiting crisis pregnancy centres – intentionally or not – are left unaware of how their personal information is being accessed, analysed, stored, and shared. As people continue their lives after visiting a centre, it remains unclear how the information they provided during their visit will continue to exist and be used.</span></span></p> <p><span><span>Our online-activity – increasingly blurred with our offline-activity – is being <a href="https://privacyinternational.org/long-read/2433/i-asked-online-tracking-company-all-my-data-and-heres-what-i-found">tracked</a> by companies, advertisers, and others, who aim to collect information about our thoughts, fears, desires, and insecurities, and <a href="https://privacyinternational.org/explainer-graphic/2428/have-you-heard-these-companies-because-theyve-likely-heard-you">use these drivers to profile and target us</a>. Some of these companies are already using GPS locations to target advertisements at “abortion-minded women” based on <a href="https://rewire.news/article/2016/05/25/anti-choice-groups-deploy-smartphone-surveillance-target-abortion-minded-women-clinic-visits/">their proximity</a> to abortion clinics. Companies are tailoring online ads to <a href="https://foreignpolicy.com/2018/06/01/abortion-referendum-how-ireland-resisted-bad-behaviour-online/">influence</a> abortion legislation. It is being reported that anti-abortion organisations are pushing <a href="https://www.maltatoday.com.mt/news/national/95506/antiabortion_groups_using_deception_and_intimidation_tactics_prochoice_coalition_warns?fbclid=IwAR23xTJEVkCsjoPAZFfKUbMutxK45GO23TOqBpKzRZkC4j9-qFiu2YZDil0#.XSiT45NKhxh">misinformation</a> through advertising platforms. </span></span></p> <p><span><span>Organisations like Heartbeat International recognise the importance of out-of-the-box anti-abortion websites and templates. They also appear to understand the power of collecting and centralising data. Extend Web Services, Heartbeat, Next Level, and Option Line’s privacy policies and terms of use remain vague in detail as to the extent to which data collected from individual Heartbeat affiliate crisis pregnancy centres is shared within or external to the larger network. This data could be highly valuable in understanding what messaging and which tactics are most effective to furthering Heartbeat’s mission – as well as shaping the larger movement’s political direction.</span></span></p> <p><span><span>The near total lack of transparency around how data is being used and shared by anti-abortion networks such as Heartbeat International is troubling. Such data could be used in ways that those who provide it may not have anticipated or approve of, including to potentially undermine their reproductive rights. Privacy and strong data protection are therefore crucial, in many ways, to ensuring people are able to exercise their reproductive rights. It is important that light continues to shine on the technologies being developed to trace and track those seeking medical help online.</span></span></p></div> </div> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-inline"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/data-exploitation" hreflang="en">Data Exploitation</a></div> <div class="field__item"><a href="/topics/gender" hreflang="en">Gender</a></div> <div class="field__item"><a href="/topics/health-data" hreflang="en">Health Data</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/safeguarding-peoples-dignity" hreflang="en">Safeguarding Peoples&#039; Dignity</a></div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/free-choose" hreflang="en">Free to Choose?</a></div> </div> </div> Mon, 22 Jul 2019 17:39:14 +0000 tech-admin 3096 at http://privacyinternational.org #FacePalm: FaceApp's Terms of Use http://privacyinternational.org/news-analysis/3095/facepalm-faceapps-terms-use <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>While people may think that providing their photos and data is a small price to pay for the entertainment FaceApp offers, the app raises concerns about privacy, manipulation, and data exploitation—although these concerns are not necessarily unique to FaceApp.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>According to FaceApp's </span><span><a href="https://faceapp.com/terms">terms of use</a> </span><span>and </span><a href="https://www.faceapp.com/privacy"><span>privacy policy</span></a><span>, people are giving FaceApp "a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license" to use or publish the content they upload, and FaceApp can track their location, what websites they visit, when they open the app, and other metadata. In applying filters to peoples’ photos, FaceApp will create a detailed </span><span><a href="https://privacyinternational.org/topics/biometrics">biometric</a> </span><span>map of their faces--which can be </span><a href="https://privacyinternational.org/advocacy/2835/our-response-westminster-hall-debate-facial-recognition"><span>as unique to them as their fingerprint or DNA</span></a><span>. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>People are right to be alarmed by terms of use like the one FaceApp has—as they should be with similar apps. It is not clear how FaceApp stores, uses, or manipulates peoples' data, including the detailed biometric maps of their faces, and this could change over time as profit incentives and technologies change. Even if you delete FaceApp, there is nothing in the terms of use that governs what the company will do with all the data they have collected about you. </span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>Because of the uniquely identifying nature of our faces and our inability to change them, cataloguing and storing peoples' faces in a <a href="https://www.privacyinternational.org/explainer/1310/big-data">database that can be mined indefinitely</a> is problematic. A biometric map of someone's face isn't just used for unlocking smartphones, it is now a highly-prized commodity by governments and tech companies used to train <a href="https://privacyinternational.org/feature/863/algorithms-intelligence-and-learning-oh-my">algorithms</a> and for <a href="https://privacyinternational.org/feature/2726/police-are-increasingly-using-facial-recognition-cameras-public-spy-us">facial recognition</a>-enabled <a href="https://privacyinternational.org/topics/mass-surveillance">mass surveillance</a>. In the future such biometric maps could be used for all sorts of purposes that people may not anticipate.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The fact that the app is based in Russia, where companies are obliged by law to store their data centres in the territory and make it <a href="https://privacyinternational.org/blog/1296/lawful-interception-russian-approach">accessible to security agencies</a>, is also concerning, although it has been </span><span><a href="https://www.forbes.com/sites/thomasbrewster/2019/07/17/faceapp-is-the-russian-face-aging-app-a-danger-to-your-privacy/#2797f66b2755">reported</a> </span><span>that at least in some cases the data isn't being transferred to servers based there, but is stored in Amazon data servers in the United States and Australia.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><strong>What can you do?</strong></p> <ul><li>If you've used FaceApps and are concerned about your data - <a href="http://twitter.com/home?status=Hey%20%40faceapp_ai%2C%20what%20happens%20to%20the%20pictures%20of%20my%20face%20when%20I%27m%20using%20your%20app%20%F0%9F%A7%90%3F%20Can%20I%20get%20it%20deleted%20from%20your%20server%20before%20I%20look%20ANYTHING%20like%20your%20app%20shows%20me%3F%20(via%20%40privacyint)">Tweet at them and ask for deletion</a> !</li> <li>Under some data protection laws, such as the EU's General Data Protection Regulation, you may have the right to request that a company deletes your personal data, under certain conditions. To find out more, read our <a href="https://staging.privacyinternational.org/blog/2346/legal-tidbits">explainer</a>.</li> <li>You can find out more about what rights you have over your personal data in <a href="https://privacyinternational.org/sites/default/files/2018-09/Data%20Protection%20COMPLETE.pdf">our guide</a>. </li> <li>Tell companies to stop exploiting your data! We've made it easy for you <a href="https://privacyinternational.org/mydata">here</a>.</li> <li>Learn more about <a href="https://privacyinternational.org/long-read/3088/our-data-future">Our Data Future</a> and the steps you can take in shaping it: w<span>e have to decide how to </span><span>protect, enhance, and </span><span>preserve our rights</span><span> in a world where technology is everywhere and data is generated by every action.</span><span> The future is not a given: our actions and decisions will take us where we want to go.</span></li> <li>Learn more about how <a href="https://privacyinternational.org/campaigns/when-social-media-makes-you-target">social media makes you a target</a> and solutions for stopping platforms and devices you use from being weaponised against you.</li> </ul><p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><strong><span>What is Privacy International Doing?</span></strong></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>We are pushing global companies to commit to the privacy of users around the world. We recommend such companies:</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <ul><li><span><span><span>Adopt the European Union’s General Data Protection Regulation (GDPR) as a baseline standard for all users, as well as comply with any national legislation that requires stronger safeguards for users.</span></span></span></li> <li><span><span><span>Demonstrate compliance with the </span><span><a href="https://privacyinternational.org/explainer/41/101-data-protection">data protection</a> </span><span>standards, including by being clear and transparent on what laws they apply to protect users’ data and how they have implemented them.</span></span></span></li> <li><span><span><span>Use reasonable security safeguards to protect personal information from loss, unauthorised access, destruction, use, modification, or disclosure.</span></span></span></li> </ul><p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>We recommend that governments:</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p> <ul><li><span><span><span>Make </span><span><a href="https://privacyinternational.org/topics/data-protection">data protection legislation</a> </span><span>as strong as possible and not undermined by legal loopholes and exemptions.</span></span></span></li> <li><span><span><span>Ensure that any data protection framework is effectively implemented and enforced, including through the appointment of an independent regulator or authority with an appropriate mandate and resources.</span></span></span></li> </ul><p><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span><span>The onus should be on the companies, institutions, and governments processing our data to protect it both by design and by default.</span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></span></p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Paris_Tuileries_Garden_Facepalm_statue.jpg" width="1024" height="683" alt="im" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Paris_Tuileries_Garden_Facepalm_statue_0.jpg" width="1024" height="683" alt="im" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Paris_Tuileries_Garden_Facepalm_statue_1.jpg" width="1024" height="683" alt="im" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-above"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/modernise-data-protection-law" hreflang="en">Modernise Data Protection Law</a></div> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/data-protection" hreflang="en">Data Protection</a></div> <div class="field__item"><a href="/topics/facial-recognition" hreflang="en">Facial Recognition</a></div> <div class="field__item"><a href="/topics/general-data-protection-regulation-gdpr" hreflang="en">General Data Protection Regulation (GDPR)</a></div> </div> </div> <div class="field field--name-field-location-region-locale field--type-entity-reference field--label-above"> <div class="field__label">Location</div> <div class="field__items"> <div class="field__item"><a href="/location/russia-and-central-asia" hreflang="en">Russia and Central Asia</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/challenging-corporate-data-exploitation" hreflang="en">Challenging Corporate Data Exploitation</a></div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/when-social-media-makes-you-target" hreflang="en">When Social Media makes you a target</a></div> </div> </div> </div> </div> Wed, 17 Jul 2019 14:17:28 +0000 staff 3095 at http://privacyinternational.org Our Data Future http://privacyinternational.org/long-read/3088/our-data-future <span class="field field--name-title field--type-string field--label-hidden">Our Data Future</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span lang="" about="/user/45" typeof="schema:Person" property="schema:name" datatype="">harmitk</span></span> <span class="field field--name-created field--type-created field--label-hidden">Wednesday, July 17, 2019</span> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span>By Valentina Pavel, PI Mozilla-Ford Fellow, 2018-2019</span></p> <p><span>Our digital environment is changing, fast. Nobody knows exactly what it’ll look like in five to ten years’ time, but we know that how we produce and share our data will change where we end up. We have to decide how to </span><span>protect, enhance, and </span><span>preserve our rights</span><span> in a world where technology is everywhere and data is generated by every action. </span><span>Key battles will be fought over </span><span>who can access our data and how they </span><span>may </span><span>use it. It’s time to take action and shape our </span><span>future.</span></p></div> <div class="field field--name-field-repeating-image-and-text field--type-entity-reference-revisions field--label-hidden field__items"> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><br /><strong><span>Here’s where we start off</span></strong></p> <p><span>We are at a crossroads. We need to address challenges </span><span>posed by technological choices made by governments and dominant companies</span><span> – </span><span>resulting in </span><span> the erosion of privacy and the centralisation of power</span><span>. T</span><span>here are also </span><span>promising opportunities</span><span>. We have </span><span>fascinating</span><span> technolog</span><span>ies being developed. Legal safeguards are emerging still -- for instance</span><span> in Europe people have a strong privacy</span><span> rights</span><span> </span><span>embedded in the European Convention on Human Rights, and in the EU </span><span>data protection</span><span> rules</span><span> </span><span>are providing a new base standard for protections with <a href="https://eur-lex.europa.eu/eli/reg/2016/679/oj">GDPR</a>. Courts across the world are responding to interesting challenges, whether India's ruling in 2016 on the right to privacy and Jamaica's on the identification system, and the U.S. on cell phone location data. </span><span>Countries around the world are</span><span> also</span><span> strengthening data protection laws – today </span><span><a href="https://ssrn.com/abstract=3386510">more countries have some form of data protection</a> than don’t. </span></p> <p><span>But most importantly, we have the power to unite and transform. The Internet and the future built upon it<span> </span>could be a collective of action-driven critical thinkers. We can change the course of our path. The future is not a given: our actions and decisions will take us where we want to go.</span></p> <p><span>In today’s digital environment, here’s what we’re not OK with:</span></p> <p><strong><span>Data monopolies and surveillance</span></strong></p> <p><span>First, we don't like data monopolies. Big tech companies harvest our data on a massive scale, </span><span>and exploit it </span><span>for their own interests. They are the new feudal lords. They <a href="https://www.politico.eu/article/down-with-the-data-monarchy-protection-platforms-facebook-whatsapp/ ">aspire to surveil, predict and automate</a> our lives and societies. </span></p> <p><span>Surveillance machines like <em>Alexa</em> and <em>Google Assistant</em> have entered into homes. They <a href="https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio">listen carefully</a> to what we say, and they watch closely <a href="https://www.technologyreview.com/f/612660/a-man-asked-for-his-data-from-amazon-and-they-sent-him-1700-recordings-of/">what we do</a>.</span></p> <p><span>Mobile devices and wearables travel with us everywhere we go, extracting and sharing data about our every footstep. Publicly and privately, we are being watched.</span></p> <p><span>This is not just passive surveillance, it's active control over our lives and social structures. Companies predict our behaviour and infer our interests, potentially <a href="https://www.ynharari.com/book/homo-deus/">knowing us better than we know ourselves</a>. They offer an illusion of choice, when in fact they can decide which information reaches us and which doesn't. They use <a href="https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf">dark patterns</a> to discourage us from exercising our rights.</span></p> <p><strong><span>Things happening behind our backs</span></strong></p> <p><span>It’s not just the data we knowingly generate – such as photos or posts on social media – that gets harvested. It’s also the data we indirectly generate: our location history, browsing activity, what devices we use and lots more information that has been derived, inferred or predicted from other sources. For example, from my browsing patterns, companies can predict my gender, income, number of children, my shopping habits, interests and insights about my social life. While in many countries data protection laws allow me to access all the data that online tracking companies collect, in most other places, <a href="https://privacyinternational.org/feature/2433/i-asked-online-tracking-company-all-my-data-and-heres-what-i-found">I can’t</a>. But things are not that simple. Even in the places where I can access data about me, I first need to know which company is collecting data. It’s not a surprise that often I am unaware of the pervasive tracking. If I do find out, asking a company for access to data is not a trivial task. It might be full of obstacles and difficulties – some intentional, some because of bad design.</span></p> <p><strong><span>Restrictions on freedom of expression</span></strong></p> <p><span>Internet platforms limit our freedoms at an unprecedented scale. They monitor and decide what content is allowed on their apps and websites. They built algorithms that decide what content is allowed and what isn’t. What’s worse, in Europe, <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=uriserv:OJ.L_.2019.130.01.0092.01.ENG">a new copyright law</a> adopted in March 2019, forces the power on platforms to surveil, censor and dictate how we express and communicate with each other online.</span></p> <p><strong><span>'Free' services</span></strong></p> <p><span>We don’t believe in 'free' services anymore – we know our data is being exploited. Abusive data practices are used to track, target and influence our behaviour and opinions. In 2018, Cambridge Analytica showed us how this practice <a href="https://privacyinternational.org/news-analysis/2857/cambridge-analytica-gdpr-1-year-lot-words-and-some-action">undermines democratic processes and weakens democracies</a>. </span></p> <p><span>This is not to say that the solution to 'free' services is paying for privacy-enhanced products. Privacy should not be something you have to pay for. And companies offering 'free' services should not use them to mask data exploitation practices.</span></p> <p><span>How? Here’s where our journey begins. Below are four possible future scenarios. They’re presented through the eyes of Amtis, a persona representing an everyday individual caught up in the digital challenges of the future. </span></p> <p><span>Amtis travelled forward in time and found four different futures: one where data is treated like property and data markets are created; one where people are paid for data as labour; one where data is stored in nationalised funds; and one where users have clear rights concerning their data.</span></p> <p><span>Here’s what I collected from Amtis’ diary. I’ve added some of my own reflections at the end of each chapter as well, just to spice things up a bit :)</span></p> <p><em><span>If you want to engage with this work further and write a comic book, a song, make an art exhibition, an animation or a short video, drop me an email at <a href="mailto:valentinap@privacyinternational.org">valentinap@privacyinternational.org</a>.</span></em></p> <p><em><span>I’d love to hear your ideas or to pitch you mine ;)</span></em></p> <p><em><span>I want to give special thanks to everybody who helped me shape and improve this project: My Privacy International and Mozilla colleagues, my fellow fellows <a href="https://twitter.com/meochaidha">Maggie Haughey</a>, <a href="https://soupysecurity.com/">Slammer Musuta</a>, <a href="http://www.tinysubversions.com/">Darius Kazemi</a> and many other friends and wonderful people: <a href="https://twitter.com/EveDaRib">Andreea Belu</a>, <a href="https://twitter.com/catileptic">Alexandra Ștefănescu</a>, <a href="https://twitter.com/meronymon">Lucian Boca</a>, <a href="http://thisisjoshwoodard.com/">Josh Woodard</a>, <a href="https://www.philipsheldrake.com/">Philip Sheldrake</a>, <a href="https://www.hiig.de/en/christopher-olk/">Christopher Olk</a>, <a href="https://twitter.com/tlehtiniemi">Tuuka Lehtiniemi</a>, and Renate Samson.</span></em></p> <p><em><span>Illustrations by <a href="https://cuantikastudio.com/">Cuántika Studio</a> (Juliana López - Art Director and Illustrator, Sebastián Martínez - Creative Director).</span></em></p> <p><em><span>Copyediting by Sam DiBella.</span></em></p> <p><span>This work is licensed under a <a href="https://creativecommons.org/licenses/by-sa/4.0/">Creative Commons Attribution-ShareAlike 4.0 International License</a>.</span></p> <p><span><em>Suggested citation: <a href="https://privacyinternational.org/long-read/3088/our-data-future">Our Data Future</a>, by <a href="https://medium.com/@valentina_7678">Valentina Pavel</a>, former Mozilla Fellow embedded at Privacy International, project and illustrations licensed under <a href="https://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA 4.0</a>.</em></span><br />  </p> <h3><strong><span>SCENARIO 1 - DATA PROPERTY</span></strong></h3></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Scenario_01.jpg" width="1800" height="1013" alt="Data ownership" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><span>In 2030 Amtis finds a future where property rights for data were adopted. Here’s how this future plays out:</span></p> <p><em><span>My data, my turf. This was the first graffiti I saw as I was walking down the street and I said to myself, “Yeah, big corp, we’re going to get you good!”. I am fed up with companies making insane amounts of money from my data. If this is the game we’re playing, I want my fair share. </span></em></p> <p><em><span>I was not the only one thinking like this. A few years back there was a strong push towards adopting property rights for data. </span></em></p> <p><em><span>I was on my way to see a data property consultant. I just got fired from my job and I desperately needed a way to survive in this city. There was a big queue in front of the consultancy firm, in a tall glass building with a huge advertisement on it: 'Sell Your Data'. The first meeting was free.</span></em></p> <p><span><strong>OK, I have property rights over data. Where do I start?</strong> </span></p> <p><em><span>I decided to see a consultancy because I didn’t know what data property actually means. For example, can I rent my data? Can people now buy and sell data like on the stock market? Do I need a data broker to bid for me? How much money can I make from giving data to my day-to-day services? How much will my data be worth in the future? Can I leave it as an inheritance to my children? </span></em></p> <p><em><span>What I am most interested in now is what’s the fastest way to profit. At the same time, I don’t want to be fooled like in the past with 'free services'. I may have lost my job, but I still have dignity left. I wouldn’t want my data to be used in any way companies please. Would I be able to set my own conditions?</span></em></p> <p><span><strong>Little money for a lot of data </strong></span></p> <p><em><span>While I was waiting for my turn at the consultancy, I received a notification that I had unlocked a special data payment feature on my social media platform. I was receiving this offer based on my location. The offer was valid only for people on the waiting list for this particular consultancy firm. They must be a hotshot company to have such personalised deals. I’m sure they receive a commission for this. They offered 0.50 units for every social interaction I made: post, uploaded photos, tagging people and objects, clicking on links. I just used the app as I normally do and in about 10 minutes I got close to earning 5 units – that’s about the price of a public transportation ticket. Not exactly a fortune, but in my situation, anything helps. I’m not sure if this is a one-off promotion or if they would offer the same reward tomorrow.</span></em></p> <p><span><strong>Credit backed by data</strong></span></p> <p><em><span>The social media company was also advertising data credits. They recently started operating like a <a href="https://www.theverge.com/2019/6/27/18760384/facebook-libra-currency-cryptocurrency-money-transfer-bank-problems-india-china">bank</a>. Their new product is a financial credit package which could be backed by data as collateral. Because data is seen as property, instead of guaranteeing to pay back the credit with my apartment or goods, I could vouch for it with my data. The company knew my exact identity, my financial transactions, but more importantly they knew my social behaviour, how trustworthy I was and how much my data could be worth. This sounds like a fast solution for my situation, but let’s see how the meeting with the consultant goes.</span></em></p> <p><span><strong>Full data scan for price simulations</strong></span></p> <p><em><span>It was almost dark when I finally entered the consultant’s office. The company had an entire floor of 'data officers'. They were basically sales people with programming skills. I was escorted to one of the data officers. She welcomed me and explained the first steps. In order to receive a full analysis of how much my data is worth, I needed to hand in my devices and wearables for data scanning. I was <a href="https://solid.inrupt.com/">storing all my data locally</a> for privacy reasons, but if I wanted to learn the entire worth of my data, I had to hand in the data. </span></em></p> <p><span><strong>Can’t get rid of 'free' services</strong></span></p> <p><em><span>She also explained the privacy policy for this simulation. The data that’s extracted from my devices would be shared with my social media company and partners. This is why the first consultancy session is free. Without a job, I didn’t really have a choice. I would have to give in, even though I felt used. Despite the bitterness of compromise, I decided to continue.</span></em></p> <p><em><span>While my data was scanned, she ran a virtual reality programme. She said the programme would walk me through the basics of data property and that I could ask all my questions in real time, receive feedback and get simulations based on the data analysis that was running in the background on my devices. </span></em></p> <p><em><span>The question that bugged me the most was whether there was any way to know where my data travels and which partner companies and affiliates get their hands on it. Is there some kind of body where I can report data property abuses? And what will they do? In the case of physical property, I can imagine that if my bike is missing, I can report it to the police and maybe they will try to find it. If they don't find it, nobody will give me a new bike. The maximum that can happen is to receive compensation for the bike from an insurance company for example. But it's unlikely that I will ever get the same bike that I lost. Does the same apply for data property?</span></em></p> <p><span><strong>Tailor-made AI assistant for data transactions</strong></span></p> <p><em><span>At the end of the virtual programme, I could decide if I wanted her to programme a tailor-made AI assistant for my data transactions. For example, I could set up price alerts to see the best deals that companies offer for data batches, and I could also rent my data to different companies as a sort of subscription. The simulation explained everything for me:</span></em></p> <p><span><strong>Real-time bidding for selling data</strong></span></p> <p><em><span>To start, I learned how real-time bidding architectures for data work. First, you select which devices will join the bidding platform. Your devices are linked to your identity and everything generated from them is considered your property. And all the data generated from those devices feeds straight into a centralised platform. There are a number of companies registered with this bidding platform that will have access to your data after the transaction is completed. Once your data gets on the bidding platform, you can immediately auction it. </span></em></p> <p><em><span>The AI assistant would help me set price ranges and tell me how others sold similar data points historically, what types of data transactions are trending, and which data are likely to be more profitable.</span></em></p> <p><span><strong>Time-locked subscriptions for data</strong></span></p> <p><em><span>As an alternative model, I could put my data in subscription offers to companies. I can ask my AI assistant to set the subscription price. Companies can buy the data subscription from me for a limited period of time. Start-ups in particular like this model because it ensures them a continuous stream of data. However, the downside is that as companies grow bigger or diversify their services, if my data becomes irrelevant to them, they can simply stop the subscription without explanation or notice. </span></em></p> <p><span><strong>Managing data transactions is overwhelming</strong></span></p> <p><em><span>The AI assistant was subscription-based. It wasn’t cheap, but not super expensive either. Given my condition, I couldn’t afford it. The consultant said they are partners with the social media company I frequently use, so she suggested that I apply for a credit for the assistant. </span></em></p> <p><em><span>Of course, I could personally manage my data transactions without the assistant, but that would mean I have to do it full-time, read all the conditions and fine prints, make sure I am not being scammed and spend a lot of time hunting for the best deals. I am not even sure I would be able to understand all the terms.</span></em></p> <p><em><span>In the end I made the credit and got the AI assistant. I added only a few parameters, so the data officer didn’t have much to programme and customise in my case. I desperately needed money, so my bar was really low. I hope the assistant will help me maximize the value of my data so that I can pay my credit in data in half the time. </span></em></p> <p><em><span>As I was leaving the consultancy, I started listening to the World Data News Channel. There was an interview with a girl named Lucy. She was talking about how inspired she was by <a href="http://jenniferlynmorone.com/">Jennifer Lyn Morone</a> which came up with the concept of People Inc. Jennifer Lyn Morone was an art student when she incorporated herself as a business to protest against extreme capitalism. She believed that it was the only way to stop corporate exploitation. Jennifer Lyn Morone started the trend of Incorporated Persons and apparently she's been getting more and more followers. Lucy herself became a copycat.</span></em></p> <p><span><strong>From individuals to Incorporated Persons</strong></span></p> <p><em><span>The concept of Incorporated Persons basically assures that your identity, name, IP address and every bit of data you create is your company’s property. If others abuse your data or touch it without your permission, theoretically, you can sue them.  People Inc. claim it’s an effective way to put a fence around your data. Human Corporations have leverage on data greedy companies. Also, if you want to make a profit, you can market all sorts of data services and products. </span></em></p> <p><em><span>For example, if someone wants a picture taken with their friends in a pub or asks for advice where to eat in town, Jennifer Lyn Morone Inc. can <a href="http://we-make-money-not-art.com/jennifer_lyn_morone_inc/">offer this for a fee</a>. And she can increase her efficiency in life too. She says: “If my friends and family became corporations I knew exactly who I would use and for what and I know who I would invest in, not only because of what they can do but because of who they are.”</span></em></p> <p><em><span>It wasn’t exactly easy to start Your Name, Inc. You needed legal, entrepreneurial and technical skills to run your company. So this model was more popular with middle-class, educated people that were able to sustain this effort. In that part of the world, Human Corporations function in a totally different type of social dynamic. </span></em></p> <p><span>I don’t know how you feel, but for me reading Amtis diary on data property was unsettling. </span></p> <p><span>Here’s what I make out of this story on a more objective and legal level. If you're eager to read Scenario 2, <a href="#scenario2">click here</a>.</span></p> <p><span><strong>Reflections on Scenario 1</strong></span></p> <p><span><strong>Data property does not mean more or better control </strong></span></p> <p><span>Creating a new law that attaches property rights to data is problematic. Due to the nature of data and how it's used in practice, it is very doubtful whether you can have exclusive property rights over it. It’s not as simple as with material goods, where you either have them or you don’t. Data can be here and everywhere and can be copied and transmitted at almost zero cost. Therefore, there are limited barriers that can be exercised with property rights and exclusivity.</span></p> <p><span>The general idea around property rights is for you to keep and enjoy that particular possession. Property usually comes from an approach of non-disclosure, not one where you want to disclose by default. If you decide to lease your property, the tenant is entitled to use and possess the property until the end of the agreement. In this case, there will be limited claims you can make as property owner during the renting period.</span></p> <p><span>Also, consider homeowners’ associations. You may be able to buy a house, but you might need to get permission if you want to redesign your garden or change the paint colour. You can’t always do what you want, even if you do have property rights; they don’t automatically imply that you have absolute supremacy over what you possess. The same would mean for data property. Even if you have property rights over data, this will not necessarily mean you will have unlimited powers over it. </span></p> <p><span><strong>One-off transaction</strong></span></p> <p><span>Once you sell data, it’s a one-time transaction that can’t be simply reversed. Moreover, once you sell, the company can do whatever it wants with the data. There could be some conditions for selling, but do we expect people to have real negotiation power with companies? Privacy is also a time-shifted risk. What you might be fine revealing today might not be a good idea to share tomorrow. If you transfer your property rights over data to others, there is no real way to assure that the data won’t be abused. This argument is valid for all data property or data monetization scenarios and will come back in some sections of the Reflections below.</span></p> <p><span><strong>Data monopolies don’t die</strong></span></p> <p><span>Property rights don’t change the fact that companies can amass large batches of user data. Nor does it undo the fact that certain companies today have already done that - take for example companies like Facebook, Amazon, Netflix and Google.</span></p> <p><span><strong>Accomplices to the same broken model</strong></span></p> <p><span>We don’t know what the main digital revenue model will be in the future, but we know now that some of the biggest companies today rely on advertising. This means that every click or purchase that I make translates into money for them. The more clicks and products I buy, the more money they get from advertisers. In a system where I get paid for data, this means the more profitable companies are, the better paid I am. In other words, it’s actually in my interest for them to make more money, so that I get more in return. This feeds the same old game; it makes us accomplices to a broken system, one that we’re trying to move away from. Do we want to legitimise questionable market practices and data abuses for a few pennies?</span></p> <p><span><strong>What is it exactly that I own? </strong></span></p> <p><span>Amtis only briefly hinted at it, but I don’t think there is an easy consensus on what exactly I can own. Is it my bank and credit statement, my smart meter reading, my GPS coordinates? How about my picture with my friends and family? If they are in the picture, do they own it too? What happens with genetic data? It contains information about my family, so if I reveal it, will my family also have property rights over it? What about my future children and grandchildren, too? Data about me is also data about other people.</span></p> <p><span>And what happens to the data about me that is generated without my knowledge? Would this be covered too? </span></p> <p><span>Designing a system of data property rights would require a classification and inventory of all possible data types that can be owned, along with their state (e.g., data in transit, data in storage). Questions would include: What data do we assign property rights to? Is it data that is collected, or analysed, or aggregated, or data that is being profiled? Would data in transit be owned as well, or only data that is already stored somewhere? Could the same data have multiple owners? </span></p> <p><span><strong>Property is not compatible with the nature of data</strong></span></p> <p><span>Here’s a legal argument to keep in mind. Intellectual property rights may at first seem akin to data ownership, but there is a fundamental difference. For example, copyright law protects the original expression of an idea, not the idea or information itself. The colours that make a painting are not protected, but the original way in which they have been expressed can be. In the case of a Facebook post, the way I express myself can be protected by copyright, but I do not own each individual word that makes that post. However, in the case of data gathered by sensors there is no creative effort involved, it's raw data. When we are talking about data, we can’t say that we have intellectual property rights over it, because data is not protected per se, especially if there is no intellectual work behind it. </span></p> <p><span>But let’s say we are doing something with the data. Let’s say we are collecting it in a database and this takes quite a lot of effort and resources. In the case of EU database law, the protection applies to the creation of the database, not the data entries themselves. More specifically, the resources invested in making the database are the object of protection, not the creation of a database in itself. </span></p> <p><span>Similarly, if we discuss compilations (such as an anthology for example), which are generally protected under copyright law, the protection lies with the originality of selection and arrangement of information – it does not protect the individual elements that make the compilation.</span></p> <p><span>Aside from this, there is more critique that can be addressed to intellectual property law. Authors are usually part of a long tail and receive only a small compensation for their work. There is a lot of power asymmetry between authors and the beneficiaries. Think for example of the music industry. Record labels generally get all the money and authors get paid last. Few authors have real negotiation power, good legal representation and the possibility to enforce their rights. That is why they earn more money through concerts than from their records.</span></p> <p><span>On a more fundamental level, property rights are alienable, which means you can essentially transfer them from one person to the next. Human rights such as the right to privacy and data protection are inalienable. If you transfer them, they lose any meaning. <a href="https://www.nytimes.com/2019/07/05/opinion/health-data-property-privacy.html">What's the point</a> of freedom if you renounce it? And what is more: if you sell data but want to keep some basic rights about the uses of that data, you are actually thinking about a rights-based approach, not a property one.</span></p> <p><span><strong>Data markets in practical terms</strong></span></p> <p><span>Let’s imagine we somehow figured out all the questions related to property. Then we still need to consider the practical aspects of a data market. Let’s say in a property system, some individuals would sell their data. Who is going to set the price? Based on what criteria? Am I going to have negotiation power in relationship to companies? How would the money actually be transferred in practice? Will I need to spend time brokering or auctioning data? Where would I go if I am not satisfied? Will I spend my time monitoring companies to make sure they actually respect the contractual agreement? Will I go to court if they don’t, and spend years awaiting a resolution (since court decision generally take a long time)? Also, how would consent be managed without producing decision exhaustion? Will we need to give instructions to our own bidding bots? Data property puts a lot of burden and responsibility on the individual to manage all data exchanges. Does this really mean more control?</span></p> <p><span>Additionally, if I start selling data, companies would have little incentive to promote data transfers or portability from one service to the other, as they will be heavily invested in buying data. If we aren’t able to pull data from one company and move it to another in an easy way, we will just get trapped in the same corporate walled gardens of today. We won’t be able to complain to companies if we are dissatisfied. What’s more, start-ups and small companies would still rely on large investments to compete with big companies when buying data. Regulating licensing of data would be extremely difficult and start-ups would be loyal to their investors, not to innovation and/or social good.</span></p> <p><span>And in the end, how many companies can actually afford to pay me money? Aside from a few well-known big tech companies, there are a zillion small companies that won’t have the resources to pay me. How would a property rights or data monetization system stop them from getting my data? And how will I know if they have my data or not? Who will enforce this? Will I be able to ask them if they have my data, like in Europe with the GDPR?</span><br />  </p> <h3 id="scenario2"><strong><span>SCENARIO 2 – DATA LABOUR</span></strong></h3></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Scenario_02.jpg" width="1800" height="1013" alt="Data labour illustration" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><span>In this next leap to year 2030, Amtis lives the life of a data labourer, being paid wages for data inputs. Here’s how Amtis begins the story:</span></p> <p><em><span>I am in my green pyjamas, but I can’t say for sure if it’s morning or evening. My eyes are red from staring at screens. I am discouraged and very tired. Of course, all these emotions and reactions are registered by my Playbour – my pocket-sized smart console that has basically become my entire life. It’s my connection to family, friends and the world; my health and mood monitor; my life organiser and personal assistant and basically how I earn my living. <a href="https://www.wiley.com/en-us/Uberworked+and+Underpaid%3A+How+Workers+Are+Disrupting+the+Digital+Economy-p-9780745653570">It’s my Play and it’s my Labour</a>.</span></em></p> <p><em><span>These days, fewer and fewer people go to work in an actual office. Everything happens through this device, which captures the data I generate on different platforms and pays me for all these data inputs. But in reality, everything goes to one single company, as most of the other platforms are its partners or affiliates.</span></em></p> <p><strong><span>Shit money, literally</span></strong></p> <p><em><span>Last month I enabled the 'health' add-on on my console, so now it's counting how frequently I go to the toilet and it <a href="//www.dailymail.co.uk/sciencetech/article-3306193/Smart-toilet-analyse-PEE-app-detects-depression-Japanese-expo-reveals-strange-health-gadgets.html">connects to the sensors</a> in the toilet to collect data about my pee and my poop. This data is sent to pharmaceutical companies so they can make better drugs. Maybe I’ll help cure the world from urinary infections, prostate disorders, digestion problems and haemorrhoids. Every now and then I’m tempted to make a bad joke about shit money, but health data pays better than most other types, so I’ll shut up. </span></em></p> <p><em><span>You know what else pays well? My 'entertainment' data. I get bonuses for more juicy data such as my heart rate, eyeball movement and amount of perspiration when I watch movies, listen to an audiobook, play games or read articles. Data associated with political media pay even better. After I learned that trick, my behaviour changed a lot. I am watching all the movies recommended in my Playbour, I am frenetically consuming clickbait articles, and trying to produce as much health-related data as possible. My life and actions are all about how well they pay.</span></em></p> <p><em><span>One time I even took laxatives to get more 'results'. I was happy to see that I could trick the system, but after a few times I guess the algorithm detected a pattern and penalised me. It not only fined my account, but it also placed a ban in pharmacies so that I can’t buy laxatives. Now, if I really have a digestion problem, I am screwed!</span></em></p> <p><strong><span>Training the AI overlord</span></strong></p> <p><em><span>Not many people know what all this is for. Everything that gets captured by this device is meant to train the world’s most powerful AI system. Human input is used to teach this machine all we know, to help it evolve. The master plan is to transform it into an eternal collective extension of our humanity and train it to make better decisions for us. We’re already putting it in charge of our daily, routine decisions. More and more decisions from politicians and our government rely on this supermachine. Would the next step be to give it full control?</span></em></p> <p><em><span>We’re giving away our ability to decide for ourselves and we are trusting the machine to govern our world. We are enslaving ourselves in order to feed it data, because that’s the best way to get paid these days. As people used to say: "<a href="https://www.vox.com/2019/3/27/18216072/boss-socialism-capitalism-neoliberalism">Better to be exploited in a capitalist society than unemployed and destitute</a>".</span></em></p> <p><strong><span>Both user and worker</span></strong></p> <p><em><span>People asked for data markets, so the data I contribute is now paid for as labour. I have full work documentation registered with the Playbour from the moment my first bit of data reached them. The interface of my console shows me how many tasks I have performed, the price that was paid for each, how many days off I am entitled to (calculated based on how hard-working I was) and what contributions go to my pension plan. It’s funny that I am a user of these platforms, but I am also their worker.</span></em></p> <p><strong><span>The Taskometer is the employer’s new evaluation metric</span></strong></p> <p><em><span>Every time a company needs something, there is a federated AI Manager that splits the task into smaller chunks and sends alerts to workers to complete it. You have to be very fast when this happens, to make sure you get the task - just like you did a decade ago with 'crowdsourcing markets'. More advanced versions of the Playbour have an AI that selects the jobs for you, instead of you doing this manually. But this version of the console is more expensive. I am saving up to buy one later. The thing is, if you don’t complete 100,000 micro tasks per month, you don’t get paid at minimum wage level per task. The system works like this: you get paid by the task, but the price varies depending on the total amount of tasks you complete. And there are certain thresholds, and some evaluation criteria such as the quality of data. So you can’t be sloppy. If you’re below 100,000 in your Taskometer, the price per task is so small that you can barely keep your head up. But hey, now we can no longer say there is fake unemployment. The Taskometer certifies my labour and evaluates my work.</span></em></p> <p><strong><span>Data labour unions </span></strong></p> <p><em><span>I tried to speak with some of the union leaders about raising those thresholds. We're counting on them to represent us and protect our labour rights, but data labour unions are still quite young and weak. There aren't many young labourers like me joining unions, many people associate them with the 'old way' of doing things and don't see any value in joining. All this time, unions have struggled to maintain relevance and to adapt to the digital space. But they didn't hold ground, so I am not sure if they can manage to put that much pressure after all.</span></em></p> <p><strong><span>Nobody escapes data exploitation</span></strong></p> <p><em><span>You might think that wealthier people got away in the data labour system. Actually, it’s more nuanced than that. It’s true that they could stay out of data labour and not hook their lives to a Playbour device - but they could not get away from sensors and tracking. The rich started building walled cities where nobody else could afford to live. The cities were sealed so that nobody outside their select group could come in. They used heavy surveillance infrastructure to achieve this. A truly smart city, some would say. And all the data produced by their houses, by their devices, by the sensors in their citadel, was captured by the AI overlord. They were as much trapped in the same AI ecosystem as everybody else, but they had the illusion of privacy and protection from the plebes.</span></em></p> <p><strong><span>Privacy for pennies</span></strong></p> <p><em><span>The 'data sharing economy' and automated services have displaced countless jobs. Most people now sell every bit of their data for a few pennies each. We can now see the true face of this economy based on 'sharing'.</span></em></p> <p><span>Here’s what I make out of this story on a more objective and critical level. If you want to jump straight to Scenario 3 <a href="#scenario3">click here</a>.</span></p> <p><strong><span>Reflections on Scenario 2 </span></strong></p> <p><span>It seems like monopolies cannot be combated through a data labour system. Monopolies don’t simply disappear, even if they start paying people wages - they adapt and persist. A market for data is complicated to achieve in practical terms, but there are other reasons why this model may not be what we are looking for.</span></p> <p><span>A data labour system runs on people fuelling it with data from all possible sources. This deepens the gap between the poor and the rich and it encourages inequality. While the rich can afford not to sell their data, the rest will be vulnerable, exposed and would give in more quickly to exploitative systems. This looks much more like digital feudalism than individual empowerment.</span></p> <p><span>But the discussion goes beyond inequality. In a future where data labour is used to feed and train AI services in all aspects of our lives – from decisions about how we govern ourselves, to our legal system, education and immigration – nobody will win in the long run.</span></p> <p><span>And who knows, maybe in the near future machines will learn from each other and there will not be the need for people to train and feed them data. Machine to machine learning might replace the human input AI services rely on, but today, as a Google employee puts it, “<a href="https://www.theguardian.com/technology/2019/may/28/a-white-collar-sweatshop-google-assistant-contractors-allege-wage-theft">Artificial intelligence is not that artificial; it’s human beings that are doing the work</a>.”</span></p> <p><strong><span>More tracking means more money</span></strong></p> <p><span>It’s true that data labour might be able to solve the problem of indirect data (e.g., data about you that you don’t know is extracted). But I am not sure this is the solution we are looking for. As all human actions, emotions and by-products could be monetised or labelled as labour, there was no more need for people to ask for transparency. In Amtis’ story, people already knew that tracking and sensitive information would bring them more money. There are no more abuses and exploitative practices because we put a price tag on them and thus we acknowledged them provided them with legitimacy. </span></p> <p><strong><span>In the long run, platforms crave quality data</span></strong></p> <p><span>The future will indisputably bring more AI services. To build better AI services, you not only need more data, but you also need data of a certain quality. It’s safe to assume that most workers who participate in a data labour system will be from marginalized, disadvantaged or poor communities. They could generally provide ordinary data, but from a certain point onwards this will not be enough. There will be certain types of data labour tasks that require specific skill sets, which won’t be easy for just anybody to perform. In other words, more educated workers could contribute to a bigger number of tasks, while less knowledgeable workers could pick from only a limited number of assignments. And this will create discrepancies and inequality.</span></p> <p><strong><span>Privacy is a time-shifted risk</span></strong></p> <p><span>Needless to add here, information that I am okay with sharing (or selling) today might get me in serious trouble tomorrow. Change of political regimes is probably the most obvious example. Remember in our not-so-distant history how totalitarian regimes asked you to declare your religion in official documents and then used this information to persecute? Or say you take a DNA test with a company. The police is already using <a href="https://www.bloomberg.com/news/articles/2019-02-01/major-dna-testing-company-is-sharing-genetic-data-with-the-fbi">DNA testing companies to find suspects</a>.</span></p> <p><span>Human beings instead of human doings</span></p> <p><span>In the end, do we really want to monetize all aspects of our lives? Will this bring us well-being and social self-determination? Do we want to define ourselves by our data (generated for others) or by who we are? And more importantly, how can we make things better if everybody will still be looking for more financial gain? If we want to set ourselves up for a better future, we should probably look for ways to <a href="http://autonomy.work/wp-content/uploads/2018/05/Nick-Christine-Social-wealth.pdf">reduce wide-spread monetisation of our lives</a>.</span><br />  </p> <h3 id="scenario3"><strong><span>SCENARIO 3 – NATIONAL DATA FUNDS</span></strong></h3></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Scenario_03.jpg" width="1800" height="1013" alt="National Data Funds illustration" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><span>In this third leap to 2030, Amtis sees that people have created national data funds where citizens and governments together own the data that is being generated by sensors or by the services people use.</span></p> <p><span>Here’s how Amtis lives this time:</span></p> <p><strong><span>Smart commuto-mobile</span></strong></p> <p><em><span>In the busiest parts of the city there are no more cars. There are only special lanes for drones, houndopacks – fast <a href="https://www.bostondynamics.com/">robots that run like dogs</a> to deliver packages, and smart commuto-mobiles – slim electric booths where you can sit on your way to work and look at your phone without being worried about the traffic. It’s pretty cool – you can check your emails, take phone calls, schedule meetings, listen to a podcast on the best route to work. The commuto-mobiles drive themselves and are connected to high resolution cameras installed all over the city, so they have 360 degree eyes of the road. This also means that every conversation you have while in the smart commuto-mobile is recorded and processed in real time by an <a href="https://www.nytimes.com/2012/03/04/business/ibm-takes-smarter-cities-concept-to-rio-de-janeiro.html">AI Traffic Grid</a>. This measure was introduced in order to ensure a high level of empathy in society. For example, if you get angry and raise your voice while talking in your commuto-mobile, the vehicle pulls over and gives you a 20 minute time out to cool off, while it plays music and guided meditation practices. I bet you wouldn’t like to have this happening before an important meeting  you don’t want to be late for! There is no way you can hop back in and make it run until you have calmed yourself down. And it <a href="https://www.theatlantic.com/entertainment/archive/2016/10/black-mirror-nosedive-review-season-three-netflix/504668/">gets registered</a> into your 'good behaviour' record too! The AI Traffic Grid also has sensors that analyse your facial expression to <a href="http://www.engadget.com/2019/05/02/stealing-ur-feelings-ar-film-facial-recognition-tribeca-2019-kanye-pizza/">determine emotions</a>, your gestures to determine thought processes and intentionality, and your breathing patterns to determine your heart rate and anxiety level.</span></em></p> <p><strong><span>Employer knows if you worked during your commute</span></strong></p> <p><em><span>With the commuto-mobile, traffic is much faster and the risk of accidents is very low. Basically, while you are in your smart booth, you don’t need to worry about anything besides boosting your productivity. People started asking for this time to be included in their <a href="https://www.theguardian.com/technology/2019/may/28/a-white-collar-sweatshop-google-assistant-contractors-allege-wage-theft">working hours</a></span><span>. And why not? They jump from bed straight into their smart commuto-mobiles and get to work – who wants to lose time in traffic? This is precisely why the commuto-mobiles were brought to the market. The commuto-mobiles are connected to the Internet, and work reports are sent to employers to show exactly how you spent your time on your way to work. Everything gets recorded in the journey log anyway.</span></em></p> <p><strong><span>Data as a service for the private sector</span></strong></p> <p><em><span>All the information that is generated from the city and the services we use is amassed into a big database that is managed by the government and citizens. This fund makes it easier for new companies that want to enter the market and need a lot of data. The idea is that if you want to use any of the datasets, you have to pay to get access. We’re calling this Data as a Service because it turns companies into the customers of our city. Access to these datasets is highly regulated and the more datasets you want to use, the more expensive it becomes. Also, depending on the sensitivity of the information, the price goes way up. For example, driving information is less expensive than health data. The money that the government receives from companies gets redistributed back to the people. This redistribution will be exactly the reason why the government will always seek new ways to collect more data, because they can use it to generate the revenue they need to maintain the 'smart' cities they have created. </span></em></p> <p><em><span>The data is anonymised, so everything is safe and nothing can go wrong. Except <a href="https://www.nature.com/articles/srep01376">re-identification</a></span><span> of individuals from anonymised datasets can happen more easily than you think.</span></em></p> <p><em><span>Individuals and researchers can also apply to get the data they need. If the project benefits the entire community, your idea can be subsidized by the government. Through these subsidies, smaller, independent companies started to run good services, and it became harder to maintain a monopoly position on the market.</span></em></p> <p><strong><span>We can build our own services</span></strong></p> <p><em><span>If there is a gap in the market, we can build the services that are missing ourselves. We have all the data at our disposal, and I think gradually this is what we are moving towards. However, for the moment these so-called 'collective services' are not that great. Most of them are not easy to use and look really, really 2000.</span></em></p> <p><span>Here’s what I make out of this story on a more objective and critical level. If you’re excited to move to Scenario 4 <a href="#scenario4">click here</a>.</span><span> </span></p> <p><strong><span>Reflections on Scenario 3</span></strong></p> <p><span>Besides National Data Funds, there are many more models to explore. Data ownership can be addressed in a lot of ways. National Data Funds don't sound like a perfect solution, but they’re not destructive, either.</span></p> <p><strong><span>Governments will have more control</span></strong></p> <p><span>In the National Data Fund model, the government gets all the data we are generating. This means governments will also have more control. They will basically have every individual’s data and can infer patterns for masses, as well as behaviours and intentions. Will people still dare to protest when abuses happen? We all know how fragile democracies are and the risk of this turning into a techno-dictatorship is high. Around the world there are corrupt, oppressive and ruthless governmental regimes. And if they aren’t now, they could become one very easily.</span></p> <p><span>We will need super advanced security measures to ensure that these databases are not leaked, hacked, or manipulated by foreign or domestic agents. Personal data will have to be anonymised, but <a href="https://science.sciencemag.org/content/347/6221/536.full">research already shows</a> we need to do a lot more to improve anonymisation techniques. And even so, handling anonymised data still can have a huge impact on individual lives because you can look at trends and control masses based on those insights. </span></p> <p><span>Moreover, National Data Funds based on access permission implies a type of infrastructure that will potentially take a significant amount of time to build and will involve a lot of resources.</span></p> <p><strong><span>Trusting governments with all our data</span></strong></p> <p><span>It would be very complicated to put all the necessary checks and balances in place, and to make sure decisions are transparent, without hidden agendas or secret deals between governments and companies. Even if there are no bad intentions in the middle, government structures will still have to change drastically. Entire teams of technical specialists will have to assess proposals coming from companies and take steps to ensure that security measures are in place for preventing abuse. Proposals would also need ethical, sustainability and environmental checks – so a lot of talent needs to be added to government. The public sector needs to get way more attractive in terms of financial rewards for its specialists. Where would this money come from? Would the government be tempted to allow more companies access to data in order to build its budget? Or outsource core services? These are hard operational and strategic decisions to take. Will governments have the backbone to make the correct decisions? What if government interests are aligned with a company’s interests? Exploitative services should not be allowed in the first place, but preventing this from happening will require a total transformation of the mindset we operate in - more than just an independent body under civilian oversight.</span></p> <p><strong><span>Security for databases</span></strong></p> <p><span>Needless to say, having a centralised database as a single point of failure is a very bad idea. The database could potentially be decentralised to eliminate the central point of vulnerability, but technical challenges around decentralisation and data integrity are significant. When it comes to centralised databases, the <a href="https://www.washingtonpost.com/news/theworldpost/wp/2018/08/09/aadhaar/?noredirect=on&amp;utm_term=.20c68d07af7e">Adhaar system</a> in India shows us the massive implications around managing such a database.</span></p> <p><strong><span>No incentives for companies</span></strong></p> <p><span>One of the ideas for the National Data Funds is to ask companies to pay a share of their profit. Of course, there could be other <a href="https://www.theguardian.com/technology/2017/jul/01/google-european-commission-fine-search-engines">models</a> such as subscription fees, subsidised access, completely free access, access based on income, or a mix of these. Companies may be required to pay twice: once for getting access to the data and again for the profit that data generates. But we already see business models that deliberately run on losses (Uber, Amazon) in order to consolidate their position, kill competition and promote the regulatory framework that’s best for them. Will the National Data Fund model solve this problem? You don’t want to maximize the revenue of the companies using the fund, but at the same time, where are governments going to get the money from for all the costs involved with technical management, deliberation and social benefit analysis?</span></p> <p><strong><span>A lot of uncertainties</span></strong></p> <p><span>There are many more questions to think about. For example, by default will everybody’s data be captured? Is there a possibility not to contribute to the fund? And if I don’t contribute, will I be able to freely operate and participate in society without major negative consequences? There is also a lot to discuss about how this data pool is managed and how decisions are made by both governments and citizens. Will every point of view have the same weight? How can we make sure citizens will not be threatened, pressured or forced to vouch for certain decisions? Also, what happens if a company gets access to a database, but then misbehaves? Even if you cut their access to the database, the damage is already done, potentially harming millions. What will be the measures to ensure that the risks of abuses are mitigated?</span><br />  </p> <h3 id="scenario4"><span><strong>SCENARIO 4 - DATA RIGHTS</strong></span></h3></div> </div> </div> <div class="field__item"> <div class="paragraph paragraph--type--image-and-text-repeating paragraph--view-mode--default"> <div class="field field--name-field-fieldset-image field--type-image field--label-hidden field__item"> <img src="/sites/default/files/flysystem/2019-07/Scenario_04.jpg" width="1800" height="1013" alt="Data rights illustration" typeof="foaf:Image" /> </div> <div class="clearfix text-formatted field field--name-field-fieldset-text field--type-text-long field--label-hidden field__item"><p><span>This time, Amtis travels to year 2030 to get a sense of how the data rights framework played out: </span></p> <p><em><span>I just moved into a new apartment and everything was a mess. My stuff was all over the place and I couldn't find anything. I received a notification on my dashboard that a delivery drone had arrived with my package. </span></em></p> <p><span><strong>Data rights dashboard</strong></span></p> <p><em><span>The dashboard showed me a summary report with information about how my data was handled: which company processed my order, the type of data that was collected about me, why it needed that information, and who this information was shared with. I am getting more and more 'clean' reports nowadays, but ten years ago it was overwhelming to see these notifications. Few companies were good at implementing privacy minded data operations, and I didn't feel like I really had a choice about what to do with my data. </span></em></p> <p><em><span>Something that pushed companies to be more serious about data protection was the fact that the dashboard could communicate</span><span> </span><span>with the regulator’s enforcement systems. Any time there was mistreatment of data, I would receive a notification which I could decide to forward to my regulator. I did that a lot of times! Regulators are more resourceful than they used to be, so when they receive notifications, they would start an investigation. </span></em></p> <p><span><strong>Machine-readable privacy policy</strong></span></p> <p><em><span>I like the data rights dashboard because it saves me a lot of time. Before I made the order, it scanned the machine-readable privacy policy and terms and conditions and showed me a summary report of all relevant provisions related to how my data was being handled. It analysed both the companies that were selling the kitchen appliances I need for my new flat, but also the terms of the platforms the companies used to sell the products. The best part was that the summary didn't just rely on what the privacy policies said. It also queried an official public database to see if there are any investigations related to the product I ordered. Additionally, it performed a deep search for any public information related to the product I wanted, to see if there were known breaches or news scandals connected to the product or to the company. I was shocked when I discovered that a team of investigative tech-journalists had revealed that the Smart Vacuum - which I was initially tempted to buy - automatically sent the blueprints of my house together with metadata to a foreign intelligence agency! </span></em></p> <p><span><strong>Data portability</strong></span></p> <p><em><span>What I am most grateful for nowadays is that I can easily move my data from one company to another. When I moved to the new place, I transferred all my utilities to the new address with a click of a button. I kept my rates and didn't have any headaches for moving my subscriptions from one place to the other. </span></em></p> <p><em><span>In case I don't like a company's services anymore, or they change their terms and conditions, I can move to a different company without second thoughts. Data portability allows me to pull all my data from one company and move it to another. </span></em></p> <p><em><span>I think that's very healthy. Smaller companies used to be cut off from the market simply because they couldn’t get enough data and customers. With data portability, this challenge is more manageable. Their major task is to convince people to trust them with their data and not to give them strong reasons to leave.</span></em></p> <p><span><strong>Open Hardware</strong></span></p> <p><em><span>My new apartment was empty when I moved in so I needed furniture. 3D printed furniture is a big thing now. You can find a 3D printing shop literally at almost every corner. I went down to my neighborhood's 3D printing shop to browse their catalogue. I needed an ergonomic desk. </span></em></p> <p><em><span>To my surprise, they were not doing only furniture. They have a separate line only for open hardware - anything from kettles, fridges and washing machines to cameras, audio systems and laptops. They were advertising an Open Hardware phone. The guy from the shop showed me how it works. You basically choose all the spare parts you want and they assemble the model for you. </span></em></p> <p><em><span>I went ahead and looked at the documentation for all the spare parts. It wasn't easy to understand all the blueprints of the different components and how everything fit together. I tried to make sense of it, but in the end I paid for technical consultancy at the shop before making the order. The phone I ordered is made out of 3D-printed material: it's 100% durable and recyclable material! And it's sooo cheap, I can't believe it! I just became the biggest fan of open hardware! These products are transparent, highly customisable for any privacy needs, sustainable and affordable.</span></em></p> <p><span><strong>People's Digitalopoly</strong></span></p> <p><em><span>Doing some research into this, I found out that the Open Hardware phone was prototyped during a <a href="https://mydata2019.org/programme/the-first-data-futurathon/">futurathon</a>. Futurathons are self-organised meetings run by a grassroots movement of engineers, artists, philosophers, journalists, youth, economists, lawyers, policy makers, environmentalists, and LGBTIAQ+. Years back, they started to design different bits and pieces of a new architecture based on strong, enforceable rights over data. Their goal was to design the People's Digitalopoly - a new digital world, which is not based on financial gains, but on social contribution and civic participation. They believed in a model which empowers people with autonomy over data and new ways of looking at our relationship to data. </span></em></p> <p><em><span>I was intrigued by their vision and I basically spent the entire rest of the day reading their manifesto. I also discovered some of the first transcripts of their meetings where they were saying that put their bet on the data rights model. They realised very early on that the true potential of the data rights model could only be achieved with open protocols and interoperability. From that time, they spent ten years of hard work to develop the <a href="https://web3.foundation/">Web3</a> open protocols.</span></em></p> <p><span><strong>Big vision: Connect the decentralised with the centralised</strong></span></p> <p><em><span>Once the protocols were stable and reliable, the big vision was to connect decentralised services among themselves, but also with closed systems. They believed everybody should be able to run their own chat applications, but at the same time to <a href="https://matrix.org">communicate with others on closed services</a>. The idea was for everything to be possible in the same place, irrespective of the centralised, distributed or decentralised architectures. This left me thinking about how easy it would be for me to use such an interconnected digital ecosystem and what it would mean for the way I live and do business. </span></em></p> <p><span><strong>Reflections on Scenario 4</strong></span></p> <p><span>Amtis' story brings a bold vision. Open protocols and decentralised systems create a new universe of possibilities. Groups have been running <a href="https://runyourown.social/">peer-to-peer networks</a> for a long time, but decentralised file storage and moving towards a decentralised web, will change the way we look at data structures.</span></p> <p><span>However, the most important lesson we need to understand is that there is no one size fits all. There are always going to be different enclaves. People will organise themselves in different ways for different needs. We need a system that allows for these differences. There is no reason why different models shouldn't be able to speak to one another. Nobody needs to be left out. </span></p> <p><span>We also need to acknowledge that as much as devices and technology become cheaper, there are still going to be many people below the poverty line who won't be able to afford them. Amtis could pay for technical consultancy and order a privacy enhanced phone, but not everybody is going to be able to do the same. As mentioned in previous sections, privacy should not be a right only the rich or more resourced people can enjoy. </span></p> <p><span>The data rights environment does not function in isolation. It seeks to provide people choice and agency over data; it doesn't mean shutting off from the world in a 'safe' bubble. It is an autonomous, fully interoperable architecture with built-in protection. It uses strong cryptographic mechanisms and anonymisation techniques to protect both individual privacy, but it also enables us to extract <a href="https://decodeproject.eu/what-decode">social benefits from collective data</a>.</span></p> <p><span>Fierce enforcement, open standards and interoperability protocols are key components of the data rights model. Strong protection mechanisms cannot rely on the individual having to make decisions at every step of the data flow. Protection needs to be ensured regardless of whether people know how to protect their data or not.</span></p> <p><span>That's why decentralised privacy-aware, censorship-resistant protocols are essential. Amtis used a data rights dashboard for analysing privacy policies and to manage orders from companies that take privacy into account. As much as this sounds like a good tool, individuals shouldn't be forced to turn themselves in data vigilantes and be alert each time companies want to collect data from them. Also, it's not a good idea to rely on a device that's so intimately connected to your daily activities. In Amtis' case, we should be concerned about the producer of the data rights dashboard and how vulnerable it might be to attacks. Could this device be used for mass surveillance purposes? Who is making this device and its security measures?</span></p> <p><span>Data portability not only offers the possibility for people to move their data from one service to another, it also gives startups and small companies a chance to compete on the market. Their challenge is reduced because people can choose to move data to their service. However, companies can come up with all sorts of tricks to attract people to their platform such as profiling, microtargeting or paying them for their data to join. Critics say that as long as we continue to encourage the market economy, a data rights system can only do damage control. A data rights system can only reach its true potential if other types of mechanisms and incentives are in place; and there are <a href="https://newleftreview.org/issues/II116/articles/evgeny-morozov-digital-socialism">more options to explore</a>.</span></p> <p><span>More work needs to be done. Data captured by sensors and the vulnerabilities of IoT (Internet of Things) are a growing concern. We're far from solving Artificial Intelligence systems and we haven't even begun to seriously address biotech - which opens up an entire new dimension of challenges at an unprecedented level of moral, ethical, societal and evolutionary complexity.</span></p> <p><span>All in all, we can still improve the data rights model, but it's the closest we have to a healthy, empowering and balanced architecture.</span><br />  </p> <p><span><strong>TO WRAP UP</strong></span></p> <p><span><strong>Data rights as a part of a comprehensive system of protection</strong></span></p> <p><span>Stepping out of the scenarios and grounding ourselves again in today’s realities, people are increasingly aware and angry about how data is harvested by data monopolies. We are disempowered: we have lost control over the data we are generating, and we are becoming more aware of exploitative practices. In order to address the power imbalances that plague our digital sphere, I believe we need a system that provides individuals with clear, direct authority over their data. One that allows us to set the boundaries we want for our private space. But at the same time, we also need a system that enables us to extract the collective benefit from data.</span></p> <p><span>A data rights system allows us to access, change, move or delete data; to know who’s collecting it, where it’s being stored, where it’s going, who has access to it, and for what purposes. Data rights cover both data that I voluntarily generate, but also data that is automatically collected or inferred about me. This includes location and browsing history, but also information that has been derived, inferred or predicted from other sources. In that sense, a data rights system offers a much more comprehensive architecture of control and protection than ownership. That’s why, when appropriate control rights are in place, we essentially don’t need property rights. It is true though that this model puts more responsibility on the user, to manage and take informed decisions about what data goes where. </span></p> <p><span>However, principles such as data minimisation, fairness and purpose specification - if meaningfully implemented - have a strong chance to reverse this burden on the user. Privacy will not depend on how knowledgeable, informed or skilled the individual is, but on how well companies comply with their protection obligations. This way the individual will not be the weakest link, having to struggle to understand all the complexities. Instead, companies design processes and data flows with privacy principles in mind, which reduce the need for individuals to invest time and effort to understand how to protect their data but at the same time respecting agency and autonomy. Data rights are not the substitute for proactive implementation and protection. </span></p> <p><span>On its own, a data rights system is not enough, it needs to be reinforced by protection principles and clear legal obligations for all the actors involved. Coupled with a strong technical and enforcement layer, the potential of the data rights architecture is enormous.</span></p> <p><span><strong>Choice and transparency on top of well-designed data processes</strong></span></p> <p><span>A data rights system also places a higher burden of responsibility on organisations. Data rights demand that organisations create a secure and protected environment for data processing and adopt a very transparent set of rules. Data processes will be built to comply with the principles of privacy-by-design and by default and data minimisation from the outset. Protection is at the core and data needs to be collected in a meaningful and transparent way. Organisations also need to invest in data portability so that data can be easily moved from one service to another - the same way phone numbers can be ported between different telecommunications operators. </span></p> <p><span>Data rights offer a solid framework. They are the backbone of a healthy society, where individual empowerment and collective well-being is paramount. We need to advance and strengthen this architecture of rights with vigilant watchdogs and new socio-economic rationales — the muscles that keep the system in check. We need well-designed data processes and user-friendly data portability — the skin that ties everything together. </span></p> <p><span>If enforced, data rights would challenge the business models of tech giants. They would have to rethink their business philosophy and redesign their business processes to implement privacy thorough operations. Instead of allowing data exploitation, meaningful enforcement would lead to a diversification of offers in the digital markets and more, genuine choice for individuals.</span></p> <p><span><strong>Develop new language</strong></span></p> <p><span>In my view 'data ownership' fails to address the main challenges of today’s digital economy. Ownership certainly doesn’t capture the full spectrum of related issues. An ownership system, even though it sounds like a good idea, is incapable of stopping exploitative data practices and monopolies on its own - it would simply allow them to adapt and persist. If we keep our focus primarily on figuring out data ownership, we face the risk of sidetracking the discussion onto a very questionable path. This is an open invitation to develop new language for clearer conversations and to better shape our demands for the future we want to see. </span></p> <p><span>I believe the potential for the data rights architecture is huge, and there are many models to explore. Amtis' next journey is for you to imagine.</span><br />  </p> <p><strong><span>NEXT STEPS</span></strong></p> <p><span>How do you see YOUR future? Tell us your first step for shaping it.</span></p> <p><span>Don’t know exactly what your first step should be? Here are some suggestions:</span></p> <p><span>1. Elections around the world are coming up. Ask candidates what role they think data plays in the future. If you see a lack of foresight, engage in a deeper discussion with them. Make them listen and use the scenarios to explain how things might go wrong.</span></p> <p><span>2. Start a data rights hub in your community. There are a few discussion forums that are dedicated to thinking about what happens next and how to address our data challenges. For example, check out the <a href="https://mydata.org/">My Data community</a> and the <a href="https://lists.theengineroom.org/lists/info/responsible_data">ResponsibleData.io mailing list</a> where you can engage in conversations on the themes discussed in the scenarios.</span></p> <p><span>3. Your creative skills are needed for spreading the data discussion more broadly. Reach out to <a href="mailto:valentinap@privacyinternational.org">valentinap@privacyinternational.org</a> if you want to contribute to building a toolkit to make it easier for your friends, your neighbours and the world to engage with the topic. We would all benefit from sharing this topic with more communities!</span></p></div> </div> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-inline"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/adtech" hreflang="en">AdTech</a></div> <div class="field__item"><a href="/topics/biometrics" hreflang="en">Biometrics</a></div> <div class="field__item"><a href="/topics/data-exploitation" hreflang="en">Data Exploitation</a></div> <div class="field__item"><a href="/topics/data-intensive-systems" hreflang="en">Data Intensive Systems</a></div> <div class="field__item"><a href="/topics/data-protection" hreflang="en">Data Protection</a></div> <div class="field__item"><a href="/topics/fintech" hreflang="en">Fintech</a></div> <div class="field__item"><a href="/topics/internet-things" hreflang="en">Internet of Things</a></div> <div class="field__item"><a href="/topics/open-data" hreflang="en">Open Data</a></div> <div class="field__item"><a href="/topics/privacy" hreflang="en">Privacy</a></div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-inline"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/expose-invisible-data-placed-beyond-our-control" hreflang="en">Expose Invisible Data Placed Beyond Our Control</a></div> <div class="field__item"><a href="/what-we-do/id-identity-and-identification" hreflang="en">ID, Identity and Identification</a></div> <div class="field__item"><a href="/what-we-do/research-advanced-surveillance-technologies" hreflang="en">Research Advanced Surveillance Technologies</a></div> <div class="field__item"><a href="/what-we-do/track-surveillance-industry-and-trade" hreflang="en">Track the Surveillance Industry and Trade</a></div> <div class="field__item"><a href="/what-we-do/modernise-data-protection-law" hreflang="en">Modernise Data Protection Law</a></div> <div class="field__item"><a href="/what-we-do/fight-data-retention-law" hreflang="en">Fight Data Retention Law</a></div> <div class="field__item"><a href="/what-we-do/realise-our-rights-live-dignity" hreflang="en">Realise Our Rights to Live with Dignity</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/challenging-corporate-data-exploitation" hreflang="en">Challenging Corporate Data Exploitation</a></div> <div class="field__item"><a href="/strategic-areas/contesting-government-data-and-system-exploitation" hreflang="en">Contesting Government Data and System Exploitation</a></div> <div class="field__item"><a href="/strategic-areas/defending-democracy-and-dissent" hreflang="en">Defending Democracy and Dissent</a></div> <div class="field__item"><a href="/strategic-areas/government-exploitation" hreflang="en">Government Exploitation</a></div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/our-data-not-trade" hreflang="en">Our data is not for trade</a></div> <div class="field__item"><a href="/campaigns/tell-companies-stop-exploiting-your-data" hreflang="en">Tell companies to stop exploiting your data!</a></div> </div> </div> <div class="field field--name-field-education-course field--type-entity-reference field--label-above"> <div class="field__label">Education material</div> <div class="field__items"> <div class="field__item"><a href="/education/risks-data-intensive-systems" hreflang="en">The risks of data-intensive systems</a></div> </div> </div> <div class="field field--name-field-audience-and-purpose field--type-entity-reference field--label-above"> <div class="field__label">Audiences and Purpose</div> <div class="field__items"> <div class="field__item"><a href="/taxonomy/term/628" hreflang="en">Feeding our followers</a></div> <div class="field__item"><a href="/taxonomy/term/624" hreflang="en">Generalised audience education for inspiration</a></div> <div class="field__item"><a href="/taxonomy/term/625" hreflang="en">Generalised audience problem articulation</a></div> <div class="field__item"><a href="/taxonomy/term/630" hreflang="en">Helping experts with our analyses</a></div> <div class="field__item"><a href="/taxonomy/term/631" hreflang="en">Helping partners and other NGOs know our stance</a></div> <div class="field__item"><a href="/taxonomy/term/629" hreflang="en">Helping people understand our solutions</a></div> <div class="field__item"><a href="/taxonomy/term/627" hreflang="en">Informing the concerned</a></div> </div> </div> Wed, 17 Jul 2019 09:26:10 +0000 harmitk 3088 at http://privacyinternational.org Choose your own (data) future http://privacyinternational.org/news-analysis/3094/choose-your-own-data-future <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><em>This blogpost is a preview of the full '<a href="https://privacyinternational.org/long-read/3088/our-data-future">Our Data Future</a>' story, produced by Valentina Pavel, <span>PI Mozilla-Ford Fellow, 2018-2019</span>.</em></p> <p><span>2030.</span></p> <p><span>Four worlds.</span></p> <p><span>One choice. Which one is yours?</span></p> <p><span>All aboard! Time to step into the imaginarium. Explore four speculative future scenarios, examining how different ways of governing data create vastly different worlds. How is our digital environment going to look like in ten years' time? What’s going to be our relationship with data? </span></p> <p><span>Each of us has a role in shaping our data future. Let’s get proactive about addressing our digital challenges. Let’s start designing our data future. </span></p> <p><span>Today, we’re not OK with big tech companies harvesting our data on a massive scale. We’re not OK with companies predicting our behaviour and inferring our interests from data that we’re sharing with them – directly or without us even knowing. We’re not OK with companies offering “free” services that mask data exploitation.</span></p> <p><span>We want autonomy, choice and freedom over our data.</span></p> <p><span>Here’s where our journey begins. Our former Mozilla Fellow Valentina Pavel explores how different data models <a href="https://privacyinternational.org/long-read/3088/our-data-future">create different futures</a>. In Scenario 1, we see a future where data is treated like property. In Scenario 2, we’re being paid for data as labour, earning our monthly wage for the data we generate. In Scenario 3, we store data in national funds, managed by both citizens and governments. Companies pay access fees if they want to use data for building services. In Scenario 4, we have rights to our data, allowing us agency and autonomy over the data we generate.</span></p> <p><span>The future is not a given: our actions and decisions will take us where we want to go. Where do </span><span><em>you</em></span><span> want to go? Tell us at <a href="mailto:valentinap@privacyinternational.org">valentinap@privacyinternational.org</a></span>.</p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Scenario_02_0.jpg" width="1800" height="1013" alt="data future image" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Scenario_02_1.jpg" width="1800" height="1013" alt="data future image" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Scenario_02_2.jpg" width="1800" height="1013" alt="data future image" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-above"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/expose-data-exploitation-data-profiling-and-decision-making" hreflang="en">Expose Data Exploitation: Data, Profiling, and Decision Making</a></div> <div class="field__item"><a href="/what-we-do/expose-invisible-data-placed-beyond-our-control" hreflang="en">Expose Invisible Data Placed Beyond Our Control</a></div> <div class="field__item"><a href="/what-we-do/id-identity-and-identification" hreflang="en">ID, Identity and Identification</a></div> <div class="field__item"><a href="/what-we-do/promote-strong-cyber-security-and-protections-people" hreflang="en">Promote Strong Cyber Security and Protections for People</a></div> <div class="field__item"><a href="/what-we-do/protect-people-and-communities-online" hreflang="en">Protect People and Communities Online</a></div> <div class="field__item"><a href="/what-we-do/realise-our-rights-live-dignity" hreflang="en">Realise Our Rights to Live with Dignity</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/challenging-corporate-data-exploitation" hreflang="en">Challenging Corporate Data Exploitation</a></div> </div> </div> <div class="field field--name-field-type-of-impact field--type-entity-reference field--label-above"> <div class="field__label">Type of Impact</div> <div class="field__items"> <div class="field__item"><a href="/impact/fighting-identity-systems" hreflang="en">Fighting Identity Systems</a></div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/demanding-identity-systems-our-terms" hreflang="en">Demanding identity systems on our terms</a></div> <div class="field__item"><a href="/campaigns/exposing-new-frontiers-identity" hreflang="en">Exposing new frontiers of identity</a></div> <div class="field__item"><a href="/campaigns/our-data-not-trade" hreflang="en">Our data is not for trade</a></div> <div class="field__item"><a href="/campaigns/tell-companies-stop-exploiting-your-data" hreflang="en">Tell companies to stop exploiting your data!</a></div> <div class="field__item"><a href="/campaigns/uncovering-hidden-data-ecosystem" hreflang="en">Uncovering the Hidden Data Ecosystem</a></div> </div> </div> </div> </div> Wed, 17 Jul 2019 08:27:20 +0000 harmitk 3094 at http://privacyinternational.org Submission to CERD's Draft General Recommendation n° 36 on preventing and combating racial profiling. http://privacyinternational.org/advocacy/3090/submission-cerds-draft-general-recommendation-ndeg-36-preventing-and-combating-racial <div class="node node--type-advocacy-briefing node--view-mode-token group-one-column ds-2col-stacked-fluid clearfix"> <div class="group-header"> </div> <div class="group-left"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span>During its 98th session, from 23 April to 10 May 2019, the UN Committee on the Elimination of Racial Discrimination (CERD) <a href="https://www.ohchr.org/EN/HRBodies/CERD/Pages/GC36.aspx">initiated the drafting process</a> of general recommendation n° 36 on preventing and combatting racial profiling.</span></p> <p><span>As part of this process, CERD invited stakeholders,  including States, UN and regional human rights mechanisms, UN organisations or  specialised agencies, National Human Rights Institutions, Non-Governmental  Organisations (NGOs), research institutions, and academics to send their  comments.</span></p> <p><span>Privacy International sent its comments to the Committee as part of this process aiming at providing a few suggestions that may help clarifying the terminology used in the Draft General Recommendation 36 and also to inform the Committee’s future contributions.</span></p> <p><span>Privacy International recommends the Committee on Elimination of Racial Discrimination to consider the following, with regard to the Draft General Recommendation 36:</span></p> <ul><li><span>To address the separate implications of the use of AI technologies for different law enforcement purposes;</span></li> </ul><ul><li><span>To consider examining the impact of practices similar to racial profiling beyond law enforcement;</span></li> </ul><ul><li><span>To clarify the use of different terms invoked in the general recommendation and distinguish between general technologies, methods of treating data and making decisions, and specific technologies;</span></li> </ul><ul><li><span>To further reflect on the significant risks for racial discrimination in the use of predictive policing and facial recognition technologies</span>.</li> </ul><p> </p> <p> </p></div> <div class="field field--name-field-repeating-image-and-text field--type-entity-reference-revisions field--label-inline"> <div class="field__label">Repeating Image and Text</div> <div class="field__items"> <div class="field__item"><div class="paragraph-formatter"><div class="paragraph-info"></div> <div class="paragraph-summary"></div> </div> </div> </div> </div> </div> <div class="group-footer"> <div class="field field--name-field-topic field--type-entity-reference field--label-inline"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"> <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> </div> </div> </div> <div class="field__item"> <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> </div> </div> </div> <div class="field__item"> <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-description field--type-text-long field--label-hidden field__item"><p>Artificial Intelligence and its applications are a part of everyday life: from social media newsfeeds to mediating traffic flow in cities, from autonomous cars to connected consumer devices like smart assistants, spam filters, voice recognition systems, and search engines.</p> <p> </p></div> </div> </div> </div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><div about="/strategic-areas/defending-democracy-and-dissent" id="taxonomy-term-585" class="taxonomy-term vocabulary-programmes"> <h2><a href="/strategic-areas/defending-democracy-and-dissent"> <div class="field field--name-name field--type-string field--label-hidden field__item">Defending Democracy and Dissent</div> </a></h2> <div class="content"> <div class="clearfix text-formatted field field--name-description field--type-text-long field--label-hidden field__item"><p>The seamless way we communicate using some of these technologies has helped many to organise politically and to express dissent online and offline. But the hidden data harvesting on which many of these technologies rely also threatens our ability to challenge power, no matter the type of government.</p></div> </div> </div> </div> <div class="field__item"><div about="/strategic-areas/safeguarding-peoples-dignity" id="taxonomy-term-586" class="taxonomy-term vocabulary-programmes"> <h2><a href="/strategic-areas/safeguarding-peoples-dignity"> <div class="field field--name-name field--type-string field--label-hidden field__item">Safeguarding Peoples&#039; Dignity</div> </a></h2> <div class="content"> <div class="clearfix text-formatted field field--name-description field--type-text-long field--label-hidden field__item"><p>The promises made with innovation have not been enjoyed by all equally. Innovative solutions can be designed to empower and serve individuals and communities, rather than state and corporate power. A new approach to data and technology must be established to make this a reality. </p></div> </div> </div> </div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-inline"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><div about="/what-we-do/id-identity-and-identification" id="taxonomy-term-666" class="taxonomy-term vocabulary-issue"> <h2><a href="/what-we-do/id-identity-and-identification"> <div class="field field--name-name field--type-string field--label-hidden field__item">ID, Identity and Identification</div> </a></h2> <div class="content"> </div> </div> </div> <div class="field__item"><div about="/what-we-do/realise-our-rights-live-dignity" id="taxonomy-term-668" class="taxonomy-term vocabulary-issue"> <h2><a href="/what-we-do/realise-our-rights-live-dignity"> <div class="field field--name-name field--type-string field--label-hidden field__item">Realise Our Rights to Live with Dignity</div> </a></h2> <div class="content"> </div> </div> </div> </div> </div> </div> </div> Mon, 15 Jul 2019 11:41:48 +0000 staff 3090 at http://privacyinternational.org Minimum safeguards on intelligence sharing required under international human rights law http://privacyinternational.org/advocacy/3068/minimum-safeguards-intelligence-sharing-required-under-international-human-rights-law <div class="node node--type-advocacy-briefing node--view-mode-token group-one-column ds-2col-stacked-fluid clearfix"> <div class="group-header"> </div> <div class="group-left"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span>Faced with the transnational dimension of terrorist-related activities, United Nations Security Council resolutions have emphasized the need for international cooperation in information-sharing, both for the purposes of collecting intelligence and judicial assistance</span><span>.</span></p> <p><span>Privacy International recognises the importance and benefit of intelligence sharing in the context of preventing and investigating terrorism or other genuine, serious threats to national security</span><span>. The organisation is concerned, however, that unregulated, unfettered and unwarranted intelligence sharing poses substantive risks to human rights and to the democratic rule of law. </span></p> <p><span>Privacy International’s research and comprehensive 2018 report shows that most countries around the world lack domestic legislation governing intelligence sharing, that most intelligence sharing agreements are secret and that <a href="https:// privacyinternational.org/report/1741/secret-global-surveillance-networks-intelligence-sharing- between-governments-and-need">independent oversight of intelligence sharing is inadequate</a></span><span>.</span><span> </span></p> <p><span>UN Security Council resolutions recognize the need to ensure that measures taken to combat terrorism, including intelligence sharing, must comply with international human rights law. However, they give no indication of the safeguards necessary to ensure such compliance. </span></p> <p><span>Privacy International believes that there is an urgent need to provide guidance to states, particularly in light of the fact that the counter-terrorism measures envisaged in UN Security Council resolution 2396 (2017) were adopted under Chapter VII of the UN Charter. </span></p> <p><span>In its submission to </span><span>UN Counter-Terrorism Committee Executive Directorate, Privacy International identifies some minimum safeguards that states must introduce in order to ensure their intelligence sharing laws and practices are compliant with applicable international human law. The briefing focusses mainly on states’ obligation to respect and protect the right to privacy as enshrined in Article 12 of the Universal Declaration of Human Rights and Article 17 of the International Covenant on Civil and Political Rights</span><span>.</span></p> <p><span>Privacy International encourages the UN Security Council Counter-Terrorism Committee Executive Directorate (CTED) to consider these safeguards in their assessment of states’ measures on intelligence sharing and their compliance with the UN Security Council resolutions. </span></p></div> <div class="field field--name-field-repeating-image-and-text field--type-entity-reference-revisions field--label-inline"> <div class="field__label">Repeating Image and Text</div> <div class="field__items"> <div class="field__item"><div class="paragraph-formatter"><div class="paragraph-info"></div> <div class="paragraph-summary"></div> </div> </div> </div> </div> </div> <div class="group-footer"> <div class="field field--name-field-target field--type-entity-reference field--label-inline"> <div class="field__label">Target Stakeholders</div> <div class="field__items"> <div class="field__item"><div about="/target/united-nations" id="taxonomy-term-337" class="taxonomy-term vocabulary-target"> <h2><a href="/target/united-nations"> <div class="field field--name-name field--type-string field--label-hidden field__item">United Nations</div> </a></h2> <div class="content"> </div> </div> </div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-inline"> <div class="field__label">What PI is campaigning on</div> <div class="field__items"> <div class="field__item"><div about="/campaigns/scrutinising-global-counter-terrorism-agenda" id="taxonomy-term-672" class="taxonomy-term vocabulary-campaigns"> <h2><a href="/campaigns/scrutinising-global-counter-terrorism-agenda"> <div class="field field--name-name field--type-string field--label-hidden field__item">Scrutinising the global counter-terrorism agenda</div> </a></h2> <div class="content"> </div> </div> </div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><div about="/strategic-areas/safeguarding-peoples-dignity" id="taxonomy-term-586" class="taxonomy-term vocabulary-programmes"> <h2><a href="/strategic-areas/safeguarding-peoples-dignity"> <div class="field field--name-name field--type-string field--label-hidden field__item">Safeguarding Peoples&#039; Dignity</div> </a></h2> <div class="content"> <div class="clearfix text-formatted field field--name-description field--type-text-long field--label-hidden field__item"><p>The promises made with innovation have not been enjoyed by all equally. Innovative solutions can be designed to empower and serve individuals and communities, rather than state and corporate power. A new approach to data and technology must be established to make this a reality. </p></div> </div> </div> </div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-inline"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><div about="/what-we-do/id-identity-and-identification" id="taxonomy-term-666" class="taxonomy-term vocabulary-issue"> <h2><a href="/what-we-do/id-identity-and-identification"> <div class="field field--name-name field--type-string field--label-hidden field__item">ID, Identity and Identification</div> </a></h2> <div class="content"> </div> </div> </div> </div> </div> </div> </div> Thu, 11 Jul 2019 16:26:07 +0000 staff 3068 at http://privacyinternational.org Have a Biometric ID System Coming Your Way? Key Questions to Ask and the Arguments to Make http://privacyinternational.org/long-read/3067/have-biometric-id-system-coming-your-way-key-questions-ask-and-arguments-make <span class="field field--name-title field--type-string field--label-hidden">Have a Biometric ID System Coming Your Way? Key Questions to Ask and the Arguments to Make </span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span lang="" about="/user/43" typeof="schema:Person" property="schema:name" datatype="">staff</span></span> <span class="field field--name-created field--type-created field--label-hidden">Thursday, July 11, 2019</span> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><em>Photo By: Cpl. Joel Abshier</em><br />  </p> <p><span><span>‘Biometrics’ describes the physiological and behavioural characteristics of individuals. This could be fingerprints, voice, face, retina and iris patterns, hand geometry, gait or DNA profiles. Because biometric data is particularly sensitive and revealing of individual’s characteristics and identity, it can be applied in a massive number of ways – and has the potential to be gravely <a href="https://undocs.org/A/HRC/39/29">abused</a>.</span></span></p> <p><span><span>Identification systems across the world increasingly rely on biometric data. Not only do these pose a grave threat for people’s privacy and freedom, they also present severe threats to people’s and society’s security because of their vulnerability to being breached.</span></span></p> <p><span><span><span>Yet, despite these, international institutions and countries are pushing biometric ID systems to countries around the world; these systems are now being rolled out as a solution to a whole range of things, from providing refugees with aid, to registering people to vote, and for providing everyday services to billions of people.  </span></span></span></p> <p><span><span><span>But not all identification systems are the same, and they pose different threats depending on their architecture and application. Below, Privacy International outlines some of the big things you need to consider when it comes to understanding any biometric identification system – and some of the questions which their proponents need to first answer.</span></span></span></p> <p><span><span><span>These considerations can be used by the anyone, but may be particularly useful for activists, journalists, policy-makers, administrators, and lawyers.  </span></span></span></p> <h2><span><span><strong><span>Is it Even Needed?</span></strong></span></span></h2> <p><span><span><span>Identities are extremely important: ensuring that everyone has a “legal identity” including having their birth registered is even one of the aims of the United Nations’ Sustainable Development Goals – the modern cornerstone of international efforts to eradicate poverty. But a smokescreen has been thrown up surrounding this term; it is used to justify any identity system, including biometric ID systems, when simpler and less invasive systems work just as well. </span></span></span></p> <p><span><span><span>Security or crime-prevention concerns are also frequently given as a motivation for states to introduce biometric identity schemes for their populations; these concerns are often presented in the abstract. They can lead to a ‘security’ argument being used even when there is actually little or no security advantage, for example when it comes to </span><a href="https://privacyinternational.org/topics/sim-card-registration"><span>biometric SIM card registration</span></a><span>.</span></span></span></p> <p><span><span><span>Policy makers and donors around the world are prone to suggesting high-tech solutions to a whole host of challenges when simpler ones would be more effective. Instead, they should first answer:</span></span></span></p> <ul><li><span><span><span>What is the problem that the digital identity system is designed to solve; what evidence is there for the extent of the problem; and why would digital identity be a solution to that problem? </span></span></span></li> <li><span><span><span>What alternative solutions are possible? Are they less privacy invasive?</span></span></span></li> </ul><h2><span><span><strong><span>Who will actually benefit?</span></strong></span></span></h2> <p><span><span><span>Any lucrative government contracts shielded from public scrutiny are prone to massive levels of waste or corruption. Biometric systems are sold by a powerful industry, often with closed ties with governments – and it is far from transparent. For example, a leading player was </span><a href="https://www.worldbank.org/en/news/press-release/2017/11/30/world-bank-announces-settlement-with-oberthur-technologies-sa"><span>banned</span></a><span> from World Bank contracts for “corrupt and collusive practices” in Bangladesh. Establishing the key companies and domestic actors involved – and their links – is therefore key.</span></span></span></p> <p><span><span><span>Another thing to establish is whether the system is aimed at actually serving domestic needs – or if it is actually designed to serve the interests of foreign powers. For example, the </span><a href="https://privacyinternational.org/news-analysis/3011/heres-surveillance-us-exports-central-america-aid-and-its-surviving-trumps-cuts"><span>United States’ Biometric Identification Transnational Migration Alert Program (BITMAP)</span></a><span> provides 14 countries with biometrics systems - </span><a href="https://www.aclu.org/sites/default/files/field_document/vote_recommendation_on_h.r._6439_the_bitmap_authorization_act_of_2018.pdf"><span>despite failing</span></a><span> to require adequate privacy protections. The collected data is then </span><a href="https://www.biometricupdate.com/201809/u-s-house-passes-bill-to-expand-biometric-technology-training-data-sharing-with-foreign-partners"><span>shared</span></a><span> with US biometric databases, including a new system known as HART developed by arms company Northrop Grumman, which </span><a href="https://privacyinternational.org/blog/648/us-border-cops-set-use-biometrics-build-line-world"><span>according to a DHS presentation</span></a><span> seen by Privacy International will scoop up a whopping 180 million new biometric transactions per year by 2022. </span></span></span></p> <p><span><span><span>Similarly, European countries are </span><a href="https://privacyinternational.org/advocacy-briefing/2548/eus-next-budget-huge-threat-privacy-heres-what-must-be-done"><span>spending billions of Euros</span></a><span> transferring surveillance and border control capabilities to foreign countries to ensure they stop people migrating to their countries. For example, the European Union’s Trust Fund for Africa provided €28 million to develop a universal nationwide biometric ID system in Senegal by funding a central biometric identity database, the enrolment of citizens, and the interior ministry in charge of the system.</span></span></span></p> <h2><span><span><span><strong><span>Is there an adequate national legal framework?</span></strong></span></span></span></h2> <p><span><span><span>Processing of biometric data, including collection, analysis, storing, sharing, must be prescribed by law and limited to that strictly and demonstrably necessary to achieve a legitimate aim. That law must be accessible to the public and sufficiently clear and precise to enable persons to foresee its application and the extent of the intrusion with someone’s privacy.</span></span></span></p> <p><span><span><span>Data protection law is a necessary but not sufficient safeguard against abuse. As of January 2019, over 120 countries around the world have enacted comprehensive data protection </span><a href="https://ssrn.com/abstract=2992986"><span>legislation</span></a><span>. The most comprehensive data protection regulation in the world, the European Union General Data Protection Regulation (GDPR), treats biometric data used for identification purposes as “special category data”, meaning it is considered more sensitive and in need of more </span><a href="https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/lawful-basis-for-processing/special-category-data/"><span>protection</span></a><span>.</span></span></span></p> <p><span><span><span>However, many of these laws contain significant exemptions. Many do not apply to processing of data by intelligence agencies and law enforcement, and even when they do, they contain wide reaching </span><a href="https://www.privacyinternational.org/blog/2074/uk-data-protection-act-2018-339-pages-still-falls-short-human-rights-protection"><span>exemptions</span></a><span> for purposes such national security and prevention or investigation of crime. Even country with modern data protection legislation, such as the United Kingdom of Great Britain and Northern Ireland, do not adequately regulate the </span><a href="https://www.privacyinternational.org/advocacy-briefing/2835/our-response-westminster-hall-debate-facial-recognition"><span>processing</span></a><span> of biometric data, such as the use of facial recognition technology by the police in public places.</span></span></span></p> <p><span><span><span>Privacy International believes that in most countries national laws do not adequately regulate the use and sharing of biometric data. They fall short of applicable international human rights law and they fail to effectively address the security risks arising from misuse of biometric data, especially at scale.</span></span></span><br />  </p> <h2><span><span><span><strong><span>Has there been a necessity and proportionality assessment?</span></strong></span></span></span></h2> <p><span><span><span>Under international law, any interference with the right to privacy needs to comply with the principles of necessity and proportionality.</span></span></span></p> <p><span><span><span>The use of biometrics presents a unique set of concerns. These are neatly summarised in the UN High Commissioner for Human Rights </span><a href="https://undocs.org/A/HRC/39/29"><span>report</span></a><span> on the right to privacy in the digital age, as biometric </span></span></span></p> <blockquote> <p><span><span><em><span>“data is particularly sensitive, as it is by definition inseparably linked to a particular person and that person’s life, and has the potential to be gravely abused. For example, identity theft on the basis of biometrics is extremely difficult to remedy and may seriously affect an individual’s rights. Moreover, biometric data may be used for different purposes from those for which it was collected, including the unlawful tracking and monitoring of individuals. Given those risks, particular attention should be paid to questions of necessity and proportionality in the collection of biometric data. Against that background, it is worrisome that some States are embarking on vast biometric data-base projects without having adequate legal and procedural safeguards in place”.</span></em></span></span></p> </blockquote> <p><span><span><span>The report recommends that States, inter alia “Ensure that data-intensive systems, including those involving the collection and retention of biometric data, are only deployed when States can demonstrate that they are necessary and proportionate to achieve a legitimate aim”.</span></span></span></p> <p><span><span><span>It should be noted that the creation of a national biometric identification system is not, in itself, a legitimate aim for the collection of biometric data on scale. Such an identification system cannot be seen as a legitimate aim in itself. </span></span></span></p> <p><span><span><span>Modern standards of data protection recognise the need to afford extra protection to biometric data.<a href="#_ftn1"><span><span><span><span>[1]</span></span></span></span></a></span></span></span></p> <p><span><span><span>Many national laws, however, do not mention biometric data, and do not explicitly characterise biometric data as personal and sensitive data.</span></span></span></p> <p><span><span><span>In practice, applying the principles of necessity and proportionality mean adopting the least intrusive means to achieve the relevant legitimate aim, in this context: the prevention and investigation of acts of terrorism. It also requires that any measure is accompanied by legal, procedural and technical safeguards to minimise the privacy’s interference.</span></span></span></p> <p><span><span><span>Necessity and proportionality assessments should play a significant role in relation to decisions to create centralised databases, and in the rules that govern retention and access of biometric data.</span></span></span></p> <h2><span><span><span><strong><span>Is it a centralised database of biometric data?</span></strong></span></span></span></h2> <p><span><span><span>Governments and industry often support the creation of large centralised databases containing biometrics information. For example, the Aadhaar biometric identification system in India contains the fingerprints, iris scans, and photographs of over 1.1 billion people. </span></span></span></p> <p><span><span>However, large <span>centralised databases of biometric data have often failed to pass a proportionality assessment under human rights law. As a London School of Economics </span><a href="http://www.lse.ac.uk/management/research/identityproject/identityreport.pdf"><span>report</span></a><span> on the UK Identity Card stated, “There is an enormous difference in the implications for the human right to privacy between this type of system, and one where a biometric is only stored locally in a smartcard”.</span></span></span></p> <p><span><span><span>That is because there is a significant difference between storing biometric data locally than storing them in a centralised database, with the latter being significantly more intrusive to privacy. </span><a href="///Users/tomasofalchetta/ownCloud/terrorism/biometrics/researchbriefings.files.parliament.uk/documents/SN04126/SN04126.pdf"><span>For example</span></a><span>, biometric passports that store the biometric details of an individual on a chip in the passport, rather than a centralised database, are used in the UK. Storing biometric data locally allows for the use of biometrics for <em>authentication</em> (to be able to be sure that the person with the document is who they claim to be) but prevents its use from the far more intrusive process of <em>identification</em> (finding the identity of a person when it is not known).</span></span></span></p> <p><span><span><span>Data protection authorities in Europe </span><a href="http://europa.eu.int/comm/internal_market/privacy/docs/wpdocs/2004/wp96_en.pdf"><span>have raised</span></a><span> grave reservations about the proportionality of proposals that would lead to the storage of biometric data on all non-nationals applying for a visa or residence permit in centralised databases for the purpose of carrying out subsequent checks on illegal immigrants (particularly those without documents).</span></span></span></p> <p><span><span><a href="https://undocs.org/en/CCPR/C/KWT/CO/3"><span>Commenting</span></a><span> on the Kuwaiti Law No. 78 (2015) on counter-terrorism, which requires nationwide compulsory DNA testing and the creation of a database under the control of the Minister of the Interior, the UN Human Rights Committee found that it imposes unnecessary and disproportionate restrictions on the right to privacy.</span></span></span></p> <p><span><span><span>It is worth noting that the recommendations of the UN Human Rights Committee to the Kuwaiti governments included amending the law “with a view to limiting DNA collection to individuals suspected of having committed serious crimes and on the basis of a court decision; (b) ensure that individuals can challenge in court the lawfulness of a request for the collection of DNA samples; (c) set a time limit after which DNA samples are removed from the database; and (d) establish an oversight mechanism to monitor the collection and use of DNA samples, prevent abuses and ensure that individuals have access to effective remedies.”</span></span></span></p> <h2><span><span><strong><span>How long is the data retained for?</span></strong></span></span></h2> <p><span><span><span>Under international law, indiscriminate retention of personal data, including biometric data, is never proportionate and necessary, even if when governments seek to justify it on grounds of protection of national security, including threat of terrorism acts.</span></span></span></p> <p><span><span><span>In the case of S v. Marper, the European Court of Human Rights found there had been a violation of the right to privacy by the UK, as a result of the blanket and indiscriminate nature of the powers of retention of the fingerprints, cellular samples and DNA profiles of persons suspected but not convicted of offences which failed to strike a fair balance between the competing public and private interests.<a href="#_ftn3"><span><span><span><span>[2]</span></span></span></span></a></span></span></span></p> <p><span><span><span>One of the first things to establish is the time-frame during which the system designed to be used, and to ensure that there exists a publicly accessible data retention policy.</span></span></span></p> <h2><span><span><strong><span>What other purposes could the data be used for?</span></strong></span></span></h2> <p><span><span><span>Strictly linked to the necessity and proportionality assessment are the concerns related to the repurposing of biometric databases (often described as mission creep.) The mere existence of biometric data in a centralised identification system could lead to the development of new justifications for its use and seeking to broaden the authorities with access to it.</span></span></span></p> <p><span><span><span>Often in the name of national security and counter-terrorism, states have sought to allow law enforcement and security agencies access to databases designed for purposes unrelated to counter-terrorism and prevention or investigation of crimes.</span></span></span></p> <p><span><span><span>For example, in 2004, the European Asylum Dactyloscopy Database (“EURODAC”) was established to facilitate the application of the Dublin Regulation, which determines the EU Member State responsible for examining an asylum application. In 2009, EU Member States proceeded to decide that EURODAC should made accessible for law enforcement purposes in order to fight terrorism, a purpose for which the data processed was never intended, as noted by the European Data Protection Supervisor (“EDPS”) in its Opinion on the matter. The EDPS’s </span><a href="https://edps.europa.eu/sites/edp/files/publication/09-10-%2007_access_eurodac_en.pdf"><span>opinion</span></a><span> also raised that the use of EURODAC for law enforcement purposes, and specifically for terrorism, means that a particular vulnerable group in society, namely applicants for asylum, could be exposed to further risks of stigmatisation, even though they are “not suspected of any crime” and “are in need of higher protection because they flee from persecution.”</span></span></span></p> <p><span><span><span>There have been some cases where privacy concerns about access to centralised databases by the police or security services have led to judgments limiting such access. For example, in India, Section 33(2) of the Aadhaar Act allowed, for the purpose of national security, access to the Aadhaar database (including biometrics) if authorised by an intelligence officer of Joint Secretary or above. The Aadhaar judgement </span><a href="https://www.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_26-Sep-2018.pdf"><span>ensured</span></a><span> that anybody who’s data was accessed in this way would be subject to a hearing.</span></span></span></p> <p><span><span><span>A key question which should be raised is:</span></span></span></p> <ul><li><span><span><span>Does the design of the system meet the goals established in the purpose of the system,  or  does  its  design  exceed  those  purposes  to  create  additional  privacy risks?</span></span></span></li> </ul><h2><span><span><strong><span>Could the data be connected to other databases?</span></strong></span></span></h2> <p><span><span><span>Similar concerns apply in relation to the trend by governments to develop ‘interconnectivity’ of different biometric databases. This trend is generally noted as positive by security actors. For example, the United Nations </span><a href="https://www.un.org/sc/ctc/wp-content/uploads/2018/06/Compendium-biometrics-final-version-LATEST_18_JUNE_2018_optimized.pdf"><span>Compendium of recommended practices for the responsible use and sharing of biometrics in counter-terrorism</span></a><span> talks positively of this aggregation of “the aggregation of disparate, single-mode databases and has evolved, in some countries and regions, into state-of-the-art, replacement networks that feature interconnected multi-modal databases designed to service a range of business needs across law enforcement, border management and other government functions at both a national and international level.”</span></span></span></p> <p><span><span><span>However, any such interoperability needs to consider the limits that human rights law, and in particular data protection standards, impose to such measures. Limits that are necessary to prevent the mission creep and the accompanying human rights abuses.</span></span></span></p> <h2><span><span><strong><span>Is it secure?</span></strong></span></span></h2> <p><span><span><span>Unlike a password, an individual’s biometrics cannot be easily changed.  As a result rectification of the unauthorised access to biometric data are either impossible or incurring a significant cost.</span></span></span></p> <p><span><span><span>Biometric data breaches seriously affect individuals in a number of ways, whether identity theft or fraud, financial loss or other damage. </span><span>The EU Fundamental Rights Agency </span><a href="https://fra.europa.eu/sites/default/files/fra_uploads/fra-2018-opinion-biometric-data-id-cards-03-%202018_en.pdf"><span>found</span></a><span> in relation to a central national database “due to its scale and the sensitive nature of the data which would be stored, the consequences of any data breach could seriously harm a potentially very large number of individuals. If such information ever falls into the wrong hands, the database could become a dangerous tool against fundamental rights.”</span></span></span></p> <p><span><span><span>In January 2018, it was </span><a href="https://www.tribuneindia.com/news/nation/rs-500-10-minutes-and-you-have-access-to-billion-aadhaar-%20details/523361.html"><span>reported</span></a><span> that access to the entire Aadhaar database – including the names, addresses, phone numbers, and photographs, but not fingerprint or iris scan data – was being sold for 500 rupees on a WhatsApp group.</span></span></span></p> <p><span><span><span>A </span><a href="https://www.tribuneindia.com/news/nation/rs-500-10-minutes-and-you-have-access-to-billion-aadhaar-%20details/523361.html"><span>breach</span></a><span> of the US government’s Office of Personnel Management – the agency that handles the security clearances of civilian workers – was announced in 2015. The records of 21.5 million people were stolen, including the fingerprints of 5.6 million federal employees. A security expert </span><a href="https://money.cnn.com/2015/07/10/technology/opm-hack-fingerprints/"><span>said</span></a><span> that this risked undercover operatives: "A secret agent's name might be different. But they'll know who you are because your fingerprint is there. You'll be outed immediately." The breach of one of the most sensitive biometric databases maintained by one of the most well-resourced and security-focused governments in the world with advanced access control protocols raises urgent questions about the ability of less well-resourced actors to appropriately defend against such breaches. </span></span></span></p> <p><span><span><span>The risks associated with unauthorised access to biometric data also threaten the effectiveness of the counter-terrorism measures, particularly when such breaches are not promptly reported and notified to independent oversight bodies and individuals concerned.</span></span></span></p> <p><span><span><span>While r</span><span>egular risk assessments of the end-to-end process of the biometric applications should be fundamental, there is also a need to conduct risk assessment prior to the implementation and application of identification systems based on biometric data and embed a privacy and security in the design of such systems.</span></span></span></p> <p><span><span><span>Questions which should be asked include:</span></span></span></p> <ul><li><span><span><span>Who holds the responsibility and so the obligations for the system (design, deployment, management, auditing, maintenance, etc.)?</span></span></span></li> <li><span><span><span>How are new systems operating in relations to existing non-humanitarian identity systems, i.e. national ID systems?</span></span></span></li> <li><span><span><span>What are the unintended consequences in the short-, mid-and long-term?</span></span></span></li> <li><span><span><span>Does the entity deploying the system have the expertise, tools and resources to undertake a well-informed risk assessment and to mitigate the risks identified?</span></span></span></li> <li><span><span><span>What minimum IT security measures should be implemented, and is there financial and technical support for such measures?</span></span></span></li> <li><span><span><span>What IT security measures will be provided in the future, for example if the providing company ceases support?</span></span></span></li> <li><span><span><span>What will happen if any data in the database is breached by various actors?</span></span></span></li> </ul><h2><span><span><strong><span>Who will the data be shared with?</span></strong></span></span></h2> <p><span><span><span>Privacy International recognises the importance and benefit of intelligence sharing in the context of preventing and investigating terrorism or other genuine, serious threats to national security. However, unregulated, unfettered and unwarranted intelligence sharing poses substantive risks to human rights and to the democratic rule of law. </span></span></span></p> <p><span><span><span>Sharing of personal data, such as biometric data, across jurisdiction can put people at high risk, and therefore must be regulated. Such sharing is within the purview of international human rights law, and in particular data protection. While such sharing is often said to be in line with existing obligations under international human rights law, there is little actual detail or guidance on how this can be achieved.</span></span></span></p> <p><span><span><span>The UN Compendium </span><a href="https://www.un.org/sc/ctc/wp-content/uploads/2018/06/Compendium-biometrics-final-version-LATEST_18_JUNE_2018_optimized.pdf"><span>identifies</span></a><span> some principles that should regulate such sharing of biometric data, focusing on the necessity of a clear legal framework and the limits on the use of such data. However, its recommended practice clearly favour the maximum sharing of biometric data across borders.</span></span></span></p> <p><span><span><span>Privacy International has published detailed recommendations on what adequate safeguards and oversight over intelligence sharing should look like, which can be accessed </span><a href="https://privacyinternational.org/feature/1742/new-privacy-international-report-reveals-dangerous-lack-oversight-secret-global"><span>here</span></a><span>.</span></span></span></p> <p><span><span><span>Biometric data, because of its sensitivity requires <a href="https://staging.privacyinternational.org/sites/default/files/2019-07/Submission%20to%20UNCTED_Minimum%20standards%20on%20intelligence%20sharing.pdf">even stricter limitations and safeguards</a> to ensure its sharing across jurisdiction comply with international human rights law. Therefore, states must introduce some additional minimum safeguards in order to ensure their intelligence sharing laws and practices are compliant with applicable international human law (notably Article 12 of the Universal Declaration of Human Rights and Article 17 of the International Covenant on Civil and Political Rights.)</span></span></span></p> <p> </p> <p> </p> <p><span><span><a href="#_ftnref1"><span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span><span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span>[1]</span></span></span></span></span></span></a> <span><span>The Council of Europe Modernised Convention for the Protection of Individuals with Regard to the Processing of Personal Data (“Convention 108 +”). Article 6, provides that biometric data uniquely identifying a person shall only be allowed where appropriate safeguards are enshrined in law, complementing those of Convention 108 +. The European General Data Protection law (“GDPR”). Article 9, prohibits the processing of biometric data for the purpose of uniquely identifying a natural person subject to limited exceptions. The Brazilian General Data Protection Law (“LGPD”), Federal Law no. 13,709/2018, Article 5 also provides special protections for biometric data.</span></span></span></span></p> <p><span><span><a href="#_ftnref3"><span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span><span><span lang="EN-US" xml:lang="EN-US" xml:lang="EN-US"><span>[2]</span></span></span></span></span></span></a> <span><span>The Court emphasised: “...The need for such safeguards is all the greater where the protection of personal data undergoing automatic processing is concerned, not least when such data are used for police purposes. The domestic law should notably ensure that such data are relevant and not excessive in relation to the purposes for which they are stored; and preserved in a form which permits identification of the data subjects for no longer than is required for the purpose for which those data are stored ... The domestic law must also afford adequate guarantees that retained personal data was efficiently protected from misuse and abuse ...The above considerations are especially valid as regards the protection of special categories of more sensitive data ...and more particularly of DNA information, which contains the person's genetic make-up of great importance to both the person concerned and his or her family” (S. and Marper v. The United Kingdom, App. Nos. 30562/04 and 30566/04, European Court of Human Rights, Judgment (4 December 2008), para 103)</span></span></span></span></p></div> <div class="field field--name-field-topic field--type-entity-reference field--label-inline"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/biometrics" hreflang="en">Biometrics</a></div> <div class="field__item"><a href="/topics/identity" hreflang="en">Identity</a></div> <div class="field__item"><a href="/topics/surveillance-industry" hreflang="en">Surveillance Industry</a></div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-inline"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/id-identity-and-identification" hreflang="en">ID, Identity and Identification</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/safeguarding-peoples-dignity" hreflang="en">Safeguarding Peoples&#039; Dignity</a></div> </div> </div> <div class="field field--name-field-attachments field--type-file field--label-inline"> <div class="field__label">Attachments</div> <div class="field__items"> <div class="field__item"> <span class="file file--mime-application-pdf file--application-pdf"> <a href="http://privacyinternational.org/sites/default/files/2019-07/PI%20briefing%20on%20biometrics%20final_0.pdf" type="application/pdf; length=231314">PI briefing on biometrics final_0.pdf</a></span> </div> <div class="field__item"> <span class="file file--mime-application-pdf file--application-pdf"> <a href="http://privacyinternational.org/sites/default/files/2019-07/Submission%20to%20UNCTED_Minimum%20standards%20on%20intelligence%20sharing.pdf" type="application/pdf; length=152617">Submission to UNCTED_Minimum standards on intelligence sharing.pdf</a></span> </div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/scrutinising-global-counter-terrorism-agenda" hreflang="en">Scrutinising the global counter-terrorism agenda</a></div> </div> </div> Thu, 11 Jul 2019 16:15:29 +0000 staff 3067 at http://privacyinternational.org Briefing to the UN Counter-Terrorism Executive Directorate on biometric data http://privacyinternational.org/advocacy/3066/briefing-un-counter-terrorism-executive-directorate-biometric-data <div class="node node--type-advocacy-briefing node--view-mode-token group-one-column ds-2col-stacked-fluid clearfix"> <div class="group-header"> </div> <div class="group-left"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span>Identification system across the world increasingly rely on biometric data. In the context of border management, security and law enforcement, biometric data can play an important role in supporting the investigation and prevention of acts of terrorism. </span></p> <p><span>This is clearly reflected in UN Security Council resolutions on counter- terrorism. Notably, Resolution 2396 (2017) the UN Security Council decided that states shall develop and implement systems to collect and share biometrics data for purposes of counter-terrorism. Similarly, the 2018 Addenda to the Madrid Guiding Principles note the usefulness of biometrics data. </span></p> <p><span>However, biometric data is particularly sensitive and revealing of individual’s characteristics and identity. As such it has the potential to be gravely abused.</span><span>1 </span><span>Identification system relying on biometric data are also vulnerable to security breaches, whose consequences for the individuals concerned, and for the overall security of society are extremely grave. </span></p> <p><span>The UN 2018 Addenda to the 2015 Madrid Guiding Principles agree that “biometric systems are a legitimate tool for the identification of terrorist suspects, but the expansive technical scope and rapid development of this technology deserves greater attention as it relates to the protection of human rights (including, but not limited to, the right to be free from arbitrary or unlawful interference with privacy).” </span></p> <p><span>This briefing aims to map out some of the implications of the adoption of identification systems based on biometrics.</span></p></div> <div class="field field--name-field-repeating-image-and-text field--type-entity-reference-revisions field--label-inline"> <div class="field__label">Repeating Image and Text</div> <div class="field__items"> <div class="field__item"><div class="paragraph-formatter"><div class="paragraph-info"></div> <div class="paragraph-summary"></div> </div> </div> </div> </div> </div> <div class="group-footer"> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-inline"> <div class="field__label">What PI is campaigning on</div> <div class="field__items"> <div class="field__item"><div about="/campaigns/scrutinising-global-counter-terrorism-agenda" id="taxonomy-term-672" class="taxonomy-term vocabulary-campaigns"> <h2><a href="/campaigns/scrutinising-global-counter-terrorism-agenda"> <div class="field field--name-name field--type-string field--label-hidden field__item">Scrutinising the global counter-terrorism agenda</div> </a></h2> <div class="content"> </div> </div> </div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-inline"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><div about="/strategic-areas/government-exploitation" id="taxonomy-term-587" class="taxonomy-term vocabulary-programmes"> <h2><a href="/strategic-areas/government-exploitation"> <div class="field field--name-name field--type-string field--label-hidden field__item">Government Exploitation</div> </a></h2> <div class="content"> </div> </div> </div> <div class="field__item"><div about="/strategic-areas/safeguarding-peoples-dignity" id="taxonomy-term-586" class="taxonomy-term vocabulary-programmes"> <h2><a href="/strategic-areas/safeguarding-peoples-dignity"> <div class="field field--name-name field--type-string field--label-hidden field__item">Safeguarding Peoples&#039; Dignity</div> </a></h2> <div class="content"> <div class="clearfix text-formatted field field--name-description field--type-text-long field--label-hidden field__item"><p>The promises made with innovation have not been enjoyed by all equally. Innovative solutions can be designed to empower and serve individuals and communities, rather than state and corporate power. A new approach to data and technology must be established to make this a reality. </p></div> </div> </div> </div> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-inline"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><div about="/what-we-do/id-identity-and-identification" id="taxonomy-term-666" class="taxonomy-term vocabulary-issue"> <h2><a href="/what-we-do/id-identity-and-identification"> <div class="field field--name-name field--type-string field--label-hidden field__item">ID, Identity and Identification</div> </a></h2> <div class="content"> </div> </div> </div> </div> </div> </div> </div> Thu, 11 Jul 2019 16:05:11 +0000 staff 3066 at http://privacyinternational.org Privacy International is joining migrant organisations to challenge the UK's "immigration control" data protection exemption - find out why! http://privacyinternational.org/news-analysis/3064/privacy-international-joining-migrant-organisations-challenge-uks-immigration <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><span><span><span>Increasingly every interaction migrants have within the immigration enforcement framework requires the processing of their personal data. The use of this data and new technologies are today driving a revolution in immigration enforcement which risks undermining people's rights and requires urgent attention.</span></span></span></p> <p><span><span><span>This is why Privacy International, and several migrant and digital rights organisation, joined <a href="https://picum.org/press-release-advocates-bring-first-gdpr-complaint-to-eu-against-uk-data-protection-law-for-violating-data-rights-of-foreigners/">a formal complaint filed</a> by the Platform for International Cooperation on Undocumented Migrants (PICUM) against the United Kingdom for failing to respect, the General Data Protection Regulation (GDPR), by including an "immigration control" exemption in the Data Protection Act adopted in 2018.</span></span></span></p> <p><span><span><strong><span>Data processing for immigration purposes</span></strong></span></span></p> <p><span><span><span>At the border and beyond, vast amounts of data are being collected and processed to track and identify people for immigration purposes, from <a href="https://privacyinternational.org/feature/2079/updated-thomson-reuters-selling-us-immigration-and-customs-enforcement-ice-access-data">utility data</a> to people's <a href="https://privacyinternational.org/feature/2731/questions-new-company-vying-border-dominance-us-needs-answer">social media accounts</a>, and even the <a href="https://privacyinternational.org/long-read/2776/surveillance-company-cellebrite-finds-new-exploit-spying-asylum-seekers">entire contents of people's phones</a>. </span></span></span></p> <p><span><span><span>Within borders, government bodies - those with and those without mandates related to immigration enforcement - are collecting, analysing, sharing and accessing vast amounts of personal data and inferring intelligence, on the basis of which life-changing decisions are made including eligibility for entry and residence and to <a href="https://www.libertyhumanrights.org.uk/policy/policy-reports-briefings/guide-hostile-environment-border-controls-dividing-our-communities-%E2%80%93">access public benefits</a> amongst others. All of this is supported by <a href="https://privacyinternational.org/feature/2216/who-supplies-data-analysis-and-tech-infrastructure-us-immigration-authorities">an industry making millions</a> from selling data, analysis, and infrastructure to immigration enforcement authorities.  </span></span></span></p> <p><span><span><span>These practices impact all foreign residents of a country, including economic migrants, asylum seekers, refugees, seasonal workers, and undocumented migrants. Not only does it interfere with their rights, it potentially allows government authorities to interfere with the rights of UK citizens as long as they can claim they are doing so for immigration control purposes.</span></span></span></p> <p><span><span><span>In view of existing and expanding data processing policies and practices for immigration purposes, there is an urgent need to regulate and monitor entities who undertake or are involved in the processing of migrants' data to ensure they comply with internationally recognised data protection principles and standards, as well as human rights.</span></span></span></p> <p><span><span><strong><span>What does the complaint challenge?</span></strong></span></span></p> <p><span><span><span>Schedule 2, Part 1, paragraph 4 of the UK's Data Protection Act includes a broad exemption for the “the maintenance of effective immigration control” or “the investigation or detection of activities that would undermine the maintenance of immigration control.” This was one of the main shortcomings of the Bill as it was being passed through the UK parliament which Privacy International and other Civil Society Organisations (CSOs) <a href="https://privacyinternational.org/advocacy-briefing/1863/privacy-internationals-proposed-amendments-data-protection-bill-report-stage">challenged</a> during the various stages of the drafting of the law.</span></span></span></p> <p><span><span><span>This clause provides for a broad and wide-ranging exemption which is open to abuse and should be removed altogether. Instead, there already exist other exemptions within the Act that the immigration authorities can seek to rely on for the processing of personal data in accordance with their statutory duties or in the case of an offence.</span></span></span></p> <p><span><span><span>Fundamentally, the GDPR was drafted to strengthen the rights of individuals with regard to the protection of their data, impose more stringent obligations on those processing personal data, and provide for stronger regulatory enforcement powers. This broad exemption goes against these core principles and the standards of data protection and human rights. It curtails the rights of individuals and removes the obligations data processors should be subject to in their data processing activities.</span></span></span></p> <p><span><span><strong><span>What should the UK do?</span></strong></span></span></p> <p><span><span><span>In order for the United Kingdom to be in compliant with the GDPR and other internationally recognised data protection standards, as well as respect its obligations governed by other national and international frameworks upholding the rights of migrants, it must strike out this broad exemption.</span></span></span></p> <p><span><span><span>We look forward to working with PICUM, other co-submitters, and UK organisations working on the rights of migrants and on digital rights issues as part of this complaint, and <a href="https://www.openrightsgroup.org/blog/2018/what-is-at-stake-with-the-immigration-exemption-legal-challenge">other advocacy and legal initiatives</a> on-going in the UK.</span></span></span></p> <p><span><span><span>As we wait to hear back from the European Commission about the complaint, PI will continue to research, raise awareness, and help bring change to make sure fairness, human rights, and security are built into the next generation of data-driven immigration enforcement.</span></span></span></p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Thin_Purple_Line_Patch.jpg" width="1200" height="675" alt="Uk immigration" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Thin_Purple_Line_Patch_0.jpg" width="1200" height="675" alt="Uk immigration" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Thin_Purple_Line_Patch_1.jpg" width="1200" height="675" alt="Uk immigration" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-above"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/demand-humane-approach-immigration" hreflang="en">Demand a Humane Approach to Immigration</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/safeguarding-peoples-dignity" hreflang="en">Safeguarding Peoples&#039; Dignity</a></div> </div> </div> <div class="field field--name-field-target field--type-entity-reference field--label-above"> <div class="field__label">Target Stakeholders</div> <div class="field__items"> <div class="field__item"><a href="/target/government" hreflang="en">Government</a></div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/protecting-migrants-borders-and-beyond" hreflang="en">Protecting migrants at borders and beyond</a></div> </div> </div> </div> </div> Wed, 10 Jul 2019 19:50:51 +0000 staff 3064 at http://privacyinternational.org Deja Vu: UK's Science and Technology Committee suggests that everyone is made to have a unique ID number http://privacyinternational.org/news-analysis/3048/deja-vu-uks-science-and-technology-committee-suggests-everyone-made-have-unique <div class="layout layout--onecol"> <div class="layout__region layout__region--content"> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>In July 2019, the UK House of Commons' Science and Technology Committee published a report on Digital Government. Lying not so subtlely amongst it's <a href="https://publications.parliament.uk/pa/cm201719/cmselect/cmsctech/1455/1455.pdf">recommendations</a> is this: "<em>The Government should facilitate a national debate on single unique identifiers for citizens to use for accessing public services along with the right of the citizen to know exactly what the Government is doing with their data."</em> </p> <p>It's been <a href="https://news.sky.com/story/mps-call-for-national-debate-over-governments-digital-id-card-plans-11760246">pointed out</a> that this is basically some of the worst features of an ID card without the card. </p> <p>Britain has a history of opposition to these types of systems. As a British judge <a href="https://www.privacyinternational.org/legal-action/nubian-rights-forum-and-others-v-attorney-general-kenya">said</a> in 1992, "if the information obtained by the police, the Inland Revenue, the social security services, the health service and other agencies were to be gathered together in one file, the freedom of the individual would be gravely at risk. The dossier of private information is the badge of the totalitarian state."</p> <p>As the Science and Technology Committee pointed out in their report, the mid-2000s were a source of debate over the use of such unique identifiers in the ID number. The work conducted by the the <a href="http://www.lse.ac.uk/management/research/identityproject/">London School of Economics</a> at this time highlighted many of the concerns, including how this can be used to profile individuals across multiple datasets. This power to see the 360-degree view of the individual is ripe for abuse. Fear over these systems is not an abstract one, but rather are a part of the reality of life in the UK for many. See, for example, how the sharing of data between hospitals and schools with the Home Office <a href="https://www.libertyhumanrights.org.uk/challenge-hostile-environment-data-sharing">facilitated the "hostile environment" policy</a> that casued such damage.</p> <p>The debate shouldn't be about having insight into how your identifier is used. It should be about making sure that identifiers are never usable.</p> <p>After all, any unique identifier will not be limited to government use. Whether through design or commercial necessity, any such number will also find it's way into the private sector. This was <a href="http://www.lse.ac.uk/management/research/identityproject/]">another fear highlighted</a> in the mid-2000s, but it has played out elsewhere. For example, the Indian Supreme Court, in their <a href="https://www.privacyinternational.org/long-read/2299/initial-analysis-indian-supreme-court-decision-aadhaar">ruling</a> on the Aadhaar system that provided a unique number to more than a billion people, that there were dangers of its use in the private sector: "Allowing private entities to use Aadhaar numbers will lead to commercial exploitation of an individual’s personal data without his/her consent and could lead to individual profiling."</p> <p>Given everything that's happened since, the 13 years since the 2006 ID Card Act (that was repealed in 2010) can seem like a lifetime. But it's clear that the concerns expressed then remain prescient now. Now that we know so much more about the risks that the exploitation of people's data plays - and the targeting, profiling and manipulating of individuals and groups - we should be even more fearful today of such a system than we were a decade ago. Furthermore, it's been shown that <a href="//eprints.lse.ac.uk/90577/1/Whitley_Trusted%20digital%20ID_2018.pdf,">we do not need</a> such a unique identifier for people to securely access government services online, and it's on such concepts we must build going forward.</p> <p>Soon, there will be a further consultaiton on the future of digitial identity in the UK. It's essential that our civil liberties are not infringed by these technologies, and that we do not take a step backwards through ideas like a single unique identification number. Rather, we should be looking at how we can further protect people while they access services going forward.</p> <p>So our question in return to the committee is this: can they foresee an identity system, involving identifiers, that never get used to create hostile environments for migrants, that never target people on benefit, and that is never used by the private sector to profile and abuse people? Now that would be a fun debate.</p> <p><em>[Image source: <a href="https://commons.wikimedia.org/wiki/File:Civil_Defence_in_Britain-_National_Registration_Identity_Card_D1887.jpg">Imperial War Museum</a>]</em></p></div> <div class="field field--name-field-large-image field--type-image field--label-above"> <div class="field__label">Large Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Civil_Defence_in_Britain-_National_Registration_Identity_Card_D1887.jpg" width="800" height="580" alt="Vintage identity card" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-image field--type-image field--label-above"> <div class="field__label">List Image</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Civil_Defence_in_Britain-_National_Registration_Identity_Card_D1887_0.jpg" width="800" height="580" alt="Vintage Identity card" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-list-icon field--type-image field--label-above"> <div class="field__label">List Icon</div> <div class="field__item"> <img src="/sites/default/files/flysystem/2019-07/Civil_Defence_in_Britain-_National_Registration_Identity_Card_D1887_1.jpg" width="800" height="580" alt="Vintage identity card" typeof="foaf:Image" /> </div> </div> <div class="field field--name-field-issue field--type-entity-reference field--label-above"> <div class="field__label">What PI is fighting for</div> <div class="field__items"> <div class="field__item"><a href="/what-we-do/id-identity-and-identification" hreflang="en">ID, Identity and Identification</a></div> </div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Learn more</div> <div class="field__items"> <div class="field__item"><a href="/topics/identity" hreflang="en">Identity</a></div> </div> </div> <div class="field field--name-field-location-region-locale field--type-entity-reference field--label-above"> <div class="field__label">Location</div> <div class="field__items"> <div class="field__item"><a href="/location/united-kingdom" hreflang="en">United Kingdom</a></div> </div> </div> <div class="field field--name-field-programme field--type-entity-reference field--label-above"> <div class="field__label">Strategic Area</div> <div class="field__items"> <div class="field__item"><a href="/strategic-areas/safeguarding-peoples-dignity" hreflang="en">Safeguarding Peoples&#039; Dignity</a></div> </div> </div> <div class="field field--name-field-campaign-name field--type-entity-reference field--label-above"> <div class="field__label">What PI is Campaigning on</div> <div class="field__items"> <div class="field__item"><a href="/campaigns/demanding-identity-systems-our-terms" hreflang="en">Demanding identity systems on our terms</a></div> </div> </div> </div> </div> Wed, 10 Jul 2019 18:17:50 +0000 staff 3048 at http://privacyinternational.org