Search
Content type: Examples
The US Department of Homeland Security awarded a $113 million contract to General Dynamics to carry out the Visa Lifecycle Vetting Initiative (VLVI), a renamed version of the Extreme Vetting Initiative and part of a larger effort called the National Vetting Enterprise. In May 2018, public outrage led the DHS to back away from a machine learning system that would monitor immigrants continuously; however, the reason it gave was that the technology to automate vetting did not yet exist. These…
Content type: Examples
VeriPol, a system developed at the UK's Cardiff University, analyses the wording of victim statements in order to help police identify fake reports. By January 2019, VeriPol was in use by Spanish police, who said it helped them identify 64 false reports in one week and was successful in more than 80% of cases. The basic claim is that AI can find patterns that are common to false statements; among the giveaways experts say that false statements are likely to be shorter than genuine ones, focus…
Content type: Examples
New workplace technologies are generating mountains of data on workers despite a lack of clarity over how the data is used and who owns it. In offices, smart badges track interactions and sensors track fitness and health; in trucks sensors monitor drivers' performance in the name of safety. In the US state of Illinois, between July and October 2017 26 lawsuits were filed by employees alleging that their employers had violated the state's Biometric Information Privacy Act, which requires a…
Content type: Examples
After four years of negotiation, in 2017 Google began paying Mastercard millions of dollars for access to the latter's piles of transaction data as part of its "Stores Sales Measurement" service. Google, which claimed to have access to 70% of US credit and debit cards through partners, said that double-blind encryption prevents both partners from seeing the other's users' personally identifiable information. Mastercard said the company shares transaction trends with merchants and their service…
Content type: Examples
In September 2018, at least five local English councils had developed or implemented a predictive analytics system incorporating the data of at least 377,000 people with the intention of preventing child abuse. Advocates of these systems argue that they help councils struggling under budget cuts to better target their limited resources. The Hackney and Thurrock councils contracted the private company Xantura to develop a predictive model for them; Newham and Bristol have developed their own…
Content type: Examples
By 2018, the Danish municipality of Gladsaxe, in Copenhagen, began identifying children at risk of abuse so that flagged families could be targeted for early intervention by applying a set of specially designed algorithms to information already gathered by the centralised Udbetaling Danmark government agency, including health records and employment information, all linked to the personal identification number issued to each Dane at birth. The system raised concerns about mission creep; both the…
Content type: Examples
In October 2018, the Singapore-based startup LenddoEFL was one of a group of microfinance startups aimed at the developing world that used non-traditional types of data such as behavioural traits and smartphone habits for credit scoring. Lenddo's algorithm uses numerous data points, including the number of words a person uses in email subject lines, the percentage of photos in a smartphone's library that were taken with a front-facing camera, and whether they regularly use financial apps on…
Content type: Examples
In late 2017, the residents of the small town of Santa Maria Tonantzintla, about three-hours away from Mexico City, discovered their town was intended to become a pilot smart city in a collaboration between the state of Puebla and the organisation Alianza Smart Latam. The town's residents, who had already lost its distinctive cobblestones, the clock tower, and a stucco bridge as part of the early stages of the project, filed an injunction to halt the project, which had failed to obtain the…
Content type: Examples
In October 2018, the answers to a FOIA request filed by the Project on Government Oversight revealed that in June 2018 Amazon pitched its Rekognition facial recognition system to US Immigration and Customs Enforcement officials as a way to help them target or identify immigrants. Amazon has also marketed Rekognition to police departments, and it is used by officers in Oregon and Florida even though tests have raised questions about its accuracy. Hundreds of Amazon workers protested by writing a…
Content type: Examples
In October 2018, in response to questions from a committee of MPs, the UK-based Student Loans Company defended its practice of using "public" sources such as Facebook posts and other social media activity as part of the process of approving loans. In one case earlier in the year, a student was told that a parent's £70 Christmas present meant the student did not qualify for a maintenance loan without means testing because it meant the student was not estranged from their family. SLC insisted…
Content type: Examples
In November 2018, tests began of the €4.5 million iBorderCtrl project, which saw AI-powered lie detectors installed at airports in Hungary, Latvia, and Greece to question passengers travelling from outside the EU. The AI questioner was set to ask each passenger to confirm their name, age, and date of birth, and then query them about the purpose of their trip and who is paying for it. If the AI believes the person is lying, it is designed to change its tone of voice to become "more skeptical"…
Content type: Examples
In November 2018, 112 civil liberties, immigrant rights groups, child welfare advocates, and privacy activists wrote a letter to the heads of the US Department of Health and Human Services and the Department of Homeland Security demanding an immediate halt to the HHS Office for Refugee Resettlement's practice of using information given them by detained migrant children to arrest and deport their US-based relatives and other sponsors. The policy began in April 2018, and the result has been that…
Content type: Examples
In November 2018 the UK's Equality and Human Rights Commission warned that asylum seekers have been deterred from seeking medical help in Scotland and Wales since the UK government began forcing the English NHS to charge upfront in 2017 and by fears that medical personnel will comply with Home Office orders to forward their data. The commission, along with health charities and the Labour and LibDem political parties, called for the policy to be suspended. The Home Office policy of moving asylum…
Content type: Examples
The Home Office Christmas 2018 announcement of the post-Brexit registration scheme for EU citizens resident in the UK included the note that the data applicants supplied might be shared with other public and private organisations "in the UK and overseas". Basing the refusal on Section 31 of the Freedom of Information Act, the Home Office refused to answer The3Million's FOI request for the identity of those organisations. A clause in the Data Protection Act 2018 exempts the Home Office from…
Content type: Examples
In December 2018, a report, "Access to Cash", written by the former financial ombusdsman Natalie Ceeney and independent from but paid for by the cash machine network operator Link, warned that the UK was at risk of sleepwalking into a cashless society and needed to protect an estimated 8 million people (17% of the British population) who would become disadvantaged as a result. Although cash used halved between 2007 and 2017, and debit cards passed cash in share of retail transactions in 2017,…
Content type: Examples
In October 2018, British home secretary Sajid Javid apologised to more than 400 migrants, who included Gurkha soldiers and Afghans who had worked for the British armed forces, who were forced to provide DNA samples when applying to live and work in the UK. DNA samples are sometimes provided by applicants to prove their relationship to someone already in the UK, but are not supposed to be mandatory. An internal review indicated that more people than the initially estimated 449 had received DNA…
Content type: Examples
In December 2018, Florida citizen Peter Sean Brown filed a federal lawsuit against the Monroe County Sheriff's offices for arresting and detaining him for three weeks claiming he was an illegal alien from Jamaica. Even though Brown offered to show the sheriff his birth certificate and explained he had been wrongfully detained 20 years before and the jail's own records listed his birthplace as Philadelphia, PA, the sheriff relied on a form sent by Immigration and Customs Enforcement. Brown…
Content type: Examples
In December 2018, in the wake of the Windrush scandal, the National Police Council, which represents police chiefs across England and Wales agreed to cease passing on to deportation authorities information about people suspected of being in the country illegally. The measures also ban officers from checking the police national computer solely to check on immigration status. Police said they believed that their too-close relationship with immigration authorities in aid of the government's "…
Content type: Examples
In a report released in December 2018, the UK's National Audit Office examined the management of information and immigrant casework at the Home Office that led to the refusal of services, detention, and removal of Commonwealth citizens who came to the UK and were granted indefinite leave to remain between 1948 and 1973, the so-called "Windrush generation" but never given documentation to prove their status. The NAO concludes that the Home Office failed to adequately consider its duty of care in…
Content type: Examples
In a November 2018 report based on a year's study of the use of data scores, Data Justice Lab provided a comprehensive look at the use of data-driven citizen scoring in government, particularly focusing on six case studies drawn from local councils in the UK. The report noted there is no systematic information about where and how these systems are being deployed, there are no standard practices or common approaches, and local government transparency varied widely. While some councils develop…
Content type: Examples
In November 2018, worried American parents wishing to check out prospective babysitters and dissatisfied with criminal background checks began paying $24.99 for a scan from the online service Predictim, which claimed to use "advanced artificial intelligence" to offer an automated risk rating. Predictim based its scores in part on Facebook, Twitter, and Instagram posts - applicants were required to share broad access to their accounts - and offered no explanation of how it reached its risk…
Content type: Examples
In November 2018 reports emerged that immigrants heading north from Central America to the US border are more and more often ensuring they are accompanied by children because "family units" are known to be less likely to be deported, at least temporarily, and smugglers charge less than half as much when a minor is in the group because their risk is less. Some parents have given their children - sometimes for cash - to other adults such as a relative, godparent, or, sometimes, unrelated person.…
Content type: Examples
In November 2018, researchers at Sweden's University of Lund, the US's Worcester Polytechnic Institute, and the UK's Oxford University announced that in August the US State Department had begun using a software program they had designed that uses AI to find the best match for a refugee's needs, including access to jobs, medical facilities, schools, and nearby migrants who speak the same language. Known as "Annie MOORE", refugees matched by the program were finding jobs within 90 days about a…
Content type: Examples
In November 2016 the UK Information Commissioner's Office issued an enforcement notice against London's Metropolitan Police, finding that there had been multiple and serious breaches of data protection law in the organisation's use of the Gangs Violence Matrix, which it had operated since 2012. The ICO documented failures of oversight and coherent guidance, and an absence of basic data protection practices such as encryption and agreements covering data sharing. Individuals whose details are…
Content type: Examples
As early as 2008, the Chinese telecommunications giant ZTE began helping Venezuela develop a system similar to the identity system used in China to track social, political, and economic behaviour. By 2018, Venezuela was rolling out its "carnet de la patria", a smart-card "fatherland" ID card that was being increasingly linked to the government-subsidised health, food, and other social programmes most Venezuelans relied on for survival. In 2017, Venezuela hired ZTE to build a comprehensive…
Content type: Examples
After an 18-month investigation involving interviews with 160 life insurance companies, in January 2019 New York Financial Services, the state's top financial regulator, announced it would allow life insurers to use data from social media and other non-traditional sources to set premium rates for its customers. Insurers will be required to demonstrate that their use of the information doesn't unfairly discriminate against specific customers. New York is the first state to issue specific…
Content type: Examples
A study published in February 2019 found that 95% of the predictive accuracy for an individual can be achieved solely by analysing their social ties; the person's data is not needed. Given as few as eight or nine of an individual's contacts, it should be possible to profile the individual even if they don't use the platform. This has profound privacy implications for protesters, elections, and misinformation campaigns.
https://www.nature.com/articles/s41562-018-0510-5.epdf
Writer: James P.…
Content type: Examples
A study published in January 2019 found that a form of facial recognition technology that interprets emotions in facial expressions assigns more negative emotions to black men's faces than white men's faces. The problem is the latest in a series of ways that facial recognition has failed for non-white subjects. The study used the official photographs of 400 professional NBA basketball players and found two types of bias: black faces were consistently scored as angrier than white faces for every…
Content type: Examples
In 2018, technology companies and medical providers were experimenting with machine learning and AI to mine health records and online posts to identify patterns linked to suicide, hoping to be able to predict, and therefore prevent, such attempts. On the academic side, a pilot programme conducted by the US Department of Veterans Affairs, REACH.VET, attempted to identify veterans at high risk for self-harm. On the commercial side, companies such as Facebook began experimenting with suicide…
Content type: Examples
In December 2018, Facebook provided an update on the civil rights audit it asked civil rights leader Laura Murphy to undertake in May. Based on advice Murphy culled from 90 civil society organisations, Facebook said it had expanded its policy prohibiting voter suppression, updated its policy to ban misrepresentation about how to vote, begun sending information about voting to third-party fact checkers for review, and was ramping up efforts to encourage voter registration and engagement.
https…