Advanced Search
Content Type: Examples
US Immigration and Customs Enforcement is using an obscure administrative subpoena called “1509”, intended for use only in criminal investigation about illegal imports or unpaid customs duty. Most requests have sought records from telecommunications companies, technology firms, money transfer services, airlines, and others, but in outlier cases ICE has used 1509s to obtain records from a Texas youth soccer league, a Georgia elementary school, boards of elections, two news organisations and a…
Content Type: Advocacy
In January 2024, PI responded to the call for input to the report by the UN High Commissioner for Human Rights on the impact of arms transfers on human rights, with a focus on the role of access to information in preventing, mitigating, and responding to the negative human rights impact of arms transfers - offering our experiences of navigating access to information regimes in the UK and the EU.
Access to information laws and processes are crucial in uncovering details of states' capabilities…
Content Type: Examples
Wisconsin schools use a racially discriminatory Dropout Early Warning System built by the state to identify incoming 9th graders who may be at risk of failing to graduate on time in order to offer them help. The system’s machine learning algorithms make their assessments based on test scores, disciplinary records, lunch price status, and race. In a study of millions of predictions over a decade, The Markup finds that the system may be wrongly and negatively influencing teachers’ impressions of…
Content Type: Advocacy
Privacy International welcomed the opportunity to provide input to the study of the UN Human Rights Council Advisory Committee on the human rights implications of new and emerging technologies in the military domain (NTMD) to be presented to the Human Rights Council at its sixtieth session.In the course of our work, we observe that the line between military and civilian technologies is blurring. Governments are increasingly relying on the very same technologies for military and civilian uses.…
Content Type: People
Angelina is a Technology Advocacy Officer at Privacy International. She works primarily on projects related to: Health, Migration, Data Brokerage and Black Box Management.She's interested in the life cycle of the digital sphere, particularly how algorithms direct users' experiences with technology and with each other, as well as the impact of AI and automated decision-making on human behaviour and attitudes.Prior to PI, Angelina worked in the climate, tech and policy sectors. She holds a B.A.…
Content Type: Examples
A new report finds that monitoring software is in wide use in US K-12 schools, and that teachers, parents, and students generally believe the benefits outweigh the risks while still expressing some privacy and equity concerns. The authors recommend transparency, data minimisation, and mitigation of inequitable results stemming from this monitoring. The authors also recommend that schools should retain control of the data and build capacity within the school system and surrounding communities as…
Content Type: Examples
An investigation finds that using search tools provided by the College Board, the organisation that administers SATs and Advanced Placement exams for university-bound students, prompts it to send details of SAT scores, grade point averages, and other data to Facebook, TikTik, and other companies via pixels embedded in its site. The tools help students find colleges that accept students with specific grades or test scores. College Board says the pixels are merely there to measure the…
Content Type: Examples
The spread of edtech has not, as hoped, levelled the playing field but widened the gap in skills between children of affluence and children of poverty, a new study finds. Removing problems of access - for example, by placing computers in public libraries - doesn't solve this because given access rich kids and poor kids use technology differently, often because children of affluence have more guidance who help solve frustrating problems and steer children towards educational resources. The net…
Content Type: Examples
A student in Minneapolis was outed when their parents were contacted by school administrators when surveillance software found LGBTQ keywords in their writing on a school-supplied laptop. The risk of many more such cases is increasing as the use of edtech spread, fuelled by the pandemic, and legislation, lawsuits, and pressure campaigns push schools to implement anti-LGBTQ policies. Software such as Gaggle, which surveils school computers and student accounts, constantly monitors students…
Content Type: Examples
Experiments with personalised technology-mediated learning have been successful in the controlled environment of charter schools, but now must prove their worth in traditional district schools with much larger class sizes, more rigid schedules, long-established teaching and learning cultures, and the pressures imposed by standardised tests. In a pilot, the California-based charter school network Summit Public Schools is offering its tools, training, and support for free to help other schools…
Content Type: Examples
The Innovation Academy in Sunrise, Arizona, is experimenting with offering 90 sixth through eighth-graders self-paced computerised lessons that generate data four teachers can use to monitor their progress, spot students who need help, and develop small-group activities. Key to the programme is "little data" that helps students understand their strengths and weaknesses and develop personalised learning plans. The Dysart district, where Sunrise is located, hoipes to expand the programme to all…
Content Type: Examples
Many US schools give students tablets, but the key to their successful use is providing data plans. The US's "homework gap" is the lack of at-home Internet access that keeps many children from being able to use the benefit from the many investments in edtech that are being made. In a new intiative, Qualcomm is working with other leaders in wireless technology to create the equivalent of a reduced-lunch plan for data.Article: Data plans key to schools' success with tablets Publication:…
Content Type: Examples
Chromebooks, which many schools purchased at the beginning of the pandemic because of their lower cost compared to PCs and Macs, are proving expensive as their prices rise, the cost of repairs bites, and Google's expiration policy means many models are about to become e-waste. A study from US PIRG finds that doubling the Chromebooks' lifespan could save public schools $1.8 billion. Older Macs and PCs, by contrast, can go on being used and have resale value. Article: Chromebooks expire to…
Content Type: Examples
Google is working to extend the lifespan of Chromebooks by providing software updates for up to a decade. The new policy, which will begin in 2024, will ensure that no current Chromebook expires in the next two years. The expiration dates were proving expensive for schools, which were having to spend millions of dollars on replacements because unsupported Chromebooks can't be used for mandatory state testing. Article: Google extends life of ChromebooksPublication: Wall Street JournalWriter…
Content Type: Examples
Months after a District Court judgment that Cleveland State University violated student privacy in requiring the use of an online proctoring service that required a scan of students' rooms, some professors California colleges were still using such software for remote exams. Privacy rights campaigners argue that the software is invasive and discriminatory and a violation of the Fourth Amendment; e-proctoring companies reply that the data they collect is limited.Article: California colleges use…
Content Type: Examples
New research shows that schools' scramble to adopt new technologies in schools have given for-profit companies a massive opening into the data of young people's everyday lives and created an $85 billion industry that has brought security and privacy risks for all concerned. Schools, meanwhile, lack the resources and knowledge to manage security vulnerabilities. Article: Edtech gives technology companies portal into students' lives Publication: LA School ReportWriter: Mark Keierleber…
Content Type: Examples
The New York State Department of Education has prohibited schools in New York State from purshasing or using facial recognition technology. Schools can use other types of biometric identifying technology as long as they consider the privacy implications. Article: New York State bans facial recognition in schoolsPublication: New York State Education DepartmentWriter: NYSED
Content Type: Examples
Human Rights Watch called on the national government of Brazil to amend the country's data protection law to add new safeguards to protect children online following the discovery that seven educational webistes directed at Brazilian students, including two created by state education secretariats, used tracking developed for advertising purposes to surveil children, harvested their personal data, and sent it to third-party companies. The websites watched children in their online classrooms and…
Content Type: Examples
Following a report from Human Rights Watch, The Public Ministry of São Paolo began an investigation to find out whether government education platforms and services collected students' personal data and sent it to adtech companies in violation of the General Data Protection Law. Article: São Paolo ministry investigates education websites for dataveillance Publication: G1Writer: Arthur Stabile
Content Type: Examples
Education experts and publishers in Brazil are warning of the negative consequences of a decision by the São Paolo state government to replace textbooks with ebooks for students over 14 starting in 2024. Many students have no Internet access, and publishers argue it will irreparably damage the testbook industry.Article: Brazilian educators oppose ebook replacement for textbooks Publication: Phys.org Writer: Phys.org
Content Type: Examples
After an in-person auction in São João let Brazilian technology companies bid for a contract to supply facial recognition technology to the public school system. PontoID, which won the $162,000 contract, began secretly rolling out the technology without informing parents or students in advance. The goal was to help track school attendance; parents would receive a text message when students arrive and leave school. In some towns, more than 80% of the population are of African descent; facial…
Content Type: Examples
The Court of Appeals for British Columbia rejected the claim made by whistleblower Ian Linkletter that linking to freely available materials from the remote proctoring company Proctorio was legitimate criticism. The company has a history of attacking those who criticise it and its products. Linklater, while working as Learning Technology Specialist at the University of British Columbia in 2020, issued a number of criticisms of its approach on Twitter, and linked to "unlisted" (public but not…
Content Type: Examples
An administrative court in Montreil, France issued a preliminary ruling ordering the Paris-based Distance Learning Institute to suspend its use of the e-proctoring platform TestWe, which uses facial recognition and algorithmic analysis to monitor students.Video and sound analysis track students' eye movements and their surroundings, a practice the court ruled disproportionate. The case was brought by a group of students represented by La Quadrature du Net and casts doubt on the legality of…
Content Type: Examples
An app used by more than 100 Bristol schools has raised concern among criminal justice and anti-racism campaigners that the easy access it gives safeguarding leads to pupils' and their families' contacts with police, child protection, and welfare services risks increasing discrimination against those of minority ethnic or working class backgrounds. Staff using the app say the app is often kept secret from parents and carers. The council website says the Think Family database, which the app…
Content Type: Examples
UK government ministers are seeking to ensure schools benefit financially from any future use of pupils’ data by large language models such as those behind ChatGPT and Google Bard. Data from the national pupil database is already available to third-party organisations. The BCS head of education recommends that the Department of Education should write a clear public benefits statement to ensure that initiatives benefit pupils as well as providing financial benefits.https://schoolsweek.co.uk/…
Content Type: Examples
The UK's Behavioural Insights ("Nudge") Unit has trialled machine learning models to help automate some decisions made by regulators such as Ofsted (schools), and the Care Quality Commission (health and social care in England). The resulting algorithm uses data such as the number of children are on free school meals, teachers' pay, the number of teachers for each subject, and parents' reviews of schools in order to predict which schools' performance might be suffering. The dataset deliberately…
Content Type: Examples
The UK’s education watchdog, Ofsted, is considering checking pupils’ and parents’ social media pages to ensure that schools maintain their standards. Privacy campaigners oppose the plan as overreach, while representatives of teachers’ unions warned the information would be unreliable. https://inews.co.uk/news/education/ofsted-snoop-parents-pupils-social-media-56319Publication: i NewsWriter: Richard Vaughan
Content Type: Advocacy
Privacy International (PI), Big Brother Watch (BBW), StopWatch, CopWatch, Defend Digital Me, Liberty and Statewatch have written to Home Secretary James Cleverly to raise concerns over the danger posed to UK society by Facial Recognition Technology (FRT).In a letter sent on 18 January 2024, the signatories raised concerns over the escalating use of FRT and warned the Home Secretary that "The indiscriminate use of this dystopian biometric technology to identify people in public spaces is a form…
Content Type: Examples
English school head teachers were asked to fill out a census form designed in partnership with the Department of Education and hosted by Capita that included fields asking for pupils’ asylum status, ethnicity, and passport numbers and export dates. Families are meant to be advised it’s not mandatory to supply the information, but when they don’t schools “ascribe” - that is, guess - children’s ethnicity. Privacy campaigners expressed concern that the census data would be used for immigration…