There are three good reasons why security is so hard for NGOs. First, we are afraid to speak about meaningful security. Second, we focus on the wrong areas of security and in turn spend money and prioritise the wrong things. Third, we struggle to separate the world we want from the worlds we build within our own organisations. At PI we have failed and struggled with each of these for over 20 years.
You are here
For further information on timeline and case history, read this briefing.
The argument were based on the written submissions of the parties. The oral statements summarised key points in these submissions.
The European Court of Human Rights will hear a landmark case on surveillance tomorrow (7 November) as part of a challenge to the lawfulness of the UK’s surveillance laws and its intelligence agencies’ mass surveillance practices.
October 31st 2017 will mark the 3rd World Cities Day (we will forgive if you did not know that), with the general theme “Better City, Better Life.” On this date, PI will be launching its latest report “Smart Cities: Utopian Vision, Dystopian Reality”. This is an opportunity for us to ask: who exactly are our cities going to become better for?
Photo Credit: AU UN IST / Tobin Jones
From unlocking a smartphone or getting through an airport, the use of an iris, fingerprint, or your face for identity verification is already widespread, and the market for it is set to rocket. While the technology is not new, its capability and uses are.
The United States Department of Homeland Security (DHS) has contracted one of the world’s largest arms companies to manage a huge expansion of its biometric surveillance programme.
According to a presentation seen by Privacy International, the new system, known as Homeland Advanced Recognition Technology (HART), will scoop up a whopping 180 million new biometric transactions per year by 2022.
The short answer is yes.
I'm sure many of you have seen people with stickers over their webcams and wondered why (probably writing that person off as paranoid). But it's well known in tech circles that a camera in a computer or smartphone can be turned on remotely by an attacker with the resources, time, and motivation.
While welcoming the objective of the Bill, Privacy International has sent a briefing to the House of Lords and a letter to Minister of State for Digital, Matt Hancock MP, outlining key concerns and recommendations. The Bill's stated aim is “to create a clear and coherent data protection regime”, and to update the UK data protection law, including by bringing the EU General Data Protection Regulation (GDPR) and the Data Protection Law Enforcement Directive (DPLED) - into the UK domestic system. We've summarsised our concerns below.
Privacy International, in partnership with 30+ national human rights organisations, has today written to national intelligence oversight bodies in over 40 countries seeking information on the intelligence sharing activities of their governments.
Cities around the world are deploying collecting increasing amounts of data and the public is not part of deciding if and how such systems are deployed.
Our connected devices carry and communicate vast amounts of personal information, both visible and invisible.
As society heads toward an ever more connected world, the ability for individuals to protect and manage the invisible data that companies and third parties hold about them, becomes increasingly difficult. This is further complicated by events like data breaches, hacks, and covert information gathering techniques, which are hard, if not impossible, to consent to. One area where this most pressing is in transportation, and by extension the so-called ‘connected car’.
Political campaigns around the world have turned into sophisticated data operations. In the US, Evangelical Christians candidates reach out to unregistered Christians and use a scoring system to predict how seriously millions these of voters take their faith. As early as 2008, the Obama campaign conducted a data operation which assigned every voter in the US a pair of scores that predicted how likely they would cast a ballot, and whether or not they supported him.
Financial services are collecting and exploiting increasing amounts of data about our behaviour, interests, networks, and personalities to make financial judgements about us, like our creditworthiness.
Increasingly, financial services such as insurers, lenders, banks, and financial mobile app startups, are collecting and exploiting a broad breadth of data to make decisions about people. This is particularly affecting the poorest and most excluded in societies.
For example, the decisions surrounding whether to grant someone a loan can now be dependent upon:
Gig economy jobs that depend on mobile applications allow workers’ movements to be monitored, evaluated, and exploited by their employers.