Practices of humanitarian sector are leaving aid recipients at risk, PI and ICRC find

Do No Harm

Key Points

  • A new report by Privacy International and the International Committee of the Red Cross finds that the humanitarian sector’s use of digital and mobile technologies could have detrimental effects for people receiving humanitarian aid.
  • This is because these digital systems generate a ‘data trail’ that is accessible and exploitable by third parties for non-humanitarian purposes. This metadata can be used to infer extremely intimate details, such as someone’s travel patterns or even religious beliefs.
  • As metadata tends to have low legal protections, there is often less transparency and accountability around its generation, collection and use by governments and industry.

The humanitarian sector’s increasing reliance on digital and mobile technologies could potentially place the data of millions receiving humanitarian aid at risk, according to a joint report by Privacy International and the International Committee of the Red Cross (ICRC).

The report, entitled “The humanitarian metadata problem: ‘Doing no harm’ in the digital era”, reveals how the use of these digital systems generates records – a data trail – that can be accessed and exploited by third parties (and governments) for unknown purposes. It describes how this metadata – the data that describes and gives information about other data, such as location, time and duration – can be used to infer details such as someone’s travel patterns and even their religious beliefs.

The report also exposes the failure of businesses and governments to design and implement privacy-friendly systems (telecom, banking, internet services) that protect people and their data, resulting in detrimental effects for those seeking humanitarian assistance. Metadata is often less protected in law, compared to content data, and there is less transparency and accountability around its generation, collection and use.

To counteract this, the report recommends that humanitarian organisations conduct mapping exercises to understand a service provider’s data protection policy and the legal safeguards that are necessary to protect aid recipients. It also calls on them to use their leverage to demand that providers design and implement these precautions.

Gus Hosein, Executive Director of Privacy International, said:

“Humanitarian organisations like the ICRC have an extraordinary mandate of ensuring humanitarian protection for people across the world. It is essential that their tools, which are often provided by companies, support their efforts rather than undermine them. This way, the very people they are assisting will not be at risk of exploitation and abuse.”

 

Alexandrine Pirlot de Corbion, Lead of Privacy International’s Global Programme, said:

“We are pleased to collaborate with ICRC on this very important project. It is clear that there is a change needed in the current data ecosystem for humanitarian organisations to truly do no harm. We hope that the findings of this report will inform others in the humanitarian community and put pressure on industry and governments.”

 

Notes to Editors

  • In 2013, Privacy International published “Aiding Surveillance”, a report that raised concerns around development and humanitarian organisations’ adoption of new technologies and data-intensive systems (e.g. biometric identification schemes). Specifically, the report revealed how these systems and technologies enabled and facilitated the surveillance and repression of people who register to benefit from development and humanitarian programmes.