UN Special Rapporteur warns that beneficiaries are forced to give up their right to privacy and data protection to receive their right to social security

Press release
ESC rights card

Tomorrow, the UN Special Rapporteur on extreme poverty and human rights will present his annual report to the UN General Assembly in New York on digital technology, social protection and human rights. On the same day, Privacy International will be launching its own series on surveillance in the provision of social services. 

The Special Rapporteur warns that specific areas need to be addressed to "avoid stumbling zombie-like into a digital welfare dystopia" and that "values such as dignity, choice, self-respect, autonomy, self-determination, privacy, and a range of other factors are all traded off" supposedly in the name of efficiency, budget-saving, flexibility and fraud detection.

The case-studies and harms presented in the report, supported by the submissions of various stakeholders, illustrate the existing threats resulting from the current understanding and direction of the digitisation of social welfare. Ensuring that new technologies can "transform the welfare state for the better" requires rethinking the current approach to digital welfare, ensuring strong regulation and adopting a human rights approach.

You can access Privacy International's submission to the UN Special Rapporteur here.



  • At every stage of the welfare system, both the UNSR and PI's analysis found serious concerns regarding regarding the (mass) processing of personal data, its exploitation, and the monitoring and surveillance of beneficiaries, and how these instances cannot be divorced from the surveillance state and the broader data exploitation ecosystem.
  • Whilst the report explores various technologies including biometrics and smart cards, a recurring concern throughout is around automation through the use of algorithms and artificial intelligence at various stages of the welfare system. Similarly to the Special Rapporteur, Privacy International is also concerned by the increasing automation of decision-making given that such processes may lead to unfair, discriminatory, or biased outcomes. People can be misclassified, misidentified, or misjudged and such errors or biases may disproportionately affect certain groups of people. The Special Rapporteur emphasises the need to counter such biases in the design of the digital welfare state, and in doing so calls for policy-making processes to be open, inclusive and participatory and, in particular, to include those affected.
  • Reflecting on the digitisation and automation of elegibility processes, the report calls out the reversal of the burden of accountability, which is now placed on the individual to demonstrate that they are 'deserving'. The Special Rapporteur challenges the rigidity of application processes, the inability of systems to adapt to everyday realities, and how systems are failing to account for those in particularly vulnerable situations. The report also denounces how the integration of technology in the welfare system has removed all human interaction, leading to the dehumanisation of applicants, and removed non-digital options, creating inequalities of access, for example for older people.
  • Digital welfare systems already are or have the potential to become tools of surveillance. As with many digital systems, this raises questions about the role of the private sector in facilitating governments to monitor people/citizens, but also the opportunity for companies themselves to exploit this data. Efficiency and fraud prevention are legitimising the direct targeting and harassment of people who seek welfare assistance. Today, Privacy International has also released exclusive research documenting the use of a payment card for the management of welfare support to asylum seekers in the UK - a system that has reportedly been used by the UK government to surveil asylum seekers.
  • Human rights have not been taken seriously by governments and industry.  Most efforts to regulate have been undermined by dubious arguments that regulation stifles innovation or is not necessary. The Special Rapporteur's report supports our view, that these arguments are unfounded, ill-informed, and merely reflect the reluctance of both governments and companies to regulate their activities, and to be accountable for their human rights obligations. Governments have national and international legal obligations to protect the right to privacy and to regulate the processing of personal data effectively. Social protection programmes must be subject to these regulatory mechanisms.
  • Not only does the private sector provide solutions to governments, but through the delivery of their own services and products they also feed their own and the broader data exploitation ecosystem. At Privacy International we are also pushing for effective regulation of, what we refer to as, the ‘government-industry complex’, which designs and manages social protection programmes - in particular to tackle issues of unaccountability and data exploitation by default and design. As the report indicates, the industry is currently operating in an "almost human rights free-zone" - this must be rectified urgently.


Jamila Venturini, Regional coordinator at Derechos Digitales said:

"The fact that surveillance mechanisms primarily target marginalized and impoverished populations is not something new in Latin America. However, the emergence of new forms of social control hidden behind promises of more efficiency brought by new technologies seriously risk increasing the distance between rich and poor in the world’s most unequal region. The need to deliver basic services is used as an excuse to develop mandatory biometric identification and predictive systems and subject the most vulnerable groups to full exclusion or invasive public and private surveillance, making the exercise of fundamental rights a privilege of the few who can pay for them."


Nathalie Fragoso, Head of research on Privacy and Surveillance at InternetLab said:

"The right to privacy and informational autonomy must be considered and complied with in welfare systems. In Brazil, for example, the enrollment in PBF (Bolsa FamÍlia Program) involves the massive collection of information, in principle, obtained to identify and address, through this and other federal programs, vulnerabilities experienced by a large part of the Brazilian population. However, shortcomings in the information security policy and the resulting risk of sensitive data exposure remain concerns. Such vulnerability has major repercussions, like the leakage and use of beneficiaries' data in the last electoral campaign by one of the presidential candidates.”


Alexandrine Pirlot de Corbion, Director of Strategy, at Privacy International said:

"Failure to protect people and their rights through the adoption of effective safeguards and due process from the out set, undermines the benefits promised by digital welfare policies and will amplify pre-existing shortcomings and injustices. All human rights are universal, indivisible, interdependent and interrelated. Individuals seeking welfare assistance should not have to trade off their right to privacy to enjoy their right to social protection. This requires effective regulation and oversight of all the actors in the ecoystem, including the drivers, the proponents and the suppliers, to ensure they are held to account and people can live with dignity and autonomy, free from undue surveillance and exploitation. The report of the Special Rapporteur is an opportunity to consider what changes need to be made to effective regulate all the actors in the ecoystem, including the drivers, the proponents and the suppliers, to ensure they are held to account and people can live with dignity and autonomy, free from undue surveillance and exploitation."


Notes to editors

As part of its new strategic programme aimed at safeguarding peoples’ dignity, Privacy International (PI) is researching and documenting the drivers of digital systems for the realisation of economic, social and cultural rights, as well as the role of the private sector as suppliers of technological solutions.

Privacy International is working with partners in Brazil, Chile and India to challenge the role of surveillance and data exploitation in social benefits systems.

We aim to also alert the donor community and other drivers of data intensive welfare programmes to ensure that the dignity, and rights of individuals, especially those most marginalised, are protected in the programmes they fund and encourage, particularly in countries where these are being developed as part of development and aid projects.


For more resources on Privacy International's work on data, technology and social rights, please visit: