In 1948, the United Nations General Assembly adopted the Universal Declaration of Human Rights (UDHR), outlining the basic civil, cultural, economic, political and social rights that all human beings should enjoy. In 1966, the International Covenant on Economic, Social and Cultural Rights was adopted by the UN which codified human rights relating to the workplace, social security, family life, participation in cultural life, and access to housing, food, water, health care and education.

The underlying concepts behind economic, social and cultural rights (ESCR) are not new. Issues of liberation of slavery, forced labour, child labour, then the development of labour rights, unions, questions of self-determination, the development of welfare and obligations of states to individuals date back centuries.

Some of these rights were already recognised in national laws and they are also included in other international and regional such as those which hold fundamental rights and freedoms as well as right of woman, children, persons with disabilities and migrant workers amongst others, as well as those on the elimination against discrimination including based on gender and race to name a few

ESCR are positive rights this means that a state must take steps “to the maximum of its available resources” to progressively realise them. States have various obligations:

  • to respect ESCR (itself refrain from any violation of ESCR);
  • to protect ESCR (prevent third parties from violating ESCR);
  • to fulfil ESCR (take necessary measures to realise ESCR, including through legislative, administrative, budgetary and other processes); and
  • to seek and provide international assistance and cooperation in the realisation of ESCR.

States must guarantee ESCR without discrimination on the basis of grounds specified in the ICESCR, including race, colour, sex, language, religion, political or other opinion, national or social origin, property, and birth and additional prohibited grounds for discrimination, including disability, age, nationality, marital and family status, sexual orientation and gender identity, health status, place of residence, and economic and social situation

What is the problem?

Just as in many other sectors of governance, governments around the world are embracing innovations in technology and data processing capabilities to develop systems that would enable them to progressively realise social, economic and cultural

For example, when it comes to social security we have seen the digitisation of welfare systems to manage the delivery of public benefits including health and housing, the deployment of biometric systems to access food, the emergence of data-intensive systems in the workplace to ensure transparency, accountability and monitor working conditions, and the increased surveillance of those accessing sexual and reproductive services including by those trying to curtail access to these rights.

However, as these have been deployed some of the key concerns which have emerge include, but are not limited to the following:

- Unregulated processing of vast amounts of data: With the increased digitisation and the adoption of automated models, vast amounts of personal data are being processed. Many regulatory and legal safeguards (where they exist), i.e. data protection and other privacy regulations, are not effectively enforced in the deployment of digital solutions hastily implemented to allegedly provide access and delivery of economic, social and cultural services. And in too many countries these processing activities are occurring in a legal void as there is not a comprehensive regulatory mechanism in place.

- Misplaced drivers and incentives: Contextual drivers include rising concerns around austerity and transparency, efficiency and financial management with many of the technical, data intensive solutions, being put forward as cost-efficient, transparent solutions, but there is little evidence that they are in practice. What results are systems which are opaque, unexplainable and which enable the surveillance of those who interact with the system.

- Little or no accountability of third parties: Industry not only provide solutions to governments but through the delivery of their own services they also feed the broader data exploitation ecosystem. There is often little or no transparency about how their business models operate in practice, i.e. the design of their systems, and the solutions they provide to governments, and their activities in this sector are firewalled from other areas of their business models and interests.

- The most marginalised are harmed: The digitalisation of many of these services is negatively impacting individuals and communities who are already in a disadvantaged and precarious position. Some of these risks have already been reported and documented, including stigmatisation, discrimination, and exposure to state and corporate surveillance.

Why it matters

The use of technology and data in the realisation of economic, social and cultural rights raises, among others, some key concerns in relation to the protection, respect and promotion of the right to privacy as provided for under Article 17 of the International Covenant on Civil and Political Rights and Article 12 of the Universal Declaration of Human Rights. As the systems being deployed interfere with individuals’ privacy, they need to meet the three overarching principles of legality, necessity and proportionality. Beyond the failure to protect individuals and their data as they interact with the systems put in place, these also have implications for non-discrimination and equality. 

There is no question that technology can help governments to tackle their obligations to realise economic, social and cultural rights and some of the key challenges they face in doing so to ensure individuals and communities live with dignity. Technology provides incredible opportunities to democratise access to information, services, and care but safeguards and due process guarantees  need to be taken into account from the onset in order to identify and mitigate risks, and provide access to redress.­­

The use of technology for the realisation of economic, social and cultural rights must be scrutinised and subject to clear, robust safeguards otherwise the same systems that are intended to facilitate the enjoyment of these fundamental rights will amplify pre-existing shortcomings and injustice not only in relations to privacy, security and data protection, but also in terms of dignity, autonomy, non-discrimination, and equality.

Find out more about why matters to the enjoyment of economic, social and cultural rights, here.

What is PI doing

This area of work on ECSC rights is part of Privacy International’s strategic intervention aimed at safeguarding peoples’ dignityby challenging current power dynamics, and redefining our relationship with government, companies and within our own communities. Innovative solutions can be designed to empower and serve individuals and communities, rather than state and corporate power.

We are taking a gender lens and feminist approach to understand how infringements on the right to privacy reinforce and exacerbate existing inequalities and discrimination, and how the digitisation of access to ESCR raise questions about the protection of people and their data by:

  • exposing and challenging the technology behind the digital solutions being deployed,
  • advocating for robust safeguards to protect the right to privacy and protect people’s data, and
  • calling for the enforcement of existing mechanisms aimed at preventing discrimination to be interpreted to integrate emerging harms from these digitized and automated systems which enabling the surveillance, profiling and tracking of those in the most vulnerable positions.

A strong stand against data exploitation is essential to challenge current power dynamics, to ensure people’s dignity and autonomy, and to prevent further violations of fundamental rights and freedoms.