Kenyan Court Ruling on Huduma Namba Identity System: the Good, the Bad and the Lessons

PI presents its analysis of the Huduma Numba judgment in three parts: the clear wins, the parts that make some small steps forward but could have been better and the dissapointing losses.

Achieved result

PI’s expert witness testimony contributed towards the decision of the Kenya High Court to suspend implementation of the Kenya’s National Integrated Identity Management Scheme (NIIMS) until the government address all the existing risks (including data breaches, exclusion and discrimination) and ensure appropriate regulatory framework for implementation of the system. 

Key findings
  • Kenya High Court accepts evidence provided by PI on the risks to the right to privacy and to data protection of data attendant with the design of, and security of NIIMS.
  • The judgment acknowledges the importance of having a data protection framework.
  • Kenya High Court fails to support Petitioner's arguments on exclusion, and judges that it could not discern violation of the right to equality and non-discrimination.
Long Read

Background

Kenya’s National Integrated Identity Management Scheme (NIIMS) is a biometric database of the Kenyan population, that will eventually be used to give every person in the country a unique “Huduma Namba” for accessing services. This system has the aim of being the “single point of truth”, a biometric population register of every citizen and resident in the country, that then links to multiple databases across government and, potentially, the private sector. 

NIIMS was introduced as an amendment in a Miscellaneous Amendments Act that became law on the 31 December 2018. Following a challenge by civil society, with some concessions (including the important caveat that registration could not be mandatory), Kenya’s High Court ruled in April 2019 that the biometric enrolment could begin. Between April and May 2019, the Government claims that the biometric data of 36 million people were collected and stored.

As with Aadhaar in India before it, the case of the Huduma Namba in Kenya raises serious concerns from privacy through to exclusion. As with Aadhaar, the case ended up in court. One of the petitioners was the Nubian Rights Forum, an NGO that supports members of an historically-marginalised community that struggles to get identity documentation. Privacy International’s Dr Tom Fisher was proud to provide expert witness testimony for the case, providing an international perspective on the key issues emerging in biometric identity systems around the world. Privacy International was able to highlight the privacy and security risks inherent in the types of systems under development in Kenya. The case was heard in September 2019, with the ruling handed down on the 30 January 2020. The full judgment is available here.

The judgment has significant implications for Kenya and beyond. It has a direct impact on the roll out of NIIMS and adds to the growing body of cases around the world dealing with the challenges posed by centralised biometric identity systems.

PI's analysis of the judgment is split into three parts. The clear wins, based on the demands of civil society, the parts that make some small steps forward but could have been better and the dissapointing losses.

As we note at the end, the judgment is the subject of appeal.

Welcomed

These are the key points in the judgment that we welcome.

System roll-out halted

The key order from the Court was that the Government could not proceed with the implementation of NIIMS until there is "an appropriate and comprehensive regulatory framework on the the implementation of NIIMS". What this framework looks like exactly, for example if it is primary or secondary legislation, is not spelled out. What is clear, however, is the Court's acknowledgment that "a law that affects a fundamental right or freedom should be clear and unambiguous" (paragraph 921), this "applies to any law that seeks to protect or secure personal data, particularly in light of the grave effects of breach of the data already alluded to" (paragraph 922).

Such a framework must be compliant with the applicable constitutional requirements identified in the judgment (as discussed further below). The importance of this order, should not be understated as it is a clear indication from the Court that the lack of legal safeguards means that the Government must take action before taking any further steps in rolling out NIIMS. This is in contrast to the Aadhaar judgment in India which "impressed upon" the respondents, in that case, the Indian Government, the need to "bring out a robust data protection regime". Yet a year and a half on, India still has no data protection law in place.

However, what this looks like in practice in Kenya is highly dependent on (a) the behaviour of the Government and (b) what such safeguards look like. Therefore, the Kenyan Government's statement following the judgment that it has "commenced the process towards generation and issuance of Huduma Nambas and Huduma Namba electronic identity cards to those who were registered", has led to concern that the Court's order is not being taken seriously, and the Government is steaming ahead with its plans. For this and other reasons, the Petitioners (the Nubian Rights Forum) have appealed and sought an urgent application to stay the implementation aspects of the Huduma Namba. 

Collection of DNA & GPS data unconstitutional

The other two specific orders from the Court related to collection of GPS co-ordinates and DNA under the NIIMS legislative framework (the Registration of Persons Act as modified by the Miscellaneous Amendments Act). The Court's analysis of the inclusion of these data points was scathing. The Court recognised that this data is personal, sensitive and intrusive and requires protection. They noted the lack of justification and evidence provided by the Government on the need to collect this data; the lack of specific safeguards; and the inability of the Government to even handle it. Ultimately concluding that:

(i) The collection of DNA and GPS co-ordinates for purposes of identification is intrusive and unnecessary, and to the extent that it is not authorised and specifically anchored in empowering legislation, it is unconstitutional and a violation of Article 31 of the Kenyan Constitution (the Right to Privacy)

(ii) The sections in the Registration of Persons Act requiring such collection, conflict with Article 31 and are unconstitutional, null and void.

It was our finding that because of the specificity of the information that DNA may disclose and the harm disclosure may cause not just to the data subject but other family members in terms of both identfication and genetic information, DNA information requires and justfies a particular and specific legal protection. Likewise, we found that specific authorization anchored in law is required for the use of GPS coordinates in light of the privacy risks we identified in terms of their possible use to track and identify a person’s location. Accordingly, we found that the provision for collection of DNA and GPS coordinates in the impugned amendments, without specific legislation detailing out the appropriate safeguards and procedures in the said collection, and the manner and extent that the right to privacy will be limited in this regard, is not justifiable. (paragraph 1039)

Accordingly, we find that the provision for collection of DNA and GPS coordinates in the impugned amendments, without specific legislation detailing out the appropriate safeguards and procedures in the collection, and the manner and extent that the right to privacy will be limited in this regard, is not justifiable. The Respondents in this respect conceded that they will not collect DNA and GPS coordinates, and that in any event they have no capacity to do so. However, our position is that as long as the collection of DNA and GPS coordinates remain a provision in the impugned amendments, there is the possibility that they can be abused and misused, and thereby risk violating the rights to privacy without justification. (paragraph 919)

In practical terms, the earlier decision of the court in April 2019 meant that DNA and GPS data were not collected as part of the data collection for the Huduma Namba during April/May 2019. As a result, choosing to collect this data would result in the need for a further extensive data collection exercise. However, the inclusion of a provision for collecting DNA and GPS data within the framework left the door open for such a possibility in the future, and for possible abuse. It is therefore important that the Court has highlighted the collection of such sensitive data requires justification and safeguards and in the clear absence of these, firmly declared these provisions unconstitutional.

Need for regulation of identity systems

The need for clear safeguards anchored in legislation and the absence of these in Kenya, is a clear theme throughout the Court's judgment. This is what has ultimately led to the Court's orders and stalled the Government's plans for now. Whilst such safeguards will not address more fundamental questions relating to the need for the identity system in the first place and can only be judged on their merits and detail, they can be an important mechanism for damage control. They also squarely underline that an identity system must be supported by strong regulation and that this should be in place before such a system is rolled out.

The number of issues where the Court extolled the need for further regulation is extensive:

"It is thus our finding that the legislative framework on the protection of children’s biometric data collected in NIIMS is inadequate, and needs to be specifically provided for (paragraph 823)

Biometric data and personal data in NIIMS shall only be processed if there is an appropriate legal framework in which sufficient safeguards are built in to protect fundamental rights ... To this extent we find that the legal framework on the operations of NIIMS is inadequate, and poses a risk to the security of data that will be collected in NIIMS.(paragraphs 884 & 5)

There is thus a need for a clear regulatory framework that addresses the possibility of exclusion in NIIMS. Such a framework will need to regulate the manner in which those without access to identity documents or with poor biometrics will be enrolled in NIIMS (paragraph 1012)

Each of these frameworks must also be subject to scrutiny to ensure that they do provide safeguards rather than just legislate for rights intrusive practices.

Acknowledgement of the risks that may arise with biometric identity systems.

We are pleased that in its consideration of the risks to the right to privacy and to data protection of data attendant with the design of, and security of NIIMS, that the Court accepted the evidence of Privacy International's Dr Tom Fisher:

 Exclusion:

In this respect we are persuaded by the evidence of Dr. Fisher, the 1st Petitioner’s expert witness as to the risks that may arise with biometric identity systems. He deposed that identity systems can lead to exclusion, with individuals not being able to access goods and services to which they are entitled, thus potentially impacting upon other rights, including social and economic rights. He stated that exclusion as a result of an identification system can come in two forms. Firstly, in cases where individuals who are entitled to but are not able to get an identification card or number that is used for service provision in the public and private spheres. Secondly, that even people enrolled on to biometric systems can suffer exclusion arising from biometric failure in their authentication. (paragraph 876)

Data breaches:

On data breaches, Dr. Fisher averred that breaches associated with identity systems tend to be large in scale, with rectification either being impossible or incurring a significant cost. Further, that the breaches affect individuals in a number of ways, whether identity theft or fraud, financial loss or other damage. His view was that the more data and the more sensitive that data, the higher the risk. With regard to the concern of function creep, Dr. Fisher averred that the mere existence of data in a centralised identification system leads to the temptation to use it for purposes not initially intended, what he referred to as ‘mission or function creep (paragraph 877)

Need for safeguards regarding access and retention of data:

The concerns of access to and retention of data was explained by Dr. Fisher as arising from the fact that the introduction of an identity system entails the mass collection, aggregation and retention of people’s personal data which has implications on the right to privacy. It was his averment therefore, that adequate safeguards should be put in place to ensure that such data is relevant and not excessive in relation to the purposes for which it is stored, and that it is preserved in a form which permits identification of the data subjects for no longer than is required. Further, that the law must also afford adequate guarantees that retained personal data is efficiently protected from misuse and abuse. (paragraph 878)

This led the Court to find the following:

Our view as regards the centralized storage of the biometric data of data subjects is that there will be risks of attacks or unauthorized access which exist with any storage of other personal data, but the most important risks are related to the misuse of the biometric data because this is data which are uniquely linked with individuals, which cannot be changed and are universal, and the effects of any abuse of misuse of the data are irreversible. The misuse can result in discrimination, profiling, surveillance of the data subjects and identity theft. In addition, as a result of the central storage of biometric data, in most cases the data subject has no information or control over the use of his or her biometric data. (paragraph 880)

all biometric systems, whether centralised or decentralised, and whether using closed or open source technology, require a strong security policy and detailed procedures on its protection and security which comply with international standards. (paragraph 883)

Concluding that:

the biometric data and personal data in NIIMS shall only be processed if there is an appropriate legal framework in which sufficient safeguards are built in to protect fundamental rights (paragraph 884)

Again, however, it remains to be seen whether the Government will follow the Court's ruling in introducing a legal framework that deals with the serious risks that emerge from the collection and use of biometrics.

Importance of a strong and enforced data protection framework

Another key aspect of the judgment is the acknowledgment of the importance of having a data protection framework. This was a core concern for the petitioners given that when NIIMS was announced no such framework was in place in Kenya.  The Kenyan Data Protection Act was passed in 2019 and the Court sought further submissions on it, before issuing the judgment. There are numerous issues with the Act, more below. However, important points, relevant for the introduction of any identity system, are:

  • the need for a Data Protection Act to be in place (as opposed to India, as mentioned above); and
  • recognition by the Court that, it does not stop and end with the legal framework, but that "once in force, data protection legislation must also be accompanied by effective implementation and enforcement" (paragraph 1035) and "adequate protection of the data requires the operationalisation of the said legal framework." (paragraph 1036)

The Court tried, but did not go far enough

As outlined in the section above in some respects the Court took some positive positions on issues previously ignored or misjudged, but there are some positions it took where the Court fell short of making a judgment that goes far enough. This includes the Court's analysis of the of the safeguards provided by the Kenyan Data Protection Act and the Court's position with regards the Petitioner's arguments on exclusion.

Kenya's Data Protection Act - still wanting

As noted above, we are pleased that the Court found i) that for an “adequate| legal framework to be in place the mere existence of a law is not sufficient, and “adequate protection of the data requires the operationalisation of the said legal framework" and that  ii) a law must be "accompanied by effective implementation and enforcement” to be adequate. (paragraphs 853 and 1035)

These are two very important points that PI has been advocating for as effective and adequate data protection is not yet a reality in many countries even where data protection laws are in place. We therefore welcome the acknowledgment by the Court that the mere existence of a data protection law, without effective enforcement and accountability, does mean that a robust data protection framework exists.

Whilst the adoption of the Data Protection Act was a significant development to ensure the protection of people and their personal data, the law adopted in November 2019 falls short of international standards as we highlighted in our joint analysis with our Kenyan partners. And even for the protections it does afford, the real test of its effectiveness will be in its enforcement.

Some of parts of the Data Protection Act which the Court praises and sees as concrete safeguards as part of the deployment of NIIMS are problematic as those particular provisions in the Act have shortcomings. One of those areas of concern relates to the independence of the office of the data commissioner. We and our partners are concerned that the establishment, under the new law, of the office as a body corporate does not grant this office the necessary institutional and financial independence to execute its mandate effectively. In order to ensure the necessary independence and effectiveness of the Data Commissioner, a Statutory Commission would be preferred to a State Office.

This point is particularly important given the reliance and expectation of the Court on the Data Commissioner to develop and operationalise various regulations including circumstances when it can exempt the operation of the Act, and issue data sharing codes on the exchange of personal data between government departments (paragraph 852). If the office of the Data Commissioner is not independent from the executive than there are concerns that any regulations it develops would fall short of respecting the essence of the Data Protection Act and instead be used to reinforce the interests and the agenda of the Government.

Recognition of risk of exclusion but does not address the issue    

The Court concluded:

We were unable to discern violation of the right to equality and non-discrimination from the evidence presented before us. (paragraph 1043)

Having worked closely with the Nubian Rights Forum on this case, and given the challenges that the Nubians face, this was deeply disappointing. This is expressed in the Nubian Rights Forum's statement on their appeal.

On the issue of exclusion, the Court noted:

We note that all the parties are agreed that the use of digital data is the way of the future. The challenge is to ensure, among other things, that no one is excluded from the NIIMS and the attendant services. This may occur due to lack of identity documents, or lack of or poor biometric data, such as fingerprints. In our view, there may be a segment of the population who run the risk of exclusion for the reasons already identified in this judgment. There is thus a need for a clear regulatory framework that addresses the possibility of exclusion in NIIMS. Such a framework will need to regulate the manner in which those without access to identity documents or with poor biometrics will be enrolled in NIIMS. Suffice to say that while we recognize the possibility of this exclusion, we find that it is in itself not a sufficient reason to find NIIMS unconstitutional. (paragraph 1012)

It is important and welcome that the Court acknowledged that people can be excluded from a system like NIIMS; and, further, that this results in the exclusion from essential services. That a system like NIIMS can exclude, rather than include, is an essential step in understanding the impact of these identity systems.

However, the solution presented by the Court is only partial, and fails to deal with the complete issue. We agree that there is a need to ensure that nobody is excluded from NIIMS on the basis of the lack of biometrics or the required documents and we hope that the new framework on this is radical in scope: the issues pertaining to biometric failure and lack of documentation are linked to broader social exclusion and marginalisation. The measures to ensure that people are not excluded by NIIMS must reflect those challenges, and provide genuine - and swift - redress for those impacted.

Yet ultimately, this is never going to be the complete solution. Basing access to essential services on a single system – and thus a single point of failure – is always going to risk exclusions. It leaves people open to failings in the technology, or bureaucratic delays, that deny people access to their rights. It also leaves open the possibilities of political manipulations, and it gives the state a vastly powerful tool to exclude individuals or communities in the future.

The solution is that there must be alternatives presented to NIIMS, and that the system be genuinely non-mandatory for accessing services. Other options must be available for those who cannot or do not want to enrol in NIIMS, this can only increase the inclusivity of the broader identity ecosystem.

Missed Opportunities

The case before the High Court provided a space to question the Government's policy, process and practice, and yet the High Court missed a number of opportunities, including in relation to questioning the overall purpose and make-up of the system. The lack of public participation characterises the concerning introduction of identity systems around the world and therefore, the Court's findings in this regard are also disappointing.

Failure to question the purpose of NIIMS

It is key that a data-intensive system, like an identity system, has a clearly stated purpose. As the Charter of Fundamental Rights of the European Union states, “Everyone has the right to the protection of personal data concerning him or her. Such data must be processed fairly for specified purposes”. 

How does the Kenyan court describe the purpose of NIIMS?

the biometric data collected is necessary to the stated purposes of NIIMS as is clear that the system can only provide trustworthy information about the identity of the person if the characteristics of that person are stored in its database.  (paragraph 787)

On a technical level, this is questionable: there are ways of providing trustworthy identity without a centralised biometric database. On a legal level, this is also problematic. As the Indian constitutional expert Gautam Bhatia has pointed out, this is a tautology: the purpose of the biometric identification system is a biometric identification system. Without a clear purpose, it becomes impossible to assess the impact of a system like NIIMS. It leaves us with no real justification of why a system that risks people's rights should be introduced at all. 

Sufficient public participation?    

Identity systems, particularly national identity systems, are often introduced without the firm and rigorous debate that such a major measure deserves. There are countless examples of identity systems pushed through by decree, diktat, or through means that allow less democratic accountability, denying the systems a mandate. We saw this with the Federal Biometric Identification System for Security (SIBIOS) in Argentina, introduced by decree in 2011, NADRA biometric database in Pakistan which was introduced while Pakistan was under military rule, and the law establishing Aadhaar in India was passed as a money bill, limiting debate.

Before the Court was this exact issue surrounding public participation in the process by which NIIMS thought a Miscellaneous Amendments Act became law. The Court found that the use of such an act, as well as the process of public participation, was lawful.This issue is part of the appeal by the Nubian Rights Forum.

The important fact is that detailed, engaged public participation in the development of an identity system is crucial. These projects are large, complex, and expensive - touching upon every aspect of people's lives. It is essential that there is discussion, consultation and debate; moreover, that this goes beyond the deployment of a system to the engagement with the public, stakeholders and civil society.

Conclusion

The Nubain Rights Forum has filed an appeal, and so this ruling by the High Court is unlikely to be the final word. As we've set out above, the judgment makes some important findings, in particular on the need for strong legal protections. However, it is also disappointing in a number of respects, and the burden of demonstrating the risks, as opposed to why NIIMS is an appropriate solution, falls disproportionately on those challenging the system. However, it is thanks to their hard work and efforts that safeguards must now be put in place. A lesson not just for the Kenyan Government, but Governments elsewhere considering implementing digital identity systems.

Privacy International, and our network of partners, will continue to follow this case and analyse - and challenge - these systems around the world.