Whose business is your healthcare? Why digital health tools need careful assessment

Digital Health Technology Assessment is needed to make sure that tools developed by the private sector and relied on by public healthcare providers do not harm people and their rights. 

Long Read
Five people and a dog are seen outlined in orange, against an orange background. Two of the people talk to each other, one stands along with a stick, one walks a dog, and the other is in a wheelchair. All of them look at their mobile phones intently, and all cast shadows on the ground. The shadows are made up of network diagrams, being representative, rather than literal shadows.

Jamillah Knowles & Reset.Tech Australia / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

Across the world, public healthcare providers are going digital in efforts to improve and modernise the services they offer. Whether it's apps that track symptoms, remote video consultations or cutting edge AI diagnostics, digital tools that support healthcare are often produced and supplied by private actors. These tools can have a positive effect on the quality and accessibility of healthcare, however there are also downsides of relying on business to deliver digital health services.

PI has teamed up with the Centre for Health, Equity, Law and Policy (C-HELP) in India and Just Treatment in the UK to investigate how carefully public healthcare providers are assessing the impact that digital tools like these have on people's wider rights. 

C-HELP has been actively engaging in research and advocacy with respect to data protection, health data management and the governance of digital health technologies. C-HELP has been advocating for a comprehensive data protection act in India, with a special focus on health data. 

In response to the measures taken during the Covid 19 pandemic, C-HELP has analysed the contact tracing tools released by state governments in India. These apps were assessed on a framework which included factors such as state capacity concerns, public engagement, and privacy and ethical concerns. 

Currently, C-HELP has been working on an analysis of the privacy policies and third party data sharing practices of the most popular telemedicine apps in India.

 

Just Treatment mobilise patients in every corner of the UK and in countries across the world to fight for an end to corporate power in our health systems and a fully-funded, publicly owned and operated NHS.

Previously, Just Treatment has made national media headlines, shifted government policy and transformed the lives of thousands of patients. Their work has been recognised by a David & Goliath Award from the Sheila McKechnie Foundation and a Tenacious Award.

People need to be able to trust that apps being recommended to them by their doctors, nurses and other healthcare providers are also keeping their data safe and protecting their privacy. 

Digital Health Technology Assessment

One inevitable concern is whether digital health tools properly protect people’s data and privacy. Health data can be incredibly intimate and extremely sensitive: especially in contexts where health conditions can be the cause of discrimination or oppression, such as sexual and reproductive health or mental health. It can also reveal information about our lifestyles, our genetic makeup and our families. There are other risks too - digital tools based on biased datasets can be discriminatory, or private businesses may make false promises of cost saving efficiencies that end up leaving some people without access to quality healthcare services.

That’s why it’s so important that public health bodies - such as hospitals, care providers, general practitioners (GPs) and government departments - are making sure that their reliance on privately developed digital tools is not coming at the cost of other rights. Not only is it unacceptable to expect (or even require) people to sacrifice their right to privacy to fulfill their right to health, but this may even end up damaging people’s health in the long-run.

Health Technology Assessment (HTA) is a longstanding practice that is used to assess the effectiveness and safety of technological innovations before they can be used in the diagnosis, treatment, management and prevention of health problems. Today, there is an overwhelming need for clear and specific rules for digital health technology assessment (dHTA) that engages with the specific needs and challenges of new and emerging practices. Together, we will look at how effective dHTA regimes are across different countries (India, UK, Indonesia and Thailand), in particular for mental health or wellness apps that people use on their own devices.

These apps can pose particular challenges because they may not be classified as ‘medical devices’ and therefore escape (perhaps justifiably) the stringent rules and safeguards around the use of medical devices. However, that does not mean they can be entirely unregulated. Here, we set out some of the key issues that demonstrate the need for careful and thorough dHTA before they are proscribed, recommended, commissioned or otherwise used by public bodies responsible for people’s healthcare.

The risks of using apps as digital health tools

Medical treatment has long depended on technological innovation and breakthroughs. It is normal for specialist devices to be used by doctors and nurses for all sorts of conditions (just think about heartrate monitors or surgical equipment or contraceptive implants). In order to keep people safe, medical devices are subject to strict regulation before they can be used. While some digital health tools may qualify as ‘medical devices’, not all do. This is especially true for software and apps, which can be quite different from what might traditionally be thought of as a medical device.

But people - and healthcare professionals - are today using apps for all manner of reasons relating to health, fitness and wellbeing:

  • To help manage and improve mental health
  • To track periods and related information
  • To motivate keeping fit and eating healthily
  • To collect and analyse data collected by wearables

Apps like these can collect and process large amounts of highly sensitive data. That data might stay securely on your device or in a personal cloud - but it might also be being shared with the app developer, a healthcare provider, or even third parties such as advertisers. Insecurely stored data is also vulnerable to hacking and cyberattack. Safeguards are clearly needed to protect people’s rights and to eliminate bad practice.

'Software as a Medical Device'?

Existing regulatory frameworks for HTA tend to focus on medical devices, with one academic study finding that even the countries with more advanced regulatory frameworks for health apps “limit their approvals to health apps meeting criteria for being defined as medical devices”. So while apps can be classified as ‘Software as a Medical Device’ (SaMD), those that are not are left in limbo. The same study “raises the question of how wellness apps — that do not fulfil these criteria but can still create demonstrable value for patients — should be vetted”.

While the full assessment process for medical devices may not be necessary for all apps, something is needed to protect people, their data and their rights.

Of course, it’s not acceptable for any app, whether used for healthcare or not, to put their users at risk of data breach or exploitation. But the stakes are especially high for apps which are recommended (or even prescribed) by healthcare professionals. The nature of the relationship between doctor and patient relies on trust: there is an inherent power imbalance that must be resolved in the patient’s best interests. The right kind of healthcare depends on that relationship not undermining the patient’s full range of rights.

All digital health tools - whether a medical device or not - and especially those produced by the private sector, must therefore be assessed before being relied on by public healthcare providers to ensure that they don’t do more harm than good. They need to:

  • Have their data sharing and security practices assured as meeting the highest standards
  • Be developed and designed with affected communities
  • Protect against negative impacts on mental health

As well as the dHTA at the national level, there are also apps and other tools being promoted by global institutions like the Global Fund, Digital Square or HIEX (UNAIDS). It is important to ensure that these are subject to the right approval processes too.

What are the tradeoffs?

Patient-facing digital health tools like apps and wearables can make healthcare more convenient and personalised. They can give patients more autonomy and control over how they manage their health. But these benefits may not always materialise, and do not necessarily apply to all users, and nor do they necessarily mean overall improvements in healthcare management and quality of life. Digital first approaches may also be attractive to governments based on a promise of cost saving and efficiency. But such claims must be closely scrutinised and contextualised.

It’s also important to bear in mind that the software behind a DHT may not be fixed. In fact, it is very important that security updates can be run whenever a potential bug is found. But software updates can also be used to make changes to algorithms in how the software works or additions of new features that may change how data is collected or potential advice given.

Key Resources

In 2019, we investigated several period-tracking apps and their data-sharing practices. 5 years later, we did it again.

Long Read

Companies selling diet programmes are using tests to lure users. Those tests encourage users to share sensitive personal data, including about their physical and mental health. But what happens to the data? We investigated to find out.

Video

We talk to Dr David Crepaz-Keay from the Mental Health Foundation to find out what happens to your data when you visit a mental health website? How can technology help people dealing with a mental health issue? And what can happen when things go wrong?

Current practice

There is considerable evidence of the harm that can arise when digital health tools such as apps are designed or deployed poorly. Within our own research, we’ve seen apps built to help with period tracking, dieting and mental health sharing user data with advertisers. The Centre for Internet & Society in India found similar widespread bad practice across a range of health apps and websites. And the Digital Health and Rights Project has found that young people using digital health tools in Ghana, Kenya and Vietnam expressed concerns about misinformation, anxiety about phone ‘addiction’, sexual harassment and stalking, extortion and blackmail and online surveillance as a result of using these technologies.

Just some other examples of how DHTs can present risks as well as benefits do harm as well as good include:

  • Babylon Health allowing patients to view other people’s consultations
  • 61 million records from wearables such as Fitbit being exposed
  • Mental health apps exhibiting a range of poor and exploitative data practices 

These examples are backed up by academic studies that have looked at the privacy practices of DHTs like apps. For example, a 2021 report that analysed over 20,000 health and fitness apps found “serious problems with privacy” including sharing data with third parties like advertisers and tracking services, and that “clinicians should be aware of these and articulate them to patients”. A research paper produced for the UK’s National Health Service (NHS) also took a look at the potential downsides of digital health tools. They found that the following problems could arise:

  • Creating and compounding disadvantages for those already marginalised
  • Making healthcare more transactional by reliance on algorithms
  • Blaming individuals by making systemic issues the individual’s problem to solve
  • Diverting resources away from basic health care provision
  • Mechanistic thinking crowding out human judgment and interaction

Research by Just Treatment amongst their supporters has shown that similar concerns are also being raised by patients in the UK. They found that patients are using apps “because they feel compelled to”, at times having been recommended or prescribed them by a doctor or other healthcare professional. But that doesn’t mean they are entirely comfortable with the situation:

“Patients are worried about the collection and use of data by these apps. 50% of those people in our survey who were using health apps said they had concerns about how the information might be used or shared, and only 17% said the clinician spoke to them about where and how data would be used. Interviewees reported feeling they are being presented with a choice between data privacy and receiving treatment.”

These worries are not purely abstract, Just Treatment identified the following concerns that people have about how their health data could be used by private companies in ways that could see them unfairly profiting:

  • to calculate their pension
  • to determine travel insurance premiums
  • to target them with scams and snake-oil treatments
  • to inform hyper-personalised targeted political advertising
  • to reveal health conditions to potential employers

Similarly, C-HELP has highlighted the issues particularly with the data protection law in India, one of the key concerns being the removal of health data and the overall definition of sensitive personal data in the final version of India’s data protection legislation - The Digital Personal Data Protection Act 2023. While the Act has been passed in 2023, the Rules that set forth its implementation have not been finalised. The draft Rules were open for public consultation in January 2025, and the final version of the Rules are yet to be made public. The combination of the weakened protection of health data, the lack of clarity with respect to the data protection legislation and the growing increase in health data digitalisation and digital health technologies being implemented both by private and public organisations is concerning.

Similar to India, both Thailand and Indonesia have fairly recent data protection legislations, Thailand’s Personal Data Protection Act was enacted in May 2019, its implementation was delayed until 2021, and Indonesia’s data protection legislation, the Personal Data Protection (“PDP Law”) was enacted in October 2022.

We know that all kinds of data, including health data, is up for sale by data brokers across the world. That includes location data being bought by police or other law enforcement agencies (which could reveal attendance at an abortion clinic, for example), and other troubling examples of nefarious actors getting hold of data about people’s personal lives, such as a catholic news outlet obtaining data about a priest from the gay dating app Grindr.

What’s next?

Healthcare is going digital. But the processes that govern digitisation are at risk of developing too slowly and inadequately, leaving people open to harm from tools that are poorly designed or poorly implemented. As identified by the NHS research, an excessive focus on new and potentially cutting edge technologies must not overlook the bread and butter of healthcare: caring for people. That’s especially true when governments, such as in the UK, are keenly focused on the potential for tech-driven economic growth and are being heavily lobbied to prioritise the interests of private business developing these tools over those of patients and the health service when designing oversight mechanisms.

In this collaboration, we will look more closely at how domestic regulatory regimes in the UK, India, Indonesia and Thailand are designed and are working in practice in order to develop some recommendations for how patient-facing health apps should be assessed before being relied on by public healthcare providers.

Without proper dHTA - on paper and in practice - promises of improvements for people, their health and their rights, cannot be assured.