Birmingham and Solihull Mental Health NHS Foundation Trust shared medical records with a Telecoms company without patient consent. The discovery became public following published documents in December under the freedom of information laws. Alpha, a division of Spanish telecoms company Telefonica, was using the patients’ data to develop an app that could predict mental health crisis.
Using historical patient information, Alpha developed a machine-learning algorithm to predict patients at risk of a mental health crisis. Once every two weeks, the app automatically flagged 25 patients most at risk of a mental health crisis to the NHS staff. Then healthcare professionals followed up with flagged patients either by phone or in person. Alpha had access to five year’s worth of medical records for the app. Although the data was anonymised (patient’s names and addresses were removed from the database), the NHS did not seek consent from patients before providing their data to Alpha.
Responses from NHS and Alpha
Dr Hilary Grant, executive medical director at Birmingham and Solihull Mental Health NHS Foundation Trust said,
There is no reason for our patients to be concerned about how their information is being used. Our number one priority remains to protect our current patients and their privacy.
In a statement, Alpha also said:
The phase one results have demonstrated that the algorithm has high predictive power, and that most clinicians valued the extra insights provided by the algorithm to help inform their decision making.
However, the pilot study showed that there is more work to do to improve accuracy, with 7 percent of clinicians disagreeing with predictions.
Phase Two of the app
Based on the success of the pilot stage, there is potential for a continued partnership between the NHS trust and Alpha. Clinicians reportedly found the app useful in about 64 percent of flagged cases. But the app also returned many false positives adding to doctors’ workload.
The second stage of the research would involve accessing people’s mobile network data to improve the algorithm according to a Health Services Journal report. For this stage, Alpha would be able to access call and message records and location details of patients. Although the NHS and Alpha have said they will seek patient consent for the second stage, concerns remain over getting consent and how the information will be used with wider access to sensitive and personal information.
Privacy concerns and the importance of informed consent
Fears have surrounded tech companies working in the health sector for a long time. Some practitioners are wary of private companies using the pretext of research to get their hands on medical data. Artificial Intelligence (AI) is increasingly being used in health care. Google Health’s AI system is a good example of the benefit. Their machine was better at predicting breast cancer than radiologists. But machine learning requires a massive amount of data during the training phase. If private companies are turning to the NHS for data, they must do more to protect the privacy of patients and make sure they are informed.
With Alpha’s partnership with the NHS, mental health patients are already in a vulnerable position. The NHS Trust should be protecting their rights and privacy, not violating them. The UK’s Health Research Authority reportedly advised the Trust that consent of patients wasn’t needed. Getting consent prior to the development of the app would have strengthened the case for adopting the app considering its effectiveness.
Data subjects (patients) should be informed about who is receiving their data and for what purpose to ensure the process is transparent.
Alpha is not alone in the decision to use patient data without consent. In 2016, Google’s DeepMind sparked a controversy with a kidney illness prediction app it developed using data from the Royal Free Hospital Trust. The trust provided the personal data of around 1.6 million patients to DeepMind. Similar to Alpha, they did not seek patient consent before sharing the information. The Information Commission’s Office (ICO) however ruled that it broke data protection laws. The ICO demanded for the NHS trust and DeepMind to complete a privacy impact assessment outlining specific steps taken to ensure transparency.
At the time in 2016, only the data controller was liable for breaches of regulations. Under GDPR, both processor and controller will be jointly liable in case of any breaches.
Health data is sensitive and more needs to be done between the NHS and tech companies working in the health industry to protect the privacy and rights of patients.
Kazient Privacy Experts offer bespoke Data Protection, Privacy and GDPR compliance solutions in a language you understand to UK and international organisations, and has received positive media coverage across Europe. Kazient’s GDPR consultants are fully certified to be your outsourced Data Protection Officer or EU Representative. Get in touch to find out how we can help your business by visiting our website www.kazient.co.uk or calling us on 0330 022 9009.