What does AI plan mean for NHS patient data and is there cause for concern?


UK ministers have committed to creating a National Data Library for building artificial intelligence models, as part of an AI action plan.

The library will be comprised of state-controlled data, with at least five “high-impact” public datasets being compiled. The prime minister, Keir Starmer, indicated on Monday that patient data from the National Health Service could be part of the library.

Health data is a sensitive issue in an age of criminal hackers, cyber espionage by rogue states and general concerns about the robustness of AI tools. Here we answer some of the questions around the potential use of NHS data.


What does the AI action plan say about health data?

The plan, written by the tech investor Matt Clifford, does not make explicit reference to health data but calls for a National Data Library that can be used by tech startups and researchers to train new models.

However, Starmer was more explicit about using NHS data on Monday, saying there was a “huge opportunity” to improve healthcare. He said: “I don’t think that we should have a defensive stance here that will inhibit the sort of breakthroughs that we need.”

NHS trusts have already used patient data to develop AI models to predict conditions such as high blood pressure and eye diseases.


What are the concerns about using health data in the field of AI?

Personal health data is by its nature highly sensitive and its vulnerability in a digital environment has already been underlined by recent ransomware attacks that have affected NHS trusts.

Andrew Duncan, the director of foundational AI at the UK’s Alan Turing Institute, says even anonymised health data can be manipulated to identify a patient through a process known as “re-identification” whereby “de-identified” data can be matched to other available information to identify someone.

“Once you start to narrow things down you can start to re-identify people easily,” he says. Duncan adds that AI models can be trained in a way that prevents re-identification, although “the caveat is that all of this has to be done very carefully”.

MedConfidential, which campaigns for confidentiality in healthcare, also wants clarity on whether a health dataset will respect patients who have signed an opt-out that prevents their data being used for research and planning in England. About 6% of NHS patients have signed the opt-out.


Will the data be used for commercial purposes?

The plan states that public and private datasets will enable “innovation by UK startups”, which indicates private companies will be able to access the material. Government officials have not ruled out allowing the data to be used for profit-making purposes.

However, the plan is clear that ministers and officials must take into consideration issues of “public trust, national security, privacy, ethics, and data protection”.

In 2017 a partnership between the NHS and a private AI company fell foul of the UK data regulator when it found that London’s Royal Free hospital had failed to comply with data protection law after it handed over the personal data of 1.6 million patients to Google’s AI unit, DeepMind. The data transfer was part of a trial of a system for diagnosing acute kidney injury.


What could the data be used for?

In his speech on Monday Starmer used the example of AI being deloyed in a medical emergency last year to identify the exact location of a blood clot in a stroke victim’s brain. He said patient data could be used, through AI, to “predict and prevent” strokes in the future.

The AI trials being conducted by NHS trusts also indicate a raft of uses, from predicting which patients are most likely to attend A&E frequently to identifying people at risk of type 2 diabetes.


Anonymised data is not covered by the General Data Protection Regulation (GDPR), which would make use of the data less legally problematic. if it falls into that category.

If it is not fully anonymised then GDPR would apply as well as the common law duty of confidentiality, which means patient consent would be required to use it – although there is a public interest exception. 

Kate Brimsted, a partner at the law firm Shoosmiths, said: “True and effective anonymisation means the UK GDPR would not apply. It would also overcome the confidentiality restrictions. However, achieving robust anonymisation is no simple task – it’s far more complex than merely removing names and other obvious identifiers.”



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.