Sajid Javid, the former Health Secretary, commissioned an independent review earlier this year around equity in medical devices to advise the Government on any issues related to equity in medical devices used in the NHS and potential solutions to these. Medical devices include physical instruments and machines, artificial intelligence (AI) tools and software to assist diagnostic or therapeutic decision making, as well as AI derived predictive analytics including those based on genomics data.
The review has followed concerns about the potential for ethnic bias in the design and use of some medical devices commonly used in the NHS, and that treating for some groups may be suffering. One example came during the response to the COVID-19 pandemic, during which pulse oximeters had found inaccurate readings were more common in black patients than in white patients, with overestimation of blood oxygen levels in darker-skinned patients, potentially meaning that dangerously low oxygen levels could be missed in some patients. There is concern that many devices had been originally developed in populations that were of a particular type, such as predominantly white males, resulting in unintended bias.
In another line of enquiry, the review is looking at AI tools that are used in healthcare to see whether their algorithms have in-built biases. For example, advanced clinical prediction models have sometimes underperformed on women, ethnic minorities and poorer groups, partly due to these population groups being under-represented in the data sources for the modelling. This is important because AI is increasingly being used to support decision making in prevention, diagnostics and therapeutics.
The review, being chaired by Professor Dame Margaret Whitehead, is looking to establish the extent and impact of potential ethnic and other unfair biases in the design and use of medical devices used in the NHS and what can be done to remedy it.
It aims to collect and review evidence as to whether the way in which some medical devices are designed, developed or used may lead to some parts of the population not benefiting as well as they might. The evidence has not yet been conclusive, so the review is looking to establish where and how potential ethnic and other unfair biases may arise in the entire lifecycle of medical devices, and the extent and impact of these, and to make recommendations by June 2023 for more equitable solutions.
Call for Evidence of Unfair Biases
The review is looking to receive any data and evidence about existing equity concerns or biases and any mitigating solutions. This could include:
Who Does the DHSC Want To Hear From?
The DHSC is happy to hear from anyone, but particularly:
People have until 6 October to respond to the review and can do so anonymously.