Reviewed by Alex SmithSep 27 2022
The general public is requested to help eradicate biases based on race and other underprivileged communities in artificial intelligence (AI) algorithms for healthcare.
Health scientists are seeking support to resolve how ‘minoritized’ communities, who are actively deprived because of social constructs, would not see future advantages from using AI in healthcare.
The scientists, guided by the University of Birmingham and University Hospitals Birmingham, recently reported in Nature Medicine about the introduction of a consultation on a set of principles that they anticipate will cut biases that are said to be present in AI algorithms.
There is increasing proof that certain AI algorithms do not work as well for specific groups of people - mainly those in minoritized racial/ethnic communities. A few of these come from biases in the datasets used to create AI algorithms. This means patients from Black and minoritized ethnic communities may get erroneous predictions, resulting in misdiagnosis and incorrect treatments.
STANDING Together is a global partnership that will formulate best-practice principles for healthcare datasets used in AI, guaranteeing they are inclusive and diverse and do not leave behind any minoritized or underrepresented groups. The project is backed by the NHS AI Lab and The Health Foundation, and the funding is managed by the National Institute for Health and Care Research, the research partner of the NHS, public health and social care, as part of the NHS AI Lab’s AI Ethics Initiative.
By getting the data foundation right, STANDING Together ensures that ‘no-one is left behind’ as we seek to unlock the benefits of data-driven technologies like AI. We have opened our Delphi study to the public so we can maximize our reach to communities and individuals. This will help us ensure the recommendations made by STANDING Together truly represent what matters to our diverse community.
Dr Xiaoxuan Liu, Institute of Inflammation and Ageing, University of Birmingham
Dr Xiaoxuan Liu and Professor Alastair Denniston are co-leads of the STANDING Together project. Professor Alastair Denniston is also affiliated with the Institute of Inflammation and Ageing at the University of Birmingham.
As a doctor in the NHS, I welcome the arrival of AI technologies that can help us improve the healthcare we offer—diagnosis that is faster and more accurate, treatment that is increasingly personalized, and health interfaces that give greater control to the patient. But we also need to ensure that these technologies are inclusive. We need to make sure that they work effectively and safely for everybody who needs them.
Professor Alastair Denniston, Consultant Ophthalmologist, University Hospitals Birmingham
Jacqui Gath, Patient Partner on the STANDING Together project stated, “This is one of the most rewarding projects I have worked on, because it incorporates not only my great interest in the use of accurate validated data and interest in good documentation to assist discovery, but also the pressing need to involve minority and underserved groups in research that benefits them. In the latter group, of course, are women.”
The STANDING Together project is currently accessible for public consultation as part of a Delphi consensus study. The team is appealing to members of the public, AI developers, scientists, medical professionals, policymakers, data scientists, and regulators to help evaluate these principles to guarantee they work for everyone and the people one partners with.
Journal Reference:
Ganapathi, S., et al. (2022) Tackling bias in AI datasets through the STANDING together initiative. Nature Medicine. doi.org/10.1038/s41591-022-01987-w.