Academics tackle gender bias in healthcare AI

A project led by the University of Glasgow over 18 months will gather data from a male and female cohort and then compare how this information is treated by AI

A Scottish university is to explore how to prevent gender bias from skewing outcomes in healthcare artificial intelligence.

Over the next 18 months, the University of Glasgow will work to develop a new framework with the aim of balancing gender-related behaviour in AI systems currently used to assess data gathered by remote monitoring technologies.  

A research team will collect data from 30 male and 30 female study volunteers using radar sensors. Separate models will be trained on the male and female data, comparing performance and highlighting any biases in the function of the AI so they can later be adjusted.

Dr Nour Ghadban, the project’s principal investigator, said: “New sensors linked with artificial intelligence could offer potentially transformational opportunities to improve the way that we monitor patient wellbeing. However, we can only reap those benefits if we can be sure that the AI systems, we use to achieve them are up to the task. We know that all kinds of human bias across race, class gender and more can be unwittingly incorporated into AI decision-making tools if the proper care isn’t taken when they are being trained on real-world data.”

Related content

Funded by the Women and Science Chair at Université Paris Dauphine-PSL, the research announcement follows recent developments in AI-supported sensing technology. The University of Glasgow is among those developing sensors intended to track heart and lung rhythms without the need for wearable technology or video cameras.

The Scottish institution is developing a £5.5m system named Healthcare QUEST – a remote technology through which sensors will provide advice and alerts to provide personalised suggestions for lifestyle improvement and rehabilitation programmes for those recovering from illness at home. It is hoped the technology will have the potential to allow older people to live more independently and provide additional insight into the wellbeing of patients staying in hospital wards.

Concerns about potential gender bias in AI systems have grown in recent years. In 2019 it emerged that chatbot technology used in triaging patients of the remote NHS GP at Hand was liable to diagnose identical symptoms very differently in male and female patients, with men being warned of a possible heart attack, while women were advised they were likely experiencing depression or a panic attack.

Sofia Villegas and PublicTechnology staff

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter