Study reveals why AI models that analyze medical images can be biased

Submitted by Kate Moore on July 2, 2024
x-ray

Source https://news.mit.edu/2024/study-reveals-why-ai-analyzed-medical-images-can-be-b…

In 2022, MIT researchers reported that AI models can make accurate predictions about a patient’s race from their chest X-rays; that research team has now found that the models that are most accurate at making demographic predictions also show the biggest “fairness gaps.” The findings suggest that these models may be using “demographic shortcuts” when making their diagnostic predictions, which lead to incorrect results for women, Black people, and other groups, the researchers say.

“It’s well-established that high-capacity machine-learning models are good predictors of human demographics such as self-reported race or sex or age. This paper re-demonstrates that capacity, and then links that capacity to the lack of performance across different groups, which has never been done,”

says Marzyeh Ghassemi, an MIT associate professor of electrical engineering and computer science, a member of MIT’s Institute for Medical Engineering and Science, and the senior author of the study.

Log in or register to join the discussion