Subscribe To Print Edition About The Tribune Code Of Ethics Download App Advertise with us Classifieds
search-icon-img
  • ftr-facebook
  • ftr-instagram
  • ftr-instagram
search-icon-img
Advertisement

Medical AI models rely on shortcuts, can cause misdiagnosis

New York, June 1 Artificial Intelligence (AI) models like humans have a tendency to look for shortcuts. In the case of an AI-assisted disease detection, these shortcuts could lead to diagnostic errors if deployed in clinical settings, warn researchers. A...
  • fb
  • twitter
  • whatsapp
  • whatsapp
Advertisement

New York, June 1

Artificial Intelligence (AI) models like humans have a tendency to look for shortcuts. In the case of an AI-assisted disease detection, these shortcuts could lead to diagnostic errors if deployed in clinical settings, warn researchers.

A team from the University of Washington in the US, examined multiple models recently put forward as potential tools for accurately detecting Covid-19 from chest radiography, otherwise known as chest X-rays.

Advertisement

The findings, published in the journal Nature Machine Intelligence, showed that rather than learning genuine medical pathology, these models rely instead on shortcut learning to draw spurious associations between medically irrelevant factors and disease status.

As a result, the models ignored clinically significant indicators and relied instead on characteristics such as text markers or patient positioning that were specific to each dataset to predict whether someone had Covid-19.

Advertisement

“A physician would generally expect a finding of Covid-19 from an X-ray to be based on specific patterns in the image that reflect disease processes,” said co-lead author Alex DeGrave, from UW’s Medical Scientist Training Programme.

“But rather than relying on those patterns, a system using shortcut learning might, for example, judge that someone is elderly and thus infer that they are more likely to have the disease because it is more common in older patients.

“The shortcut is not wrong per se, but the association is unexpected and not transparent. And that could lead to an inappropriate diagnosis,” DeGrave said.

Also read:

Shortcut learning is less robust than genuine medical pathology and usually means the model will not generalise well outside of the original setting, the researchers said.

Combining lack of robustness with the typical opacity of AI decision-making can make these AI models prone to a condition known as “worst-case confounding,” owing to the lack of training data available for such a new disease.

This scenario increased the likelihood that the models would rely on shortcuts rather than learning the underlying pathology of the disease from the training data, the researchers noted.

–IANS

Advertisement
Advertisement
Advertisement
Advertisement
tlbr_img1 Home tlbr_img2 Opinion tlbr_img3 Classifieds tlbr_img4 Videos tlbr_img5 E-Paper