General

Medical AI Shortcut May Cause Complication: Reports

NewsGram Desk

Artificial Intelligence (AI) models like humans tend to look for shortcuts. In the case of AI-assisted disease detection, these shortcuts could lead to diagnostic errors if deployed in clinical settings, warn researchers.

A team from the University of Washington in the US, examined multiple models recently put forward as potential tools for accurately detecting Covid-19 from chest radiography, otherwise known as chest X-rays.

Follow NewsGram on LinkedIn to know what's happening around the world.

The findings, published in the journal Nature Machine Intelligence, showed that rather than learning genuine medical pathology, these models rely instead on shortcut learning to draw spurious associations between medically irrelevant factors and disease status.

Artificial Intelligence (AI) models like humans tend to look for shortcuts.

As a result, the models ignored clinically significant indicators and relied instead on characteristics such as text markers or patient positioning that were specific to each dataset to predict whether someone had Covid-1

"A physician would generally expect a finding of Covid-19 from an X-ray to be based on specific patterns in the image that reflect disease processes," said co-lead author Alex DeGrave, from UW's Medical Scientist Training Program. "But rather than relying on those patterns, a system using shortcut learning might, for example, judge that someone is elderly and thus infer that they are more likely to have the disease because it is more common in older patients.

"The shortcut is not wrong, but the association is unexpected and not transparent. And that could lead to an inappropriate diagnosis," DeGrave said. Shortcut learning is less robust than genuine medical pathology and usually means the model will not generalize well outside of the original setting, the researchers said.

Combining lack of robustness with the typical opacity of AI decision-making can make these AI models prone to a condition known as "worst-case confounding," owing to the lack of training data available for such a new disease. This scenario increased the likelihood that the models would rely on shortcuts rather than learning the underlying pathology of the disease from the training data, the researchers noted. (IANS/AD)

Who’s to blame for climate change: Fossil fuel producers or purchasers?

Guilt-Free Indulgence: Healthy Versions of Your Favourite Dishes

Shares in India's Adani Group plunge 20% after US bribery, fraud indictments

Rollover Accidents Involving SUVs: Why Are They So Common?

10 Ways to Drive Customer Engagement with Interactive Mobile App Features