Beware of Biased AI
This site is intended for healthcare professionals

COMMENTARY

Beware of Biased AI

F. Perry Wilson, MD, MSCE

Disclosures

December 19, 2023

0

This transcript has been edited for clarity.

Okay. You're in the emergency room, evaluating a patient who comes in with acute shortness of breath. It could be pneumonia, it could be a COPD exacerbation, it could be heart failure. You look at the x-ray to help make your diagnosis — let's say COPD — and then, before you start ordering the appropriate treatment, you see a pop-up in the electronic health record, a friendly AI assistant that says something like, "I'm pretty sure this is heart failure."

What do you do?

This scenario is closer than you think. In fact, scenarios like this are already happening in health systems around the country, sometimes in pilot programs, sometimes with more full-fledged integration. But the point remains: At some point, clinicians' diagnoses are going to be "aided" by AI.

What's the problem with AI predictions? Well, people often complain it's a "black box"; sure, it may tell me it thinks the diagnosis is heart failure, but I don't know why it thinks that. To make AI work well with clinicians, it needs to explain itself.

But a new study suggests that "explainability" of AI predictions doesn't make much difference in how doctors use it.

Recommendations

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.

processing....