Over 300 Total Lots Up For Auction at Two Locations - UT 10/25, CA 10/28

AI produces more mammo false positives on Black patients, older patients

por Gus Iversen, Editor in Chief | May 28, 2024
Artificial Intelligence Women's Health
Patient age and race can significantly influence the accuracy of AI algorithms used in interpreting digital mammograms, according to a study published in Radiology, a journal of the Radiological Society of North America (RSNA).

The study, conducted by researchers at Duke University Medical Center, involved nearly 5,000 digital breast tomosynthesis screening exams. It found that AI algorithms produced false positive results more frequently in certain demographic groups, specifically among Black patients and older individuals aged 71-80, while being less likely among Asian patients and younger individuals aged 41-50.

The study highlighted the lack of demographically diverse databases for AI training, pointing out that the FDA does not mandate diverse data sets for validation. This raises concerns about whether AI software can perform uniformly across different patient populations.

"AI has become a resource for radiologists to improve their efficiency and accuracy in reading screening mammograms while mitigating reader burnout," said Dr. Derek L. Nguyen, assistant professor at Duke University in Durham, North Carolina. "However, the impact of patient characteristics on AI performance has not been well studied."

For the study, researchers identified patients with negative digital breast tomosynthesis screenings performed between 2016 and 2019. From this group, they randomly selected 4,855 patients (median age 54 years), including 1,316 white, 1,261 Black, 1,351 Asian, and 927 Hispanic patients.

The mammograms were evaluated using iCAD's ProFound AI 3.0, assigning a case score (certainty of malignancy) and a risk score (one-year subsequent malignancy risk). Given all mammograms in the study were negative for cancer, any flagged as suspicious by the AI were considered false positives.

The study underscores the need for healthcare institutions to consider their patient demographics when purchasing AI software for mammogram interpretation and to inquire about the diversity of the training data used by vendors.

"Having a baseline knowledge of your institution's demographics and asking the vendor about the ethnic and age diversity of their training data will help you understand the limitations you'll face in clinical practice," Dr. Nguyen advised.
(25)

Tranzrad

re: AI produces more mammo false positives on Black patients, older patients

May 29, 2024 11:23

AI US must know patients race an age? What if you change the age and race how does it impact the false positives? It seems like researchers could "re-run" the images for diagnosis with fictitious information to test this theory. This would require some HIPPA compliance that currently disallows. 🤔

Log inor Register

to rate and post a comment

You Must Be Logged In To Post A Comment