Over 150 New York Auctions End Today - Bid Now
Over 1350 Total Lots Up For Auction at Three Locations - NJ 04/25, MA 04/30, NJ Cleansweep 05/02

Stanford AI CheXNeXt can spot 14 conditions on chest X-ray with accuracy, speed

by Thomas Dworetzky, Contributing Reporter | November 27, 2018
Artificial Intelligence X-Ray

In 2017, the pair unveiled their earlier effort, CheXNet, which could “detect pneumonia from chest X-rays at a level exceeding practicing radiologists,” in a paper appearing in the online research archive arXiv.

“The motivation behind this work is to have a deep-learning model to aid in the interpretation task that could overcome the intrinsic limitations of human perception and bias, and reduce errors,” Lungren explained in a Stanford report on his work at the time, adding, “more broadly, we believe that a deep-learning model for this purpose could improve health care delivery across a wide range of settings.”

Their latest algorithm, CheXNext, is a neural network trained using the dataset ChestX-ray14, a set of hundreds of thousands of X-rays released by the National Institutes of Health.

Of the set, roughly 112,000 X-rays were used to teach algorithm, and a set of 420 X-rays, not part of that group, were then used to test the algorithm for the 14 pathologies against experienced human radiologists.

“We treated the algorithm like it was a student; the NIH data set was the material we used to teach the student, and the 420 images were like the final exam,” Lungren recounted.

The comparison with the results of human diagnosticians, not just other AI approaches, was also of great importance, he stressed.

“That’s another factor that elevates this research,” he advised. “We weren’t just comparing this against other algorithms out there; we were comparing this model against practicing radiologists.”

Another important advantage of the algorithm beyond its accuracy: it took people an average of 240 minutes to read the X-rays. It took the machine 1.5 minutes.

Where CheXNeXt goes next has yet to be determined.

“I could see this working in a few ways. The algorithm could triage the X-rays, sorting them into prioritized categories for doctors to review, like normal, abnormal or emergent,” Lungren said.

It could also end up as a diagnostic helpmate to a primary care physician, alerting practitioners when it is time to call a radiologist.

“We should be building AI algorithms to be as good or better than the gold standard of human, expert physicians. Now, I’m not expecting AI to replace radiologists any time soon, but we are not truly pushing the limits of this technology if we’re just aiming to enhance existing radiologist workflows,” Lungren said. “Instead, we need to be thinking about how far we can push these AI models to improve the lives of patients anywhere in the world.”

Back to HCB News

You Must Be Logged In To Post A Comment