Over 90 Total Lots Up For Auction at One Location - WA 04/08

The FDA is looking at new rules to govern AI

by Sean Ruck, Contributing Editor | December 02, 2019
Artificial Intelligence
From the November 2019 issue of HealthCare Business News magazine

In October’s issue of HCB News, we looked at the history of AI and the fundamentals behind the technology. Rik Primo, the principal of Primo Medical Imaging Informatics Inc., is again with us for part two of our two-part piece to look at the FDA’s efforts to regulate healthcare AI software.

According to Primo, the FDA has a challenge. “The FDA has very clear rules for the approval of medical devices. If you have an X-ray device for example, you submit the device to the FDA. They will test it; you help them with the testing.” But Primo points out, the intended usage is straightforward — producing an image using radiation. The term to be familiar with here is “software as a medical device” or SaMD. SaMD is defined by the International Medical Device Regulators Forum (IMDRF) as “software intended to be used for one or more medical purposes that perform these purposes without being part of a hardware medical device.” Primo says AI is very similar to SaMD as far as regulations should be concerned. He says according to the IMDRF, adaptive artificial intelligence and machine learning technologies differ from other SaMD in that they have the potential to adapt and optimize device performance in real time to continuously improve health care for patients.” The FDA wants to focus on the risk that is posed to the user and on patient safety issues that could arise from the use of the device.”

Altering an X-ray device, Primo says, is not a simple task. “You cannot just get a different type of X-ray tube or a different cassette tray and install it in an X-ray device that wasn’t designed to work with these parts. But with software, you basically can change the software and make the device do different functions which may take it beyond the scope of the initial intended use it was tested for.”

SaMDs can be locked though, meaning significant algorithm changes wouldn’t happen until the next version is introduced with the accompanying FDA nod. This workaround was created by the FDA and the IMDRF. However, some AI algorithms, once released into the “wild”, have the ability to continuously learn (AI-CL) and evolve based on real world experiences. In that case, by the time a new update is introduced, the original software which gained FDA approval may look very different.

Primo says there may be a mistaken belief that humans are directing the changes in the AI algorithms, but in fact, AI-CL algorithms could be adjusting their behavior based on some interaction or feedback from data or users, while not necessarily requiring the implicit directives for these changes by the users.

You Must Be Logged In To Post A Comment