The Food and Drug Administration (FDA) is rolling out two new guidance documents aimed at clarifying how the agency will oversee artificial intelligence (AI) in consumer-facing health technologies, according to FDA Commissioner Marty Makary.

Makary announced the guidance documents Tuesday at the Consumer Electronics Show (CES), noting that they will help provide clarity for companies building AI-powered wearable devices and decision support tools.

“There’s an AI revolution, and we’re here at the opening day of the Consumer Electronics Show, announcing two new FDA guidances that will promote more innovation with AI and medical devices,” Makary said in a video posted to X.

Wearables

The first guidance, related to wearables, clarifies that “low risk products that promote a healthy lifestyle” will not be subject to FDA regulation. However, products that are advertised as “medical or clinical grade” will be subject to regulation by the agency.

“We want to let companies know very clear guidance, that if their device or software is simply providing information, that they can do that without FDA regulation,” Makary said Tuesday on Fox Business’s “Varney & Co.

“The only stipulation is that if they make claims of something being medical grade … we don’t want people changing their medicines based on something that’s just a screening tool or an estimate of a physiologic parameter,” he added.

For instance, the guidance explains that a wearable smart watch intended to assess an individual’s sleep quality, heart rate, and blood pressure “relates to general wellness and does not refer to a specific disease or medical condition.”

However, the guidance notes that if the watch company “implied the product’s use in a medical or clinical context,” then it would not be considered a low-risk general wellness product.

Decision support tools

Similarly, the second guidance softens the FDA’s regulatory approach to clinical decision support software, which includes AI-enabled products that assist doctors.

“If something is simply providing information like ChatGPT or Google, we’re not going to outrun that lion,” Makary said on Fox Business. “We’re not going to go in there and say, ‘Well, there’s one result that is inaccurate. Therefore we got to shut this down.’ We have to promote these products, and at the same time, just guard against major safety concerns.”

The guidance outlines several examples of what clinical decision support the FDA will and will not regulate.

For example, the guidance explains that clinical decision support software may fall outside FDA regulation when it is intended to support a healthcare professional and allows the clinician to independently review the basis for a recommendation.

By contrast, software that analyzes certain medical images or provides recommendations without allowing independent review may remain regulated.

“Nothing substitutes the clinical judgment and wisdom of a physician or other clinician that is talking to you. But we can fight bad ideas with more ideas,” Makary said, adding, “We are fully embracing AI, and we want to see it out there quickly and safely.”

Read More About
Recent
More Topics
About
Grace Dille
Grace Dille is MeriTalk's Assistant Managing Editor covering the intersection of government and technology.
Tags