As artificial intelligence becomes increasingly embedded in medical devices, industry stakeholders are urging the US Food and Drug Administration to take a measured approach to oversight by relying on existing quality and postmarket controls rather than creating new, AI-specific regulatory frameworks.
The feedback follows a discussion paper issued by FDA’s Center for Devices and Radiological Health seeking input on how best to evaluate the real-world performance of AI-enabled medical devices, including how manufacturers should detect and manage performance drift over time. Stakeholders broadly agreed that continuous performance monitoring is essential but cautioned against regulatory duplication.
Existing Quality Systems as the Foundation
Industry groups emphasized that FDA already has a robust framework in place through its quality management system requirements, soon to transition fully to the Quality Management System Regulation aligned with ISO 13485. These controls already require manufacturers to monitor device performance, manage complaints, investigate nonconformities, and implement corrective and preventive actions.
Stakeholders argued that these mechanisms are well suited to monitoring AI-enabled devices, including the evaluation of real-world data, device logs, and postmarket feedback. Rather than introducing parallel oversight structures specific to AI, commenters recommended leveraging and modernizing existing processes.
This approach, they noted, would help maintain patient safety while avoiding unnecessary regulatory burden that could slow innovation.
Risk-Based Oversight for AI Technologies
Several organizations reinforced the importance of a risk-based approach to AI regulation. They noted that not all AI-enabled devices pose the same level of risk and that oversight should scale based on factors such as intended use, degree of automation, human involvement, and data transparency.
Under this model, higher-risk AI-enabled devices may warrant enhanced postmarket studies or closer surveillance, while lower-risk applications could be managed through standard quality and postmarket controls. Stakeholders stressed that flexibility will be critical as AI technologies continue to evolve rapidly.
Addressing Performance Drift and Real-World Monitoring
A recurring theme in the comments was the need for ongoing monitoring of AI performance once devices are deployed in clinical settings. Performance drift, whether due to changes in input data, clinical workflows, or software updates, poses unique challenges for AI-enabled products.
Industry groups acknowledged FDA’s focus on real-world performance monitoring as timely and appropriate, while emphasizing that manufacturers already collect and assess relevant data as part of existing surveillance programs. They encouraged FDA to build on these practices rather than redefine them.
Enhancing Adverse Event Reporting
Some stakeholders also called for updates to FDA’s adverse event reporting systems to better capture AI-specific risks. Current reporting mechanisms primarily categorize events as malfunctions, injuries, or deaths, which may not fully reflect issues such as algorithm bias, hallucinations, or gradual model degradation.
Enhancing reporting tools to capture AI-relevant information could improve FDA’s ability to identify emerging risks without imposing new oversight structures.
What This Means for AI-Enabled Device Manufacturers
The industry response signals strong support for regulatory continuity as AI becomes more prevalent in medical devices. Manufacturers should expect FDA to continue emphasizing postmarket monitoring, quality system maturity, and risk-based oversight, while remaining cautious about creating AI-specific regulatory silos.
For developers, this reinforces the importance of strong quality systems, well-documented monitoring processes, and proactive planning for performance management across the product lifecycle.
How EMMA International Supports AI-Enabled Device Strategy
At EMMA International, we support medical device and digital health companies developing AI-enabled products by aligning regulatory strategy, quality systems, and postmarket surveillance approaches with FDA expectations. Our teams help clients assess AI risk profiles, design real-world performance monitoring frameworks, prepare for QMSR transition, and engage regulators with confidence.
For more information on how EMMA International can assist, visit www.emmainternational.com or contact us at (248) 987-4497 or info@emmainternational.com.
Reference:
U.S. Food and Drug Administration. CDRH discussion paper on real-world performance monitoring of AI-enabled medical devices. 2025.
Regulatory Affairs Professionals Society. Industry groups urge FDA to use existing controls to monitor AI-assisted devices. January 7, 2026.
International Organization for Standardization. ISO 13485: Medical devices – Quality management systems.



