Back to Blog
Regulatory9 min read

SaMD + AI: What the FDA Actually Wants to See

Software as a Medical Device meets artificial intelligence. The intersection is where the biggest regulatory opportunities — and risks — live for diagnostics and digital health companies.

Gx

GxP Agents

Regulatory Intelligence · 2026-03-01

If you're building diagnostic algorithms, clinical decision support tools, or any software that uses AI/ML to inform medical decisions, you're operating at the intersection of two of the most actively evolving regulatory domains: Software as a Medical Device (SaMD) and AI/ML governance.

The FDA has been remarkably transparent about where this is heading. The question is whether companies are listening — and more importantly, whether they're building for it.

The FDA's Predetermined Change Control Plan

The biggest regulatory innovation in SaMD + AI isn't a new rule. It's the concept of a Predetermined Change Control Plan (PCCP) — a framework that allows manufacturers to describe anticipated modifications to their AI/ML algorithm and get them pre-authorized.

Think about what this means: instead of submitting a new 510(k) every time your model is retrained, you describe the types of changes you'll make, the validation methodology you'll use, and the performance thresholds that trigger regulatory notification. If the change fits within your PCCP, you implement it. If it doesn't, you submit.

What This Means Practically

For companies building AI-powered diagnostics or clinical decision support:

Your Algorithm Lifecycle Needs Governance

Every model version, every training data update, every performance metric needs to be tracked, validated, and auditable. This isn't optional anymore — it's the price of entry.

You Need Real-World Performance Monitoring

Post-market surveillance for SaMD means continuously monitoring how your algorithm performs in the real world. Not just accuracy — bias, drift, edge cases, and failure modes.

Your Data Pipeline Is Your Regulatory Submission

The FDA wants to see your training data, your validation methodology, your test datasets, and your performance benchmarks. If your data pipeline isn't governed, your regulatory submission has a gap.

Human Oversight Must Be Architected

For AI/ML-enabled SaMD, the FDA expects clear documentation of when human review is required, how human override works, and what happens when the AI and the clinician disagree.

The GxP Agents Approach

We built our regulatory agents specifically for this intersection. The Regulatory Affairs domain includes use cases for:

  • Submission readiness QC that understands AI/ML-specific documentation requirements
  • Regulatory intelligence that tracks evolving SaMD guidance across FDA, EU MDR, and global health authorities
  • Labeling intelligence that maps AI-specific claims and indications across markets
  • For diagnostics companies navigating this space: the regulatory framework is becoming clearer, not more ambiguous. The companies that build governance into their AI development lifecycle now will have a significant advantage when the final guidance lands.

    samdfdaai-mldigital-healthregulatorydiagnostics

    See GxP Agents in Action

    Discover how AI agents purpose-built for life sciences can transform your regulatory workflows.

    Book a Demo