What the Purolea warning letter really means for AI in medtech

April 28, 2026 ░░░░░░

What the Purolea warning letter really means for AI in medtech

The FDA's April 2 warning letter to Purolea Cosmetics Lab is making the rounds, mostly under some version of "FDA cracks down on AI in manufacturing."

That's the wrong way to read this. If you build medical devices, treating this as an AI story will cause you to miss the part that actually applies to you.

Purolea wasn't a story about AI gone wrong. It was a story about a company with no functioning quality unit, trying to use AI to fill the gap. The AI didn't fail Purolea. There was nothing for it to plug into.

BONUS RESOURCE: Click here to download your free Cybersecurity Gap Assessment Checklist!

What actually happened

If you strip the AI angle out, the FDA's findings read like any inspector's worst-day list: insects and filth in the manufacturing area; unapproved homeopathic drugs labeled to treat shingles and genital herpes; no microbiological testing of finished product; no identity or purity testing of incoming components; no process validation before distribution; a quality unit that didn't establish procedures, didn't review batch records, didn't implement production controls…

These are the real problems. Everything in the letter about AI is downstream of them.

Where the FDA does address AI, the language is precise. Purolea told inspectors they'd used AI agents to draft their drug product specifications, procedures, and master production records. When asked why no process validation had been done before distribution (a clear requirement under 21 CFR 211.100), Purolea said the AI agent never told them it was required.

And that’s the line everyone's quoting:

If you use AI as an aid in document creation, you must review the AI generated documents to ensure they were accurate and actually compliant with \cGMP. Your failure to do so is a violation of 21 CFR 211.22(c).

Look at the citation. 21 CFR 211.22 isn't an AI regulation. It's the rule that defines the responsibilities of the quality control unit. The FDA didn't write a new rule for AI. They applied the same rule that's governed pharmaceutical and device quality for decades: a qualified human, accountable to a defined quality system, owns the output.

Still think the Quality role is going away?

A counter-narrative has been gaining traction in the AI-for-life-sciences market. It goes like this: the letter is a free design spec from the FDA, and the right response is to build a smarter AI architecture, with electronic signatures and audit trails layered around the model.

Some of that sounds nice. The FDA is, in plain language, telling the industry what compliant AI use looks like: human review, qualified reviewer, documented approval, traceable record. That's pretty useful guidance.

But framing this as a software architecture problem puts the emphasis in the wrong place. It treats the issue as something a better wrapper around the model can solve, and it's aimed mostly at AI/ML teams shipping software-as-a-medical-device. It also skips past the part that’s been generalized: Purolea didn't fail because their AI architecture was wrong. They failed because no quality system existed for the AI's output to enter.

A perfectly engineered AI agent drafting into a quality vacuum produces the same outcome as a sloppy one. There was no QU (quality unit) to review the draft. There were no specifications to check it against. There was no process for turning a draft into a controlled record under a qualified person's signature, and no record of who approved what, when, against which procedure.

The takeaway for medical device companies isn't "build a better AI architecture." It's "make sure you have a real quality system before AI gets anywhere near it." Reverse that order and you're Purolea with better software.

Why this even matters to device companies

Medical device manufacturers run under a parallel regime: 21 CFR Part 820 (now harmonizing with ISO 13485 under the QMSR), design controls under 820.30, risk management under ISO 14971, software lifecycle under IEC 62304, electronic records under Part 11. Purolea was issued under cGMP for finished pharmaceuticals, but it’s focused on obligations that look almost identical on the device side.

Do a quick translation across the aisle and there are really two questions every QA, RA, or operations leader should be able to answer right now.

First, if AI helped draft a design input, a risk control, a CAPA investigation, or a validation protocol, can your QU produce the full chain of custody from draft to controlled record, with a Part 11 signature tied to the named, qualified reviewer who approved it?

Second, can you show, structurally, that no AI-assisted content reached a controlled state without that review?

If either answer needs hedging, the gap isn't your AI tool. It's your quality system, and the system of record your QU relies on to do the job.

AI-generated content is the new paper-in-a-drawer

The device industry had a long, expensive argument in the 2010s about paper QMS versus eQMS. Paper was familiar. Paper was cheap. Paper looked low-risk. Companies that held on longest found out the hard way, when an auditor showed up and asked who'd reviewed what, when, against which version. The answers usually weren't there.

Cloud-based, purpose-built eQMS replaced paper not because paper got banned, but because paper couldn't produce the chain of custody regulators were already entitled to see. Signatures, version control, traceability, audit trail. That's what turns a document into a controlled record.

AI-generated content in a chat window, or pasted into a generic project tool, is the 2026 version of paper-in-a-drawer. It looks fast. It looks like progress. It produces no defensible chain of custody when the inspector walks in. And the FDA has now said so on the record. The review has to be real, done by a qualified person, against a defined specification, captured in a controlled system. Anything less is, in the FDA's words, a violation of 211.22(c). On the device side, that's 820.20 and 820.70.

Where Greenlight Guru fits

We've spent more than a decade building an eQMS for one industry: medical devices. That focus matters here, because the controls the FDA cited at Purolea (quality unit oversight, validated procedures, traceable records, qualified review) aren't generic IT problems. They're quality problems with regulatory shape: design linked to risk, risk linked to CAPA, CAPA linked to training, training linked to the named reviewers who sign documents under Part 11.

Inside a system built that way, AI's genuinely useful. Drafting first versions of risk analyses. Summarizing supplier history. Suggesting CAPA root causes. Scaffolding verification protocols. Real productivity gains, and we're investing in them. But every output lands inside an eQMS where the QU's review is the gate, not an afterthought. The speed is real because the controls are real.

Companies that take the wrong lesson from Purolea will spend 2026 bolting AI tools onto pathetic quality processes. Companies that take the right lesson will spend 2026 making sure the quality system they've got is the kind AI can safely help. Two very different bets, and only one of them holds up the next time an inspector walks in.

BONUS RESOURCE: Click here to download your free Cybersecurity Gap Assessment Checklist!

One question for your team

If FDA inspectors walked in tomorrow and asked your QU to show, for any AI-assisted document in your QMS, who approved it, against what spec, on what date, with what signature, against what version of the procedure, how long would it take to answer?

That answer is the real measure of whether your quality system is ready for the way medical devices are going to be built from here forward.

Greenlight Guru is the eQMS that is purpose-built for medical device companies. More than 1,000 device companies use our platform to manage quality, product development, and clinical evidence in a single connected system designed around the regulations and standards that govern this industry. If you're ready to learn more, then get your free demo today. 

 

Etienne Nichols is the Head of Industry Insights & Education at Greenlight Guru. As a Mechanical Engineer and Medical Device Guru, he specializes in simplifying complex ideas, teaching system integration, and connecting industry leaders. While hosting the Global Medical Device Podcast, Etienne has led over 200...

Cybersecurity Gap Assessment Checklist
Download now
Cybersecurity Gap Assessment ChecklistSlide-in
Search Results for:
    Load More Results