The "Consolidated Appropriations Act of 2023" (more commonly referred to as the Omnibus Act) was passed and signed into law on December 29th, 2022. This amendment to the Food and Drug Cosmetic Act has expanded the scope of the FDA beyond just "safety and efficacy" to include the cybersecurity of medical devices. This amendment resembles a watered-down version of the PATCH Act, which failed to pass in late 2022.
As a result, on March 29, 2023, the FDA gained the legal authority to define and enforce medical device cybersecurity. So for today’s episode, we got THE leading minds in MedTech cybersecurity together to discuss what we need to do next. Chris Gates, Director of Product Security at Velentium, Chris Reed, Vice President of Product Security at Medtronic, and Ken Hoyme, CEO of Dark Star Consulting, join the podcast today to discuss the new guidelines, what the FDA can and can’t say about it, and what kinds of deficiencies you’ll be seeing in the future because of the new legislation.
Watch the Video:
Listen now:
Like this episode? Subscribe today on iTunes or Spotify.
Some of the highlights of this episode include:
-
How the FDA tried to clear a path for routine patches and updates
-
The minimum that the omnibus bill is talking about
-
No longer needing to make the link between cybersecurity and safety and effectiveness
-
When they have the legal authority to enforce cybersecurity
-
Why the document took so long to go through
-
Security architecture analysis
-
Why you should be referencing the April 2022 draft
-
Unpatched vulnerabilities at the time of submission
-
The effort needed to understand the FDA’s intentions
Links:
Medical Device Cybersecurity in 2023 and Beyond Slides
Memorable quotes from this episode:
“Literally, if you’re not aware of this already, you’re already behind the 8-ball right now and there’s things you’ve got to do.”
“Basically, if you think it might be a cyber device, it is a cyber device.”
“Don’t sit there and try to be pedantic about this and say “I don’t need to do this because there’s a comma here.” It ain’t gonna work for you.”
“A synonym for threat modeling really is security architecture analysis.”
Transcript:
Etienne Nichols: Hey, everyone. I don't know if anyone's out there yet. Sorry, we're running a minute late.
The others should be showing up in just a moment. They're backstage.
And just while we wait for everybody to show up, if just see. Just got to give it a feel for how this. This tool is. If you want to some, put something in the chat.
I'd be curious to hear if anybody's out there, where you're coming from and maybe is this your first Greenlight Guru event? Have you been to one of these before?
All right, thank you. Thank you, George. I'm glad everybody can hear us.
But yeah, let us know where you're coming from.
Looks like we got some people from Canada, Michigan.
All right. Louisiana. California.
Cincinnati, Ohio. All right. That's. My wife's family's. All from there. Kentucky.
Cool.
Well, let me. Let me share my screen.
We do have a presentation to share.
Give me just a moment.
All right, so before we jump right into the discussion, I do want to run through one other thing. Looks like people know how to use the chat. That's great. If you have specific questions, you want us to get to towards the end, you can feel free to use the chat. I'll try to monitor that, but there's a Q and A section as well at the top, and you'll see in that ribbon Q and A.
That way I can actually see you can actually vote on the questions you want asked as well. So use that Q and A as if you're able to as well.
And I'll try to monitor both of those. Both of those ways. But feel free to chat as well with each other and kind of discuss what we have going on.
So, I don't want to take up any more of your time. I know you came to listen to these guys, not me. We're going to be talking about the cybersecurity regulation in 2023 and beyond.
What every manufacturer needs to know. This is a live recording of the Global Medical Device Podcast.
And so, if you're listening after the fact that feel free to check the show notes and you should be able to access this presentation as well. So, we'll do our best to kind of talk through this for those of you who are not able to see the visual, but we also have the visual as well.
For those of you here today.
So, I'm going to let you guys introduce yourselves a little bit first, if that's all right. Maybe we can start with you first. Chris Gates.
Christopher Gates: Christopher Gates. I'm the Director of Product Security for Velentium. I've been developing medical equipment for over 50 years, so clearly, I'm masochistic and the last 18 of which I've been pretty much devoted to embedded cybersecurity of medical devices.
Etienne Nichols: Fantastic. Ken Hoyme is the next on my list. Why don't you go ahead?
Ken Hoyme: So, I see you still got me as key.
I forgot to catch that on that. I am Ken and I retired last year badly from Boston Scientific, started a company consulting group and have been on Medcrypt's board of advisors doing some consulting with them.
Finished teaching a class at the University of Minnesota on medical device cybersecurity group and keep my fingers still in various working groups.
Etienne Nichols: So fantastic. Chris, you want to round us out?
Chris Reed: Sure. Chris Reed. I'm the Vice President of Product Security at Medtronic.
I've been pretty involved in the industry for quite a while, and I have a lot of knowledge on the topic today. Specifically, my last role at Medtronic, I was involved in regulatory policy and had a hand in stewarding this law to fruition.
So, I can shed some light on the intent and impact that we expect this to have.
Etienne Nichols: Fantastic. And I will probably kind of fade to the background to let maybe Christopher Gates guide this discussion. He kind of wrangled this group together.
But feel free to just. Since I'm managing the PowerPoint, just tell me when to go to the next slides.
Christopher Gates: Sounds good. Let's just sit there for a second. Let me. Let me give a brief background as to why we're doing this. Well, first off, Chris and Ken and I, we run into each other all the time. We all work in these working groups. We're constantly in the same locations together.
It is wonderful to have friends out there who are so up to date and so current and so aware of everything that's going on in medical device cybersecurity today. So really great to have them.
One of the things I was running into constantly was a complete ignorance of regulatory changes that have been occurring over the past few months and what's currently happening in the near future and the distant future for regulatory cybersecurity.
At some of the talks I've given, I would ask for a show of hands, like, how many have read the April 2022 guidance? How many are aware of it at the time?
Like back in November, I was asking how many have read the Patch Act? I was lucky if I got one to two hands up in a large crowd of a couple.
So, it is this thing of there's a huge amount of ignorance associated with it. This compounded by the fact that the FDA is constrained into what they can say. They try to get the word out, but there's things that they cannot say.
For instance, they can't reference a draft guidance, only finalized ones. So, if you look at the FAQ on our cybersecurity page, you'll see where it references the 2014 guidance, which is no closer to what we're doing today than anything you could pick out there. You could. A novel would be closer to what we're doing.
So, it is really out of date, but that's legally all they can talk about.
So, we are all manufacturers, and I thought, you know, we should get us together as some of the best out there because we're manufacturers and we can speak to manufacturers, and we can tell you what things that government agencies cannot tell you.
So today here, what we wanted to do is go through and have a nice chat between the three of us, little background here on the slides, but not much.
And talk about where we currently sit, what we know today, what we believe tomorrow, and what we see for this near future coming, and what all these manufacturers need to know. Because literally, if you're not aware of this already, you're already behind the eight ball right now and there's things you've got to do and they're not small either.
So quick retrospective on this slide. This is about the only thing in history.
Back in 2014, the FDA, to their credit, was really the first government agency that started weighing in on cybersecurity medical devices in particular, OT operational technology.
And it was not well written, unfortunately. Said they wanted cybersecurity, but it was very nonprescriptive, it was very non describing the expectations that the FDA had for it. So, a lot of medical device manufacturers ignored it, creating a lot of strife.
Next slide, please.
In 2016, they came out with a post market document, much better document that's still in force today, sort of.
Although some things have occurred in post market that we're now aware of.
But still a very good document, much better written, almost threatening, but it was very nice in the sense that it laid out very prescriptive. This is the kind of stuff we expect you as a manufacturer to do after you've gotten approval and your devices in the field.
Next slide.
2018, a good document came out of the FDA replacing the 2014 one. Sort of…
At least it wasn't finalized, it was a draft, but it laid out what the expectations were in 2018 for your pre-market submission to the FDA. And it did a pretty good job of it.
The document wasn't formatted too well, and they had a tiering system they had to get past, but otherwise it was much better organized document than 2014. And then new slide last year the April 2022 guidance that came out doubled in size from the 2018 one and really is something they have been trying to enforce since that time on any of the 510s submittal that's going in, and they're sort of waiting into it.
If you haven't read this document, you.
Ken Hoyme: Need to next slide A couple of just interjections in that one, I think in the post market.
And one of the things they were really trying to make clear is how the FDA was trying to get out of the way for routine patches and updates.
So, a lot of information in there related to. Because everyone was claiming that the FDA was the blocker for them to be able to issue patches. And they really tried to make a clear path while still retaining the safety and effectiveness aspects of if a cybersecurity issue actually caused patient harm.
Christopher Gates: Yeah. And they introduced the whole concept of isaas.
And as far as having this separate public private partnership with an organization like HISAT, which is outstanding for us to use for cybersecurity vulnerability, disclosure, for sharing of information, for working with HBO's and MDMs together in one place. So, it was a very good document. You're right, Ken.
Ken Hoyme: So yeah, carry on.
Christopher Gates: We appear to have lost Chris, so.
Etienne Nichols: Hopefully he sent me a message. Hopefully he should be back. Yeah.
Christopher Gates: Okay, cool. So, what has changed? Okay, well, like I said, we had that guidance document from last April and they were already asking for this.
But then toward the end of last year there was a patch act which we won't really go into, but it basically was the intent was to give force of law to the FDA to enforce cybersecurity.
It was dropped from the user fee bill and instead a somewhat watered down version of it was put into the appropriations bill, the omnibus bill, and that was signed into law.
And December 29th that amends the Food, Drug and Cosmetic act to now make the FDA responsible and to ensure the cybersecurity of new devices.
So, this applies to new and modified existing, but not to fielded devices.
That's important. That's what the bill gives it currently is. We don't have to worry about all the legacy stuff sitting out there necessarily today.
But if you take a new product that's brand new or you're spinning an old previous design and going back in that then is applicable to be covered by this.
And a few things like the FDA is to finalize the pre-market guidance by the end of this year and the FDA is to work with CISA to update these guidances annually.
So, this definition that's in this thing, some people have made a lot of this and too much actually we in the industry tend to wordsmith a little too much about what something might mean and how that's going to be.
I see Chris laughing about this because he knows this is what we see and I'm sure he can say the same thing that has been clarified.
Suzanne Schwartz actually the FDA clarified this meaning does it have software in it?
So, it's pretty simple folks.
Basically, if you think it might be a cyber device, it is a cyber device. Next slide.
Ken Hoyme: Quick one.
Chris Reed: Yeah, probably a cut worth spending moments. Go ahead, Ken.
Ken Hoyme: Yeah, I was just going to note also that you in the April 2022 updated draft they spent a lot of energy linking to the 820, you know, quality system regulation. So, there's a lot of references in there.
So, they were clearly in their evolution trying to argue that because guidances are listed as not binding and they're still pointing you to things that you need to do to meet the quality system regulations.
So, they were I think very intentional. And so therefore I think we've seen Since April of 2022 a lot of questions and things coming back in submissions related to the information that's in there.
Not because they're trying to follow the guidance yet because it's still draft, but it's because it's anchored in the inequality system breaks yeah.
Chris Reed: And just real quick on the last the definition, you know, the thing that the FDA emphasizes is has the ability to connect.
You know, if you've got USB.
In fact, they've even had on slides USB.
So don't like to get caught up on the Internet part of that USB, Bluetooth network, a network jack, you know, you're going to have a hard time trying to use the Internet part of that to wiggle out of these requirements applying and I've, I've heard a lot of questions around that, which is why I was smiling when Chris brought it up.
But I just want to emphasize that I will tell you that there probably is some legitimate use to applying that filter. But I would say you really have to have shown how you've disconnected all that connectivity or have tightly managed it in a way to make it irrelevant.
I think to Chris's point, if it's got software and inputs and outputs, you know, you're going to, you're going to. They're going to interpret that very broadly and I think that's worth emphasizing.
Etienne Nichols: One of the just quick questions from the audience was about acronyms. Just to make sure we go ahead and define those acronyms as we go. I know there are a lot of them, so it's going to be tough, but at least the first time we use them if possible.
AOK okay, let's go.
Christopher Gates: MDM is medical device manufacturer. HDO is health delivery organization. Hospitals, clinics, doctors’ offices. Yeah.
Chris Reed: SBOM, Software Bill Materials.
Christopher Gates: Yes.
Etienne Nichols: Oh, man.
Christopher Gates: The ISO and Drug Administration. I think we've got a couple.
Etienne Nichols: I was guessing ISOP was Intelligent Security Operation Platform.
Christopher Gates: ISO no. Advisor. No. Information Security Advis. Advisory Organization.
Ken Hoyme: Analysis Organization.
Christopher Gates: Analysis Organization.
Chris Reed: That's it.
Etienne Nichols: Okay. Okay.
Christopher Gates: So, something to make a distinction between. And we saw like cyber device on the last screen. And here this obviously is an excerpt from a document. This is all out of the omnibus bill.
So, this was put together by Congress.
It was not put together necessarily by the FDA.
In fact, it wasn't. And so, the wording is somewhat interesting. It's what's in the omnibus bill, but it's not necessarily what the FDA is doing. If you're looking for FDA expectations that April 2022 guidance on free markets is where you want to go, that's really where their expectations are. If you're trying to work to the omnibus bill, you're not going to hit the mark.
So, this is just to give you an idea what the omnibus bill is talking about.
I think this is an absolute minimum and don't think you're going to get away with that necessarily today.
So, when we talk about that, these are kind of given things that are out there. But again, the 49-page April document is a much better reference point.
Ken, Chris, you agree on that slide?
Ken Hoyme: Yes.
Christopher Gates: One of the more interesting parts of this, and Chris Reed and I have gone around on this a few times, is this rule of construction.
This is all out of the omnibus and talks about basically giving the FDA complete authority to define the aspects of what it means to ensure medical device security.
So, when you see things in the omnibus bill like that last slide, the FDA can override that.
That doesn't mean that's the beginning or the end of what they can do. They can basically do anything they want and then going forward they're going to be advancing the ball here on an annual cadence along with CISA.
And then and Chris schooled me on this one. The last part of this was including the device that's approved or cleared prior to the date of enactment of this actual.
That sounds a lot like legacy, but what is it really, Chris? It's not legacy, it's.
Chris Reed: Yeah. So, I think there's a slide here in a moment talking. But I think the key here is we've already seen the FDA using their authority to crack down on security cybersecurity, but they always have to currently tie it to safety and effectiveness.
So, when you see for instance recalls around cybersecurity or communications or any actions or even when they do deficiencies on a submission essentially is going back to this security issue is making us not competent or we have an issue with the safety and effectiveness.
So, this rule of construction basically said we still have that authority, none of this overrides it.
But now we have new authorities that allow us to.
We now they essentially can reject submissions that don't do basic cybersecurity hygiene that's currently in the laws Chris is showing or eventually in the guidance that FDA publishes. So, they no longer have to make that tie to safety and effectiveness.
They can just say this does not meet basic cybersecurity hygiene. And that's. So the rule of construction was making sure they didn't override what they the authority they already felt like they had.
Christopher Gates: Excellent.
So, one of the very, very common, in fact I even got it from some of the engineers that worked for me here, mistakes here is that you saw in the media the device cybersecurity enforcement was delayed to October.
And that's not the truth.
The truth is starting March 29, they have the legal authority to enforce cybersecurity, but they're not doing so as far as the refuse to accept checklist.
So, what this looks like is when you put a submission in, this is going to be the friendly period here where from March 29 to October 1, they are going to be receiving these and putting them in through the clerk.
They're going to go through the current refuse to accept checklist, pass it on to a reviewer that's assigned to it and then that reviewer will work with you for the deficiencies that are in that submission.
Those are the deficiencies. Like any deficiency, you'll have 180 days to fix them or else your pre-market submission drops out.
Ken Hoyme: And I am aware of companies that have gotten a refuse to accept prior to March 29th.
That was on cybersecurity grounds. So, and I think it's rooted in my comment earlier about the April 2014, April 2022 linkage to the quality system regulations. And so, in one case I know that the submission was made by a company that didn't include an S BOM and got bounced back because even though that's in the guidance and it's in this new authority, they still expect it as a necessary prerequisite to a maintaining safety in the post market world.
Chris Reed: Yeah, that's a little surprising to hear that detail like just what our experience.
See if I can highlight kind of what's how this is changing.
They if manufacturers didn't notice last year, they updated what you know, essentially deficiency templates around cybersecurity and all those are rooted to existing approved guidances.
The thing I would say is if you look at the draft April 22guidance, you can still see the spirit in their deficiencies but they will never point to that guidance because it's not final but they will point to existing off the shelf guidance or you know, the here's I should probably shouldn't cite these guidances off top of my head, but the there's, there's other guidances that like they have a slide that shows like 10 guidances they'll have and so when you get a deficiency right now, they'll point to those.
When I've seen SBOM brought up before, if you say in your labeling, you're going to give SBOMs, they'll say okay, we want to see it.
So, we were getting deficiencies already on cybersecurity, but it was tied to existing guidance. And again, ultimately, they're trying to prove the safety of the device.
This March 29th date is submissions that came in March 29th of this year or later.
They now have this new authority and so you can expect to see deficiencies, and I've been told it's too early. We haven't gotten deficiencies back on these yet.
But essentially if you don't include an SBOM now they will say you need to give us your SBOM, you're required to. And they'll point to the law if you don't give your vulnerability management process.
So, they have a new thing to point to in their deficiencies and my understanding is they've already updated those templates.
The vehicle was strange using the refuse to accept policy to communicate this, but they essentially were declaring to everyone hey look, we have these new authorities, but they're not going to auto reject if you use the ESTAR template until at least October 1st.
And if for any, for any of you that know the, you know, this is more of a 510k process and on October 1st, I believe it's required for all 510k submissions.
So, they're trying to get some alignment behind the scenes to enforce the new requirements. So, they were kind of declaring their new authority while pointing out they're not going to auto reject until at least October 1st when that becomes effective.
So, there's a, there's some, if you will, lines that are happening in terms of how FDA is implementing their authority. But I want to go back to Chris's point. They are absolutely already going to enforce it.
So, I guarantee you we'll be hearing about submissions that were submitted March 29 or after that will get deficiencies related to these new authorities.
Christopher Gates: And I see we're getting a lot of questions about SBOM and all that. We'll, we'll get to those at the end. And let's talk about SBOM's IMDRF is certainly one of them.
There are better references than that. And we'll talk about that in a second.
Yeah. And one of the things I want to point out is this is not related to cybersecurity. I remember for years there was the RF coexistence guidance that came out of the FDA.
Don Witters of the FDA, great document.
And it was in guidance for, oh, a long time, like eight years or something. And we spent millions of dollars and external testing laboratories and everything to meet this guidance.
So, what they can say they're enforcing to and what they enforce to are really two different things.
So, if you want your device to go through approval cleanly and easily and not have a lot of delays and questions asked, a lot of extra work, just meet what their expectations are, and you take those from the latest guidance and the latest input you can get from the FDA.
That's just the way to work with it. Next slide, which should be eSTAR.
It is, as Chris noted, 10-1-ish.
All of the 510 submittals will no longer be on paper. They will be through an electronic means.
So, this is, we bring this up because a, it facilitates the use of refuse to accept checklist. It starts to get automatic at this point.
And then also we now have these different levels of documents that we have to deliver.
There is the 2018 guidance, there is the omnibus bill, there is what ESTAR is saying in its health and there's what's in the April 2022 guidance.
So next page, here's some of the stuff from ESTAR that they're talking about of what they want to see inside of this.
Bear in mind this can change rapidly and I expect it will as we go forward here. But this is certainly as it stands today, if you file in eSTAR what they're telling you.
Chris Reed: Great. Yeah. And just to highlight here, all those hyperlinks here have more text inside them. So just. I know there's questions about SBOM here. FDA does have a FAQ on their website, but also if you click on that SBOM and supporting info, it'll give you links to the NTIA, you know, SBOM implementation information, which is basically them telling you this is what we expect to see in SBOMs.
Same thing. You actually get a little more detail on each one of these, what they expect to see.
And you can imagine their deficiency templates on the other side might mirror the request that you'll see in this help text. So, I would highly recommend. I've been. I went and scraped all this and have been distributed internally within our company.
You know, as well as we're updating materials just, we were doing most of this. Right. But it was helpful to see how FDA was communicating it. So, it's a really great resource and.
Etienne Nichols: I'll just throw out for those of you listening, maybe, maybe you're not listening to it live, but we're looking at page nine of our presentation and the resources to talk about are threat modeling, cybersecurity risk assessment, unresolved anomalies, cybersecurity controls, traceability matrix, cybersecurity testing, and SBOM supporting info. So yeah, I'll try to do a better job kind of trying to communicate a little bit about to those who may not be able to visually see this right now, but cool.
Christopher Gates: Or ignoring the elephant in the room, which is somewhere in the FDA, there's a JavaScript programmer who decided to name this window JavaScript window. Okay.
Chris Reed: Yes. Yeah, we will ignore that.
Christopher Gates: Ignoring that next slide or perhaps didn't.
Ken Hoyme: Rename the default.
Christopher Gates: The April pre-market guidance for those who haven't read it yet or while we've been talking.
They're really good. This is an abbreviated form of the table of contents. I've cut out some of the boilerplate that's in there and a good appendix one that's kind of instructional.
Good document to read.
I have gone through it at least five or six times cover to cover because there's a lot in there and a lot of context and it takes a lot to pull some of the things out of it, like all of the artifacts and documents and processes they want to see in place at any given time.
Some of the stuff that's in there even speaks to post market and in fact it has some rather large impacts to post market testing.
And they'll talk about things in there like a whole raft. There's about a dozen different tests, pen testing, fuzz testing, updating your pack surface, threat modeling, all these things done something like.
And it says e.g. annual basis. Well in practice now in 2022 we've now seen this put in place and it's in point of fact it's every six months they're asking for this testing.
They want to see that this artifact is being re performed every six months.
So, it's a significant amount of ongoing testing for the life of that device. So, you reach end of support. So, 10, 20, 30 years, whatever that device is.
So, there's a lot to it.
Highly recommend you go read this document. Can't recommend it enough, guys.
Chris Reed: Yeah, I agree. I think the key here is this pre-market guidance.
It's not final.
So, you won't see deficiencies saying directly that you didn't give a global system view of your security architecture.
However, you may see a deficiency that essentially says you need to provide a system end to end view for us to understand, you know, the risks that you've explained and that you threat, you know, you've done a threat model appropriately. So, this guidance can be used to understand directionally where FDA is at and should be used.
But it will not be referenced directly in deficiencies until it's final.
Christopher Gates: But the spirit is very important.
Don't sit there and try to be pedantic about this and say, oh, I don't need to do this because there's a comma here, it ain't going to work for you.
It's applied to the spirit of what you're doing. One of the other things to recognize as you read this document is it took just under two years to get this out of the FDA.
So, when this came out, I followed back into the comment portal, it was like 64 different comments, most of which were, why are you referencing old documents?
Well, it turns out they're old documents because they weren't when they wrote it. But it takes that long to get it through.
So, you'll see things in there like not the new RS45, 1345 alignment that they're doing, the QMSRS, but you'll see Reference to the older laws and stuff like that.
It's in transition.
So hopefully when they finalize that all that will be brought up to current levels of external document references. I'd like to see that happen.
But for the rest of the stuff, take it pretty close to gospel. This is what they're going to want to see and, and if it takes that long to get something out, they're probably not going to make too many changes to it here before you see a final version.
Ken Hoyme: So, another observation I'd make is as you've seen the, the evolution of this and becoming longer and more detailed, the focus on both threat modeling as an activity which was starting a while before, and now the focus on security architecture and understanding that a synonym for threat modeling really is security architecture analysis. It is the act of iterating over the architecture to ensure that security controls have been implemented.
This version of it interestingly overlapped the period of time when Dr. Kevin Fu was serving a period of time at the FDA.
And for those who aren't familiar with Kevin, he was a professor at the University of Michigan, recently moved to Northeastern University, has in 2008 he and his grad students published kind of the seminal paper that started pointing to the potential security flaws in medical devices. And he has been a strong advocate to.
Some of the more technical aspects that are showing up in here I think are evidence of them bringing in more technical expertise to make sure that it's expressed in a clear and appropriate manner.
Christopher Gates: Yeah, there's some excellent stuff that almost reads like a textbook, like appendix one that goes through cybersecurity controls in there. And I wouldn't be surprised but Kevin had a large hand in that because how well written it is, it's just, it's a good educational tool.
So very nice to see. New slide.
So, let's talk about what this means as far as artifacts when you go to submit. If you look at the 2018 guidance that was the best of the pre-market guidances up to last year, pretty straightforward kind of things. We're talking about a management plan. What is it you're doing? A fairly simple risk management plan. It's no longer that way. We're talking about threat modeling of the interfaces that go on between the components in your system.
Security architecture views are like communications and networking, where you would show what those different aspects are and how you're kind of securing them as well as use cases.
What are the different roles? What is their least privilege, what access do they have?
Third party software components that's you know, libraries Frameworks, operating systems, communication stacks. You'd list out any known vulnerabilities and risk assessments for any of those softwares. Bill of materials as we've mentioned and we'll talk more about here when we get done with the presentation.
Security Risk Management Report. That's an old good TIR 57, which Ken had a large hand in that document and that's a good document to reference. So, it's basically summing up the securities, the vulnerabilities and the mitigations that were applied and then vulnerability testing that you were supposed to apply to it. So, scanning of things for known vulnerabilities.
Think Metasploit type stuff.
Pretty simple back then.
Chris Reed: Yep.
There's a quick question that I am worth highlighting here. Cool. There's a question about the 22 pre-market guidance superseding the 2018 one and I think it's important to know that you know, earlier Chris had on the screen that actually the only final guidance in the pre-market Space is the 2014 guidance that the 2018 guidance was superseded by.
It was draft and it was superseded by the draft 2022; April 2022 guidance you hear us talking about.
Again, the nuance we're trying to share here is that that's not final, so you won't get a deficiency pointing to it, but it does again give you a view into the thinking. And just as mentioned earlier, it's still an important document because it likely will get finalized by the end of the year and it probably won't change much because if it changes too much, they'll have to reissue it as a draft.
So, it's directionally correct. There's going to be tweaks to it, but it's going to look like that almost definitively. Right. So, it's definitely a resource. The April 2022 guidance should definitely be a resource you're referencing.
Just know that you'll never get a deficiency that points to it.
Christopher Gates: Absolutely. And yet they work to the 2018 guidance even though it was draft. And that's really the important takeaway. There is use that, read that, understand that's their current expectations. Even if they legally can't say that's what they're working to.
They are giving you a huge signpost as to what's needed. So next slide the eSTAR help that Chris went through and scraped for all of the content that's in there.
Little bit bigger list.
Some of the stuff you see is common from before like cybersecurity management plan that has a lot bigger scope now than it did before.
There's a lot of other activities such as updating and event response and stuff like that that now needs to be addressed in your plan.
It's the total product lifecycle, not just, you know, how am I going to handle this during development, threat modeling still there, Cybersecurity risk assessments, unresolved anomalies. That's bugs for you and I can bugs be weaponized? You have to look at them in conjunction with vulnerabilities and other bugs. Can they be in a chained environment used to affect the security? In other words, can I convert a bug into a vulnerability?
What the controls are that you're using, ensuring trusted content, detect, respond and recover. We've all seen those before.
We're now tracing controls to vulnerabilities and tests, which is really out of order. It should be vulnerabilities, controls and tests to ensure that your controls that you've put on those vulnerabilities are affected.
So, you're going to have tracing across this and show that these things were actually performed. Development frameworks and testing.
In the April guidance they talk about this to a large extent. There's a couple of good things to reference out there. NIST has one, as does ISO, the 8100151 ISO document for secure medical device development,
SBOM and some supporting information with SBOM such as kind of support. Again, go to reference the April guidance for that supporting information.
Things like expected end of life and what kind of support you can get from each of these components.
Hazard analysis for all off the shelf software and cybersecurity labeling which is, you know, your instructions for use, things to instruct the customer what they can do, how to perform updates, what kind of ports they need to open, what known risks are remaining, all sorts of stuff that you're supposed to be transparent about in there.
So that's today in this isn't like a future state. That's what's in eSTAR today.
Dan. Chris?
Chris Reed: Yeah, I mean that's it, I think, you know, right now look at the draft April 2022 guidance. Also look at the help text in the eSTAR because it'll show you the types of documents they're looking for.
If you don't already have a good sense of it, it's a good place. The help text is a great place.
Ken Hoyme: To start, I think.
Christopher Gates: Go ahead, Kenda.
Ken Hoyme: Yeah, I think the issue you talked about with unresolved anomalies was an interesting observation on their part is that they have been talking in general about wanting to know what unpatched vulnerabilities, known vulnerabilities are in your device at the time of submission and what their risk is, as well as obviously plans to potentially get them patched after approval before you go through the field with it.
But their recognition that anomalies need to be reported in your final test in general and asking whether or not those anomalies could tie to cybersecurity is an interesting observation on their part and it just kind of expands in the number of things you need to be thinking about as you're closing out and getting ready for submission.
Christopher Gates: That's one of those tests that needs to be ongoing by the way, for the post market. I see Robert asked a question down here. It's an easy one to answer. Robert's quite right.
He puts in there the expectations of the FDA cyber labeling are quite long and detailed. Need to spend a large effort to comprehend their expectations.
Exactly. Go read the April guidance. They go into a lot of detail as to what they want, and a little bit of interpretation will be in there as far as what you're supposed to convey.
But this has a pretty big impact for say your IFU, your instructions for use.
You're going to have a chapter in there now with a few headings in it.
So, you need to get ahead of it at the very least.
Ken Hoyme: One of the things in that regard I know I provided this comment to the 2022 guidance. Hopefully they will maybe rethink it. But it was in both 2018 and 2022, so perhaps not.
And that is that the SBOM is considered part of labeling.
And because the SBOM changes every time you issue patches,
I know that you know when I was at Boston Scientific the internal processes for updating and modifying labeling were far more heavyweight than, you know, service manuals or things that you that were considered of lower.
So, I think you know the interesting challenge about how to streamline if they maintain that SBOM is a labeling artifact that organizations need to think about, how do you.
How do you generate and approve and clear that so that it can move out quickly and it isn't a labeling change that slows down, you're getting a patch to market.
Christopher Gates: There's a good question there that John asked and talked about. Are they recommended frameworks for labeling guidances? Given the 14971 discussion on using the and the differences in IT as a risk measure for EU MDR, I don't think you should look at the IFU as a risk control or vulnerability control mechanism. It is a way to enable and inform your user base if there's something they need to do, something they can do to enable it, or just enabling them with the idea of this is what you can do to ensure the safety and security of this device. And it is not going to be a mitigation.
It is not something that you're going to say or transfer risk to the end user.
Chris Reed: I agree with that, Chris. The one thing I would also clarify though is, and I think again it's a nuance here, it is meant to communicate the risk and ideally how to manage the risk of that device. Right. So, like you threw out examples earlier like firewall port openings, you know, if patching's required, that's done by the user.
Like it should explain that. So, it's, you know, expressing the risk of the device and how it should be operated.
Christopher Gates: Yep, absolutely. New slide.
And that's all the artifacts called out in the 2022 April guidance.
So, I'm not going to walk through each one of these because I want you to read it.
38, 38, 38. We went from 8 in 2018 to 38.
By the way, you won't find this table in the guidance. On one of my many passes through it, I started writing down the artifacts. That's what all this came from.
Some of these things are really very cool and are advancing forward like security architecture views. I absolutely love it. It actually relates back to model-based systems engineering and I think that is the future of where we're headed for cybersecurity.
So, there's some really great things about it. Things like introducing metrics for your field updates and defect density and time period for update penetration time periods from awareness to your updates.
I think those are great. You're going to be keeping averages of those literally. Earlier this morning with Client I was working on that very topic of how we're doing that and setting up these metrics so that we can have all this stuff readily available on a moment's notice.
So, this is stuff that you should be looking at and reading about.
Feel free to read that document several times and understand what's going on. Next slide.
A couple things I wanted to talk about training Ken mentioned he works at University of Minnesota. Ken, would you like to talk about your training for a second?
Ken Hoyme: Sure.
So yeah, we developed with in conjunction with the center for Medical Device Cybersecurity at the University of Minnesota that that we worked to create and it reports into under the university's Technological Leadership Institute which provides master's program geared toward working professionals with technology or management of technology as One of their master's degree cybersecurity is another, but that tends to be geared toward enterprise cybersecurity and medical device innovation is their third.
So, this is under the medical device innovation track.
We just completed it. I'm about to submit final examination results to the students this evening and it's going to be a core class in the medical device innovation track. So, I expect it will be given annually in the spring semester so links.
Chris Reed: There to keep an eye on it.
Christopher Gates: We so need this. This industry needs to be trained. Next slide please.
At Velentium we've also stood up a training program. This is purely video.
It is on a student paced system that you can work at it for anywhere from the masterclass. 60 hours of concentrated cybersecurity for medical devices covers regulation, covers crypto, covers everything you need to know.
And then we've got trained for everyone else that's in your organization. And then because the senior leadership team is so important, as any one of the three of us can tell you, if you don't have SLT behind your back, there's no point even trying to.
Cybersecurity and medical devices extends across the organization so you need them.
So, bringing them up to speed as to what the expectations they should have for their organization and how to achieve it and how to get reporting back is important too.
So, we're all doing this training to try to elevate people up in this industry to a point where we can do this without experts like ourselves. It just becomes part and parcel of doing the work.
Ken Hoyme: Translating from Chris Speak SLT Senior Leadership Dean thank you.
Christopher Gates: Thank you, Ken.
Next slide.
I've seen a number of and I'm going to dive into this because we saw so many questions about SBOM. Ken's certainly worked on this. Chris has been around it, hunted around it as well.
I started working with SBOM at NTIA some five years ago and the great Alan Friedman, who is an amazing guy, he can herd cats better than anybody I've ever met.
He, he started this working group, and I rolled off another group working on firmware updates with him. He says, I'm starting this new one called the software build of Internet.
I said what's that?
He says, oh, it's basically an ingredients list of what goes into your product and it's machine readable and how do you. And I said really? Well, that's what that's going to take a week, okay, maybe a month for us to define.
Yeah, I was an idiot.
Five years later we're still debating the nuances. There are tons of edge conditions. There's lots of challenges that we've overcome.
There's still lots to go in the crawl, walk, run model of life. We're in the early stages of walking in SBOM, but we can all now create SBOMs at Velentium. We've been creating them for two years. They're machine readable. We happen to use CyclomDX. There are two predominant standards, CyclomDX and SPDX.
They work in XML or JSON formats.
They can be consumed and monitored against vulnerability databases like NVT and others.
And you can look at this moment dependency check out of all was 300 million component checks monthly.
So, there is a lot of SBOM monitoring going on.
And I saw also mentioned out there Vex and VDRs, those are useful secondary documents.
Yes, you can include Vex information in CycloneDX format. You certainly don't have to at the moment. No regulatory agency is asking for this.
Will it be probably as this evolves, VDRs as well start to read about it, start to learn what they are and start getting there. But they're not something you have to be worried with today.
Ken Hoyme: Let's go up and define what a Vex is intended to solve and what is recognized is that let's say for example, you are using Windows in your device back in the day and WannaCry came out which exploited a vulnerability in the Windows SMB protocol version one.
Well, if your device didn't use SMB and you had those ports and services completely turned off because you didn't need them, the attacks that would try to exploit that vulnerability would not work on your device.
And so, the vulnerability exchange format is a way to communicate that particular vulnerabilities which might be present based on what your SBOM says are not exploitable because of the way your device is configured.
And so, it's recognized that rather than necessarily panicking and saying you say you use Windows 10, build whatever in your device and these vulnerabilities are listed. What have you done?
In some cases, nothing. Because a good, hardened operating system used in a device should have almost all miscellaneous services and things turned off.
So that's the purpose of Vex?
Chris Reed: Yeah, absolutely.
Christopher Gates: We realized up front that some 80, 80% of vulnerabilities weren't exploitable.
Okay, so it's like we're going to have a huge number of false positives if we didn't come up with some artifact to address that.
Chris Reed: Yeah, and a couple things that I do want to highlight here Deccs I think technically, just recently, that's a very new standard.
I think CISA just published a document on it even just a few weeks ago.
I do want to make it clear that the law does not and FDA is not expecting you to submit Vexes right now that this question was asked and I answered it.
But given the conversation, I wanted to say this.
That being said, if you gave, they do expect you to give you a list of vulnerabilities and how you disposition them.
Vex is a vehicle to do that and if you gave it to them, they would likely accept that. I just want to make it clear that right now, if you didn't include a Vex in your submission, you still have to include the same information.
It just may not be in that automated machine-readable format just to kind of tie the, you know, connect the dots there.
Christopher Gates: So, there's two questions that are lingering out there that are kind of related, talking about transitive dependencies and also about requiring S bonds to be included in labeling.
The FDA wants to see human readable as well as machine-readable S bonds, but they are different. A machine-readable S bond may have tens of thousands of components listed in it because of transitive dependencies.
So, a transitive dependency, let's say log 4J, we'll pick on a popular topic.
Log 4J has 294 transitive dependencies. That's other 294 other projects that it needs for it to perform its job, for it to implement logging in a Java program.
So those transitive dependencies are there. You should do as much of them as you can, all of them, if possible, in your machine-readable version.
In a human readable version, no, you go to that first level, and you say log 4J and the version and where you sourced it from, and that's it.
You have much less requirements. You need to provide both for the FDA.
So, in your labeling, yes, you can put in your SBOM in a human readable form, in your IFU, for instance, but not all the transitive dependencies. That's ridiculous. And the reason is simply because it's pointless. Nobody's ever going to look through thousands and thousands and thousands of human readable texts to try to figure out if they're susceptible.
It just doesn't scale.
So, you need the machine readable to take all those thousands of entries and summarize them for you instantly, as many good tools do. Now, by the way, there's lots of good tools out there that you can reference.
MedCrypt, Heimdall, Cybellum out there. These are good. There's a lot of open source out there as well too, like dependency check and all that. That will do this for you, so you don't have to do this.
So good question Silk about it.
Chris Reed: S. Fonz I see a comment from Sean talking about Vex possibly providing MDMs wiggle room to take credit for quote, illusionary controls to declare vulnerabilities that are not, that are applicable, not exploitable.
The I wanted to use this opportunity essentially, I think to summarize what he's saying is Vex is not a meant to be a vehicle to just say everything's not exploitable, we're not fixing anything.
In fact, what I wanted to do was pivot a little bit. I don't know Chris, if I, I should have looked at the slides closer. There was a very specific part of the law that's cited, and you'll see it in the RTAFDA guidance.
One of the things in the, the Patch act or the bill was essentially that you have to have a reasonably regular update cycle for what they call controlled but unacceptable vulnerabilities.
And then also if you have an uncontrolled vulnerability, you need to update as soon as possible.
And that matches, that matches the essentially the FDA's post market guidance on cybersecurity.
What I would, what I would say to Sean's comment is what we're trying to do is get the concept down that we need to be doing maintenance on these devices and reasonable routine cycles based on the device's design need to be planned in.
We're working on updating the joint security plan to kind of clarify this, but just to highlight a couple extremes.
You know, a pacemaker may have very little third-party software, and it's very specifically designed and very well tested and therefore it may not have a very frequent update cycle because it's been designed to be resilient for 10 years.
However, you may also have a medical device that's running on off the shelf windows where patches, as we all know, come available once a month.
The reasonable cycle may not be once a month, but it's probably not never.
And it's probably not, you know, every three years. I would say it's probably not even yearly, although even if we got there on some of these devices that would be better than possibly where we're at.
So, I just want to emphasize that the law is this is part of the nuance that FDA doesn't have to say, hey, you didn't apply this patch and you're unsafe.
They can just say, hey, you've not patched that device for three years and there's been all these vulnerabilities. That is not good hygiene, therefore we're rejecting your submission. You know, that's the kind of authority they now have.
And I think that's worth calling out. I think Vex definitely shouldn't be. It's a vehicle to communicate, maybe why you're waiting to your next planned cycle to address the vulnerability.
It's not a vehicle to just make it like, say, we're not applying any of the updates that have happened.
Christopher Gates: There's also the point of Vex, which is a company statement, and if you come out and declare something incorrectly, like you downplay a vulnerability already. We're discussing, as part of the critical infrastructure of the United States, software liability.
Okay, so there are some interesting discussions that's going on not only in the United States, but also in the EU.
Interestingly, it could do things like, oh, you use log 4J and it has a problem.
The authors of log 4J are now liable for it, not the person who used it necessarily in their project.
There's a lot to be rung out here.
But there is legal implications to posting a Vex and saying we're not vulnerable.
Vexes, as well as SBOMs, are versioned. They're made to be updated. So, you may say we're doing root cause investigation now and we don't have nothing. And then you say, oh, we're not affected.
And then later on you find out you are affected, you update that and say, yes, we are affected. And you can qualify that with what you can do as power, far as patches and workarounds and settings and configurations that you can do.
So, concerns about manufacturers lying or wiggle room, as was said in the comment, the legal side of this is going to kick in and start taking care of that problem. Just as like if they put on their webpage and the security page saying, yeah, we're not affected by this particular ransomware or something.
And they were.
There is legal liability to that.
So that's a technical solution we can't do. We can sign the Vexes, we can sign the SBOMs, but digitally, but we'll let the lawyers take care of the rest.
Chris Reed: Yeah. And I hope even before the lawyers start happening, although I hope, I hope we don't get there sooner than later, I did want to highlight a couple key things about the law that you do realize.
Ken Hoyme: You do realize that Sean's online.
Chris Reed: Yes, Sean is a lawyer. Yes, I'm probably giving lawyers bait for this, right?
Christopher Gates: You tell by how he calls out phrases. I mean, come on.
Chris Reed: Yes.
Ken Hoyme: Yeah.
Chris Reed: So yeah, for those that don't know that are listening, that was who asked the question about Vex being used to basically push off things any.
So great, great question. The one thing I wanted to share, first of all, originally, if you look at the original version of Patchak that was introduced in the House, I think in April or May of 2022, I can't remember exactly what month.
Initially, FDA pushed for what I call legacy authority, meaning all devices on the market had to meet the new requirements. It doesn't matter whether or not they were new. Doesn't matter that they weren't designed with those requirements.
They had language in there around adulteration and misbranding and we did successfully get that language taken out in the final law.
It doesn't mean we don't have a problem there that we're going to work on. But I just want to emphasize the new authorities only apply to the submissions going in after March 29th this year.
That being said, before we get lawyers, what I expect and I want to this is what I've been sharing internally, those devices that you submit vulnerability plans for, I would expect that in a couple years when, when you have FDA inspections, they're going to pull open those plans and those procedures you provided and say show me your vulnerabilities and what you've been doing about it.
You better bet inspectors are going to get trained on this and things you're submitting today. You better be able to operationalize because I do expect that that will be looked at.
It's going to take; it's going to have a tail. But that's the implication of this. And I think that's just worth pointing out. I, I've been, you know, this is hard work to get all this operationalized.
And that's the message that we've been sharing internally. You know, we're, we're better at it with some products than others.
And but I absolutely expect that they will reach back into products even without any new authorities. But they'll do it based on the inspection process and things like that.
Christopher Gates: And Chris, to your point, I've already seen that in clients. Okay, it's already occurring.
Chris Reed: Yeah, I will say I've seen it occurring. But the thing I will tell you is it happens accidentally.
You know, you have an inspector pulling up like a CAPA record or you know, a safety risk analysis, you know, something like that, and it happens to be security related.
Then they get into it.
They're not necessarily coming in and asking specific questions yet. They kind of stumble into security right now. And yes, we've had that.
Totally agree that inspectors are already stumbling and it's just not an intentional effort yet. And I expect that will happen at some point.
Ken Hoyme: So, I had actually jotted a note to bring up later in this group which you're kind of addressing, which is like, you know, FDA can issue recalls for fielded markets based on safety and will they be able to do recalls based on cybersecurity hygiene?
And certainly, through the audit process, you could certainly consider a 483 letter, or something written up against not keeping the device secure as was expected at the time of submission.
Chris Reed: There's a couple other questions in here that are kind of, maybe more general. I'm going to. They're kind of complicated answer. I've been trying to write answers to the questions that have been written.
Etienne Nichols: You've been fine. You've been fantastic. Yeah, go ahead.
Chris Reed: Yeah, trying to.
There's a couple questions just around general like IoT devices cloud for reputable OSS.
Here's what I would say generally and all that the regulation technically applies to the regulated medical device. So, if you have an IoT device that's not falling within the, you know, regulated medical device, it's not technically in scope.
That being said, all those supporting systems that might be related to your device can get brought in to do this cybersecurity analysis. And so let me give you an example.
One of these was for reputable OS like iOS is it better to allow auto update or freeze the version you did the latest testing on?
If, if the medical device is the actual app software, I'll, I'll pick on a commercial product. I'll even pick on one of like generic, generically one of my, my companies. Right.
So, there are, we have apps in the iOS store, in the Apple Store, Apple Store. And they,
this would be used to like talk to maybe your, your glucose monitor or things like that.
They are designed to run on a normal user's device and have to keep up with the auto update. And that's part of the analysis that they've submitted to FDA on exactly how we manage that risk.
We have other apps that are medical devices, and they have a Medtronic provided tablet or phone.
And in those cases, we generally freeze because we control it only because it allows us to have a little more control.
But technically the tablet or device isn't part of the medical device. But FDA is absolutely asking questions about how we manage that device.
And you know, we've been increasingly including that scope in our submissions.
So, I share all this. To say your medical device, they'll look at the broader system that supports it and if it directly can impact safety, they will absolutely ask questions about it. So, if you're running a medical device, like again, I'll give another. This is public, so I can share this.
We have AI algorithms that sometimes look at heart rhythms to make sure they're actual real alerts and we filter out false positives and that software runs inside the cloud.
The regulated software is the AI software that processes those.
But we have to then talk about how we manage our cloud securely and things like that. Generically, they don't reject cloud or any of those systems. They just want to make sure that your medical device can run in a stable, controlled fashion. So that's a super complex topic, but I'm trying to kind of highlight that they will pull in the supporting system and make sure you're controlling it.
Christopher Gates: I wanted to address a question by John here that he's talking about. Bring your own device, BYOD, notice I call that what that means.
And he says MDM, Mobile Device Management, which is really what he means, a kitted device.
And it's a very complex decision and in any sort of sophisticated ecosystem with the medical device, you probably will have both. You may have a patient controller that is BYOD, and yet you'll have a clinical implantation used for a surgical theater. That's a Windows tablet, that is a kitted device.
How you manage these both have their pluses and their minuses. All the extra work you need to do and update it. It's quite a complex choice and I can't just say use this one or use that one, because it really depends a lot upon your business model and your use cases associated with it.
I will tell you; we've had a lot of clients come to us now and we're now doing a lot of post market support for them where we're administering their MDMs, managing these kitted devices for them, configuring them, managing them, providing all the services because it is so invasive, it's so much extra work.
And there's a lot of this cybersecurity that's going to be a lot of extra work for people post market that you're not expecting. And that's just one area, in fact, that caught us off guard.
We literally had clients coming to us asking us to do it for them.
This wasn't something we offered up front. We didn't anticipate this.
So, it is a big deal and it's a very challenging decision to make how you do that.
Ken Hoyme: So, we have had several questions, I think in the area of cloud. I wonder if any of either of you have thoughts on how this whole area gets.
Christopher Gates: I live in Las Vegas. There's no clouds here. Okay, sure.
Chris Reed: Actually, I'm pretty sure there's pretty large data centers that are hosting clouds.
Christopher Gates: Very, very large data centers. Including Google.
Chris Reed: Yeah.
So, yeah, I mean, I tried to touch on that a little bit in one of the examples I was giving earlier.
I'd like to be very clear, like cloud can absolutely be part of your solution and in many cases should be.
The thing I would say is understanding the boundaries of where your device is and where the supporting infrastructure is and how to, how to the risk that might present is the key.
You know, I, I, the other thing that you'll hear the FDA talk about a lot is making sure you factor in, you know, loss of connectivity to the cloud, to your safety analysis. Maybe the best example of that that you will hear FDA talk about.
And again, I'm not trying to pick on a company, but you can Google search this and it's a, it's a public case, but I think there was a case where Electa had a ransomware incident in their cloud infrastructure that supported their, how they configured. I think it was radiation delivery devices inside cancer treatment clinics.
And because that went down, they couldn't download the configurations for certain patients and basically many clinics all across the US couldn't deliver radiation therapy for a couple weeks.
If you just search that, you can read about it.
Those are the types of things that FDA sees and goes, wait a minute, how did they not have a way to, you know, they, they just stopped patient care, and you may have actually caused an issue where the delay in therapy maybe allowed the cancer to grow back.
I mean, that's going to be really hard to prove and you know, but you can imagine there's impacts and if you were designing a device that required cloud connectivity to keep a patient safe, that's probably going to be a pretty tough order. So hopefully that helps with the cloud question.
Definitely different aspects to consider, but definitely needs to be, that's part of why FDA talks about threat modeling and really thinking through your safety analysis of how, when the cloud's not available, that should definitely be in your risk analysis.
The cloud not available, how does this affect my device?
Ken Hoyme: I think, and I think the, you know, it's not appropriate to just say the cloud service handles security and you don't have to explore in your threat model the aspects of what the cloud services, what responsibilities on security are in the cloud service, what responsibilities are in the way that you configure your application and deal with it in the cloud, and certainly in doing a threat model, availability of various links should be part of the threats that you are considering.
And yeah, I've long advised, having started my career in aviation, you really have to rely on local control.
You can't fly by cloud.
Christopher Gates: Well, there's a lot of great questions coming in here. We could probably do this all day, but I want to thank my two friends Ken and Chris for coming here today and we all thought this was important enough to try to get plain speech out there to the industry to talk about where you are, where you're going, and how this is going to change going forward.
So, I especially want to thank our friends at Greenlight Google for giving us this forum and this venue for doing this.
So, thank you everyone, Great questions today.
Etienne Nichols: Thank you all for coming.
We'll go ahead and close it down, but a few people have asked about whether or not this will be released later. This is going to be released as a podcast episode in a few weeks and if you want, we can let you know via email when it comes out.
Also, it'll be at www.greenlight.guru podcast and we'll let you know when it comes out. Thank you all really appreciate this. Any last words.
Christopher Gates: If you enjoyed this, let us know on LinkedIn. I think we're all on LinkedIn, so let us know if this all worked well.
And who knows, maybe you might see the three of us again sometime. Talking about another subject.
Ken Hoyme: I'm just glad my cats left me alone.
Etienne Nichols: Fantastic. It's been a pleasure serving as your intern and we'll see you all next time. Thank you everybody for coming.
Christopher Gates: Thank you everybody.
Ken Hoyme: Bye bye.
Etienne Nichols: Thanks for tuning in to the Global Medical Device Podcast. If you found value in today's conversation, please take a moment to rate, review and subscribe on your favorite podcast platform. If you've got thoughts or questions, we'd love to hear from you.
Email us at podcast@greenlight.guru.
Stay connected for more insights into the future of MedTech innovation. And if you're ready to take your product development to the next level. Visit us at www.greenlight guru. Until next time, keep innovating and improving the quality of life.
About the Global Medical Device Podcast:
The Global Medical Device Podcast powered by Greenlight Guru is where today's brightest minds in the medical device industry go to get their most useful and actionable insider knowledge, direct from some of the world's leading medical device experts and companies.
Etienne Nichols is the Head of Industry Insights & Education at Greenlight Guru. As a Mechanical Engineer and Medical Device Guru, he specializes in simplifying complex ideas, teaching system integration, and connecting industry leaders. While hosting the Global Medical Device Podcast, Etienne has led over 200...