Medical Device Quality, Regulatory and Product Development Blog | Greenlight Guru

Software as a Medical Device: Securing Your Digital Future

Written by Etienne Nichols | September 29, 2025

In this episode, host Etienne Nichols sits down with Jose Bohorquez and Mohamad Foustok from CyberMed to dissect the complex world of Software as a Medical Device (SaMD) and cybersecurity. They emphasize that SaMD is first and foremost a medical device and should be treated as such from the very beginning of the development process. The conversation highlights the most common mistakes companies make, like treating security as an afterthought and jumping straight into coding without a solid architectural plan.

Mohamad Foustok introduces the concept of "zero trust" and the critical importance of designing for security across the entire product lifecycle, from initial concept to post-market surveillance. The discussion clarifies that cybersecurity is not limited to network-connected devices but applies to any medical device with a software function, regardless of its connectivity. They also touch on the historical context of FDA guidance, noting a significant shift in recent years that has raised the regulatory bar and put a greater emphasis on robust cybersecurity documentation.

The guests provide actionable advice for MedTech professionals, stressing the value of a balanced approach that integrates security and functionality from day one. They explain that a well-thought-out process, though seemingly slower at the outset, ultimately saves time and resources by preventing costly and time-consuming redesigns later on. This episode serves as a vital guide for anyone looking to build a secure and compliant medical device in today's evolving regulatory landscape.

Watch the Video:

Listen now:

Love this episode? Leave a review on iTunes!

Have suggestions or topics you’d like to hear about? Email us at podcast@greenlight.guru.

Key timestamps

  • [01:50] Common pitfalls in developing SaMD, including overlooking regulatory guidance like IEC 62304.
  • [03:20] The critical mistake of treating cybersecurity as an afterthought in product development.
  • [05:00] Who cybersecurity applies to beyond software, including patients, manufacturers, and supply chains.
  • [06:30] The FDA's stance on cybersecurity for any device with a software function, even if not network-connected.
  • [08:00] A discussion on "reasonable assurance of cybersecurity" and what it means for manufacturers.
  • [10:00] The "zero trust" principle and why you should never assume a network is secure.
  • [14:00] How hospitals and other stakeholders are demanding more rigorous cybersecurity standards.
  • [15:40] The ideal process for a "security-first" development lifecycle.
  • [21:00] Why rushing development without a proper architecture can lead to significant delays and cost overruns.
  • [23:00] A brief history of FDA's cybersecurity guidance and the major shift in 2023.

Top takeaways from this episode

  • A "Security-First" Mindset is Essential: Integrate cybersecurity from the initial architectural phase of your project. This proactive approach saves significant time and money by avoiding costly redesigns and delays later in the development process or after an FDA submission.
  • Cybersecurity is for All Software-Driven Devices: Don't assume that only cloud-connected devices need cybersecurity documentation. The FDA requires documentation for any device with a software function, including embedded systems and programmable logic, even if it's not connected to a network.
  • Regulatory Compliance is a Process, Not a Document: The FDA is not just looking for a checklist of documents; they want to see a well-defined and consistently followed process for how you build and secure your software. This includes a "reasonable assurance of cybersecurity" that stands up to scrutiny.
  • Hospitals are Your New Regulators: Beyond FDA compliance, be prepared for hospitals and other buyers to conduct their own rigorous cybersecurity audits. A strong cybersecurity posture is becoming a key differentiator and a prerequisite for market access.

References:

  • IEC 62304: The international standard for medical device software life cycle processes. It is a foundational requirement for developing compliant medical software.
  • FDA Guidance Documents: Specific documents from the U.S. Food and Drug Administration that provide detailed requirements for software as a medical device (SaMD) and cybersecurity.
  • Etienne Nichols' LinkedIn: For more insights and connections in the medical device industry, connect with Etienne Nichols. [https://www.linkedin.com/in/etienne-nichols-105151241/]

MedTech 101

Zero Trust: A cybersecurity principle that means you should never automatically trust anything inside or outside of your network perimeter. Instead, every access request must be verified before granting access. Think of it like a strict security guard who checks everyone's ID, even if they claim to work there. In a hospital setting, this means a medical device should not assume the hospital's Wi-Fi is secure; it should treat every connection as potentially hostile and build in its own protections. This is in contrast to the old model where everything inside the network was trusted by default.

Memorable quotes from this episode

"Software as a medical device ultimately is a medical device, and so you want to be developing it from the get-go with that mindset." — Jose Bohorquez

"Security can't be an afterthought. You have to consider security at the inception of your approach to a product." — Mohamad Foustok

Feedback Call-to-Action

We love to hear from our listeners! Your feedback helps us create content that is most valuable to you. Please send your comments, topic suggestions, and guest recommendations to podcast@greenlight.guru. We read and respond to every email personally.

Sponsors

This episode of the Global Medical Device Podcast is brought to you by Greenlight Guru. In a world where regulatory requirements for software are constantly changing, having a robust and agile Quality Management System is non-negotiable. Greenlight Guru's Medical Device QMS & EDC solutions are purpose-built to help you navigate these complexities, from initial design through post-market surveillance, ensuring your SaMD and other medical devices are secure, compliant, and ready for market. Visit their website to learn how their solutions can streamline your entire development process.

 

Transcript

Etienne Nichols: Hey everyone. Welcome back to the Global Medical Device Podcast. My name is Etienne Nichols. I'm the host for today's episode. Today I want to talk about software as a medical device, specifically cyber security and some of the things that relates to it, some of the pitfalls and the development of a software as a medical device.

 

With me today to talk about this topic are Jose, Mohamad from CyberMed, Jose Bohorquez.

 

Saying your name correctly, Borges or Borquez, either one's fine.

 

I should have practiced this with you too. What is okay, we'll start with Jose.

 

Jose Bohorquez helps MedTech companies build secure compliance software. He's the president of CyberMed, where he leads cybersecurity services like penetration testing, fuzz testing, and FDA documentation. He also runs Bold Type, which is a software firm that develops FDA ready apps, embedded systems and cloud tools under a nice 1345 quality system.

 

Jose spent over 20 years at the crossroads of engineering, healthcare and startups who reviews NIH grant proposals, mentors’ digital health founders, and helped shape key software and cybersecurity standards through AAMI.

 

And he holds a PhD from MIT. And his led teams raise capital, launch products and earn multiple patents. He speaks the language of code compliance and clinical impact, and he helps others do the same.

 

And I really appreciate you being with us here today to talk. Jose and Mohamad Foustok also build and protects technology that people's lives depend on. He's the Chief Security Officer at CyberMed, and chief software and cybersecurity architect at Bold Type, where he drives innovation in medical devices, biosensors and connected health.

 

And for more than 20 years, Mohamad has worked at the front lines of software systems and cybersecurity. At Motorola, he helped shape mission critical communications used by first responders. And at Blue Maestro, he co-founded and launched the world's first Bluetooth pacifier thermometer, which I could probably use right now.

 

Not me personally, I was six months old at home. So. His work spans hardware, firmware, apps to cloud platforms. His focus is always the same, building technology that works, scales and stays secure.

 

Where have I messed up or gone sideways? Love to fill in any gaps, but both of you, welcome so much to the show.

 

Jose Bohorquez: Thank you. Thanks for having us.

 

Etienne Nichols: So, if we were to jump into this topic, of software as a medical device? What are the pitfalls? What are the things, what are the biggest mistakes you see teams make when they're building their first software as a medical device product?

 

Jose, I don't know if you want to take the lead here and just.

 

Jose Bohorquez: Sure, yeah. I think, you know, software as a medical device ultimately is a medical device, you know, and so you want to be developing it from the get-go with that mindset.

 

So, ensuring that you understand, you know, IEC 62304 and FDA's guidance documents.

 

I think this is an area where FDA has probably the most guidance documents in kind of everything that FDA covers. I think there's over two dozen, you know, guidance documents related to software.

 

Anything from, you know, software as a medical device to like, you know, other functions, like, you know, your medical device software might have multiple functions. Some of them are medical and some of them are non-medical to, of course, cybersecurity.

 

So really having an awareness of the regulatory landscape that they need to develop under and recognizing that what FDA is looking for is a process. Right. They want to know that you're following a process as you develop the software.

 

So, it's not the kind of thing that you want to try to tack on at the end, whether that be for cybersecurity or just for your standard 6 to 3 or 4 type of documentation.

 

Etienne Nichols: I think you hit the nail on the head too, just with your first line especially. I love that you said software as a medical device is a medical device, and I think we treat it like such its own thing.

 

And I think it's interesting too, because there's the other side of that coin when we're talking about, you know, I'm sure Mohamad's going to say a few things about cybersecurity.

 

We assume cybersecurity is only for software.

 

And so, both of those, you know, it's a blending of the mix. And I think it's a really powerful.

 

I want to ask a question real quick about that.

 

Mohamad Foustok: May I jump in and actually say what I think is the first. What is. I think is one of the.

 

Etienne Nichols: I do want to hear. Yeah, absolutely.

 

Mohamad Foustok: So. So, this is. This is one that actually applies to systems in general, not just medical.

 

And it's something the FDA really highlights. But it's true there too, through medical also, which is you can't. Security can't be an afterthought.

 

You have to. A lot of teams make the mistake of thinking that their focus should be in medical on the functional side or the clinical side and kind of Think that, well, we'll deal with security later.

 

But the reality is good security starts at the beginning. You have to consider security at the inception of your approach to a product, whether it's a San Diego or whether it's a general-purpose medical device.

 

And that's the mistake I always see is that people come too late saying, now we've got to figure out how to be security.

 

Etienne Nichols: Now I kind of said something, assuming everybody already knows this, as far as cybersecurity being applicable to so many more devices than just a, an app,

 

can you touch on that and talk a little bit about who it is actually applicable to?

 

Mohamad Foustok: Sure. Jose, you want to jump in or want me to go for it? Okay, yeah. I was going to say, actually it's interesting because it actually applies to pretty much all the stakeholders that would interact with the medical device.

 

If you're looking at the medical device context, uses themselves, patients, for instance, there's security implications to what they need to do.

 

And it's interesting that things like your labeling on your medical device product needs to include instructions to the users of your product to ensure that they know how to apply good security practices there.

 

And it could be something as simple as the choice of passwords, for instance, that they use if it's a password, or good hygiene and security.

 

Don't write your password down on a piece of paper and put it in your wallet.

 

But it starts with the patient. So, you asked, who does it apply to? It applies to your patients, it applies to your manufacturer, your supply chain, your distribution channels. So, all of those things touch your device.

 

It applies obviously to you as a manufacturer.

 

The updates that you will do to the system have to all be secure.

 

So basically, every aspect of your product is impacted by security concerns.

 

Jose Bohorquez: Yeah. And from a regulatory standpoint, FDA's guidance document is very clear that any medical device that has any software function, whether that be application or embedded software, or even programmable logic, whether or not it's network connected, has to submit all of the cybersecurity documentation.

 

So, when a medical device manufacturer is developing a medical device that has any software and they go to fill out their eSTAR application for their 510(k) or De Novo or PMA, as soon as they click that their system has software, there's about 12 attachments that FDA, you know, show up in the eSTAR form related to cybersecurity, and that's independent of whether it's cloud connected or not.

 

So, sort of the decision tree around does this apply to me as a manufacturer is pretty simple. Does your device have software.

 

Mohamad Foustok: Then there's an important phrase, maybe you can. With regards to those. There's an important phrase that FDA uses and this is taught a few times, which is this concept of reasonable assuredness of cybersecurity.

 

And what this phrase is really used for is to basically dictate that those attachments that Jose mentioned, this is not a matter of just paying lip service to.

 

Here's some documents, you know, generic documentation. That's good enough.

 

It's, it's you as a manufacturer attesting to FDA that you have taken this seriously and you are assuring, you know, you're, you're providing a reasonable assurance of cybersecurity that they can audit and review.

 

And so, it's not just a matter of throwing some documents together. It's, it's well thought out, a security approach to the product.

 

Etienne Nichols: And I don't know if, if you're able to go into some of the history I know FDA has recently, or maybe as recent as the last 18 months, increase quite a bit of the cybersecurity requirements. I'm going to relate just a very brief story from last week when my wife and I went in for her to get a CT scan.

 

And I'm sitting there in the waiting. Well, it's not the waiting room, but it's. I was in a room where an admin office almost and where patients are located. And I'm holding the baby, and I think, man, I got to get some stuff printed.

 

And I look over and there's a printer there and I look at my phone to see if I can print it from my phone. And I'm locked out my. I bet there's a USB on there.

 

So, I just put my USB in, and I printed them. And my wife slapped my wrist when she got back. She does not approve of me doing that sort of thing.

 

But in hospitals, when these, all these devices are connected in different ways, I'm not a software guy. I don't know what the cybersecurity concerns are in those regards, but that is a concern, right? It's the Internet of medical things.

 

What do you say to somebody who says, well, who's going to hack my medical device? Or what's the likelihood they'll actually be able to get into my medical device?

 

Any thoughts there?

 

Mohamad Foustok: Sure.

 

So, it's interesting. So, when you think about networks. So, let's start with that, right? There's this first initial assumption that a hospital network, for instance, is secure. I'm in a hospital.

 

Hospitals take care of their networks. They're secure.

 

The reality is hospitals are open to the public and anyone can bring anything in. And so, you sort of have to begin with this, with removing that assumption and saying, let's assume first of all that we're not secure in a secure environment. Don't ever trust the environment that you're in. The only environment, in fact, that you can trust is the one you can control, and that's your own environment.

 

And if you anything beyond that, there's this notion of what they call zero trust, which effectively and which is difficult to achieve. But it's the idea that everything past your boundary that you can control, you don't trust.

 

If you do apply trust to something you don't control, you're essentially opening yourself up to an attack.

 

Now you say, how can they attack?

 

I mean, hackers, this is what they do.

 

Attacking systems is what they specialize in. And if there is a network that has any kind of connectivity to the Internet, odds are that someone somewhere can attack it. And so, if you have a device and your device is connecting to your WI FI network in the hospital, assume that WI FI network is untrustworthy and an attacker can reach the WI FI network itself. And if they're at the, if they're in the WI FI network, you're right there.

 

So, if you're not secure, they'll attack you.

 

Jose Bohorquez: And you brought up a really interesting example, Etienne, which is that, you know, as a medical device manufacturer, you might think, well, my device doesn't have network connectivity, so I'm safe. But you brought up a perfectly good example, which is, so what, somebody can attack it through USB, right? Somebody can attack it through a serial port connection that you have or, you know, an Ethernet connection or some sort of, you put a test connector.

 

Now the, the, the exploitability of that kind of device that doesn't have inherent connectivity to the cloud is going to be lower, right? So, one of the things that you do in cybersecurity is try to quantify the risk that, you know, your system has.

 

And, you know, that kind of how easy is it to access the system?

 

Obviously, a device that can be accessed through the cloud is going to have a higher risk profile. A device that requires a human being to plug in a USB port.

 

But if you're talking about something like, you know, imagine an infusion pump at a hospital that has a USB connect and connection there, and, I don't know, somebody of high importance is there, if somebody could just walk up to it, stick in a USB thumb drive, and, you know, change the Software tamper with it, for example. I mean, that's very high risk.

 

So, so that's where FDA says, you know, it's not just network connected devices, it's not just wireless devices. If your device has any kind of means of tampering with the software extracting information, you have to consider this to your example, Etienne.

 

Mohamad Foustok: And just so adding one more layer to this, you brought in a cell phone on your phone, and your phone is cellular connected, therefore it's connected to the internet. That's by definition, you took your phone, and you put a USB cable to the printer.

 

Let's assume that the printer itself had no connectivity. Well, it just became connected. That printer just became connected to the Internet through your phone.

 

If the printer itself happened to be also connected by Ethernet connection, for instance, to the rest of the clinic through your phone, you have now allowed the printer and the rest of the network to all be connected to the Internet.

 

Even you've created a gateway, even though you had no intention to do so.

 

And it was never thought through that way. The reality is, you know, you're connecting these devices together inadvertently into the Internet, which means now it's cloud connected, which means anyone anywhere on the planet through your phone can get access to the internal networks.

 

Etienne Nichols: Yeah, I feel very terrible right now. However, I will specify that I put it using my laptop and then I had the physical device, which I password protect.

 

But yes, yeah, absolutely.

 

Jose Bohorquez: And the hospital does have some responsibility that if they're going to make device, because, you know, the Cybersecurity, you know, FDA regulated cybersecurity rules don't apply to the printer. Right. And so, you know, most hospitals won't connect that printer maybe to their.

 

Oftentimes they have two networks. They have kind of a more publicly accessible one and a more secure one. But your medical device might be getting connected to the more secure one.

 

And that's one of the things that you have to consider is what impact can your device have to another system that you know, like a hospital where you're being connected.

 

And by the way, it's not just FDA that cares, it's also the hospitals. I mean, if you're trying to sell a medical device into a hospital, more and more that hospital CISO is going to send you a form that is oftentimes more exhaustive than what FDA is requiring because they want to have reasonable assurance that your device isn't going to function as, you know, as a gateway to enter into their system and create more vulnerabilities for them.

 

Etienne Nichols: That's a really good point. And it's something I want to maybe bring out a little bit more because oftentimes we focus on regulatory submission and hey, once I get submitted, I'm done.

 

But it doesn't mean you're necessarily generating revenue at that point. So, the hospital makes sense. Are there other players that we need to be considering when building out these devices, particularly from a cybersecurity standpoint?

 

Mohamad Foustok: Yeah, I mean, interoperability. Interoperability in general.

 

Etienne Nichols: Right.

 

Mohamad Foustok: And cybersecurity is a part of interoperability. How does your device fit into a larger ecosystem?

 

And now nowadays the security aspect of interoperability has become critical because no one is going to allow your device to be part of that ecosystem if you can't demonstrate that your device is safe to use from a security standpoint.

 

Etienne Nichols: Yeah, yeah, that makes sense. What does good look like in 2025? I know things have changed.

 

If you could just walk through the process. You're designing a medical. The software is a medical device. And I know that's something you both do consistently and have done for several years now.

 

What does that look like, that process.

 

Jose Bohorquez: Of medical device development, like putting, putting the security piece aside or really the security piece should be incorporated.

 

Etienne Nichols: Yeah. Tying to Mohamad's first point where he said, I feel like the biggest pitfall is they don't do that from the beginning. Okay, well, I can just see the assumption some people I feel like make is, oh, well, these are software developers, they're going to consider cybersecurity when in reality those are two different disciplines.

 

But thoughts about that?

 

Mohamad Foustok: I mean, I would say, I mean, okay, so you're following a standard flow of initially, you're identifying your user needs.

 

From there, you're creating overall system requirements and you're starting to formulate a solution to your approach or to what problem you're trying to solve. Right.

 

And you begin, obviously the first stage of a solution, other than analysis of the requirements through use cases, is establishing an architecture.

 

Well, that is the time to bring in a security architect.

 

So, you know, generally speaking, medical device architects or software architects are not focused on the functional aspects of the system.

 

Security architects are focused on the security aspects. And so, someone with a specialization as a security architect would come in, take a look at what is the existing, at least what's the proposed architecture, analyze it and say, okay, from a security standpoint, here are potentially the pitfalls we're seeing, here's what we need to do from a security standpoint.

 

And now you start getting the tensions occurring because trade-offs will be required for security versus potentially functionality. And you're going to have to, you have to decide that up front.

 

This is why I said from the beginning, if you make all your assumptions initially that all you're trying to solve is the technical requirements, you know, what do you need to do from a functional standpoint.

 

And you ignore security, at some point down the road you're going to realize you need security.

 

And now that's going to hit right against whatever new decisions you had made up front on an architectural basis of your solution. And you may have to go back and change your architecture, which has massive impacts to accommodate security.

 

So, if you want to do it the right way, you're doing it in lockstep every step of the way, from the beginning.

 

And it's a cycle. Because what's going to happen is you analyze your architecture, determine what secure needs there are. Those will impose new controls which would come back as new requirements, which will impact your architecture as well. So, you're going around until you've satisfied both the technical needs and the security needs. Technical meaning, sorry, functional needs and the security needs.

 

You have to satisfy both together in a balanced way before you move forwards. And then when you get down into the next levels, you're looking at design same thing. You're designing for your functional needs and you're designing for your security needs.

 

When you are pulling in third party components to meet your functional requirements, you analyze those for their security vulnerabilities, and you determine are they the right choices for me. You may find something, you're picking up a third-party component and it's great from a functional perspective, it meets all your needs.

 

However you analyze it from a security standpoint, it has a lot of vulnerabilities.

 

You probably want to say, you know what, I'll give a pass on this one. I might pick up something else which may not be 100% functional match but is a much better security match.

 

So those are, like I said, so you're making these trade-offs along the way, always balancing security and functionality.

 

You descend down when you get into, I mean, skipping a lot. But when you get down into your verification, same thing, you're doing functional verification, you're doing security verification.

 

So, you're always doing these in parallel as you go. And you'll find security problems that will, you know, that can, that can impact you at the implementation level. You can make changes; you might find functional problems that result in security problems.

 

So, it's the two just, I think again going back, you cannot treat these things separately.

 

Security and functionality go hand in hand in the design of your, in, in the entire life cycle of your system.

 

Post market, same thing. You release your product; you're going to be doing post market surveillance.

 

Your post market surveillance covers reported issues of functionality, it covers reported issues of security, and you are required to handle both together. So, it's, it's really for the entire life cycle decommissioning, let's go even all the way life.

 

You're taking your product out of commissioning at the end of your life cycle. Well, you're going to sanitize your product to make sure that you don't have any medical data on it.

 

For instance, that's leftover residual medical data. You're going to remove anything that security on there too. You're going to remove any accounts that are on there or anything else that's in there.

 

So, from the beginning to the end, every step of the way, you're looking at functionality and security together. Hopefully. That's a good answer.

 

Etienne Nichols: Man, everything from prenup to life insurance for your medical devices is. Yeah, that's pretty intense, Jose. I know. Notice you came off mute. Did you have something you wanted to add to that?

 

Mohamad Foustok: No.

 

Jose Bohorquez: Just thinking back to your first question of like, what's one of the mistakes that people make when developing a SAMD and even going beyond security?

 

I mean, it is sometimes they skip the architecture phase, you know, and they sort of go from requirements to trying to hack something together.

 

You know, in particular, like people like to code and they just want to jump right into it. They're not taking the time to really think about how the system is being architected.

 

That comes back to bite them later. You know, it's one of the reasons why, you know, if you're following 6, 2, 3 or 4, as you should be when developing medical devices, it's a standard that encourages or really requires you to develop an architecture document, right? Really think about how the system is architected and how those different components of the architecture meet the requirements of the system.

 

And so, as Mohamad's saying, you know that the importance of looking at it from a security standpoint, even at the architecture level, some companies make the mistake of not even looking at the software, you know, from an architecture point of view and not even thinking about that.

 

So, it comes back and bites you later.

 

So, it's Pennywise empowered, foolish to really just say no. I just want to start going like, let's build the features, let's do all that kind of thing.

 

And where it really hurts you is down the line you discover Like a, a very significant vulnerability and you have to take 10 steps backward and, and start all over again or, or kind of, worst case scenario, you submit to FDA and they've gotten pretty savvy about cybersecurity in the last year and they come back and they tell you you've got a serious vulnerability here and that means you gotta go re architect your system, redesign the software, redo verification, testing, redo pen testing, update all your documentation, and you're six or 12 months now behind schedule on launch because you didn't take those precautions early in the process.

 

Etienne Nichols: I love the terminology that software uses, architect, for example, because I think most people can maybe relate to a house building. I recently designed and built a house. When they put the trusses in, I forgot to poke yoke them and I had a spot for H Vac to go through and they were asymmetrical, so I was like, I can't. So anyway, yes, ripping things apart, just tying that back.

 

Mohamad, I know you have something to say, so I want to, I'm going to get to it. But I'm curious because when you talk about all these different architects having to work together, I'll just kind of murder that metaphor. But like if you had an electrical system, H vac system, plumbing, all those different things, they have to work together. Same thing with software.

 

I feel like there's this tendency to want to move fast in software.

 

How do you.

 

Do you have any kind of advice to teams who are building that medical device, that software's medical device, and the differences in building that versus other software? Software in, out in the world.

 

Mohamad Foustok: Yeah, so. And again, this isn't specific to medical device software, to Sandy specifically. I mean, this is a more general comment that applies to good software as a whole.

 

As they said, there's this sort of pennywise pound-foolish approach, thinking that if I'm just going to rush to the end, I'll, I'll get there. The reality is that evidence, I mean history shows it won't go that way.

 

History shows that projects where you take the time up front to think it through and come up with a good approach actually save you time down the road.

 

So yes, it may feel a little sluggish at the beginning.

 

You don't have something that can show somebody running. That's what people want to get to quickly.

 

But spending that critical time upfront, thinking through, thinking through pros and cons of different approaches before you've committed anything, making those trade off decisions from all the different facets that allows you to establish an approach that will be far better in the long run than if you try to rush through.

 

And I think that's oftentimes a mistake. Now, foundations such as 62304 and other standards in other industries that enforce this, Aerospace and others have their own standards that are similar.

 

The reason that they sort of, in those standards, they require you to follow a hormonalized approach is to basically encode this, this historical understanding that, you know, if you skip these steps, you're going to pay for it later on.

 

And so, so they sort of force you to not skip the steps.

 

You know, it's for your own good. Effectively. They're telling you the collective, the collective experience of whoever put together the standard or this guidance is to say, do things this way and you will have a better outcome.

 

So don't think that you know better and you can ignore all this, and you can jump to the end and that you'll be able to produce something that's great.

 

I mean, conceptually, it may happen, you may be lucky, you may be the one lucky one that skipped all the steps and ended up with something great.

 

But, you know, I would say it's not worth, it's not worth the attempt because mo, in most cases you will fail.

 

You will end up with something that will cost you a lot more in the long run, you know, and, and the stakes are much higher, right? I mean, think about it, right? It's one thing if you're releasing a website that's going to be used by people, then you can keep changing it every day, versus a medical device that's going to, that people's lives might depend on or their health certainly depends on. You don't want to get it wrong.

 

Etienne Nichols: One of the pieces that advice that I hear a lot of people giving and I think is really valuable is focusing on the problem, not, not always the solution.

 

And I think early stage, if you're thinking about these different ways of architecting your software from cybersecurity, whatever it is, all the different layers that you're, you're mentioning here, that's another time when you can, you may think you're going slow, but there are other things you could be doing to really solidify your product in the market as well.

 

Mohamad Foustok: In fact, that's an excellent point you brought up because one of the things that you want to do up front is challenge what the problem is.

 

So, you know, you oftentimes engineers sort of take things at face value that here is a problem statement. Go solve it.

 

Well, is the problem statement really the Right problem you're trying to solve, is it again, the better understanding you have of the, of the ultimate goals, not the problem statement itself, but what are the ultimate goals?

 

This allows you with experience to potentially push back and say, wait a second, why is it that you think this is the problem? Is that really the problem? Or is this the problem? And you can pivot sometimes and that might make much better solutions.

 

So again, taking the time up front to understand what really is the problem and dig into it and analyze it and potentially, as I said, push back on. Is this really the right problem to be solving can pay dividends downstream.

 

Because imagine if you're actually just taking everything at face value and you're solving what we think is the problem, and then you get to the end, and when you're at the end, somebody suddenly has a eureka moment and says, wait a second, no, no, no, no. This isn't actually the problem we have.

 

It's over there. That's the actual problem. And now you've just wasted your entire development on something that isn't even the problem.

 

And that happens a lot, actually. Surprisingly, you asked a question earlier, which I don't know if you want to just touch on that again, which is you asked about FDA and their sort of what they've done recently with regards to cybersecurity.

 

And there is a history here. So, if you go back over that, it's really a 20-year history almost.

 

If you go back 20 years ago, FDA.

 

So back in 2000, I don't remember if it was four or six Jose, but it was early on in 2000 when FDA had put out their initial guidance on cybersecurity and there was a huge gap where they essentially tried, in 2014, they tried to uprev it to try to bring it up to latest standards, and it was never ratified as an actual guidance.

 

So, there was actually a huge gap between the two official points, early 2000s and 2023, September 2023, which is when they came out with their guidance.

 

And the result of that on the industry is the industry took a lot of cybersecurity for granted during the first two decades.

 

And when 2023 came along, it was a monumental shift for most, most people because FDA essentially just reset the bar to where it needed to be.

 

And it had been kept so low for two decades that it caught a lot by surprise. What they required was suddenly a huge difference to what they were asking for before.

 

Now, now I think it's stabilizing. They did update it in June 2025, but it was a, it was more just a formatting update. It wasn't there; there wasn't a lot of content change.

 

So, so they've now raised the bar significantly and they're holding that bar. I don't expect it to jump again, but hopefully they'll keep it to current.

 

But cybersecurity isn't evolving.

 

Things have changed a lot in 20 years.

 

And so, from that perspective, historically there was a huge gap of 20 years where cybersecurity was not applied significantly to medical devices, and they felt far behind other industries.

 

Now FDA has brought them up to current standards.

 

Etienne Nichols: Okay, with that being said, with 20 years of being one, you said kind of mentioned it being a little bit lower bar lower than it should have been.

 

So, for 20 years it is that way and now the bar has been raised. I'm curious if there are any practical implications in the market as far as what does technology look like or what it, what the potential changes are and effective changes in the, in the industry.

 

Any thoughts there?

 

Jose Bohorquez: I mean, I would say, you know, as Mohamad said, like some of the practices that FDA is now requiring have been around for a while in other industries.

 

And in fact, even FDA's guidance document and standards like SW96 are really built upon other industries that have been ahead of the game.

 

So, in some ways medical is catching up. I think the implications are that companies in particular startups that maybe don't have a dedicated person on the team that has cybersecurity expertise, you know, it does create a hurdle for them.

 

So, you know, fortunately there's companies like ours and there are others out there who have come in to try to fill that gap, to say, hey, you know, you're a 10 person startup, it probably doesn't make sense for you to hire, you know, an expert in cybersecurity, which is, you know, a skill that generally requires decades of experience and a lot of, a lot of just learning.

 

So, you can, you can bring somebody on to help you with that.

 

I think it's counterbalanced too though.

 

The bar is sort of lowered from a technology development effort these days. You know, even with the advent of AI, for example, you're seeing the facilitation of development, but then the bar is higher now in cybersecurity.

 

So, there's kind of a counterbalancing phenomenon I think that's happening right now. And the two aren't completely unrelated either because I do think right now you've got a lot of people starting to develop software.

 

And even for medical devices that probably don't really know how to develop software, so they're not thinking about architecture and certainly not thinking about security.

 

So, if anything, I think FDA is going to raise the bar even farther.

 

I can tell you anecdotally we've seen even in the last year and a half since the 2023 guidance was released in 2020, late 2023, you know, we've had many submissions and the level of rigor with which they're reviewing these cybersecurity sections of a 510k has gone up. Right. So, so things that would have skated through before now they're coming back and asking more specific targeted questions about different areas.

 

So, it's clear that their expectation is also, it can be the same guidance document but there's always the interpretive aspect to it and it's clear that they're being more strict in how they interpret their own guidance.

 

Mohamad Foustok: Yeah. Early, you asked about practical implications to manufacturers. I think so. Jose has addressed the potentially new products entering the market, but there's also an impact to existing products that are in the market.

 

So, there's no grandfathering allowed. Basically. If you had gotten cleared prior to September 2023, so you'd gotten a device cleared and now you're making a resubmission, maybe it's a catch-up submission, maybe it's a new indication, something you're making a change to your product. And knowing that products, medical devices oftentimes have fairly long-life cycles. So 2023 is not that far away. You could have, you could have gotten cleared in 2022, 2020 in the 2000s. Even 2018 is still around.

 

If you go back into FDA now with a new catch-up submission, they expect you to apply the latest cybersecurity guidance to your product.

 

There's no, you don't get a, you don't get a grandfathered in pay. You know; I got cleared before so I'm okay.

 

So, so that's a, that's a hit right there. So, so you can imagine you're an existing product. You've never. Security wasn't on your mind when you got cleared do. And all of a sudden, you're going in right now with a small change with an indication or a catch up and FDA will stop you and say, well wait a second, you need to cover all your cybersecurity bases as though you were coming in new right now.

 

And that can be a significant impact to people.

 

Etienne Nichols: Yeah, I can see some companies trying to do a letter to file, but I'm curious what you just. With your regulatory expertise. So, if, suppose someone does a minor change. This isn't a big deal.

 

It will do a letter to file internally and then they come through with an inspection.

 

Are those. I would assume those things are going to be sort of top.

 

Mohamad Foustok: I can't speak to like inspection or not, but I would assume so, yeah.

 

Jose Bohorquez: No, I mean, the guidance specifically tells you that if you're making a change to your system that could impact cybersecurity, even if it's a legacy device, then you do need to, you know, to, to prepare the cybersecurity documentation.

 

So, to Mohamad's point, I mean, with a legacy device, you're only, let's say, quote, unquote, safe from the regulatory overhead if you're not touching it at all.

 

Mohamad Foustok: Right.

 

Jose Bohorquez: But even that, I mean, you know, people make the mistake of thinking of security as strictly being like a compliance issue, but there's literally security implications to this. There's a reason behind it. So if you've got a legacy device, I mean, you have to consider that maybe proper security wasn't taken into account and that your patients or the hospitals that your system connects to are being put at risk if you don't take the time to evaluate your system and just determine whether or not you've got vulnerabilities that are really important.

 

There's an element of ethics here, of having kind of, the, what should I say, the decency to make sure that your product isn't putting people at risk.

 

And also, the liability point of it. And by the way, also the business reputation.

 

You know, I don't know if you follow the contact news that came out about a year ago or nine months ago, you know, CISA essentially decided to evaluate the system and realized that it had a back door, and it was leaking data back to like an unknown server.

 

So, this was a patient monitor system, and it was in the news. And FDA, you know, did a big thing about it and, you know, the company had to come back and.

 

But it was in the news everywhere, right? So, Google this company's name. That is the first thing that comes up. So, you have to think about your company's reputation as well.

 

Even if you've got a legacy device and you're not planning to make changes to it, security is important.

 

Etienne Nichols: And so, stakeholder involvement, I mean, we could talk about regulatory inspection. And I agree, ethical is really important.

 

I hear people say every now and then, you know, I would fear a litigator much more than a regulator. And so, you know, that's for sure. The case. But there's other people too, that we're trying to just try to, you know, meet their needs as well.

 

So, it. People in the hospital, the actual patients themselves. I love how Mohamad talked about every person may have a role to play with security, whether that's setting your password or how you interact with the tool.

 

Are there other things that they should be thinking about? Anything maybe that teams, maybe they've considered cybersecurity from a regulatory compliance standpoint, any other aspects that are helpful for teams to be considering.

 

Mohamad Foustok: So, in the, in the sense. And I know you asked for software as a medical device in the sort of the beginning, and I think that has its own challenges when it comes to environment.

 

So, I think. So, I'll step back and say when you're creating an actual physical medical device, which you can. It goes back to how much you can control.

 

So, you control the boundaries of that device physically. I mean, there's a, there is a physical entity that is the device and you're putting some software inside it. You control that physical bound.

 

You have literally physical security. You could decide not to have exposed ports. You could decide you have control over those things.

 

When you're in the sandy environment, you don't have those controls. You're in a sort of a. You're in an unknown environment.

 

So, you're delivering a software product that's going to be installed in some system that you don't control.

 

So, security boundaries become even more important there.

 

So, because you don't have control over the environment that you're in, so, you know, that's something to be to consider is that, you know, how does.

 

Are there any implications when I. Where my software gets installed with regards to the security of the software functionality that it has?

 

What do I need to. What are there any guidances I need to give to the people installing it in terms of where it should be installed and where it shouldn't be installed and, you know, things like that.

 

So, there are now, there are some techniques to provide isolation as well. There are, there are architectures, virtualization architectures that allow you to add better isolation of your functionality to the outside world.

 

But those are things again, you have to consider.

 

But I think it's all too easy in the old days to sort of think of it as I'm producing some software, an application, I'm just going to give it to some hospital and tell them go and install it and use it.

 

And those days are sort of over.

 

You can't do that anymore.

 

You've got to think about Your software, in terms of it's an entity as a medical device, it's sitting in some environment.

 

And I have to consider the security implications, both from the environment to me and from me to the environment that I'm in.

 

And you've got to think of those things. And not only have you got to think of them, as I said, for reputational reasons, because you don't want to mess up, and for ethical reasons, because you want to do the right thing, but also from a regulatory perspective, because FDA is not going to let you get away with it.

 

Jose Bohorquez: Also from a financial perspective, you said at the end that some people feel fear, litigators more than regulators.

 

In the last month, we've actually seen the first settlement, settlement with the Department of Justice for cybersecurity. So, there is a company called Illumina that agreed to pay $9.8 million because of allegations of misrepresentation of the federal cybersecurity requirements.

 

So, you're seeing that it's not just FDA seeking compliance, but even at the Department of Justice, you know, they're evaluating these things and looking to see if people are making false claims and actually putting people's security at risk.

 

We've now seen the first case of that being applied.

 

Mohamad Foustok: Yeah.

 

And in fact, to that, to that point, I don't know if, again, if people, I mean, again, we should be aware of this.

 

When you are making your submission to FDA and you're submitting your documentation, you are making commitments. I mean, you provide a plan for, for instance, maintaining your cybersecurity posture. That is a commitment you're making. You're saying, I am going to do these things, and you will be audited, and in audits, they will ask you for those things.

 

Now, I'm not speaking on behalf of FDA or the government, but I would imagine that just like any other commitments you make, there is probably some legal binding to that.

 

And if you make those commitments to FDA and FDA discovers that you are not doing what you claimed to do, that you're or committed to do, there could certainly be legal consequences to that.

 

Etienne Nichols: Yeah, if. I know we're coming up close on time; this has been really great. I. I appreciate you both sharing your expertise and all your knowledge and experience.

 

If there's one thing that the audience could walk away with from this conversation, I'm curious if you each have your own one thing, what, what that one thing would be, and if you want to take a second to think about it, that's fine, too.

 

The power's in the editing. You Know whoever wants to hit the.

 

Mohamad Foustok: Buzzer, I have to watch. It's probably the same one thing, but I hate to say it because it's gonna sound self-serving and it isn't meant to be self-serving, but it's my one thing would be just seek someone who knows seeking under, you know, don't assume you know if unless you do really know and you're experienced. Don't jump to assumptions.

 

Seek expertise. Yeah. Because these are sometimes tricky things and you need to find someone that knows what they're doing.

 

Jose Bohorquez: Yeah, yeah, no, for sure. And there's just so much to know.

 

I mean, we've got a little list of the documents that you should be familiar with for cybersecurity. And it's probably 20 different documents, I mean ranging from the guidance documents to TIR 57, TIR 97, SW 96.

 

There's MITRE documents, there's NIST documents, there's FIPS, there's just all kinds of references that somebody on your team or extended team should know. Well, so that they're ensuring that your device is secure.

 

But my, my probably one thing would be to really, this is something that FDA actually speaks to, which is really understand the security has to be designed into your product.

 

It shouldn't be bolted onto your product.

 

And it's not to say that if you've developed your system, you know, if you're an entrepreneur and you're like, oh man, I thought I was going to be submitting to FDA in two months and I realized we didn't do this, we're completely screwed.

 

There is a process that you can follow to try to fix what you've messed up, if you will, or to try to catch up.

 

It's just more painful than if you would have done it from the get-go. And chances are it's not going to be quite as secure as it would have been otherwise.

 

So, if you're at that point, then even more you need to bring in some help to make sure that you do things efficiently. But if you're earlier in the process, really try to bring on some cybersecurity expertise fairly early.

 

You know, sometimes in our engagements we split them up in two phases, one of them as an architecture phase to really help a client get the architecture right. Do the threat modeling early and we might not see them again for six months and then we'll come in during the implementation phase and help ensure that they've, you know, we help them with pen testing and help them with reviewing their SBOM and that kind of thing.

 

That's the ideal way to do it, is to kind of address it early and then over the course of the whole process.

 

Etienne Nichols: Yeah, I think that's good advice and I appreciated your advice, Mohamad. You know, as an engineer, sometimes we're trained to be very, you know, think critically, think intelligently, and you have to have a certain level of confidence.

 

Someone recently pointed out to me, said there's what, eight something billion people in the world to think that what you've decided is the best way and you know, at some point you just gotta make a decision.

 

But yeah, seeking out that help, there's nearly always a better way and that expertise is really helpful. So really appreciate you both. This has been great.

 

Where can. Where do you recommend people find you if. If they want to reach out?

 

Jose Bohorquez: Yeah, good question. So, you know, our website, CyberMed AI has a lot of great resources. We actually just finished the book that's all-around cybersecurity for medical devices. We're going through a phase of review where we're opening it up to the community.

 

They get free access.

 

You know; all we ask is that as you go through it. And we built it into a little web app so you can click on sections and leave comments. We've already had, you know, a ton of people leave us comments and reviews.

 

That's helping make the book better.

 

So, our plan is to take all that, expand some sections, address some, some questions, and then send this to print in about a month.

 

So that's book CyberMed AI. That would be the link to that. People can get free access to that right now for about a month and then on LinkedIn. I'm pretty active, so if you just search up Jose Bohorquez, there's not too many of us and they can follow me. I try to put out content about once a week, something that is helpful to people related to cybersecurity.

 

Etienne Nichols: Excellent. Yeah, I'll try to put links in the show notes and try to get this out quickly enough so that people are able to get in and make those comments as well.

 

So very cool. Thank you so much.

 

Really appreciate you sharing all of this information. And those of you who've been listening, thank you for listening to the Global Medical Device Podcast. We'll see you all next time.

 

Take care. Thanks for tuning in to the Global Medical Device Podcast. If you found value in today's conversation, please take a moment to rate, review and subscribe on your favorite podcast platform. If you've got thoughts or questions, we'd love to hear from you. Email us at podcast@greenlight guru.

 

Stay connected for more insights into the future of MedTech innovation. And if you're ready to take your product development to the next level. Visit us at www.greenlight.guru until next time, keep innovating and improving the quality of life.

 

About the Global Medical Device Podcast:

The Global Medical Device Podcast powered by Greenlight Guru is where today's brightest minds in the medical device industry go to get their most useful and actionable insider knowledge, direct from some of the world's leading medical device experts and companies.

Like this episode? Subscribe today on iTunes or Spotify.