Approaching Cybersecurity & Usability as a SaMD Company

May 27, 2022 ░░░░░░

264-GMDP-header-templates

How do you balance security and usability of software as a medical device (SaMD)? It’s not easy and trade-offs may need to be made by device companies in order to give users what they want and need to safely use it as intended.

In this episode of the Global Medical Device Podcast Etienne Nichols talks to Abbas Dhilawala, a cybersecurity and SaMD expert with Galen Data, about a new approach to cybersecurity and usability for SaMD companies to ensure products are both secure and user-friendly.

Abbas has 18 years of experience developing enterprise-grade software for the medical device industry and is well-versed with technology, industry standards, and the privacy of data.

Watch the Video:

Listen now:

Like this episode? Subscribe today on iTunes or Spotify.

Some highlights of this episode include:

  • Usability and human factors testing standards exist. However, there’s no standard approach to follow for cybersecurity. Abbas’s approach is to obtain user feedback as soon as possible for SaMD to still be secure and user-friendly.

  • Different kinds of users in the healthcare spectrum can be trained to use SaMD, including hospital staff and patients - depending on their level of trust and understanding of technology.

  • Potential Pitfalls: Classification and credential layers, such as permissions and passwords, can put the security burden on the users but leads to the need for risk assessment/management for possible harm. 

  • Biometrics: Cutting-edge technology, such as fingerprint, eye, and face scanning is not as secure, reliable, or consistent, but it’s getting better. Always have a backup plan.

  • Key Takeaway: There’s a lot of push on cybersecurity, but don’t take away the convenience or the usability aspect. Find a way to balance both usability and cybersecurity.

Links:

Galen Data (Schedule a Demo)

FDA - Guidances

FDA - Cybersecurity

HIPAA

True Quality 2022

The Greenlight Guru True Quality Virtual Summit

Greenlight Guru YouTube Channel

MedTech True Quality Stories Podcast

Greenlight Guru Academy

Greenlight Guru

Global Medical Device Podcast Email

Memorable quotes from Abbas Dhilawala:

“Ultimately, if you make the product in a way that’s hard to use, you can be secure. If nobody uses it, it doesn’t really matter.”

“There’s lots of standards, just no harmonization.”

“What can you do to minimize stress? Health care is already a stressful environment.”

“The fundamental layer of security is to know who the user is.”

“Having standards is a nice thing because then you can develop tooling around that.”


Transcript:

Etienne Nichols: Hey everyone. Welcome back to the Global Medical Device Podcast. Today we're going to be talking with Abbas Dhilawala. He is a cybersecurity and software as a medical device expert at. Abbas has 18 years of experience developing enterprise grade software for the medical device industries.

 

Well versed with technology industry standards and the privacy of data, his experience lies in programming, cloud, cybersecurity, data storage and regulated medical device software. He works with Galen Data. Prior to Galen Data, Abbas worked as a software architect at Tietronics Software, where he led technical teams in creating medical device software that was successfully cleared in US and European markets.

 

He also has a master's degree in software engineering from the University of Houston and a bachelor's degree in information technology.

 

All that said, the thing that we want to talk about today is usability and cybersecurity and kind of the intersection of those two, usability and software as a medical device.

 

It's an interesting topic. It's one that we don't hit on very often. So, I hope this is of benefit to you all. Hope you enjoy the episode. Hey everyone, welcome back.

 

It's good to be with you today. We're talking today about security and usability of medical devices software as a medical device today. With us to talk about this subject is Abbas Dhilawala.

 

Hopefully I got that right.

 

Abbas Dhilawala: You got it right.

 

Etienne Nichols: All right. Last time I did an episode, I mispronounced my own name. So, you know, we'll see how this goes. But it's good to be with you today, I guess. Let me start with a question. Balancing security and usability in medical device, kind of the proposed topic of the discussion today.

 

There has been a lot of discussion about security, cybersecurity. How does that play in or how is that even a balance with usability? Can you explain that?

 

Abbas Dhilawala: A little bit?

 

From my position as a chief Technology officer, I'm into product development, right? So, I need to develop products that people will use obviously in healthcare, cybersecurity and cybersecurity in general, but specifically healthcare.

 

A huge focus. And so, we have to kind of be aware of it, address all the issues that come up with. The FDA has come up with, you know, recently even newer version of its FDA guidance and a lot of information, a lot of things.

 

So, we have to do all that stuff. Right. It's important to do cybersecurity. But ultimately, if you make the product in a way that's hard to use, it can be secure, but if nobody uses it, it doesn't really matter.

 

You know, there's a quote from a famous cybersecurity researcher saying the most secure product is one that's hidden in a titanium shell with nuclear weapons around it. And even then, it's not sure if it's safe.

 

Right. So, there's always going to be a balance. You have to obviously look for security, but you have to ensure that people can actually use a product. So, they're getting out of the product what you want them to get out of the product as much as we're bal.

 

So, it's a balancing act and it's not easy and you have to kind of make trade-offs here and there. But that's kind of what we want to talk about.

 

Etienne Nichols: Okay, now that sounds good. And so, looking at that, so cybersecurity, obviously it's a requirement. How is it we should approach these things with usability? Usability. There are standards around usability as well.

 

What's the approach you typically take?

 

Abbas Dhilawala: So that's an interesting question. So, there's no standard approach, unfortunately, while there are standards for human factor testing. Right. We have kind of collaced around a couple of different standards and FDA has kind of said, okay, the, these are good standards.

 

There's no such thing around cybersecurity alone. I mean, there's lots of standards, just no harmonization.

 

And then there's no harmonization around addressing usability, addressing cybersecurity. So that's a big gap, which means there is still an evolving field of how to do that. But what we try to do essentially is try to get the user feedback as soon as possible.

 

Right. And then basically say, okay, what can we do while still being secure? Like what kind of innovation, we need, what kind of technologies we need to develop or deploy to make this something that the user can still work.

 

Given that there's different kinds of users within the healthcare spectrum. Right. So, you have the hospital staff, for example, people that are trained or can be trained more easily, but also people who are more in stress, and the nurses of the hospital or the doctors.

 

So, you account for that, that you can train them on a certain way, but you also have to account for that Maybe there's just too much stress and there's an emergency room and the doctor needs to know what medication I cannot give to this patient.

 

And I need to log into the EHR to get that. And I have seven different passwords I need to try before getting.

 

That doesn't work. Right. But you can train them. Then there's the spectrum of the kinds of patients which they run the gamut from somebody who's very good at technology to someone who just hates technology.

 

And you got to figure out a solution that while still secure. How do you make this stuff to work? So, a lot of user research is what can we do?

 

And figure out where to target if you're targeting an older population versus targeting in your population.

 

Not saying the older populations lack that where but there's a different way they work and different kind of acceptable things that you may be able to do there versus a different place.

 

Trust. So, you're targeting different kind of community that is more trusting of technologies versus less trusting of technologies. You got to figure all that out as part of your user research.

 

So, you know which technologies from a security point of view can deploy. That's addresses and. But then again, just like any human factors, you got to test, you got to test and whatever you pick may not be the right solution.

 

And there's always. And the nice thing about software is iterations are relatively easier comparatively than a hardware solution. So, you have that ability to kind of iterate over more times.

 

Ultimately, it's first recognizing that you have to kind of keep. When you bring on security consultants and they say you got to do all this stuff, you always have to take step back.

 

But how does it. How does a patient interact with the system? How does a healthcare worker interact with the system? How does your own staff interact with the system and what can you do?

 

And to minimize stress because healthcare is already a stressful environment, how can you minimize the strap while still providing the same level of security?

 

Etienne Nichols: Wow, that's really good answer. So, there's a lot of things to think about. Sounds like we need to think about who's going to be interacting with it. What's the intended use, what's the level of security required?

 

Lots of those things working together in tandem. When you deal with companies who are working on this and working towards a solution, both a secure solution that's also user friendly.

 

What are some things that you typically see? Maybe some pitfalls that are common that people kind of get trapped up in?

 

Abbas Dhilawala: Yeah, the fundamental layer of kind of security is to know who the user Is right. So, we talked about that sort of technologies as a user management or account provisioning or identity and access management as a general group.

 

So that's generally where a lot of these things happen. Because a lot of stuff you can do behind the scenes that doesn't interact with the user. Like, you can encrypt your data, you can encrypt an address, you can transition.

 

The user doesn't care, they don't know, they don't interact. But when the user has to start providing information that's generally around the identity management, like, who am I and how do I prove I'm the right person?

 

Right? And so typical things you do is, you know, a username password, right? Use that everywhere. So that's simple, except we tell people, hey, use a unique password that's 20 characters long and something you'll never use somewhere else, but yet you remember it every day.

 

It's like, okay, good from a security perspective. But how does that really work? I mean, realistically, they do that? Probably not. Well, they'll probably use it over and over again, or they'll write it down and put it somewhere digitally that is insecure.

 

So, things like that. How do you kind of take the user's aspect and says, okay, yes, it improves my security to some degree by having a complex password and making it so unique that it's only available to the service.

 

But it also puts a big constraint on the user's side. You're essentially putting the burden of security on the user rather than trying to figure out how to take the burden on in the application.

 

So, then you say, okay, what can I do? You know, I still need security, I still need to know who the user is. And I want that user to want that access not to be spoofed, right?

 

I don't want someone else basically coming in as that user because that is a privileged user. So again, you can start classifying different users. You can say, okay, these are my regular users.

 

They don't have a whole lot of privilege in the system like my patients. They only see the data.

 

These are my administrators that can do a lot of damage, right? So, can I deploy different things for different kinds of user types? Can I make the administrator go through a more rigorous proving point because their damages are much higher?

 

And here I maybe I can figure out a different way of interacting. Can I start using things like can I use their other credentials? Like can I do single sign on, right?

 

Can I start using their Google or their Facebook or things that they already use their credentials and stop deploying my own credentialing system.

 

How can I do that? What are the risks there? So again, it all comes back to risk management. Like anything in healthcare or anything medical device, you know, start with the risk and kind of figure out where the risks are.

 

But think about instead of just saying, okay, everybody needs a username password. Can I have different layers of that? And depending on the kind of harm done, can I deploy different solutions that are more user friendly to some of.

 

Etienne Nichols: The group, at least that makes sense. Yeah. I was flying recently, and they let me do a trial run of the clear eyes thing where they looked at my eyes, the biometric.

 

And I'm going to lead into that in a minute to ask about potential future, I don't know, technology that's out there. But I also thought immediately, well, what about the fingerprint scanner on your, maybe your MacBooks?

 

But I also got to think it, okay, well if the intended use is a surgeon, he has to take his gloves off, then you have an issue there. So, it kind of goes back to like you were saying earlier, you have to go back to the user study and see how are they going to use it in the field, how is it going to be used on the ground. That makes a lot of sense. But what do you see as far as cutting-edge technology? You work with a lot of companies.

 

What are you seeing right now?

 

Abbas Dhilawala: So, biometrics is absolutely something that will.

 

The problem is either the technology is not as secure currently or there's just too much burden, we put to get it working consistently. So again, on my phone, you know, we have the fingerprint scanning, we have the face scanning, the face ID and stuff.

 

You know, there was an iris scan. I think Samsung had it delivered a few years ago. Never really worked. Yeah, but it's getting better, right? So, the fingerprint scanning that was 10 years ago wasn't very good and it wasn't very secure.

 

Today it's much secure, much more reliable. You still have a backup plan though, because I can get a cut or my fingers can burn or I may be wearing end situation, I may be wearing gloves or something else, things like that.

 

So, you still need a backup. So, I think you will see more and more biometrics being deployed, but you still always have to from security perspective, you know, what is the weakest point?

 

So, if you still require them to fall back on a password or a code, and that code is no good, or the password mechanism is no good, you still have the same vulnerability, right?

 

Hey, somebody can get in. So, unless we can come up with a much better solution in terms of biometrics, I think we'll still have that discussion around. As far as biometry is concerned, I think there's a few different things.

 

There's a palm scanner now that people are experimenting with, which looks at your blood flow. So not quite sure how that works. Apparently, there's some unique pattern and how your blood flows in your veins and you can come up with a signature that it's so cutting edge.

 

The thing that it's useful, it's, it's touchless and it can go through anything you wear. So, if you're wearing a glove because it uses an IR signal and you don't have to touch it.

 

So that was good for like, you know, in the COVID situation where people, they won't touch things.

 

Etienne Nichols: So, it could go through a glove.

 

Abbas Dhilawala: You say through a glove because use an IR scan your blood. So, it's useful in healthcare because as a surgeon I can use that. I don't have to take out stuff; I don't have to touch anything because I'm in a sterile field.

 

So, it's something that may be useful, but again, it's a very early technology, still a lot of research needed and still reliable components that need to be made that can do that.

 

So those kinds of things. But yeah, I mean, biometrics is a good example. So is, you know, in the other side of things where you have administrators that again have a high privilege kind of access to the system, they can do a lot of damage.

 

Yeah, you can put more burden. Right. For example, you can mandate that they use a two-factor authentication system. So not only they use username, password, but they have to put this code or plug in some key or a card, scan a card or something that makes it a little easier.

 

But you don't want that do that for the patients or the healthcare providers every day. Because again, if I'm. I have a card, but I can't touch things, I can't touch things.

 

So, then you make that a process that's hard to come up with.

 

Etienne Nichols: That makes a lot of sense. The thing that, I guess if we go back to cybersecurity and the secure aspect of it, you kind of mentioned a little bit of a balance in the beginning.

 

Well, I guess in the title of our potential discussion here, being a balance, when you're thinking about usability, how do you know if you've gone too far in the balance and maybe you're sacrificing cyber security for the sake of usability or how do you Detect that?

 

Abbas Dhilawala: Yes, that's a good question. So that can happen. Like, I mean you can get a patient advocate that says no, no passwords or, you know, or four-digit password, right? That is in a 1, 2, 3, 4 or hard coded, no passwords or only passwords, things like that.

 

You can definitely go that route. So, I think it's still just like we do with other risk management. You generally have on your team a kind of cross functional team that is, you know, both from a security mind of people, but from a patient or from a usability mind of people.

 

And you basically have to still adhere to the risk management principles, like what is the risk? If I take away these features from security perspective to improve usability, what is my risk and is that risk acceptable?

 

So again, it's come down to risk assessment. You have to figure out the ROI of everything you do from a risk perspective. And so that's still possible. You still have to do other things that are required in product development by the FDA and by general cybersecurity guidelines.

 

Right? You do a lot of testing, you do vulnerability testing, pen testing, those kinds of things. And those things will come up like, hey, I can easily get your passwords, and I can do this stuff.

 

And then you have to go back and address like, how do I keep that secure while I address some of the usability aspects. So, it's not like you're just taking the usability side of things and just deal with cybersecurity.

 

I think it's a balance and you have to constantly work on it. It's not a one-time thing because, you know, as vulnerability changes cybersecurity, unfortunately it's a cultural shift in medical devices because in devices we're used to this long product development cycles and you know, things don't change that much or that fast, even safety risk like we have. But you know, if something is going to shock you, that's probably going to shock you for a while. It doesn't change; the profile doesn't change.

 

But a cybersecurity risk is evolving every day, which means you have to make that determination pretty much constantly. And you have to constantly evaluate and figure out, okay, did I go too far in one way or the other and how can I calibrate that aspect of it?

 

Right?

 

And that's part, and I think a lot of what you can do from a usability. So, usability doesn't mean that you have to make it easy to use. It also makes it so that people perceive this as useful.

 

Right? So, if you explain that this is why you are doing things in a way that's digestible by your stakeholders. I think it's that that changes the personality. Right? So, for my bank, for example, for me, my bank if doesn't ask for a two factor, I perceive that as a problem.

 

So, to me, the usability actually enhances because I get a better peace of mind when they add the security measure. Because I see, okay, I don't want to lose my money.

 

Right? So again, it just depends on how you frame things. It's not always that it has to be a one button, do, do everything kind of thing. It's just more about comfort.

 

How comfortable am I as a user to use this technology? Is this adding more burden to me or is this making me feel better about it? And so that's. There's a lot of psychological aspect of usability doesn't mean take away security.

 

It means present it in a different way or explain it in a different way and make sense. It makes sense if you just say, okay, you have to do it, there's no choice.

 

And we don't know why you have to do it, but you have to do it. It's like, you know, if you call customer service and all they say is, oh, restart your machine.

 

Is like, why?

 

Right, I just did it. But don't just restart it. That is what changes the usability aspect. Because it's a feeling that I get as a user of something that's not right for me.

 

It doesn't have to be, again, compromise on technology. It can be in a more of an educational aspect as well.

 

Etienne Nichols: That's really interesting. It piques my interest a little bit because you talked earlier about user populations or maybe the different people who are going to be using the product, whether it's in the, you know, professional health field or maybe just a normal layperson.

 

I don't have a great word for whoever that may be myself, maybe. But are there different studies that have all been done already across different populations for the usability for this type of thing?

 

I'll give an example. So, when I was in the field, we might have something that needed to be a right-hand operation. So, we would look at a right hand for a 95th percentile male.

 

And that was kind of your standard. You had to at least be that. Is there something similar for.

 

Abbas Dhilawala: No, no, unfortunately, no. And part of that is because it evolves and it's a contextual thing. Right? Again, I talked about the bank experience and because it's contextual to my risk of losing,

 

I take that as an okay thing. If the same thing happens other places where I'm just, you know, I'm logging into a site to get some information, but I'm not providing any data. But it still makes me go through the hoops. Same mechanism, but I feel differently. And so, it's very contextualized. It will depend again on a very specific device to device or system to system kind of thing.

 

And that's why I think a user research or a user study upfront can give you a better sense of. For this context, for this use, what is something that the population, the population targeting, what is some of the things that they will feel about? How about using it again? You start to balance. Security is important.

 

I want to be very clear. Security is super important because I mean we see it all over the place. It's not when you build a system or a device and you take it to the hospital, it's not just a risk on the device or your system, but it could be a risk, you know, broader risk. And the hospital could be compromising their systems because you connected your device to the network.

 

And that could lead to a much bigger risk platform instead of just device. So, the attack vector is much bigger than. So, things like that. You have to be very careful.

 

Security is important. The question is though, there's been a push now is to make everything secure. And it's like, okay, yes, great, let's just take it back and make sure we do it in a way that we are not putting a lot of burden on people who are already burdened with the healthcare system.

 

Etienne Nichols: Yeah, that makes sense. You also have your usability in certain aspects of your device. And my mind goes to like the S bomb. You know, we think, okay, we got a requirement, we got to do an S bomb.

 

But think with the end use in mind. If that hospital has to metabolize that information to search for potential other vulnerabilities, then it needs to be something they can digest.

 

So, there's a user aspect even of that. But I mean that gets a little bit away, I guess from the.

 

That's just a part.

 

Abbas Dhilawala: But yeah.

 

Etienne Nichols: Thoughts? I don't know.

 

Abbas Dhilawala: No, I mean, that's true. And then the SPOM is something that, you know now in the, at least in the next, the straf guidance they'll ask for explicitly. And it's a good thing because when we always talk about supply chain, but we never talk about supply chain risk.

 

And there's supply chain risk in terms of manufacturing, in terms of quality.

 

But now you have to talk about from A software perspective, supply chain risk in terms of cybersecurity. So, it's a good thing. There are several standards now and several tooling that comes with those standards that help to kind of digest some of that stuff.

 

And we use our internally, we use an SBOM, and we use system called CyclondX or format called CyclondX. And there's tooling that we use because same problem was I don't want my engineers trying to go through a 700-page JSON to figure out what's going on.

 

You know, you need tooling. But having standards is a nice thing because then you can develop toolings around that. And the tooling, you know, we use, and it tells us all the volumes out there without having to go and find each one.

 

It just tabulates all that for us. So, it helps us as an engineer do better product development. And then similar things can help a hospital keep track of 100x that information.

 

Because there are so many devices and so many systems, ultimately tooling can be developed. And I think it's a good thing that people are moving towards a more standardized system.

 

And I would hope eventually that there is a more harmonized standard around cybersecurity and something that accounts for usability and kind of, you know, just a few years ago when FDA started pushing on the usability, the human factors aspect of it, it was similar thing.

 

It was just standalone kind of thing. You do it at the end. And now I think people are moving more and more into a more integrated approach.

 

So, you take care of this during design, during development and cybersecurity kind of needs to move that way as well. And it is. And then eventually I think you will have that harmonization around people talking about usability and cybersecurity in kind of the same room around the same design topics.

 

Etienne Nichols: Yeah. So, the standard that comes to mind with usability, 62366, does that handle the software aspects? It's been a while since I've looked.

 

Abbas Dhilawala: At it, some of it. And then there's HFE 75, I think.

 

75? Yeah. I mean they are more on process, like what you should do in terms of process. It doesn't really matter if it's software or hardware. They just talk about, you know, like formative studies or user studies and coming up with scenarios or task analysis and stuff like that.

 

So again, it's a process of how to do good usability study. It doesn't necessarily go into, you can apply that to physical things; you can apply to software or something that's like an app that is has both aspects of it.

 

Etienne Nichols: Do you have any recommendations about maybe what might be specific to software as a medical device as you're going through those studies?

 

Abbas Dhilawala: So again, the things I look for especially is, you know, when you look at studies, you know, you have to keep the context in mind, right? Because studies can be done over different timeframe.

 

And as things change, security landscape kind of evolves. You have to kind of contextualize that, okay, something we learned three years ago may not be today, may be accurate. You have to also keep up with technology.

 

Maybe there's different ways of doing the same thing now that you did three years ago. Like the fingerprint scanner is a good example. You know, five years they have to gone much better where you can kind of maybe think about using that now that you couldn't maybe five years ago.

 

So just keep that context of time and evolving nature of both the threats, but also the technologies to kind of counter those threats around that. But there is also a cost dynamic that for product developers, you have to keep that there's no unlimited resource, unlimited money available.

 

Unfortunately, you still have to get to market a product that's usable. So again, it's a triangle. You know, how do you balance risks with usability, but in a way that you can actually take to market and be profitable.

 

So, it's hard to give a specific tip, but it's essentially around just trying to balance out different technologies. But look at things from an identity access management, asset management, like SBOM, for example, from the hospital side of things, how do you do your vulnerability management? How does that impact your design decisions that you made from a usability perspective?

 

So, a constant analysis of all the vulnerabilities out there, and if me using my iris scan and the component I use is going to be affected, can I swap out the component, or do I need to now create an alternative way to use my system? So, things like that, things can I update remotely, things I have to build into the hardware because hardware is hard to change, right?

 

So, I can change my algorithms, I can change, I can put a new firmware that does different things, but I can't change if I have an iris scanner built in, I have an iris scanner built in, I can't do anything about it, things like that.

 

So just how do you address the ability to patch in future when your design decisions you made from a usability aspect of it now create more securities than you did when you actually designed system?

 

Those are kind of things to keep them in mind.

 

Etienne Nichols: I love what you said earlier too, about reframing, because that was a great point with the bank. I'm the same way. If you didn't ask me certain things, like, well, you know, have you seen companies’ kind of leverage that to have a more robust system but still overcome maybe some of the usability issues that that system might impose?

 

Abbas Dhilawala: I know a few companies that have tried, and I say tried because I don't think there's any whole lot of success story, especially on the hospital. Yeah.

 

Because I've still seen the same complaints. I was like, oh, why do we need to. And we actually had a doctor who's an entrepreneur, who's a customer of ours, and he was very critical of our security measures because, like, I don't do this, this and that.

 

And we had to sit down and explain, look, here's why we do it, and here's why you should leverage those things in your system. As we started reframing, he started understanding, but he's still a surgeon, and he sees this issue in daylight.

 

He's trying to help the patient. That's the only thing matters to him is how do I help the patient? And technology comes in his way rather than assisting him. And so, he has this kind of, you know, a block.

 

So, I don't want to do any of that. So, I don't know if it's successful. But it at least calmed him down and made him use the system. And we had to tweak some stuff that allowed him to kind of be comfortable with that aspect.

 

But I think we still have a long ways to go to kind of do that at a broader scale where we can kind of reengage with physicians or nurses and even patients to kind of established a landscape and said, this is important, and this is why it's important.

 

This is why we expect you to kind of take the slow bit of inconvenience. But because this is useful, again, with a bank, it's always personal. Right. So, my money gets lost.

 

That's important because it affects the psychology of things. It's like when it's affecting you versus it's affecting someone else. And not saying people don't care about other people, but there's an imbalance between how you care about yourself and other people.

 

And so, I think that reframing hasn't really happened that in the healthcare is like affecting me versus affecting somebody else's system. And I think part of that is the valuation kind of value of data.

 

Right. Do patients really value the healthcare data? They say yes when you ask them, but do they really evaluate.

 

Do I really care?

 

My vaccination record is safe in theory, yes, but maybe, maybe not. I don't know.

 

Etienne Nichols: Right, right.

 

Abbas Dhilawala: Yeah. I mean things like that, is it affecting me personally? And unless that happens, it's kind of hard to make that case to you as a portfolio.

 

Etienne Nichols: But I guess with HIPAA as a developer, you don't really have any choice to say, well, what are your thoughts there?

 

Abbas Dhilawala: Yeah, no. So, I mean HIPAA does have, you know, certain requirements, right. So, things like, you know, you have to have an identity management. Right. So, there's no way around it.

 

But HIPAA is also very general and can, and in many cases and you can use many different things to kind of satisfy those requirements. So, like you can do identity management differently.

 

You, as you said we talked about similar ways to do password less systems. If you have a phone technically works with HIPAA because it has entity ways to identify a single person.

 

Things like that works even with CFR 11 and digital signatures and stuff. So those guidelines or those regulations are pretty broad, and they allow different technologies to survive. So, you still have the many of choices to implement plus or comply with HIPAA.

 

Etienne Nichols: Okay, okay, cool. So, I'm going to switch topics just a little bit. So, we have True Quality coming up in three weeks, I guess you'll be there I assume.

 

Abbas Dhilawala: Yes, I'll be there.

 

Etienne Nichols: I'm super excited to meet you in person.

 

And for those of you who are listening right now, who aren't with, obviously not with us right now, but hopefully you're listening to this prior to the event and are able to come and get to hear.

 

Abbas, you agreed to speak there. I don't have the topic in front of me that you're going to be speaking on. Why did you agree to speak, if you don't mind me asking.

 

Abbas Dhilawala: So, one is, you know we've known Greenlight since like five years now. So, in fact a short story, you know when we started our company, the first thing we said, okay, how do we get our name out?

 

Right? We are a software company that's doing a product, a software product in healthcare.

 

And what do we do? We have no idea. We're the first-time entrepreneurs and we came from a healthcare background, did medical wise software for years but we never ran a company, we never did any other stuff, and we looked around, and I saw Greenlight Guru everywhere. No matter what I search on our husband, I see Greenlight Guru.

 

So obviously they've done good at marketing. So, we find Nick's phone number, and we give him a call A very cold call, and he picks up and he talks us for an hour and he tells us everything they did, everything Greenlight did, essentially when Greenlight was on some of those page.

 

And we've been friends since and it was very useful, and we talk and we've been working with Greenlight. So that was one aspect of it, but also other part.

 

We've done webinars with Greenlight; we've done other things. I haven't done a podcast, but I think a couple of my co-founders have done podcasts and we always find that to be very useful, very useful from a content point of view, like discussions, you know, informant discussion, but also other Greenlight Guru podcasts that you have.

 

I've listened to several of John's stuff. I think he interviewed our CEO last year, I believe, and other things like that. Always very informative, even for me, that I've been in the healthcare industry for almost 70 years now.

 

I still find them to be super informative. And so that's the real aspect of it is like I like the format, I like the fact that the quality that you guys put in into these events, and we just have a good relationship with Greenlight and it's always been a very good.

 

So, all that added up to me agreeing to kind of come unto our company to be there as well.

 

Yeah.

 

Etienne Nichols: Well, that's great. Well, I can't wait to hear your talk. I mean, I'm sure it's going to be great. I love the combination of usability and cybersecurity because we have been focusing a lot on cyber security, which, you know, rightfully so, but we can't take our eyes off the prize, you know, what the intended use, quality of patient, the improvement to the quality of life to the end user. So, any other thoughts or things you'd like to. We covered a lot of ground, but I don't know.

 

But I don't know. So, what other things am I missing?

 

Abbas Dhilawala: I think we covered pretty much everything. I mean, ultimately, I think it's still an evolving thing. Right. So, it's a lot to learn, a lot to fail at and learn and find good solutions.

 

And it's something that will never end. Like even in 20 years, we'll be talking about it because it's essentially a balance and the balance has to be kept constantly and there's always going to be imbalances in certain times, and you just have to keep working at it.

 

Well, I think the point I was trying to make today with this talk, hopefully I made is there's a lot of push on cybersecurity, as I said, rightfully so. But let's not take away the convenience or the usability aspect completely out of the picture.

 

So, let's find a way to kind of balance and address both usability and cybersecurity. And from that some innovations will come, right? So, we'll come up with new ways of doing things or newer technologies can be improved upon or implemented that will address that.

 

But if you take away that problem, then there's nobody working on it. So, keep focusing on there is a problem of usability when you put too much cybersecurity controls around that, especially user facing controls.

 

And let's figure out a way to kind of have the best of both worlds.

 

Etienne Nichols: That's great. And you covered a lot of ground as far. Like you said, there's lots of different ways to have that marriage of usability and cybersecurity. So, I love all of the different tips you gave.

 

It's great. I think this is going to be a very valuable session, especially at True Quality. If those of you listening are able to come, definitely drop in and say hi to a boss.

 

Tell him you know him personally now because you've met him on the podcast. So. So any last words before we kind of wrap this up?

 

Abbas Dhilawala: Yes, keep at it.

 

Etienne Nichols: Yeah, yeah. Don't give up on cybersecurity. Don't give up on usability. Thank you all for listening. You've been listening to the Global Medical Device Podcast. If you're interested in learning more about why we do the podcast, why we do all the different things, even the things that Abbas had mentioned, head over to www.greenlight.guru and check out the software that we're putting together or have put together the only medical device success platform built for medical device professionals by medical device professionals.

 

We will see you all next time. Thank you.

 

Thanks for tuning in to the Global Medical Device Podcast. If you found value in today's conversation, please take a moment to rate, review and subscribe on your favorite podcast platform. If you've got thoughts or questions, we'd love to hear from you.

 

Email us at podcast@greenlight.guru.

 

Stay connected for more insights into the future of MedTech innovation. And if you're ready to take your product development to the next level. Visit us at www.greenlight.guru. Until next time, keep innovating and improving the quality of life.

 

ABOUT THE GLOBAL MEDICAL DEVICE PODCAST:

medical_device_podcast

The Global Medical Device Podcast powered by Greenlight Guru is where today's brightest minds in the medical device industry go to get their most useful and actionable insider knowledge, direct from some of the world's leading medical device experts and companies.

Like this episode? Subscribe today on iTunes or Spotify.

Nick Tippmann is an experienced marketing professional lauded by colleagues, peers, and medical device professionals alike for his strategic contributions to Greenlight Guru from the time of the company’s inception. Previous to Greenlight Guru, he co-founded and led a media and event production company that was later...

Search Results for:
    Load More Results