Quality Myths & Lessons Learned Part II

February 22, 2024 ░░░░░░

GMDP_356

In this episode of the Global Medical Device Podcast, host Etienne Nichols delves into the world of medical device quality and reliability with expert guest Kevin Becker.

They explore the nuanced challenges of ethical decision-making in the MedTech industry, the complexities of accelerated testing, and the continuous quest for quality improvement. Becker, author of "Quality Myths and Lessons Learned," shares his insights from the second edition of his book, emphasizing the importance of ethics, the intricacies of statistical models, and the practical aspects of quality management in medical devices.

Interested in sponsoring an episode? Click here to learn more!

Listen now:

Love this episode? Leave a review on iTunes!

Have suggestions or topics you’d like to hear about? Email us at podcast@greenlight.guru.

Takeaways:

  1. Latest MedTech Trends: The episode underscores the critical role of ethics in the rapidly evolving MedTech industry, where technological advancements and moral responsibilities intersect.
  2. Practical Tips: Listeners gain practical insights into the importance of rigorous testing, continuous learning, and ethical decision-making in ensuring the quality and reliability of medical devices.
  3. Future Predictions: The discussion hints at the increasing significance of statistical models and accelerated testing in predicting and enhancing the longevity and efficacy of medical devices.

Key timestamps:

  • [00:05:20] Discussion on the new chapter about ethics in Becker's book
  • [00:10:35] Insights into accelerated testing and its application in medical devices
  • [00:15:50] Kevin Becker's five levels of knowledge and its relevance to MedTech professionals
  • [00:20:45] The significance of standing up for what's right in quality and regulatory matters
  • [00:25:30] Real-life examples of complex problem-solving in medical device engineering
  • [00:30:55] Final thoughts and advice from Kevin Becker for MedTech professionals

Links:

Memorable quotes:

  • "The first level of knowledge is you don't have a clue... The third level is you know enough to be effective, which is where we all want to be." - Kevin Becker
  • "All models are wrong; some models are useful." - Quoted by Kevin Becker, highlighting the pragmatic approach in engineering and quality assurance.
  • "Do something, do anything. If it's wrong, we'll learn from it. Just do something." - Kevin Becker's advice to overcome analysis paralysis in product development.

Feedback:

Love this episode? Leave a review on iTunes! Have suggestions or topics you’d like to hear about? Email us at podcast@greenlight.guru.

Sponsor:

This episode is brought to you by Greenlight Guru, the only quality management software designed specifically for the medical device industry. Streamline your process and foster innovation with Greenlight Guru’s intuitive platform!

 

Transcript

Kevin Becker: The first level of knowledge is you don't have a clue. You don't even know the technique exists, so you can't use it because you don't even know it's there. Second level is you know enough to be dangerous, which means you know it's there and you can use it, but you're as likely to use it wrong and come up with a wrong but believable answer as you are, maybe to use it correctly. Third level is you know enough to be effective, which means you know it's there, you're going to use it. You're probably going to use it right and you're probably going to draw a correct conclusion.

That's where we all want to be, right? And when I teach classes, I say I can only get you between two and three. No, enough to be dangerous. I can't get you all the way to effective because that takes practice.

Etienne Nichols: Time is usually of the essence, but here's the problem. Traditional product development processes are usually as slow as molasses. They cause delays and they're headaches. For companies like yours, Greenlight Guru is the ultimate solution for MedTech's biggest challenge.

You may be facing lengthy development cycles that drain your resources and hinder progress, but we streamline the entire product development journey. We make it faster. We make it more efficient and less prone to hiccups.

By centralizing your data management, automating your workflows, and allowing real time collaboration. It's all here. It's designed to propel your projects forward. And guess what? Regulatory compliance is built right in.

It reduces the risk of costly revisions and ensuring you stay on track. With Greenlight Guru, you're not just developing products, you're accelerating progress, making a difference when it matters most.

Don't let inefficiency hold you back. Embrace innovation with Greenlight Guru. Go to www.Greenlight.Guru to learn more.

Hey everyone. Welcome back to the Global Medical Device Podcast. My name is Etienne Nichols. I'm the host of today's episode. With me today is Kevin Becker. Kevin is the author of Quality Myths and Lessons learned.

I heard him present on this topic. I really was impressed.

We did an episode on it, but now he has a second edition and I'm excited to see the things that have changed. He mentioned it doesn't sound like you've read the book, and I haven't read this.

I read the first edition. I have not read the second. So, I'm excited to get my hands on that. But first, let me introduce Kevin. Kevin has a bachelor's degree in mechanical engineering from the University of Minnesota and a master's degree in reliability engineering from the University of Maryland.

Kevin is an ASQ certified quality engineer, reliability engineer, and six sigma black belt, and has experience as a quality reliability engineer, quality manager, director of engineering, and director of quality in the medical device, computer, disk drive, measurement equipment, and machining industries. He's trained engineers, technicians, executives, managers, and supervisors in quality and reliability methods, statistical techniques, and risk management methods. Kevin, as I mentioned, he's authored this book and co-authored published papers in the areas of liability, probabilistic risk assessment and measurement, correlation, and has been on a few podcasts, including this one. So, I'm excited to be with you today. Kevin, how are you doing today?

Kevin Becker: I'm doing fine. Getting over a little bit of a cold, so if my voice is scratched, you have to excuse that, but.

Etienne Nichols: Okay, no worries. No problem at all. Glad you're with us today. So, tell us a little bit about what prompted. I'm always curious, what is it that changed in your mind, or what did you feel like maybe was missing that you felt you needed to do a second edition?

Kevin Becker: Well, there are a couple of things. I have a day job, as you know, I work full time, and then I have consulting on the side and wrote the book as part of the consulting job.

And the first edition, there was a time constraint that I was under, and I knew that I couldn't include everything in it. It was a compromise right, between time and material, and there was a time constraint that I wanted to meet, so, I did. But I knew I left things out right. And then I kind of learned the hard way with the first edition that color is really expensive. And the first edition, if I'm being open, was more costly than I thought it should be.

So, the second edition, I decided to make it less costly, and I went with black and white. So, the second edition has 50% more content for half the price, is what it ends up being. It was a big difference between color and black and white.

Etienne Nichols: That's really impressive. So, tell me a little bit about the content. I remember when we talked about.

I've drawn a lot of different things that I still quote that book. One of my favorites is what is the most important part of quality management system? And I still like that as a trick question, like, well, this is what Kevin Becker says, and I'll just leave that hanging for just a moment.

But what are some things that you think are changed? And I wish I had it in front of me. I apologize for that. But what are some of the things that we can look for and really draw from?

Kevin Becker: Well, last time I talked about an embarrassing episode, it was regarding ethics. Excuse me.

I put a whole chapter in here on ethics this time, and I wanted to put it in the first edition, but I didn't, primarily because I was afraid that it would maybe reflect on certain people that it shouldn't reflect on. Right. I mean, people might. Readers might interpret that. I'm talking about my current employer, my current situation, whatever the case might be.

So, then I left it out because I didn't want to have the wrong interpretation. I decided over time that, you know what? Just be straightforward about it. And none of the examples in that chapter are from my current employer, unless it's otherwise specified. There's one, but it ended up being a positive example. But the message or the story that I told last time was when, early in my career, living paycheck to paycheck, young son and wife at home, and I was told to fabricate some data, and I did it.

And I lost sleep for two or three nights. And I went back to my boss and said, don't ever ask me to do that again, thinking that I might end up losing my job over. It turned out I didn't lose my job. He said, okay. And we actually got along great for decades after that.

It turned out really well. But there are other instances as well. I was told at one time, if sales lie to a customer, you lie to a customer. And I was pretty livid. I had my badge in my hand at that time. I went to my manager and said, I'm not doing this. And he kind of talked me down off the ledge and said, I'll take care of it. And he did. And I never heard anything about it again. So that one also turned out well.

I had one time when engineering manager asked me about a conference in the cafeteria, which that's off the record, right? If it's in the cafeteria, always is. And he wanted me to make a certain decision that the wording, I believe, was something along the lines of, there's a pile of money out there and we should grab our share.

He was talking about yearly bonus, right? Because we could increase short term profits through certain actions. But I was certain that the long-term effect to the company would be detrimental.

So, I declined to do that. And there were other instances. And it's not only me. I've talked to other people in the quality profession as well, and you run into things like that. So, the reason I thought it was important to include in the book, especially for young people getting into the industry, you just got to think about that, right? You know that it might happen and think about it in advance so you're not taken by surprise like I was the first time. And then in there, I also mentioned talk about the fact that it's not always clear cut either, though, right?

I mean, go back to the discussion with the engineering manager. I was certain that long term, it would be bad for the company. But what if I wasn't certain? What if there was a 20% chance that long term it would have been good for the company? What if that was 40%? What if it was 60? What if it was an 80% chance it would be both good short term and long term? At some point, the ethical equation flips right through the whole chapter about ethics. In this one, I think it's probably the most controversial chapter.

Etienne Nichols: But I remember you telling me there would be a chapter on ethics, and I'm glad you put that in there. And I remember the story you gave, and it's interesting. I just recently saw.

I don't know if you saw the recall. That was it. Let's see. Who was it that did this? Was it Medtronic?

Essentially, it doesn't really matter who it was, but there's recently a recall where parts had been stolen from the scrap or NC disposition and were being sold on Facebook marketplace.

Kevin Becker: Oh, that one I haven't seen.

Etienne Nichols: Yeah, I have to pull it up. I'll put a link for.

I basically, someone posted that on LinkedIn, and I basically said, well, if anybody wonders how this happens, they've probably never seen the scrap bin in a manufacturing facility. And it's not that this is the way it should be, but I can remember in my past just talking about the ethical dilemmas or the things that you may face and nobody tells you about.

I can remember having an assembly that retailed for $96,000, and it was on my desk, and someone dropped by. I'm like, man, we could sell that on eBay. I'm like, I never thought about that. And then I thought, wow, that is a temptation that you feel it. And we didn't do anything with that, but I thought, man, that's something I never would have thought of. And you go by the scrap in, and there are pieces of this just thrown in there and so forth. Anyway, it's a very real thing that you're right.

I never was quite prepared for, so that's interesting.

Yeah.

Kevin Becker: And I ended up putting a little more math in this one than the previous one. It's obviously not all math. There's a chapter on ethics, but there's some more math in here.

There's a paper that I wrote. I actually presented at an international conference a while back, and then I adapted it to a broader topic. That paper was really specific, but that one is in here.

It's designing time accelerated tests. And the reason I bring that up is because when I was developing that method, when I was writing it, it actually was kind of ridiculed by other engineers within the same company. These were kind of smart guys, but they didn't understand the science and math behind it.

Long story short, it was peer reviewed, published, presented at international conference. Whatever. The reason I bring it up is, sometimes you have to be willing to stand your ground. When you know you're right, sometimes you have to be willing to stand your ground. Even other engineers might not always see it the same way or understand the same level.

That same topic actually had a different job, accelerated testing. In general, the topic on accelerated testing, one of the senior management was questioning the validity, and again, I had to justify it using physics and math. And I think I ended up saying, if you trust the physics and you trust the math, you trust the method.

Where it can fall apart is if you have the wrong understanding of physics. If you made a math error or something like that, it can fall apart. But if you trust it, it works kind of along those lines. There's an example in here also about statistics work, even when it's not obvious. I remember very clearly a project where we had two different product lines, very similar, same customer, but one product line had 0.5% more contamination than the other product line.

0.5% difference.

It was statistically significant. Clearly statistically significant. It wasn't borderline. There was a difference. We had a project team, and we worked for months, couldn't find the answer, disbanded the project team, and then I got a call from the lead one day, a few weeks later, and she said, hey, I think I found the problem. Well, these are small parts. Twelve on a strip, stainless steel parts.

And on the one product line, when we found one of the parts was bad, we had a little tool. We clip it off and throw it away. The other product line, we didn't have a tool, so we would grab the strip and grab the part with our fingers and bend it back and forth until it came off the strip.

Even though we were wearing finger cuts. Gloves. Finger cuts are short, but even though we had protection on our hands, we were transferring contamination to the adjacent parts when we bent the wand back and forth to remove it.

Point is statistics told me they were different. We couldn't find the difference for months, but that difference was there. We just hadn't looked in the right place yet.

Etienne Nichols: Wow, that's a really good story.

Kevin Becker: Yeah.

Etienne Nichols: That is too cool.

So, you mentioned young engineers, and maybe we could talk about the target audience or who this book would really be good for. Young, old alike, it sounds like. Especially when you talk about the physics and the science, even if you've been in the career for a while, we all have something to learn.

Who do you really target?

Kevin Becker: Yeah. In the back of the book, there's actually a short synopsis of each chapter, and it lists who that chapter is targeted at. But the target audience is fairly broad because I would say senior managers need to read that chapter on ethics, because if we as managers are putting young employees in that situation, shame on us. Right? Remember when I said the first story, I had a young son at home, young wife, paycheck to paycheck.

Understand that it can be difficult for someone to take an ethical stand in that situation. And as a manager, we should not be putting them in that situation, plain and simple. So senior managers, I think, could benefit from the book, as well as young engineers.

Different chapters, different people.

Etienne Nichols: Yeah, absolutely. And I like how that is laid out. Almost read what applies in the moment and learn from it as you go. That's really good. Is there anything that you mentioned the first time? There's time constraint this time, maybe it took a little bit longer, but you're able to get something out a little bit better in your mind.

Was there anything that you're really proud of in this version compared to the previous edition?

Kevin Becker: Well, the two that we've talked about already, I also have the last chapter, and not second to last chapter is called short bites. It's just a bunch of short little topics. Some of them are more than a page, maybe could have been a chapter on their own, but a lot of them are maybe only a couple paragraphs.

There's one that I call thermodynamics in the workplace. And when I took thermodynamics, University of Minnesota, the instructor had a great example, great example of entropy. Kind of an esoteric concept a little bit. But he was holding a ceramic coffee mug.

He said, if I drop this mug, it's going to break into a thousand pieces. Those thousand pieces will never spontaneously form into a mug. That's entropy. Essentially, the universe tends toward chaos. And I found in the workplace; it's the exact same thing. Like, take a quality management system.

If we don't constantly work to maintain that system and to maintain order, it will eventually devolve into chaos. Right. Everybody starts doing their own thing, and pretty soon you don't know what people are doing, how the parts are being made. So, thermodynamics in the workplace, it's a physical concept, but it applies to human behavior.

Etienne Nichols: I like that. That's really cool.

Yeah, man. Bringing back some good and bad memories with thermo. Thermo one was better than thermo two for me, but anyway, so that's really cool.

I like how you really apply the physics and the science and things. Sometimes when we think of quality assurance versus maybe some of the harder science, people hesitate, I guess, to apply some of that or think about how it's something that we should still be. Keep staying sharp on. But there are probably going to be people who don't read the book. Maybe the majority of the people who listen to this might not get the book.

Any other stories? I love stories, and I wish I had the specific ones to dive into more detail with you on. And I'm actually going to ask a specific. Maybe I'll ask a specific piece of advice for those in quality or regulatory or whatever role they may be in. I know you still study math and science and things like that. Do you have a way or a suggestion on staying sharp on that or a reason behind it?

I don't know if that question is making any sense.

Kevin Becker: Yeah, I have a natural interest. Honestly. I have a calculus book next to the rocker where I sit and watch tv sometimes. So, if I get bored during the tv show, and I think I may have mentioned that last time, it's still something that most people will find, really. Od.

I have a natural interest in it, though. But along those lines, actually, when I was working in a previous job, there were three or four of us who were in a reliability engineering group, and none of us were reliability engineering professionals yet at that time. And the one person who was, as our mentor, left the company.

So, then we were kind of sitting there, you know, we can do this, but how will we know if we're going in the wrong direction? How will we know if we're heading off the rails, if you will? And what we did that worked really well, is we found that Minnesota had a society of reliability engineers professional society. They had monthly meetings, and we started attending those monthly meetings, and we started talking to people who were professionals in this field and learning from them and ended up presenting at a couple of those meetings.

But just the networking and the learning. We could ask them questions like, okay, this is what we think we should be doing. Is that the right direction, or are we heading off in the weeds somewhere?

And they'd tell us, I mean, it's really nice people, right? Everybody's there to connect and help each other. So that's one way. There are societies out there. There are professional societies…ASQ, this was a subchapter of IEEE. But there are people out there who are more than willing to help if you do find yourself on an island. So that's one thing I would do.

I mean, obviously, you can take college courses, you can do self-study podcasts. There are books on tape, all that kind of thing. But maybe the one that I would point out is there are a lot of people out there willing to help.

Etienne Nichols: Okay.

The thing that I mentioned from the first book that I go back to a lot is, what's your favorite thing about a quality management system? And I hope I'm triviating this correctly, but management responsibility, are there things like that, or is there anything in the book?

I'll have to go back, and I'll tell you what I find. But anything like that, any tidbits that you really might be contrarian takes or something that maybe not everybody agrees with or.

I love that little controversial area. It's always interesting to find those sections.

Kevin Becker: Yeah, there's one in there about measurement error or measurement capability. Right.

There's a rule that when you're doing a measurement error study, you need to have five distinct categories. In other words, that the products parts being used for the measurement error study, the measurement system has to be able to distinguish them into five distinct categories and people.

It's a good rule in a certain context, but people apply it when it shouldn't really apply. We've had measurement capability down in less than 10% of the tolerance. Okay. So, our distribution is really tight.

Say our PPK is above five. Right? Or CPK is above five.

And the parts that we obtained for the measurement study are almost identical. So, of course, the measurement equipment cannot distinguish five different categories because the parts are almost identical. And we've had customers refuse to approve the measurement system because we didn't meet that five distinct category thresholds.

It would be probably seven figures to buy a measurement equipment that could do that.

So, let's put it in context again. Right. We have highly capable process, parts per billion, nonconforming if even. And you want me to spend a million dollars on a new piece of management equipment just because we didn't meet that five distinct categories. That's ridiculous. If you can sell that, you need to go into sales because you'll retire a millionaire in the first five years.

There's no senior management that I can think of that would think that's a good decision. But people get. So, you read the book and that's an easy rule to understand, but you don't understand why it's there.

And then you're not able to understand when it doesn't make any sense. And the real reason for that rule is if you're trying to improve your process, you have to be able to tell one part from another the difference among the parts.

We have a highly capable process. We're not interested in improving this one. We have much more problematic areas. We need to focus our attention and we're not going to buy new measuring equipment for something like this. So, there's one that where I completely disagree with the way people are doing it.

Etienne Nichols: Implementing this makes, I don't know how it tickled my brain this far back to go into one of my social studies classes in college talked about the EPA water when that act was enacted across the United States.

The way they enacted it is everybody had to improve their water system by x percent. Well, those guys in Alaska were dealing with some pure water. So, they said, well, the only way we can adhere to this law is to dump some bad stuff in the water so we can improve it.

So, they did that. And so, they realized we're going to have to tweak this law. But it made me think of that some people who are real good rule followers have some unique ways of following the rules, but that's a really good point.

And when you talk about that, you mentioned that you mentioned another conversation where you have to be able to stand up to your peers.

If you mentioned the physics and science, I don't remember the exact specific example, but how the other engineers around didn't necessarily agree with that. And in this situation, maybe the customer didn't agree with it, and then in others ethical situations, management.

I'll get to the point, and I'll stop this TEDtalk, but it's a matter of being willing and able to stand up and be competent enough to stand up to both the customer, your peers and management.

And I know the book kind of deals with how to do that. In those different situations or gives examples. But do you have any advice for people in those different situations on how to have the confidence and the ability to do that in each one of those circumstances?

Kevin Becker: The only way that I can have confidence is the word you used is if I've studied the material and I understand it to a level, it almost gets to the point where I need to be able to derive it.

I mean, there's a chapter in here where I heard a rule of thumb in my reliability engineering class. They said, if you want 95% confidence of a certain proportion defective, nonconforming, whatever, you just divide three by the proportion defective, and that's your sample size. They give you 95% confidence. Hey, that's pretty cool. Why does it work? When doesn't it work? Because it's a rule of thumb.

Rule of thumb only works over a certain range, and it's important to know when it doesn't work so you understand when you might be making a mistake. So, I went and derived it.

It wasn't that difficult, the derivations in the book. But that's how I gain confidence. I will convince myself that I understand it to a level where I can extrapolate a little bit without being wrong. I hate being wrong.

I absolutely hate being wrong. Makes me a little bit conservative. It still happens, but I hate it.

Etienne Nichols: No, I respect that a lot. That makes a lot of sense. We talk about confidence in this society a lot, but we don't always talk as much about competence, and I think that's really how you should derive your confidence.

That makes sense.

That's a really good point. I started thinking of maybe, like, a polynomial line.

A lot of times, we have the information between two points, and I can remember how they taught us in school. Anybody can extrapolate, but engineers interpolate. We stay between the bounds. And you said it's a matter of knowing the bounds.

Kevin Becker: Unless you're a reliability engineer.

No, seriously. Then you have to extrapolate, because product life is 20 years. You can't test a product for 20 years before you put it on the market. The market's already changed.

Etienne Nichols: Right.

Kevin Becker: You have to test in the lab, you have to accelerate, and you have to extrapolate at that point. That's the only way to stay in business. There are tricks, though, to extrapolating.

It goes back to science. When you're doing that accelerated testing, you have to understand the physics and the chemistry of what's causing the failure, because if you accelerate too far, you'll cause a failure that won't ever happen in the field.

I mean, the easiest example is higher temperature is often used to accelerate failure. Well, if you heat it enough, you're going to melt it. Is it ever going to melt in the field?

No, it's never going to melt in the field. But there are a lot of less obvious, more subtle ways that you can go wrong with that. So, one of them is understand the physics and make sure that then you do failure analysis, and you see, okay, is this what we expected to happen?

Is that what happened? If yes, then we're doing what we intended. If no, we better look closer because maybe we've just learned something that we should have known all along, and it's going to be very important to us.

Another trick with accelerated testing is standard deviation has to stay the same between you test at different acceleration levels. Right. Standard deviation has to stay the same from one level to the next.

If it doesn't, you've changed the failure mechanism. Well, almost certainly changed the failure mechanism. You're not accelerating what you want to accelerate. So. Yeah, I actually glad you brought up the extrapolations.

I understand. Never extrapolate. That's what we're taught in college. But when you have a 20-year product life and you want to figure out if it's going to fail, you can't test for 20 years.

Etienne Nichols: No, that's really good. I love these conversations that take me down a road I wasn't expecting, because when I think about that interpolation, it is something we're taught in college, pretty heavily, at least mechanical, where I was.

But you're right, at some point you have to know what's a reasonable extrapolation. And that's kind of the additional trick. We don't want to just get caught up in our original fundamental teaching onto it.

Kevin Becker: In the pharma industry for drug life, they'll do accelerated testing at higher temperature, but at the same time, they put product on the shelf for real time tests aging. Right. And they'll just leave it on the shelf and then they'll test it every so often.

So, it was a three-year life. The accelerated testing might give you that answer in a matter of weeks or months, but then there will actually be product on the shelf for three, four, five years.

And every so often they'll pull some off and test it just to make sure. So, it's accelerate, get the quick answer, get the drugs out there so people can benefit from them, but then follow it up with real time aging.

Etienne Nichols: So, I think about this really being applicable with packaging. I wonder if do you have some other examples you might give in this accelerated testing.

I remember working with the silicone product where we accelerated the vulcanization through heat. But just on the tippy toe of my knowledge. But I'm curious if you have a specific example, because you got me really curious about some of this.

Kevin Becker: Yeah, we have one. It was a medical device, and it had glass fibers. Okay.

It would read light signal through the glass fibers. And early on, we conducted an FMEA because that's what you're supposed to do. And we identified that glass fibers breaking were a potential failure mode that concerned us.

We started a test where we just had the fiber, or the cable was suspended between pulleys, and one pulley would move up and down. It was a weight on the end. So, we were fatiguing it, right. We were bending it, simulating repeated bending in the field. And we got the results, and we found out that, darn, these things are breaking faster than we can have with the product life.

So, it's interesting, we had a debate among the engineering team. And I'm going to say I lost the battle, but won the war, if you want to use that analogy. But the debate was, do we hold the fibers tighter, or do we hold them looser, which is worse?

My hypothesis was, you got to hold them looser because if you squeeze those things as soon as the first one breaks, now you have a sharp edge that you're forcing to scrape against the adjacent ones because you're holding it too tightly.

And the opposite argument was, no, holding them more tightly will prevent them from breaking anyway. So, we decided to hold them more tightly, put them on test, and they broke half the time or a third of the time. So, then we said, okay, that didn't work. We hold them more loosely. And that did extend the life to the point where it was successful. We didn't really have to worry about it in the field anymore.

Etienne Nichols: It's so crazy that you could have very intelligent conversation and truly be on both sides of this fence a lot of times.

That's just really interesting. I love those conversations. I can remember those whiteboard conversations. Really fun. I'm curious what didn't make it into the book and what your criteria was. Was there anything that you thought, well, okay, that's going to stay on the cutting room floor?

Kevin Becker: And for this reason, part of it was time.

I was very busy in my day job again. And I had put a timeline on myself of this year. And am I kind of surprised that I met it, actually, because I was so busy doing other things a little bit surprised, I met it.

So, part of it was just time and then maybe I didn't remember everything to put in there. Right. There's one thing that know wish I would have had in there. It's a little bit tongue in cheek, but QC training Services has actually used it with my permission recently. But it's Becker's five levels of knowledge, because I read things like in ASQ, you get certification.

They have Bloom's tax autonomy. And I read those definitions, and even after I read them, I don't fully understand exactly what they mean. So, I said, okay, I'm going to make it easier.

So, I made up my own. And this is a little bit of tongue in cheeks. Hopefully nobody gets offended by it. But the first level of knowledge is you don't have a clue.

You don't even know the technique exists, so you can't use it because you don't even know it's there. Second level is you know enough to be dangerous, which means you know it's there and you can use it, but you're as likely to use it wrong and come up with a wrong but believable answer as you are, maybe to use it correctly.

Third level is, you know enough to be effective, which means you know it's there. You're going to use it. You're probably going to use it right. And you're probably going to draw a correct conclusion.

That's where we all want to be. Right. And when I teach classes, I say, I can only get you between two and three. No, enough to be dangerous. I can't get you all the way to effective because that takes practice.

You got to do this thing. You can't just sit in a classroom and absorb it. Fourth level is you don't have to teach it. I used to think teaching was easy until I started teaching. Then I figured out that you teach smart people and they ask smart questions that make you think about a subject in a way you maybe haven't thought about it before.

So, you're learning at the same time as you're teaching, and you have to think on your feet. And that's not as easy as I thought it was. It makes it fun, though. And then the fifth level is you don't have to make it up, which essentially you take basic principles, and you create new ideas, new concepts, new methods from the basic principles.

Etienne Nichols: Yeah.

Kevin Becker: So that didn't make it in there, but it would have if I had thought of it. But I was busy, and I had other things on my mind, and I didn't think of it.

Etienne Nichols: Yeah, that's cool. It made me think of the know enough to be effective and then to teach it the fourth level. When you talked about if you almost feel like you have to know something well enough to derive it, to feel confident defending it to your customer or your manager, I think that would be really beneficial. If we taught things more often, we would have more confidence, because that's what you're doing, really. When you're standing up to somebody, you're.

Kevin Becker: Teaching them along those same lines. There is a story, and this one made it into the book. But I remember a time where our customer had rejected over a million-dollars worth of product over a span of months, right?

And they'd send parts back. We'd measure and say, no, these are good. So, they just had all this million-dollars worth of product sitting on their shelves, and they weren't real happy with us.

And I remember one day my boss called me into his office, looked me in the eye and said, are we right or are we wrong?

And the question took me a little bit by surprise. Just the timing. I didn't know it was going to happen, right. So, I stopped to think about it a while and I went through in my head, and we were very careful about how we were controlling our measurement equipment. So, I could look them back and say, no, we're right. And it went on maybe a couple more months. And then customer was visiting. He said, hey, I want to talk to you outside for a little bit. So, we went outside, and he said, by the way, we found the problem with our measurement equipment. You guys were right all along, and we're going to accept all this product. I never heard an official, you were right. I heard it unofficially outside. But it goes back to a lot of what we've been talking about.

To have confidence, I had to have the data. I knew the data was there, though. It was all there. I had been monitoring it. It was my job, part of my job to monitor this stuff so I could confidently say, we're right.

Turned out we were.

Etienne Nichols: Kudos to him for even asking the question. I don't know if a lot of people think to ask that, that's something that I'll have to take away. Am I right? Am I wrong?

The other thing that you mentioned there, because you mentioned that that's the second time, is, okay, he took you outside. And the other time there was the cafeteria conversation, which those are technically off the record.

I don't know. You probably handle this sort of in your ethics, maybe subtly in your ethics. Chapter. But what are your thoughts on those conversations? They're important, but they can also be dangerous. Would you agree?

Kevin Becker: Oh, definitely.

Etienne Nichols: Yeah.

Kevin Becker: They can be the best conversations. They can also be the worst conversation.

Etienne Nichols: Yeah.

Kevin Becker: This isn't in the book, but I remember a similar conversation. A colleague and I wanted to have a cafeteria conversation. And at that time, we were both frustrated that we'd ask questions of management and didn't give us what we considered to be good directions.

We talked it out for a while and we said, we're just going to do what we think is the best, and they'll tell us that they don't like it, so we're going to quit asking them questions. We're just going to do what we think we should, and they'll tell us if we're wrong.

They never told us we were wrong. So that's a case where that off the record conversation had a very good outcome. Right.

Etienne Nichols: It's kind of like the engineering notebook, I guess. Maybe I'm equating it to documented conversations or documented anything, and maybe that's not really the way to look at it, but, yeah, something that's not necessarily going into your DHF. Yeah, that's really interesting. It's something that I don't know that we put a lot of cognitive thought into, or at least I didn't when I was either on the manufacturing floor and product development, those conversations, I would just have conversations anywhere.

The more I think about it now, I look back and I think, actually there was a point in my life when there was one particular person where I realized, I can't talk to this person unless there's someone else around because it always goes sideways, but there's always those human interactions. That's interesting.

Kevin Becker: Yeah, I guess one other thing that did not make it into the book that you asked about. Probably the most interesting project that I ever got to work on ended up being a trade secret. So, I can't go into details, but I can do a high-level thing.

Right. But it was a statistical model for life of a product. And we had like a year and a half's worth of testing on the product to work on the model. And I was looking through literature to find models and developing the method, and I sent it out for review.

And the answer was, yeah, we don't know. So, we were working with a statistician at the recent Minnesota PhD statistician, and said, okay, we'll have him review it. Instead. Came to a meeting, he said, this is probably one of the hardest problems in statistics and what you did seems right.

That's probably the most rewarding project that I've ever had. To be upfront, I didn't solve a closed form. I used a combination of closed form, Monte Carlo curve fitting, all kinds of stuff.

But that was really interesting. That was fun. If I could do five more projects like that, I'd consider it a life. But I'm over 60, so I think I'm a little different phase of my career. But I would still like to do projects like that again.

Etienne Nichols: You get faster as you get older, though, because you've already done it. I'm sure you've got more. Have you thought about. This is something I've been thinking a little bit lately, so I need somebody to tell if I'm right or wrong. Complicated problems versus complex problems, some with this is the answer, others with this is the best answer possible.

Have you thought much about that, or do you have a comment or opinion?

Yeah.

Kevin Becker: The example I just mentioned was one of those where this is the best answer possible. It was not the perfect answer, but it was the best one we could do.

When I'm teaching, a lot of times I will tell people, you know what? We're engineers, we're not mathematicians, and I'm not disparaging anybody, but in my head, the way I view it is mathematicians look for the right answer, the perfect answer. Engineers look for the best answer to help the company be profitable.

Right? So, it might not be the perfect answer, but it might be close enough. There's a quote in the book, I actually have six quotes that I have on my board at work, and they're all listed in the book, and this one's from George E. P. Box.

All models are wrong. Some models are useful, and as an engineer, we want a useful model. We're not going to spend and know. I could have spent an extra five years on that project. I said it was a year and a half worth of testing. Well, that's because it was physical testing, and we couldn't accelerate it any faster than that. Within reason.

If I spent another five years, could I have made it better? Probably. But would it have been a good use of the company's money? No way. Because it was useful?

Etienne Nichols: Yeah.

Kevin Becker: Does that answer your question?

Etienne Nichols: Yeah, totally. And I think that's something I think we could stand to learn. We just got to be head over the head with those multiple times early in our career. If you've got a problem that, yeah, it's just a mathematical problem, sure, get the answer. But if it's a complex problem with very systemic, lots of different threads.

It's more of a complex problem. You're just going to have to find what works for this situation. I love that. Yeah.

Kevin Becker: You just reminded me of a young engineer who was reporting to me and he got himself all tied up. Analysis paralysis. He just kept digging further without implementing anything. And I had a really good relationship with him.

So, we're in a one-on-one meeting at one point, and I said, do something. Do anything. If it's wrong, we'll learn from it. Just do something.

Etienne Nichols: Yeah, I was there at one point. I know I was that guy.

What other thoughts or piece of advice do you have or recommendations from the book? I highly recommend people get it. Those of you listening, we'll put the link in the show notes, but in the last few minutes, do we have any last piece of advice or thoughts?

Kevin Becker: Be true to yourself.

It goes into ethics. It also goes into being competent. Decide who you want to be and then do what it takes to become that person.

Etienne Nichols: Well, Kevin, thank you so much for being on the show. I'm going to get my hands on the book, and I'll let you know once I read it, too, because I'm excited. You've whet my appetite and I'm interested in getting more into this, especially the accelerated testing.

But all of it's going to be interesting. So, I'm excited and thank you for putting the work in to help the industry and help those who are early in their career, senior management, everybody who could benefit from this, really appreciate it.

Kevin Becker: Yeah, thanks for having me again.

Etienne Nichols: All right, we'll let you get back to it. Everybody takes care.

Thank you so much for listening. If you enjoyed this episode, can I ask a special favor from you? Can you leave us a review on iTunes? I know most of us have never. Done that before, but if you're listening. On the phone, look at the iTunes app. Scroll down to the bottom where it says, leave a review. It's actually really easy. Same thing with computer. Just look for that. Leave a review button. This helps others find us and it lets us know how we're doing. Also, I'd personally love to hear from you on LinkedIn. Reach out to me. I read and on to every message because hearing your feedback is the only way I'm going to get better. Thanks again for listening, and we'll see you next time.

 


About the Global Medical Device Podcast:

Untitled (8.5 × 3 in)

The Global Medical Device Podcast powered by Greenlight Guru is where today's brightest minds in the medical device industry go to get their most useful and actionable insider knowledge, direct from some of the world's leading medical device experts and companies.

Like this episode? Subscribe today on iTunes or Spotify.

Etienne Nichols is the Head of Industry Insights & Education at Greenlight Guru. As a Mechanical Engineer and Medical Device Guru, he specializes in simplifying complex ideas, teaching system integration, and connecting industry leaders. While hosting the Global Medical Device Podcast, Etienne has led over 200...

Search Results for:
    Load More Results