Quality Myths and Lessons Learned

January 18, 2023 ░░░░░░

301 GMDP-header thumbnail

What words cause the most problems in MedTech, and what situations should you be ready to handle when you work in Quality? Today’s guest wrote about these issues in his book and will be talking more about them in today’s interview.

Kevin Becker has a bachelor’s degree in Mechanical Engineering from the University of Minnesota and a Master's degree in Reliability Engineering from the University of Maryland. Kevin is an ASQ Certified Quality Engineer, Reliability Engineer, and Six Sigma Black Belt with experience as a Quality/Reliability Engineer, Quality Manager, Director of Engineering, and Director of Quality in the medical device field. Kevin has authored and co-authored published papers in the areas of reliability, probabilistic risk assessment, and measurement correlation and has written a book titled Quality Myths and Lessons Learned.

Listen to the episode to hear what Kevin has to say about ethical considerations in Quality, Quality’s PR problem, and why having a principle-based decision-making process matters.

Watch the Video:

Listen now:

Like this episode? Subscribe today on iTunes or Spotify.

Some of the highlights of this episode include:

  • What prompted Kevin to start a consulting business

  • Examples of things few engineers realize

  • How you grow the muscle of realizing what you’re incentivizing

  • What a quality engineer might specifically be interested in with regard to ethics

  • Gray areas in ethics

  • How to use flow charts

  • Having a principle-based decision-making process

  • How a competitive culture can lead to pushing the rules

  • Overcoming peer pressure in the industry

  • The most important part of a quality management system

Links:

Kevin Becker's LinkedIn

Quality Myths & Lessons Learned Book

Etienne Nichols LinkedIn

MedTech Excellence Community

Greenlight Guru Academy

Greenlight Guru

Memorable quotes from Kevin Becker:

“Communication is another issue that is really difficult for engineers. They should be good at it, but they’re not.”

“The worst possible answer is wrong but believable.”

“I’ve seen some flow charts that have a lot of circular loops, and I don’t think they help make things clearer.”

“The goal of any company should be: recognize (ethical deterioration) long before it gets to an ethical or, even worse, legal consideration, and then take action to correct it in a timely fashion.

 

Transcript:

Etienne Nichols: Hey everyone, it's good to be back with you today. Today with me is Kevin Beckers, author of Quality Myths and Lessons Learned. Kevin, it's so great to have you on the show.

 

I've really been looking forward to this conversation. How have you been doing?

 

Kevin Beckers: I'm doing good. Glad to be here. Thanks for having me. Yeah.

 

Etienne Nichols: I got to hear you. At the Minnesota ASQ Quality Conference. I was a little confused because it was like five conferences in one, but I really enjoyed your presentation and since got your book.

 

Um, yeah. So, you want to tell us just a little bit? I know you had a disclaimer you wanted to give, so I'll let you do that at this point too.

 

Kevin Beckers: Yeah. I have what I refer to as my day job. I'm a director of quality at a med device component manufacturer. But I also have a consulting business on the side, and this is part of the consulting business.

 

I'm not representing my employer, and the disclaimer is none of the specific examples come from my current employer or my day job as, as I refer to it, they all come from other sources.

 

Etienne Nichols: Yeah. Okay, well, appreciate that. And I can totally understand sometimes I, when I'm going through my background or giving a story, I'm thinking, okay, what specific details should I leave out? And I know that's for, for quality minded people, that's, that's a, can be a struggle.

 

So, I appreciate and respect that 100%.

 

So, one thing that I'm curious about, so I've, I've read the book, and I loved what you. I actually have a lot of pages dog eared and some things highlighted.

 

But before I get into that, I'm curious what, what actually prompted you to, to actually take the time to write all of these things down?

 

Kevin Beckers: Actually, I'll start with what prompted me to start the consulting business.

 

I worked at Hutchinson Technology for 28 years. They did a great job of training us. Okay, so.

 

And I started there fresh out of school when I was really young. I thought everybody knew this stuff.

 

Okay. HTI made suspension assemblies for rigid disk drives and computers. You don't see them around very much anymore. It all went to flash.

 

So, I ended up leaving HTI. And after I left HTI, I kind of realized that nobody knows this stuff. And that's an exaggeration, but I mean, I was surprised at how little it was known out there.

 

And then I'm engineering by training at heart, and I was in a management role with a lot of the more of the soft skills and I kind of felt my technical abilities slip away if you will.

 

And I decided no, that's not going to happen.

 

So, I started a consulting business focusing on a lot of technical training. And then the book was an offshoot of all of what I just mentioned. The fact that I get asked the same questions over and over again.

 

I end up solving the same problems over and over again. And that's true in the consulting business. This when I work with clients as well, that a lot of the same questions arise.

 

Etienne Nichols: So, when you said a lot of people don't know this or maybe nobody knows this, I know it's an exaggeration, but if nobody knows this, can you give us an example of one of the things that really stood out to you as a shocker that nobody knows?

 

Kevin Beckers: Well, there are a bunch of them.

 

Etienne Nichols: There's probably 34 based on the ones in your book.

 

Kevin Beckers: 34. There's more than 34 because I'm working on a second edition of.

 

Etienne Nichols: I'm excited for it.

 

Kevin Beckers: You know, one example is that a Fortune 500 company had us using R squared linear regression for measurement correlation between two pieces of measurement equipment and turned out that our process was very capable to the point where the capability of the process made the measurement system look bad.

 

When the measurement system actually was performing great. The CPK was on the order of 12.

 

I mean and every a measurement has to be taken for every part in the CPK.

 

I've had that question arise on multiple times after that. I mean we ended up ended up working with that Fortune 500 company. We came to an agreement, but it's come up two or three times after that.

 

Things like statistical control. I've had customers tell me that if R squared is low, your measurement process is out of control. Well, look at the math. They have nothing to do with one another.

 

Absolutely nothing.

 

People still get control mixed up, statistical control mixed up with parts being in or out of tolerance. They have nothing to do with one another. The tolerance shows up nowhere in the control limit or control chart formulas that are used to determine whether or not you're in control.

 

Things like that people. There's still confusion about well, what is the job of a quality inspector or a quality auditor. I've had some people tell me the job is to find defects.

 

Well, that leads people to over inspect. If you're talking about auditing a QMS system, it leads the auditor to maybe make up a requirement that isn't really part of the standard and that leads to inefficiencies in the system. So, I'm say finding a defect is not the job of a quality inspector or quality auditor.

 

Some people will say it's you get the parts out the door. Well, I think the downside of that is obvious, right? Then we ship maybe defective product or whatever. So, I simply define it as find and report the truth.

 

If we're doing great, tell me we're doing great, and I'll go fix a real problem.

 

If we're having problems, tell me exactly which problems we're having so I can fix them.

 

Yeah, things like that.

 

Etienne Nichols: So, I actually had a conversation, I was talking to somebody yesterday about this, about the, the way you incentivize good or bad behavior, or you may not even realize it. Like you mentioned, you just gave a couple examples. If, if the QC inspector thinks their job is to find defects.

 

And however, you know, however you incentivize that, whether you say, okay, so many defects found, you know, it gets some sort of reward.

 

You've, you've incentivized, really scrutinizing and looking for issues rather than what you said is, is the true job, which is searching for the truth and reporting the truth.

 

And I noticed when I read your book, lots of you have lots of different examples where you show if you are doing this, you're incentivizing this potentially bad behavior.

 

How do you exercise or grow that muscle of realizing what you're incentivizing? Can you expound a little bit on that?

 

Kevin Beckers: I think it's similar to the FMEA concept. You try and figure out what could possibly go wrong. How could somebody possibly misinterpret this? Communication is another issue that is really difficult for engineers.

 

They should be good at it, but they're not. But a couple of examples of incentivizing their own behavior. We had competition from one shift to the next. Okay, so get, how many parts do you get out?

 

Hopefully all good parts. But that led to a behavior where somebody might run a tool past the point where it should have been repaired and then the next shift would have to deal with it.

 

But the problem is that turned a 20-minute repair into maybe a six hour repair.

 

Bad for the company, right? Good for the shift based on how the incentives were set up, but bad for the company.

 

I had an example where I was department manager, we had a budget to control too. And the rating scale at the end of the year for Performance was set up that the only way you could get the highest rating is to come in under budget.

 

You can't get a high rating by being right or being accurate. And I sat down with the plant manager, and I said, okay, you're giving me a choice between being foolish or dishonest.

 

There's no way to honestly win this game. Because if I'm.

 

If I'm not going to be foolish, I'm going to pad the budget knowing that I can come in under budget. But what good does that do to the company?

 

And if I'm going to be honest, then I'm locking myself out of a good performance appraisal.

 

So, in that case, I was able to negotiate with the plant manager to get a rating that focused more on being right as opposed to being under.

 

But my goal was to maybe change the culture of the company. That didn't happen. I just got my own performance rating adjusted.

 

Etienne Nichols: Well, I guess that's a small win, maybe not winning the battle.

 

And it kind of segues a little bit because you mentioned something dishonest.

 

If you pad the budget, maybe that's dishonest. And I know you're working on the second edition of the book you said. And one of the chapters I think we mentioned was on ethics.

 

And I wondered if you could tell us a little bit, maybe give us a sneak preview on what might be included in that or what a quality engineer specifically might be interested when it comes to ethics.

 

Kevin Beckers: Yeah, I'll probably start with a story that's embarrassed, still embarrassing to me 40 years after the fact. Okay. But it's going to be in the book, so I have to beast to talking about it.

 

Anyway.

 

So, I was a young engineer and had a young wife and toddler at home living paycheck to paycheck.

 

And we messed up an experiment that I needed to report to the customer. I went to my manager and said, hey, we messed this up, we're going to have to start over.

 

And he looked at me and said, no, we didn't mess it up. Part number one measured 28 and part number two measured 32. And part, you know, it was just making up numbers.

 

The implication was obvious.

 

Make up data to give to the customer, which isn't real. It's falsifying data, essentially.

 

The embarrassing part is I did it.

 

And the reason is what I mentioned.

 

Young wife and toddler at home, paycheck to paycheck. Well, I lost sleep for two or three nights, finally decided that the job isn't worth the principal, and I went in and essentially told the manager, I said, don't ever ask me to do that again.

 

Now, thinking that I might be out of a job five minutes after that fact.

 

It turned out actually quite well in the sense that I worked closely with that manager for many years after that. We never talked about it again. He never asked me to do anything like that again.

 

But I did it the one time. And I think the moral for that one is if you're a manager, understand that people might accept bad direction because of their life situation.

 

If you're a new engineer, I would say, you know, realize that you might be put in that situation, and it's probably better to think about it a little bit in advance as opposed to in the moment not knowing what to do.

 

Etienne Nichols: Yeah, I. That's. That's a really good point because, I mean, I can remember. I remember a young engineer coming to me at one point where it's not quite the same situation.

 

But, you know, and I was young at the time, too. I'm. I like to think of. Maybe I'm still young, but he came to me, and it was during an FDA audit where we were.

 

It was a guided inspection.

 

They were looking for certain pieces of data or certain. They were looking at certain inspection things that we had done. And one of the items he had printed off in color, you know, if it was.

 

The inspection was out, it was in red. And so, he came to me and said, the director of quality wants me to print this in black and white before we take it into the inspector, the FDA inspector, so that it.

 

Maybe he'll miss the fact that we passed something that was out of tolerance. What should I do? I'm like, well, you know, that's. That's weird. And I don't know. I. Maybe I'll just present that to you like the que. The data's still going to be presented, the true data, but we've tried to hide it because by printing it in black and white or whatever. I don't know. Curious what your thoughts are because it's such a gray area. It felt like a gray area to me, too.

 

Kevin Beckers: I don't think that's a good idea to try and hide it because we had some training not too long ago, and the trainer used to be an FDA inspector, and he said that if the FDA designates a company as sneaky, which. That sounds pretty sneaky to me. Yeah, they might be back every six months instead of every two years, and then they're probably going to look harder. I. Yeah, to me, that doesn't sound Like a good idea?

 

Etienne Nichols: No, Interesting. And well, I totally agree. And that's.

 

He and I sort of decided that together, we being lowly manufacturing engineers, thinking about a director, you know, presenting data. But so looking back on that's kind of interesting to think about.

 

But the fact that you said the designation of sneaky, I mean that, that basically is saying one employee could impact an entire organization and, and their FDA inspection schedule.

 

Kevin Beckers: Yep.

 

I think I have actually another example on the subject of ethics, which I think is important to go to. It was years ago, and a manufacturing manager asked me to have a conversation in the cafeteria.

 

Well, cafeteria conversations are off the record, right? Yeah.

 

He wanted me to make a decision.

 

And his point, I think his exact words were there's a pile of money out there and we should grab our share while it's still there. And the decision he wanted me to make would have been good in the short term.

 

In other words, year-end bonus, think year-end bonus. So, it would have been good for us in the short term. But I was 90 plus percent confident in the long term it'd be bad for the company.

 

So, I declined to make that decision.

 

But the gray area is what if I was only 80% confident it was going to be bad, a bad long-term decision. What if it was 70?

 

What if it was 50? 50 knowing that it would be good in the short term for me personally, what if there's a 30% chance that it would be bad long term?

 

So, the point being there can be a lot of gray area even in ethics, when I was 90% certain it was bad for the long-term health of the company.

 

The ethics are clear. No, I can't make that decision.

 

But think about what if it wasn't 90? What if it was less than 90? At some point we all have to draw a line. And I'm even open to the idea that people can draw lines in different places, places than I do.

 

And that doesn't mean somebody is right or wrong. But yeah, it's not always quite black and white.

 

Etienne Nichols: I, I respect that you say that about. It's like a risk-based approach.

 

So, if you like that conversation you had, if you make that decision, same way we decide a lot of the things with, with designing and developing our products, then you also should be willing to be audited potentially later. You know, somebody say, oh, you made this decision.

 

Kevin Beckers: Yeah, yeah.

 

Etienne Nichols: Interesting.

 

Well, I'm just going to open up one more. Let's see. There was a. Something I wanted to ask the five worst possible answers.

 

Well, I thought was a good chapter.

 

Kevin Beckers: In your book where the worst possible answer.

 

Etienne Nichols: Right, the worst possible. Yeah, sorry, I'm looking at the chapter page. Yeah, the worst possible answer.

 

Um, I thought, thought, you know, I first looked through them, I thought, oh, the, the wrong answer makes sense.

 

Um, answer that costs a lot of money answer. We don't like answers that create a lot of work. But I wonder if you could expound or maybe tell some stories about what you think the worst possible answer is to a question.

 

Kevin Beckers: Yeah, for me it's a two-part answer. And the worst possible is wrong but believable. Because if it's believable, we're likely to act on it. And if it's wrong, we're going to take the wrong action on it.

 

You know, my first example on that one is actually not work related and I do have my wife's permission to share this, but a number of years back she had a positive marker for ovarian cancer.

 

Okay.

 

I looked on the Internet and found out that with that positive test and certain symptoms, there's greater than a 90% likelihood that it's true. And the symptoms were there as well.

 

Without going into a lot of detail, six weeks and a major surgery later, we found out it was a false positive.

 

You can kind of imagine probably what those six weeks were like. And then don't forget, there was a major surgery thrown in there that ended up being false positive. Anyway, we bring it into the work realm.

 

I was an engineer, I got a call on a weekend, and they said all of our measurement equipment is failing. So, we had to use real parts to monitor our measurement equipment because there was no NIST standard that we could use in this particular case.

 

And the parts were fragile; they could be damaged. So, the question was, do we shut down our entire operation? Well, the 24/7 operation, we were making millions of parts a week, literally millions a week.

 

So, the decision was in the seven figures. The seven-figure decision.

 

So, on the one hand, we had the parts, said the measurement equipment was bad, but we had multiple pieces of measurement equipment. They all agreed with one another. Well, we knew the parts were fragile, so we had a backup.

 

Right. We had a backup set of parts. The backup parts also said the measurement equipment was bad by about the same amount in about the same direction. So, there's two extremely unlikely events.

 

Number one unlikely event is all the measurement equipment went bad at the same time.

 

The second extremely unlikely event is both sets of parts that we use to monitor the equipment went bad in the same direction at the same time.

 

I ended up making the decision that we would keep running. And it wasn't because I figured out one was right, and the other one was wrong. It was because we, we were at a point where we couldn't make enough parts to satisfy demand.

 

And I know of. If I made the wrong decision, we could fix it Monday and the cost would be high and bad parts produced.

 

But if I shut everything down, that capacity was never recoverable. It was gone forever. So, it was if I made the wrong decision. It was expensive in either direction, but one was more recoverable than the other.

 

Turns out we had a sister company four hours away.

 

We got their backup set of parts, brought them in, found out the measurement equipment was right, some reason our parts went bad, never figured out why or what happened there.

 

But both of them were believable. One was wrong, but both were believable.

 

Etienne Nichols: Yeah, that makes a lot of sense.

 

And so going back. So, if we tie that together with what you were saying about CPK and PPK,

a wrong but believable, potentially depending on your experience with those, those tools.

 

I also want to tie that back and just tying three things together. So, work with me here. But you mentioned the people who don't know anything about some of this when they're, when they're coming out and they're working in the field.

 

How does someone see that wrong but believable answer? Maybe see these tools but doesn't have the experience to use those tools. How do they know when they don't know? You know what I mean?

 

How do they get to the point where they recognize, you know, maybe I don't know something here.

 

Kevin Beckers: Yeah. And I think it's an interesting point you brought up because part of the reason I even went down that road of wrong but believable is because it can happen so easily in statistics.

 

You know, in the book, there's an example where we had to calculate CPK to accept a lot of products or a job of product. And there were 50, 50 numbers and I had to approve it.

 

And I looked down the 50 numbers, and I told the QC inspector, I said, I think you punched the wrong button on your calculator. Would you please re-enter it?

 

And she looked at me like I was crazy. And you can't calculate standard deviation in your head. 50 numbers. And she was right. Hopefully. Not that I'm crazy, but that I can't calculate standard deviation with 50 numbers.

 

But what I can do is find the smallest and largest and do the range in my head. And I know that mathematically one standard deviation cannot be Larger than the range.

 

How does that happen? You forget a decimal point. That's how it happens.

 

So that doesn't answer your question.

 

How does one know what you don't know? But that's an example of, again, a wrong but believable. That one was believable because occasionally our process would produce results that failed in that manner. But the way to do it is to study the subject matter.

 

Right. I mean, you have to go to school, ask certifications. I have a number of certifications.

 

Learn the science behind what we're doing and make sure that you're competent at it.

 

Etienne Nichols: Yeah. Never stop hammering on your craft.

 

That's really good.

 

You know, another thing I was thinking about, so I had a conversation with several people yesterday, and some of these things are bubbling up into my, into my head here. But when you're talking about just continuing down that, that road of hammering on your craft, becoming proficient in your field, you know, QEs and I actually kind of lump HF almost human factors and quality engineers, for some reason, they get a little bit of a bad.

 

They've gotten a bad PR over the years or, or maybe, maybe this is. Is getting better. I don't know. But when I was in manufacturing, just kind of watching product development and quality engineering go back and forth, and, you know, I tried to stay out of it, but sometimes you get sucked in.

 

There was some, some product development seemed to have a little bit of a.

 

They, they didn't really want QE getting involved in their stuff. Maybe they didn't want human factors getting involved in their stuff. And I've been thinking about this lately. Why is that?

 

And I think inherently a product development engineer thinks or, and maybe they're right to a certain degree. We try to include human factors in our design of a product.

 

We tried to include quality measures in the development of our product.

 

What do you have to offer me?

 

And I, I'm just, I may be taking the devil's advocate side here, but I wonder what your answer would be to that.

 

I have my own answers or thoughts on it, but I'm really curious what your thoughts are.

 

Kevin Beckers: I think there's another reason I've been around a few decades. Okay, yeah, I think this part is getting better. Hopefully it is. But if you go back far enough, the quality department used to be where some companies would dump the engineers that didn't quite cut it in other engineering departments.

 

So, there were a fair number of people who really didn't study the science and really didn't understand what, what was really going on. So, I think that's part of it.

 

I think part of it is quality department has or has to deliver bad news a lot of times, right? I mean, we're the ones digging through the data to find what's not working.

 

And we're often focusing on what's not working so we can fix it. Right. We don't spend a lot of time focusing on what is working.

 

So, I think that for maintaining credibility, the first one is you have to be competent. We've talked about that a bit already.

 

I think a second one is you have to be willing to help solve the problem. You can't be the person that just comes in and points the finger and then walks away and kind of, you know, brushes their hands.

 

You have to actually be willing to help the problem if you. Or fix the problem. If you find out that we didn't follow a procedure, well, maybe the procedure wasn't clear.

 

So maybe I should be the one volunteering to fix the procedure, to clarify it instead of just pointing out that the procedure is bad and somebody needs to fix it.

 

That'll go a long way towards, I think, correcting the perception, if you will.

 

Another thing is to be reasonable as well, right? I mean, I've seen instances where somebody will reject a document for a typo that everybody knows the word is supposed to be the not th without the e.

 

And that rejection could cause a couple weeks of rework, depending on, you know, what the approval process is.

 

And I usually take the approach that if the intent is crystal clear, I'm not going to reject for something like that because it creates resentment among others. And the long game is a working relationship because we all have to work together, right?

 

I'm not willing to give up the long game for the short game. So, I will. I will not, for example, reject a document based on a small technicality that has no practical impact for the business.

 

Etienne Nichols: I really respected that when you wrote that because when I was reading something similar in your book, I thought it was funny how you said, it's actually difficult for me to write this article because the perfectionist in me.

 

And I respect that.

 

But that's a really good point. You know, I've. I've argued sometimes if. If someone wants to correct your grammar about.

 

Instead of saying, I'm trying to think of an example here, the place I want to go to versus the.

 

Or the thing I want to do versus in which I want to do, you know, the. In which the point of communication is to convey your thought. And so, if you've done that you've accomplished your goal.

 

Obviously, there may be better ways to do it, but I. I really respect that.

 

I want to use that.

 

That thought that you said as far as sequential approval and how long it can take sometimes, and really understanding the process, because I've experienced that where you have a change order that sits on a desk for a while, and finally they get around to rejecting it.

 

And it's already been through, you know, regulatory and doc control and so forth, and now it's been rejected by product development. Has to go all the way through that. Again.

 

One of the things you talked about in your book was flowcharts and understanding the flow of how those things should work and how to make them a little bit more efficient.

 

I'm getting to a question here, so thank you for your patience. But the question I have is I've heard people say in your SOPs, if you put flowcharts in there, you can actually set yourself up for a trap.

 

Now, there's two things here. Flowcharts in general for your process, and then flowcharts for your sops. But if specifically for your sops, do you agree with that? Do you disagree?

 

How does that work with updating those things and making sure you're not putting it in writing as well as a flowchart and tripping yourself up?

 

Kevin Beckers: Yeah, I've been caught by that. Auditors, right? Auditor, look at you and say, well, your procedure says something different than your flowchart says. And I guess to jump right to the answer, whenever I write a procedure like that, I put a statement along the lines of, in case of any discrepancy, the text takes precedence over the flowchart.

 

And it's really tough for an auditor to write a finding. They can. They can say, hey. They can point it out and say, hey, you should really fix this. But it's really tough for them to write a finding.

 

So, yes, it is an audit trap, but there's an easy way around it. Hmm.

 

Etienne Nichols: Okay. Well, I think you found option C for me, that I'm.

 

False dichotomy, Etienne. Okay, there you go. Well, what. What about flowcharts in general? How are some. Can you give some examples of how they work now that I've. I have a gate to use them?

 

Kevin Beckers: Yeah, I. I like flowcharts because they can take very confusing text and make it easy to follow. And they. They also point out gaps in the process. Right, like that. The one you're referring to is a serial approval process that could be made parallel and essentially cut off 80% of the approval time.

 

And it's very easy to present to management the, the difference between the serial and the parallel. And especially if you, if you have data on the times taken for the serial process, it's easy then to just point to.

 

Well, the longest one is the total time if you do it in parallel as opposed to adding them all together.

 

Etienne Nichols: Yeah, that makes sense. What another issue that I've seen with, with you know, a flowchart towards the. Maybe it's in the appendix. Wherever it is, the flowchart you have to print off in engineering, you know, massive paper because they have mapped out the entire process and you have to get your magnifying glass out to look at it. Do you have specific guidelines in setting up those flowcharts like size, whatever that makes it actually possible to be a tool rather than a hindrance?

 

Kevin Beckers: Well, I like. First off, I like linear flowcharts. I've seen some flowcharts that have a lot of circular loops, and I don't think they help make things clearer. I think they actually tend to confuse the issue.

 

If they get large, I will, I'll just have them as multiple pages and I'll have connectors that go from one page to the next. I hear what you're saying. I've seen that. I don't find that to be very helpful.

 

I should say it's helpful for the people putting it together. It's not helpful for communicating to others. So, if I'm communicating to others, I would generally try and break it up into smaller chunks and then have connectors in strategic places.

 

Etienne Nichols: Yeah, that makes sense.

 

I guess really the point I would want to. Or that I'm sort of drawing from what you're saying is remember the why. Remember the reason you're putting that together.

 

Not to be pretty, not to be fancy.

 

Kevin Beckers: Yes. To communicate. Yeah. Always start with the objective in mind. Right. Or start with the end in mind.

 

Etienne Nichols: Exactly.

 

So, what are some other quality myths and lessons learned?

 

I saw that you had 34. I don't know how many you're adding to the second edition.

 

What are some of the ones that you.

 

Maybe your top favorites.

 

Kevin Beckers: Actually, the second edition is going to expand the scope a little bit beyond myths and lessons learned. Maybe, but one of them is having a principle-based business or decision-making process.

 

And that came over the years.

 

You know, in quality you get asked to do a lot of things that might not be considered ethical. And to be fair, a lot of them are just plain out of ignorance.

 

Right. Somebody will say, well, you probably found the only defect in the Whole job. Can't we just sample it again?

 

And that's really an ignorance question. They believe what they're saying.

 

But in the quality field, that's, that's known as, you know, unethical practice because it's well known that even if a job is 10% effective, if you sample enough, often enough, eventually you'll find it.

 

One sample that passes, and that's not considered appropriate practice. But you get put in a lot of situations.

 

So, I started to go off Stephen Covey's book, 7 Habits of Highly Effective People. And he talks about living your life according to a set of principles. And I thought, well, why wouldn't that work for business?

 

It's even maybe more important for business.

 

There's an example in the book. I say, you know, the start with four principles. The first one is we're going to obey the regulations.

 

Second one is we're going to comply with quality agreements for that we have with customers.

 

Third one, we're going to follow internal procedures, subject to 1 and 2.

 

And then the fourth one is we're going to do things as efficiently as possible, subject to 1, 2 and 3. And I found that that really helps me to have that in the background because, you know, I can be involved in discussions.

 

Well, do we, we found a problem. Do we tell the customer? Do we not tell the customer? Well, to me, that's an. That's the wrong question because it has a really easy answer.

 

What does a quality agreement say? What do the regulations say? If they say we tell the customer, we tell the customer. It's that simple. The question then is, how do we tell a customer?

 

Because it doesn't do anybody any good to present false information or to present even the wrong impression of what's really going on. Are we even sure of our data? I mean, are we positive that we have a problem?

 

Sending a customer on a wild goose chase, as they say, does not help anybody, but it changes the whole discussion if the question is not do we?

 

Rather, the question is how do we.

 

And it really helps move forward because there's, there can be a lot of pressure, you know, to, to avoid something that might be embarrassing or expensive or, or a lot of work, if you will. Yeah.

 

So, I found that having that basis of principles, and there are more, that that's just a brief example. I think another example of it in there is DHR device history record in the medical device industry.

 

That is the story of how the parts were built. Right. And it has to tell the complete story. Well, what if you run into a case where this paper is messy and it needs to be rewritten to present to the customer.

 

And even the customer wants it neat. Right.

 

Can you rewrite it, or can you not rewrite it? And I think if you understand the purpose of the DHR, the answer is yes, you can rewrite it. If you keep the original and you clearly annotate somewhere, maybe on both the original and the copy, that this was recopied for clarity reasons.

 

Right. So, you keep both records. You're not hiding anything at that point. You're doing what the customer asked for, still complying with the intent of a DHR.

 

Etienne Nichols: That is. That is a really great point because I can remember actually running into that issue where only one person can read this other person's writing. And that's. That's just a really eloquent way to handle that situation.

 

The. The hierarchy of principles, too, is something that I. I don't. I'd like to emphasize, I guess, because when I read that or saw it in your presentation in Minnesota, I think that was really powerful.

 

Regulations first, then internal quality agreement. Right. And just understanding how you're going to. What's going to supersede something else. I thought that was really fantastic.

 

Kevin Beckers: Yeah. And I think part of that chapter, too, is normalization of deviance. I think that ties in here as well. I first found out about it in an article in the Star Tribune, the Minneapolis newspaper.

 

It was a person from the financial industry who just got out of jail, and he described a culture where it was very competitive.

 

So, to get your bonus, you had to start pushing the rules a little bit. And then everybody was pushing the rules. And to get the larger bonus, you had to actually break the rules and start pushing ethics till everybody did that.

 

Then to get the biggest bonus, you had to become unethical. And it eventually led to a point where you had to break the law to get the biggest bonus. Well, he did that.

 

He got caught. He went to jail. He came out, got out of jail, told the story.

 

If you do a search on Google, you can find a bunch of other examples. The space shuttle with the foam that broke off and damaged heat tiles.

 

That had happened in previous flights and originally caused a lot of consternation, but it became accepted as the norm until the one shuttle catastrophe.

 

And going back to those principles also helps us recognize when we might be starting down the path of normalization of deviance. It can happen to any business. And the insidious thing is it happens slowly over a period of months or years, and it happens in small steps.

 

So, it's not always easy to recognize. But if you have that firm foundation in principles, I think it makes it much easier to recognize. And the goal of any company should be recognized it long before it gets to the point of an ethical, or even worse, legal consideration and then take action to correct it in a timely fashion.

 

Etienne Nichols: Yeah, that.

 

I appreciate you putting the links to the articles in your book as well, because I actually, even though I had to type them out, I actually went to some of those and read about the Challenger and so forth with the foam.

 

That was really interesting.

 

You've probably already written the second book, but I highly recommend that again because that was helpful. It was really cool.

 

Do you read a lot of other books, just out of curiosity?

 

Kevin Beckers: Yeah, I've told this story too often, but I actually bought a calculus book. I have it on the TV stand next to the chair in the living room.

 

When the commercials are boring, I might read calculus. I got a free book at the Minnesota Quality Conference because I spoke up, and they were just trying to encourage.

 

Trying to encourage participation.

 

And I thought that one might be a waste of time, but I'm finding it to actually be very interesting and useful.

 

Etienne Nichols: Which one is that?

 

Kevin Beckers: Harris Semiconductor. I can't remember the name of it offhand. I wasn't planning on talking about Yellow Harris Semiconductor as a company and how they really turned around some problems in. In one of their facilities.

 

And it actually touches on some of the things that we've been talking about today.

 

Etienne Nichols: Yeah, there's another book I was actually thinking, and I. I actually wasn't planning to talk about this either. Honestly. I have books on my desk because I read. I'm pretty much reading all the time, but.

 

Howard Roots, Cardiac Arrest. I don't know if you're familiar with that one. I think it might be a Minnesota company, but it's cardiac arrest.

 

Five years as a CEO on the Fed's hit list.

 

An issue they got into with ad promo, which might be interesting. It's a little downstream from manufacturing, but anyways. So, the reason I brought that up is when you were talking about the normalization of deviants, I actually glanced down. I just finished this book.

 

Have you ever. I actually don't know if I recommend it. It's too heavy. It's called Ordinary Men, the Reserve Police Battalion 101 and the final Solution in Poland by Christopher Browning.

 

Kevin Beckers: No, I'm not familiar with that one.

 

Etienne Nichols: This is. Basically, the entire book is about what the subject you're talking about. Normalization of deviance. And I actually wanted to read the last thing because not to push back on what you're saying about it happening in steps because I think that is the way it happens in business.

 

I totally agree.

 

But he talks about how instantaneous it can happen. This is basically 500. I'm sorry about this, I hope you forgive me this tangent, but A group of 500 people from Hamburg sent to polls Poland, civilians, mostly non-military background, commanded to do horrible things.

 

And his conclusion, after looking at a lot of different studies, he says, I fear that we live in a world in which war and racism are ubiquitous, in which the powers of government mobilization and legitimization are powerful and increasing, and in which a sense of personal responsibility is increasingly attenuated by specialization and bureaucratization, and in which the peer group exerts tremendous pressure on behavior and sets moral norms. And this is the phrase that I wanted you to hear.

 

In such a world, I fear modern governments that wish to commit mass murder will seldom fail in their efforts for being unable to induce ordinary men to become their willing executioners.

 

Now that's a very extreme and heavy example, but I kind of plays exactly what you're saying.

 

Kevin Beckers: I believe that it can happen quickly. I think in, in business it's more often or more common to have the slow and insidious where it just, we slowly migrate away from, you know, another thing, another example would be. Well, you know, the procedure says this, but we don't really do it that way. We really do it this way. And then that attitude or that culture progresses throughout the organization to the point where pretty soon we're not following anything that our procedures say because we just don't do that.

 

Well, that's, that's another example of potentially normalization of deviance. It also ties back into that embarrassing story I told though, that people sometimes will react in ways they would not normally react due to, I didn't say peer pressure, but you know, superior manager, whatever direction. Peer pressure is a big part of it as well.

 

And I've seen peer pressure in discussions in this industry or in businesses where yeah, it goes back to the principle-based thing. Right again. Because that's the best way I found to overcome the peer pressure.

 

It's like, no, I can't do that.

 

Etienne Nichols: It's an important point because young engineers, they may think this is the only opportunity. This is my job and what other options do I have?

 

Like if I get fired, how do I get another job? I mean, I don't know if you have any examples of this or thoughts, I'm Sure. You've hired different people.

 

What are your thoughts if someone gave a story like that during an interview? Or do you have any interview stories you might share?

 

Kevin Beckers: It's actually a tough one because if you're only hearing one side in an interview. Right. Yeah. If I got the sense that the person was being straightforward and honest, I would view it as a positive rather than a negative.

 

But you have to walk that line where you can't be seen as bashing your former employer in an interview. You have to tell the truth; you have to answer the question.

 

It's kind of a fine line. But as long as I thought that the person was being honest, I would view it as a positive, not a negative.

 

As far as the general concept about where would I find my next job, I think people get more nervous about that than need be. If you've got a solid background, right. You've got, you know, education, experience, you said not much experience, but you've got the education.

 

I think it's not the end of the world that some people think it might be.

 

Yeah.

 

Etienne Nichols: Especially if you have that calculus book sitting on your table that you read when you're bored.

 

Kevin Beckers: Yeah. I'm not sure I should have told that story. People think I'm weird already. That's not.

 

Etienne Nichols: Well, I think as an engineer you're supposed to have some of those nuances.

 

Kevin Beckers: So that's good.

 

Etienne Nichols: So, this has been really great. I really appreciate you taking the time. What other do you have? Any other just thoughts that you'd like to drop for the, for the listeners today that you think would be beneficial?

 

Kevin Beckers: Yeah.

 

Most important part of a quality management system. That's another one that I think people don't always understand. I hear a lot of things. I. Somebody. Some people might say the CAPA system is the most important part.

 

Some people might say document control measurements, you don't measure it, you can't improve it, whatever. But I can rebut any, I think any of those with a couple simple questions.

 

You know, let's start with the CAPA1.

 

CAPA system is most important. Well, what if the CAPA owner's manager says, I need you to work on these other things, not the CAPA that's been assigned to you?

 

Or if you say document control is the most important, the rebuttal would be, what if the CEO is the person putting post it notes on the equipment telling people to do something different than what the procedure says? Or if CEO is too extreme for you, middle manager, lead supervisor, whatever. But what if management is the one telling People not to follow the document control system measurement. What if you need a CMM and you have a ruler? Well, who authorizes purchase of CMM?

 

Somebody challenged me on it just recently and said that, well, quality is everybody's responsibility. You're giving a free ride to those people by saying it's management. And my response to that was who sets the culture?

 

Who sets the expectation that quality is everyone's responsibility?

 

I think my answer is clear.

 

Management responsibility is the most important part of a QMS. Without management support, we can, we can achieve local victories, small ones, but we're never going to achieve the big cultural victory. So, I think that is the most important part of the QMS and that one's one that I wanted to bring up today.

 

Etienne Nichols: That's a great point. I appreciate you bringing that up.

 

Just to add one question onto the back of that.

 

Are there things that you see management doing better than others in that management responsibility to promote that culture of quality?

 

Kevin Beckers: I think whoever is the top management in the site has to be out front promoting it, talking about it.

 

See often where you get people asking maybe 10 times a day, are we getting the parts out? And then once a quarter in an all-employee meeting, you talk about quality.

 

Well, talk to psychologists, whatever. I'm not a psychologist, but at some point, I think it boils down to what you ask about the most is what comes across as most important to people.

 

And if you ask about output 10 times a day and quality once a quarter, I think it's clear to most people what the priority is.

 

Etienne Nichols: Great point, that's very good. All right. Where can people go to either find your second book. When's it coming out, by the way, do you know?

 

Kevin Beckers: Well, that's my side gig, right.

 

I'm hoping by the end of 23. It's not that close. I just actually committed myself to doing it in the last month or two.

 

But it is my side gig, so I can't tell you for sure when it's coming out. Sorry.

 

Etienne Nichols: That's. That's fair. No worries.

 

We'll put in the show notes how to find your first book, which I still. I mean it's. It's fantastic. So, I'll put links to there. Where can people find you?

 

Are you active in any way in social media?

 

It's okay. If not.

 

Kevin Beckers: I do have LinkedIn account. I'm not that active on it. I check it when I get notified that something changed, but I'm not very active on it in general. But that's really the social media.

 

Yeah, the book is on Amazon, second edition will probably be black and white because it'll be a heck of a lot cheaper.

 

The color makes it a little more spendy.

 

Etienne Nichols: Sure. Cool.

 

All right, well, we'll put links to your LinkedIn and so forth. So, if you as the audience have a question for Kevin or want to reach out and talk more, feel free to do that.

 

This is great. Thank you so much, Kevin. I really appreciate you taking the time to visit with us today and we'll let you get back to it.

 

Kevin Beckers: Yeah, thank you.

 

Etienne Nichols: All right, thanks for tuning in to the Global Medical Device Podcast. If you found value in today's conversation, please take a moment to rate, review and subscribe on your favorite podcast platform. If you've got thoughts or questions, we'd love to hear from you.

 

Email us at podcast@greenlight.guru.

 

Stay connected for more insights into the future of MedTech innovation. And if you're ready to take your product development to the next level. Visit us at www.greenlight.guru until next time, keep innovating and improving the quality of life.

 

 


About the Global Medical Device Podcast:

medical_device_podcast

The Global Medical Device Podcast powered by Greenlight Guru is where today's brightest minds in the medical device industry go to get their most useful and actionable insider knowledge, direct from some of the world's leading medical device experts and companies.

Like this episode? Subscribe today on iTunes or Spotify.

Etienne Nichols is the Head of Industry Insights & Education at Greenlight Guru. As a Mechanical Engineer and Medical Device Guru, he specializes in simplifying complex ideas, teaching system integration, and connecting industry leaders. While hosting the Global Medical Device Podcast, Etienne has led over 200...

Search Results for:
    Load More Results