Blaming and shaming nurses and doctors and pharmacists and other healthcare workers for medical mistakes doesn’t improve safety and transparency.

We need accountability and reliability in our processes, but humans make mistakes that are almost always exacerbated (if not primarily caused by) system problems. There is a better way to improve safety, prevent “second victim” syndrome, and encourage reporting of errors. It’s called Just Culture.

Check out this articulate and useful analysis of the RaDonda Vaught Vanderbilt nurse case from a Just Culture perspective.

Here is a useful tool from the National Patient Safety Foundation to help you implement Just Culture in your practice!

Check out the original Facebook video here and leave your thoughts and comments. If you’re running short on bandwidth and phone battery life, you can listen to the optimized audio-only podcast version here on iTunes and Soundcloud. Please subscribe and leave a review, it helps us a lot! The podcast is now available on Spotify as well, spread the word! FULL TRANSCRIPT BELOW.

 

 

 

– What’s up everyone, it’s Dr. Z. Check it out. Today I wanna talk about something really important in healthcare and beyond healthcare, actually. And that is getting blame out of the system.

In other words, in healthcare errors happen all the time. It’s a really complex system. And people have said since the Institute of Medicine Report in 1999 that we’re basically crashing a jumbo jetliner full of people every single day just from medical errors. And the question is how can we make this better?

Is the answer punishing people? Is the answer blaming people? And we recently talked about this Vanderbilt nurse who was arrested for making a medical error that led to a death. Is this the right way to do it?

And I’m gonna argue that absolutely not. It is not. There is a better way. In the show notes for today’s show I’m gonna link to a National Patient Safety Foundation sort of flyer that outlines what we call “Just Culture”. Just Culture is a fancy way of saying instead of blaming people let’s actually improve the system in a nonjudgmental way and hold people accountable where it’s necessary but also hold the system accountable so we can improve it and make sure we make less mistakes.

Now there are two big myths that we operate on in the world that damage our ability to actually reduce errors. One of them is if people just try harder they’ll make less mistakes. This is simply not true. This is not how humans work. Humans who go into medicine, in particular, into healthcare are really, really, really smart, dedicated, passionate people who care about other people. They don’t make mistakes because they’re lazy. They make mistakes because they’re human.

The second thing is that the big myth is that if you punish people for mistakes they will make less mistakes. These are mistakes, we’re human. Now there’s a role for discipline in terms of reckless errors, intentional errors, yes. You can deter those with discipline. But not regular old mistakes. So that framing, what we’re talking about, let’s go through a way that we can apply a Just Culture model to mistakes that might happen in healthcare.

Okay, think of it this way, there’s a green light, a yellow light, and a red light. A green light is the first level of looking at mistakes, and it’s saying okay first of all, did the mistake involve a violation in what is the standard of care? In other words, did you deviate from what most people would do in this case, and what’s the normal practice? If the answer is no, but a bad outcome still happened, what do you do? You console the person that this happened to. And that includes the patient but also the caregiver. So a good example is a pediatrician gives amoxicillin to a child for an infection, the child develops anaphylaxis and dies. They never knew they had anaphylaxis. They’d never been allergic to anything in the past. This was a standard practice but a bad outcome happened.

Now the part about consoling the healthcare provider is actually crucial because this is a horrible thing to go through, and people get this, what’s called the second victim effect where the caregiver actually takes on the emotional burden of having made this decision or this error that injured another person and they take it home and they beat themselves up and sometimes they actually quit their jobs, sometimes they hurt themselves, sometimes they die by suicide. This is a second victim effect. So consolation and support and counseling are a crucial part of this. So that’s a green light. You didn’t violate the standard of care, but there was still a bad outcome. Those are hard to prevent. Those require bigger systems changes and understanding growing the science better.

The second sort of class of errors is in the yellow light category. And this involves an error where it wasn’t the standard of care. In other words, there was a deviation. And when you think about what happened in the Vanderbilt case where an incorrect drug was given to a patient that wasn’t monitored in a radiology suite, okay, that’s a good example of violation of the standard of care.

Now in this case, you need to ask two questions. The first is, was this error intentional? In other words, did they intentionally violate the standard of care? Not intentionally making the error, but did they intentionally violate the standard of care? If the answer is yes, that’s one category on a grid. If the answer is no, that’s another.

The second question you have to ask to make this grid with four squares in it is something called the substitution question. Would a similarly trained competent individual, could they make the same mistake in the same situation? In other words, would one of your colleagues possibly be able to make the same mistake with the same level of training? And if the answer is yes, that’s one column, if the answer is no that’s another.

So let’s go through this. What happens if, yes, the answer is, another person could have made this mistake and they didn’t intentionally violate the standard of care. It was unintentional. Well in that case that’s a straight human error. You counsel and console that person. The mistake could have happened to any other person. We look at the systems improvement that might reduce the chances of this mistake happening again. And we train the person up. Okay, no blame. We hold the system and the person accountable to improve. Simple as that.

Now what happens if that person made the mistake, or violated the standard of care intentionally, but another person might have done the same thing. So the substitution question is yeah, another person cold have done it but it was intentional. They said the nurse said you know what, I’m gonna mix up drugs for multiple different patients at the same time and my colleagues are doing the same thing. Well in that case, that is a risky behavior, and the person needs to be trained, but also you need to look at the system and go why is this behavior so common? Why would other people make this mistake? There’s some hole in the system or in our training where this is allowed.

It’s an interesting thing. There’s something called “normalization of deviancy”. And it sounds like some weird sexual thing, but it’s not. In healthcare normalization of deviancy means we start to, as a tribe, deviate from best practices because we start to shift and we go, you know what, I’m gonna turn that alarm off. I’m not gonna listen to that alarm. Or you know what, we all kind of override the Pyxis to get the med we need when the order’s not there. Everybody does it, it’s just a thing. And that becomes normal behavior. Then a mistake happens. So that involves looking at the system, looking at the individual, and training the whole cohort to improve. And we gave an example of that.

Now, what happens if there’s no intention to make an error but another person would never have made that error. In other words, the substitution test has failed. So your colleagues wouldn’t have done that if they were trained the same way. You didn’t know you were making an error and a mistake happened. That means the person simply isn’t well trained. So either they’re not competent to do what they’re doing or they’re poorly trained, something along those lines, assuming there was no intent. Well, in that case you need to train that person, you need to make sure they’re in the right space, they’re doing the right thing. And if the training isn’t successful then they probably need to be in a different position.

Now what happens if, in that last part of the grid, another person wouldn’t have made the mistake, and the person made the violation of care intentionally? Well in this case, that’s simply reckless. That’s reckless behavior, it’s risky, and that’s where there’s a role for discipline. So in that case, the person needs to be pulled aside and said listen, what you’ve done here is not okay. And there’s gonna be some consequences that prevent reckless behavior in the future, and we’re gonna look at the system to make sure that this kind of behavior is caught quickly and is less likely to occur. And so that’s how you look in that yellow light framework. A good example might be a surgical resident who just fragrantly violates protocol, sticks in a line using an unclean technique, because he or she is in a hurry and knows he’s doing it and just doesn’t care about the consequences. Well, that requires discipline, all right? So that’s the sort of yellow area.

What about the red zone? Pretty much the keyword here is impairment. So if someone is, here’s a good example. Substance abuse, so let’s say there’s a surgeon who’s out drinking with his friends and he’s on call. And then he gets called in to do a gallbladder emergently. Well, he goes in drunk, does the surgery, there’s a bad outcome. That is an impaired clinician. And that requires immediate escalation, immediate action, whether it’s rehab, discipline, all of the above. Systems changes, all of the above. Red zone, do something right away, okay?

The second version of that is a clinician or a healthcare provider who’s actually has a health problem that affects their ability to care. So they’re on medication. A good example would be a surgeon with Parkinson’s disease that’s undiagnosed or he’s in denial about and is shaking and has a bad outcome. And actually, my father suffered from a surgeon who had this condition and had a bad outcome. And the question is how do you manage that? Is it a culture of blame or is it a culture of now we have to take this person, treat them, make sure they’re out of the system, et cetera. And again, the point of Just Culture is you do it non-judgmentally, without a lot of emotion, in a way that’s improving safety.

Now the final version of impaired in the red light zone is you just are a malicious piece of shit. And when that happens, if this is just, this is a case of being a dick in a no-dick zone. So an evil person doing bad things, meaning they’re consciously trying to hurt people, and we’ve seen examples of this. So a nurse in a nursing home who’s raping a woman in a vegetative state. Okay, that is simple criminal behavior. Do not pass go, discipline and all the courts and all of that.

So now let’s bring it back all together. In a system where we have this incredibly complex number of moving parts and everything and people’s lives are at risk every single day, and we have human beings who care, who are competent but care very much but make mistakes, how do we solve the safety issue? Do we blame people? Do we put people in jail for making mistakes? No. We apply a Just Culture. We improve the system. And people will say, well it’s impossible because medicine is a special case. Look at the airline industry. They’re the shining example of how Just Culture can improve safety. The safety record of the airline industry has consistently gotten better through a no blame culture of constant improvement.

We owe it to our patients, we owe it to each other, to do this together in healthcare. It’s not a bunch of corporate speak, it’s not administrators trying to control us, it is a better way of keeping our patients safe. So here’s the call to action. Share this video. Tell people about Just Culture. In the show notes will be the algorithm that you can look at. There are gray areas. We don’t apply algorithms like computers and medicine, we look at the gray and we treat people in situations as humans would. Hit like, hit share, become a supporter, and you can get CME for future episodes and we out, peace.