Listen:
Check out all episodes on the My Favorite Mistake main page.
My guest for Episode #277 of the My Favorite Mistake podcast is Derek Leiner, MD, FACP.
Dr. Leiner trained in internal medicine at VCUHealth and the Richmond VA Medical Center. In 2018, he completed a year as Chief Resident for Quality and Safety, a national VA QI and safety training program then began as a staff physician at the Richmond VA Medical Center.
His career has included teaching, education leadership as an Associate Program Director for a medicine training program, and safety culture leadership. Derek currently works as an academic hospitalist and is the physician champion for high reliability at the Richmond VA Medical Center. He has a passion for humanism, collaborative care, and just culture.
In this episode, we discuss a medical error involving a lumbar puncture procedure and the subsequent emotional impact on the healthcare professional involved. We explore the distinction between near misses and patient harm, highlighting the importance of learning from both. The concept of Just Culture is introduced, emphasizing a fair and supportive approach to addressing errors. We delve into High Reliability Organizations and their focus on creating a culture of safety. The significance of open communication with patients and the potential for positive outcomes is also addressed.
Additionally, we examine the “second victim” phenomenon, where healthcare professionals experience emotional distress following errors. The episode concludes with practical strategies for recovering from mistakes and fostering a culture of continuous learning and improvement in healthcare.
Questions and Topics:
- What is your favorite mistake?
- Derek's LinkedIn article about a mistake
- Is this a near miss or patient harm?
- Did I understand your definition of a near miss correctly?
- What was the reaction and response to the incident?
- Can you explain the concept of high reliability organizations and its relevance to healthcare?
- What was the patient's reaction to the disclosure?
- What is your reaction to Doctor Mayer's story?
- What are your thoughts on Just Culture?
- How do you coach others to recover from mistakes and combat negative self-talk?
Key topics discussed:
- Lumbar puncture incident & its emotional impact
- Near miss vs. patient harm
- Just Culture & its implementation
- High reliability organizations (HROs)
- Patient disclosure and reactions
- Second victim phenomenon & support
- Recovering from mistakes & learning
- Systemic factors, human error & normalization of deviance
- Importance of continuous learning & improvement
Scroll down to find:
- Video version of the episode
- How to subscribe
- Quotes
- Full transcript
Find Dr. Leiner on social media:
Video of the Episode:
Quotes:
Click on an image for a larger view
Subscribe, Follow, Support, Rate, and Review!
Please follow, rate, and review via Apple Podcasts, Podchaser, or your favorite app — that helps others find this content, and you'll be sure to get future episodes as they are released weekly. You can also financially support the show through Spotify.
You can now sign up to get new episodes via email, to make sure you don't miss an episode.
This podcast is part of the Lean Communicators network.
Other Ways to Subscribe or Follow — Apps & Email
Automated Transcript (May Contain Mistakes)
Mark Graban:
Welcome to my favorite mistake. I'm Mark Graben. Our guest today is doctor Derek Leiner. He trained in internal medicine at VCU Health in the Richmond VA Medical center. In 2018, Derek completed a year as chief resident for quality and safety, a national VA Qi and safety training program, then began as a staff physician at the Richmond VA Medical center.
Mark Graban:
So before I tell you a little more, Derek, welcome to the podcast. How are you?
Dr. Derek Leiner:
Thank you very much for having me. I'm doing great. Really excited to be here.
Mark Graban:
Well, great. It's good to have you here. I'm sure there's no shortage of jokes about the double meaning of Richmond, Va. Richmond, VA in Richmond, Virgin, Virginia.
Dr. Derek Leiner:
Yeah. Yeah. All the time.
Mark Graban:
I knew that wasn't, like, a uniquely clever thing for me to point out. I couldn't help it. But we have more serious topics to discuss here today, but a little bit more about Derek. His career included teaching education leadership as an associate program director for medicine training program, and in safety culture leadership. He currently works as an academic hospitalist and is the physician champion for high reliability at the Richmond VA medical Center.
Mark Graban:
And we'll be able to talk about some of those topics and more later in the episode. So I look forward to that discussion. But as we always do here, Derek, as you know, the key question I like to ask and start off with, what would you say is your favorite mistake?
Dr. Derek Leiner:
Well, thanks again for having me. And it's, you know, I wish I could say there are only few, but we are human, and we do make mistakes. I wanted to share a story of a time on clinical medicine. I was just coming on to a rotation as an attending. The team also had new interns starting that same day.
Dr. Derek Leiner:
So a lot of new faces on that case. We were starting that week with a very heavy patient load, a lot of patients. We were actually over what is usually a cap for our team, and that was just sort of the way the admitting structures sort of worked. So there's a lot of things going on, a lot of complex illness, a lot of challenging situations with a couple of kind of hard conversations to have with families. And so it was just painting the picture of a chaotic day.
Dr. Derek Leiner:
There was a young gentleman who was on that service. He had had a worsening neurologic injury, and we didn't really know why that was happening. And part of that workup was going to be doing something called a lumbar puncture, where we go into that spinal space and remove some of the fluid and send that off for some testing. And it got to be the afternoon, when we were planning to do that procedure, the way that this had worked at the hospital where I was at the time, it was that the inpatient team would need to ask for or would need to try first before a more experienced operator wouldn't step in and help. And so we were going to do the procedure at bedside.
Dr. Derek Leiner:
As we were preparing, doing our materials at that time, something came up. The resident I was with said, hey, you know what? I'm going to really take care of this discharge situation. Is that okay that you do the lumbar puncture with the interns? I thought that was completely fine.
Dr. Derek Leiner:
So we went to bedside. We do something called a timeout, which is just a checklist of things to make sure that the patients are going to be safe for the procedure. The patients know what's going on. They've consented for the procedure. They know the risks in that we look at making sure that no one's on a blood thinner, because a procedure on a blood thinner, especially going into the spinal space, can be dangerous.
Dr. Derek Leiner:
So I looked down at this handoff tool that we use, looked at the patient's medications. The blood thinners were not on that list. And so we proceeded to do the procedure. This patient was a larger gentleman. He also, because of his neurologic injury, had some spasticity.
Dr. Derek Leiner:
He was trying to lay down a lot of pain, and so it made the procedure just technically difficult. And unfortunately, we were unsuccessful in getting into the spinal space. So we were going to back, you know, we're going to bring the needle out and talk with our experts for some help. And as the intern removed the needle, all of a sudden, a lot of blood is coming out of this gentleman's back. And so I jump in right away, hold pressure.
Dr. Derek Leiner:
Trying just like rapidly to think of what could have just happened, you know, where, you know, I was reviewing the anatomy again in my brain, just trying to think of what is happening. What did we hit? And fortunately, the bleeding was very quickly controlled. The vital signs were completely fine. The patient had no additional pain.
Dr. Derek Leiner:
Everything ultimately was okay. But there was that momentous there that was very worrisome. Ultimately, about an hour or two later, I get a text message from a resident, you know, I need to talk to you. And I knew as soon as I got that message from her what the problem was going to be. And we found out that the patient actually did receive a blood thinner that morning.
Dr. Derek Leiner:
Enoxaparin is a very typical medication that we use for patients who are hospitalized to prevent blood clots that people can get when you're stuck in a hospital bed for several days, there is a warning with that medication that this should not be done with procedures. And it's controversial for spinal procedures, but really, my practice is always to make sure no one's gotten it before we do a procedure going into the pack. And unfortunately, that was something that we had missed. So went back, disclosed to the patient, let them know what was going on, and told them what the steps were going to be to try to move forward with it. But that's something that still weighs on me, that we had done that procedure with that.
Dr. Derek Leiner:
With that. But when he had received that blood thinner.
Mark Graban:
Yeah. When you say there was a miss, that it hadn't been charted or it was missed in reviewing the chart before the procedure.
Dr. Derek Leiner:
So what I had used in my, when I was at bedside doing the sign out was that written list of medications. It is a tool that we use, and there is some potential error with that tool because it requires someone to be putting that information in every day. When we looked back and tried to investigate what happened with this event, we realized that the team over the weekend had thought the patient was off the medication. The patient was off the medication. But as someone was finishing up some work late Sunday evening, before our new team took over, it was like eight or 09:00.
Dr. Derek Leiner:
They went ahead and threw that enoxaparin back in on the patient because it's a quality measure that we are tracked on, that we want to make sure that people who are hospitalized do have those prophylaxis, blood thinners. And so this person threw that in there, and that was just unbeknownst to the team. So it wasn't added to the medication list before rounds. And so the tool that we were using, the list that we had, was wrong for that reason.
Mark Graban:
Right.
Dr. Derek Leiner:
Right.
Mark Graban:
So would you consider this a quote unquote near miss or a patient harm? I mean, you said you stopped the bleeding, everything was okay. Either way, it's a learning opportunity. So maybe that, I don't know, does that designation even matter, even informally, near miss versus harm?
Dr. Derek Leiner:
Well, I think the designation matters, and there are, unfortunately, a couple of different definitions, but the ones that I like are that if this is an unsafe situation or there was the potential for a patient being harmed and something stopped that event from happening, that would be a near miss or dumb luck, prevented the patient from getting hurt. That's still a near miss. If someone actually does have harm, and it can be that something was intended and it was missed or something was done when it shouldn't have been done. Those are. I consider those to be harm events and those have a different levels of severity as well.
Dr. Derek Leiner:
Yeah, but I'm sorry. The only thing that I was just going to say was that my hope when we investigate events is that we take it seriously no matter what. Often, I think often we sometimes falsely believe, oh, the person didn't get hurt. I am so glad, actually, if we do the work to understand why even a near miss or a close call happened, it can be a canary in the coal mine. Sometimes it can be a clue to something much deeper that is a larger threat to an organization or to a team.
Dr. Derek Leiner:
And so I think, I think the work is, you know, the work of investigating that is, is needed no matter the severity.
Mark Graban:
So do I have it right by what you shared about a near miss? If you were about to do the procedure and somebody came running over at the very last minute and said, doctor liner, wait, they're on a blood thinner. That might be more in that near miss category.
Dr. Derek Leiner:
You know, and when we talk with teams or when I teach residents about safety, culture and our duty to report and what that means for making our healthcare system safer, either one should have been reported and investigated. So even if someone runs into that room and says, hey, hey, this is something we need to stop, we still need to explore why it was that we were using a tool at bedside that was inaccurate.
Mark Graban:
Right. Right. So what do you remember about the reaction and the response to that? I'll use the word incident. I don't mean that in a judgmental or blaming way, but just is different with the event.
Mark Graban:
What you remember about the response and what I'm hoping is, you know, it would be more focused on understanding and learning than, you know, a sort of, you know, punitive blame game. What do you recall then in terms of what happened afterwards?
Dr. Derek Leiner:
The. One of the reasons I, like, it's weird to say it this way, but one of the reasons that this is a favorite mistake is because of the opportunity to help the team afterward. And so personally, my response was one of intense guilt. I mean, my stomach sank, I felt lightheaded, I felt nauseous. I remember going back to the room to disclose with the patient and how shaky I felt, you know, how embarrassing that that was when we are trying so hard to provide a transformative healing experience, you know, to take great care and safe care of patients.
Dr. Derek Leiner:
When something like this happens, we, we feel like. I feel like I just didn't live up to my, my goal and my duty as a physician. And then as we were debriefing the event just in the team room, kind of talking about what had happened. I saw this. I could see it.
Dr. Derek Leiner:
Like, the body language was clear, that everyone felt the exact same way. And in part of my education work, you know, I developed some curricula around, how do you work through error? And I've had practice and myself working through error. And so that was an opportunity for me to actually use some of those tools with that team. Some of the members of the team had just become senior leaders or senior residents, and so it was a great chance to be able to teach how to work through that.
Dr. Derek Leiner:
That's kind of more like the immediate, the team involved response. And then, of course, we want to make sure that we report what happened so that we can investigate it, and then there is an organizational response in that investigation. And I'm happy to say that felt non punitive. All of the conversations were, what could have happened? We know this was not something you meant to do.
Dr. Derek Leiner:
What were the system pieces that played a role? And that led to those much more fruitful conversations of, okay, here are the things that we can change so that when humans interact with systems, we aren't relying on a system that is inaccurate. You know, for example, like the conversations around, how do we build a handoff tool that maybe has more accurate information on it?
Mark Graban:
Right. And, I mean, the one thing that stood out to me, correct me if I'm wrong, you talked about a handwritten or a written or was it a printed list of medications? I'm trying to think, like, how did that not get updated? That would be worth digging into. Right.
Mark Graban:
Not who didn't update it, but.
Dr. Derek Leiner:
Right, right. Absolutely. It's a word document. And so it has. It's printed each morning.
Dr. Derek Leiner:
It's usually updated at the end of the shift, every day before we hand off to a night set of physicians. And so when a medication had changed in the middle of the night prior, that wouldn't have been captured until the next day. And that's exactly what system piece we've been so passionate since then about fixing is. Do we need. We need to figure out a way that maybe we have some sort of software that pulls that information from the chart and doesn't rely on a human to remember to update it.
Mark Graban:
Right. Because when you say, yeah, relying on a human to remember it, we're human, we forget, we get distracted. We're under time pressures. There's systemic factors. And when you say something, well, it usually gets updated.
Mark Graban:
Like, to me, one outcome of having a high reliability organization would be the usually has become an always. So there's jargon and terminology that the listener might not know. Could you talk a little bit about briefly, this concept of high reliability organizations and what that means in healthcare in terms of preventing and learning from mistakes?
Dr. Derek Leiner:
Yeah, absolutely. I think a lot of the conversation around safety in healthcare started in the late nineties with a report that came from the Institute of Medicine called to heir is human. And they estimated in that report that it was up to 100,000 people were dying, dying in american hospitals because of human error. And it's been studied over and over again since then. And it's hard to say what the exact number may be.
Dr. Derek Leiner:
And you can, don't need to get into some of the controversial numbers of that. But I think we can't argue with the fact that error happens, mistakes are happening, and it has patient consequences. And so we rally around this. We embrace this idea that we must make healthcare safer. I though I think we've embraced that.
Dr. Derek Leiner:
Well, the progress in that aim has been slow. It's been 25 years, and we're still not seeing the minuscule accident rates that we see in aviation or naval operations or nuclear power. We aren't matching that, and we have to figure out why. And I think that trying to understand that why led to investigation to how those other safer systems work, and that led to this idea of high reliability. And so you can find several different definitions of this also.
Dr. Derek Leiner:
The one that I like is that high reliability organizations, or hros, are those complex organizations that have fewer than their fair share of accidents, despite their complexity. And healthcare certainly is that. It's a system of microsystems, and we have to move a patient through all of those different microsystems to provide complete care. And so how do we do that without hurting people? Has been this.
Dr. Derek Leiner:
It's been a big question.
Mark Graban:
Yeah.
Dr. Derek Leiner:
I think when you hear that definition, sometimes you think of HRO as an entity. And what I like to teach and remind folks is HRO is, for me, a mindset. It's a combination of the collective behaviors and collective beliefs and the culture of that organization that drives our goal to be, to have zero harm. So there's a series of behaviors and values and beliefs that we try to teach and adopt so that we are always thinking about where error may be, and it's recognizing where threats may be coming, situational awareness, being safe with each other, to speak up when things don't feel right, trusting each other when someone says something isn't right, celebrating, speaking up behaviors like all of these things that tie into a shared mental model of what threats are coming at a team or what threats are coming toward a patient so we can stop it before it happens.
Mark Graban:
Yeah. Yeah. And in this case, one other question I wanted to ask was, what do you remember about the patient's reaction to this being disclosed?
Dr. Derek Leiner:
I was so glad that he gave us so much grace, you know? And I remember when we were learning about disclosing bad news or disclosing error through medical school or residency training that a lot of the teachers or the educators would talk about. A lot of times the patients understand and what we really need to be do. And when we're taking care, one human taking care of another is being honest and being transparent and not trying to hide it and certainly not taking it lightly. And so we try to be very honest, like, this is something that we take very seriously.
Dr. Derek Leiner:
We want you to have the best care. This should not have happened, and we feel so terribly about it, and we feel so sorry. And I always like to remind them that I promise him that it will be reported, and I follow through on that promise. And we are going to figure out the system pieces that played a role here, and we're going to look for why this happened. Outside of the human mistake, what's the system piece that played a role?
Dr. Derek Leiner:
And so we told that to him. We told him this is what happened. And he was so kind. He was, hey, I get it. You have been here before.
Dr. Derek Leiner:
This is the fourth time I've been admitted. You guys have been helping me. So I know mistakes happen, and it's no problem. You know, thank you for reporting it. And interestingly, that made it feel worse.
Dr. Derek Leiner:
I don't know why, you know, his reaction did you mean, you try so hard and then they're so understanding. It's just like, I'm so glad to be taking care of you. And I, I don't know. It just sort of, it sort of, I don't know why it made it feel worse. It just like, yeah.
Dr. Derek Leiner:
You know, it's almost like an unexpected response. Like I'd be, I feel like I would be angry if that happened to me. And to see someone give us so much grace, it just felt so good. And that was really nice.
Mark Graban:
And your story and what happened there is kind of an interesting contrast. I forget, did I point you back when we talked before to episode 70 of the podcast with Doctor David Mayer? So what had happened there? Quick synopsis. And David was kind enough to kind of follow up with me and flesh out some other details.
Mark Graban:
That story is also in my book. But what Doctor Mayer was describing, he was a resident anesthesiologist, so there was, I guess, a couple of decades earlier, there was kind of multiple pecking orders of resident to attending anesthesiologist, a surgeon. Like, to me, as a patient, I want everyone to be a team. I don't want there to be a pecking order. But there was a wrong side incision made, and the story doctor Mayer recounted was basically the attending surgeon lying to the patient and saying, you got two procedures for the price of one.
Mark Graban:
And the patient was happy and thankful, and Doctor Mayer regretted not speaking up. But I, even in the moment, tried, you know, the episode, tried to console him a bit of like, well, you know, please don't beat yourself up, because, you know, there's this culture factor where he predicted, probably correctly, he would have gotten in trouble for speaking up and for not going along with the lie. And his medical education did not address what you're touching on of how to share something with a patient, as comfortable as it might be. So, you know, I think there's some, some different standards or expectations over time. I'm curious your reaction to what you remember of Doctor Mayer's story.
Mark Graban:
I'm glad you had a situation where there could be that candor and that more constructive follow up.
Dr. Derek Leiner:
Yeah, I remember that story. I found it striking, and it made me reflect on my experiences and thinking about how hard it is to disclose something to a patient. And I think that there's always the concern that they've put their life in your hands, and if you've made a mistake, that there's going to be sort of litigation or tort or things like that. And I find that that may lead people to want to hide things. I think that a lot of the reports from IOM from 1999 and others and the conversation around healthcare becoming safer in our country, I think, has led to more of those curriculas focusing on honesty, transparency, humanism, collaboration, and disclosing error.
Dr. Derek Leiner:
And so I was glad to benefit from that in my education, to have some of those tools in my toolbox to talk with that patient.
Mark Graban:
Mm hmm. Yeah. Yeah. I mean, it's said, I think there are studies. This is not my expertise, but what I've heard colloquially is patients sue less often when there's a disclosure and a sincere apology.
Mark Graban:
And I know some people who have run across inpatient safety circles, you know, their child was harmed or even killed, and, you know, there's medical error and part of any out, you know, there's, there's grief there's sadness, there's all kinds of emotions, but then there's a level of anger when they feel like they're being stonewalled, they're not getting answers. And then sometimes it seems like legal action is as much about getting answers than it is wanting to punish anybody.
Dr. Derek Leiner:
Yeah, I don't know. I've heard the same thing, but I don't know the data behind it either. And when I think about whether I'm working with the patient directly, when I'm on a non teaching service, or when I'm working with house staff trainees and trying to model physicianship for them and teach them physicianship, ultimately, my hope is that I can really help the patient understand that I see them as a human and not someone who's sick or only sick or someone who's only got a disease or only a primary problem to manage while or inside the hospital. And so I've been trying, and this is a more recent development, to be honest. But I've been trying to be much more brave and much more gracious when someone is asking for talking with my supervisor or talking with a patient advocate, or if my care gets reviewed by a peer under something called a peer review process, if they're worried that I made a mistake or did something that's not standard, and those things are rare, fortunately.
Dr. Derek Leiner:
But I'm trying to practice seeing the value in all those things. And so if someone does want to move toward litigation because they don't feel like they're giving the answers, I try my best to be honest, but I try to support them like I do want them to have the best experience. And I want to hear the feedback to how we grow, because there's going to be things that they're experts in their care, they're experts in their body, they're things that they know that we need to learn so that we can take the best care of them. And so if going through a lawyer or talking with the patient advocacy or the patient experience center is the way to do that, I really want to try to support that, even though it can feel really hard for what that might mean for me in terms of, you know, punishment or discipline or things down the road.
Mark Graban:
Yeah. Because, I mean, I like to think if I was in that situation, if I were to harmed by a mistake, I would want the event, the incident to be reported would be my focus, not the person being reported. Like, we hear that word reported, and different people might jump to different conclusions, but I think of even, like, completely unharmful and benign customer service mistakes. I hesitate sometimes to point it out or complain to a company because I'm worried that somebody might get blamed or punished for something that's really a systemic problem. Like, I have probably a higher sensitivity to that than most.
Mark Graban:
I'm not saying I never complain, but there's times where I really do think that, sadly, sometimes the default is a response of. And I learned this phrase in healthcare, naming, blaming, and shaming. Like, sometimes they'll say, you know, it's not worth bringing it up because so and so was doing their best, and it was human error, probably some systemic cause, and they'll say, you know, I'm gonna just let it go. I don't. Yeah, it's organization.
Mark Graban:
It comes back to organizational culture, human. Just human dynamics, or things that are pretty common.
Dr. Derek Leiner:
Yeah, yeah, I agree. And it makes me. It reminds me of that concept of just culture or the just culture response to error. And, you know, there are times where. And learning about this has actually changed the way that I watch processes.
Dr. Derek Leiner:
Like I'm going through a drive thru or something, and they get orders mixed up. It's like that. They're doing their best. You're right. Like you're saying.
Dr. Derek Leiner:
And it was the system that failed that person, even though they were trying their best. And we certainly teach that. And that's how I try to operate when I am involved in safety event investigations. But I don't want to minimize how hard that can be. I think a lot of times it's very human for us to say, it's very easy for us and part of just human nature to be like, this person made the mistake.
Dr. Derek Leiner:
They've got to try harder. But I think when you look at some of the literature and the opinions of why healthcare hasn't changed quickly in 25 years since that landmark study, it's that we are so willing to say that people need to try harder. Human vigilance is the answer.
Mark Graban:
Yeah, be more careful.
Dr. Derek Leiner:
Be more careful. You've got to reread this policy. Make sure that you get this retraining. We'll put you in through some coaching. And those things can be important for personal growth.
Dr. Derek Leiner:
Humans should grow in their roles over time, but it misses those system pieces like you're just talking about that also need to be fixed.
Mark Graban:
Yeah, I've got some colleagues in healthcare. They share my view. If I made some sort of noise when you talked about retraining, I kind of try to question, like, well, if the training the first time didn't prevent this situation, I'm not convinced. Retraining would have maybe anything more than a temporary effect. I mean, this is crude math here, but two times zero equals zero.
Dr. Derek Leiner:
Yeah.
Mark Graban:
And I'm not calling anyone involved a zero. But lately, when, you know, training is, to me, a systemic factor, you know, a quick example of, again, something completely unimportant compared to medical error. I remember one time I was at a Starbucks, I ordered a nitro cold brew, which is kind of my favorite beverage, and the barista behind the counter dumped a bunch of ice into it. Now, I did kind of point out, like, gently, like, well, you know, that beverage isn't supposed to have ice. Could you remake that?
Mark Graban:
And then I quickly realized that, oh, she was kind of being shadowed. And I don't know if it was a more experienced barista or somebody was then pointing out, oh, well, see, no, it was like, after the fact, it was like it was training through reacting to mistakes. And I'm like, it didn't. Like, I don't know, like, it seemed like that that would put the barista, the new employee, through unnecessary stress. Because, like you said about your lumbar puncture mistake, it wasn't intentional.
Mark Graban:
She was, you know, and I'm like, oh, gosh, you know, take your time and train people. Like, there's probably hundreds of different mistakes a barista could make. Like, don't have. Yeah, don't, don't. I don't.
Mark Graban:
I'm jumping to an assumption that the training was completely reacting to mistake based. But, yeah, you know, like, oh, I hope she didn't get in trouble because it was easy enough to remake a drink that was, you know, some extra unnecessary cost to Starbucks.
Dr. Derek Leiner:
Right? Yeah, absolutely. I mean, no matter what industry you're in, humans are humans, and we're beautifully imperfect, and we have to recognize that. And I worry. And again, I've only ever worked in one industry, but I don't know how, you know, Starbucks or Publix or whatever, whatever have you, works where their cultures are.
Dr. Derek Leiner:
But I just worry that certainly in healthcare, that we want everyone to be perfect.
Mark Graban:
Yeah.
Dr. Derek Leiner:
And I think that's changing, and I think it's changing for the better, and it allows us to be more mindful about some of those system pieces, and I think that can make more meaningful improvements to how we deliver care.
Mark Graban:
Yeah. I'm glad you brought up just culture, because that was something I wanted to talk about. And people can go, justculture.org is a great website to learn more about this approach. And when you were sharing in your story, it wasn't intentional. Not to get too sidetracked.
Mark Graban:
But when I was doing some searches about stories and journal articles in healthcare, there's this phrase, this word unintended pops up where they're describing unintended or unintentional medical error. And I'm like, well, they're, I think, by default, unintentional. I mean, it seems like an unnecessary modifier. If it was an intentional mistake. It's not a mistake.
Mark Graban:
It's assault or harm or something criminal. But, you know, so that I'll get off my soapbox there. But the question back to just culture is, I think it's helpful. There's this framework of, like, was somebody intending to cause harm or not being one of the filters about, like, what's the just the fair and just response to something that, that's gone wrong? What are your thoughts or experiences around that?
Dr. Derek Leiner:
I think part of creating just culture, and you're right, there's a lot of great websites where people can learn more about it. I like the videos and the website and the, the book, really, of Sidney Decker from, I think he's in Griffith University, if I remember correctly. But Doctor Decker writes a lot about how do you build that idea of justice in your response to error and the vision. The graphic that I like to think of when creating just culture is that we don't want a society that is completely blame free, because humans do need to learn and grow. It's part of what, what makes us feel fulfilled.
Dr. Derek Leiner:
We want to keep getting better. And so if you make everything blame free, we won't grow. If you make everything blame, the system doesn't change, and people will start to hide mistakes or leave your industry. And so you try to create this pendulum right in the middle of those two, where you can help others grow, organizations grow, and people grow at the same time. And so I like to think of it as not so much being outcome based in our response.
Dr. Derek Leiner:
So, you know, sometimes I've ordered the wrong medication for somebody and it was caught and no one got hurt. And so we didn't do anything about it because it was caught and it was fixed. And that prevents us from actually exploring why the, the wrong order went in on the wrong patient in the first place. So we have missed opportunities. Same thing with rule based responses.
Dr. Derek Leiner:
You know, a lot of times, if we say, oh, well, you broke a rule, so you need to be reprimanded. It also, that one's difficult because it can be sort of controversial. And it's hard to say, you know, yes, break rules, but sometimes rules don't always match what we're doing right at the bedside and policies. And so those, there needs to be a way to work through those as well. So I like the risk based approach where you look at what was the behavior that the someone exhibited and why did they choose to make that behavior, what risk were they choosing to take?
Dr. Derek Leiner:
And yes, there's some guidance and some algorithms or some writings on a malicious action or someone's impaired. Those individuals need a special sort of approach. I think that is kind of outside of what I think of as just culture. Once you get out of that, though, you're looking at was this a simple human mistake where they didn't see the risk and didn't choose the action? It was a part of being human.
Dr. Derek Leiner:
Was it something that they didn't understand the risk they were taking, but it was a behavioral choice, like cutting a corner or doing a workaround, or was it something that they knew was going to be risky and they chose to do it anyway? And that's more of that reckless behavior. And each one of those has a different response. And really what we should be doing. And this is going to be like Derek Liner's approach.
Dr. Derek Leiner:
No matter what you see. I do think people would need support and psychological first aid. So a lot of times the writings and just culture will say that if you see reckless behavior, that's where discipline and punishment can show up. And I don't disagree, but I worry about labeling people. And I think that there's going to be times when I'm trying to be my best self and yet I still choose a reckless behavior.
Dr. Derek Leiner:
And I don't want to be labeled as a reckless person. I just, in that moment, because of the surrounding context, made a reckless choice. And so I think if we can remove that label and realize that it was a reckless choice, you still support that person. You can say, hey, because it was reckless. We still need to look at, you know, demerits or reprimands or something.
Dr. Derek Leiner:
But we want to support you too, because we know you don't try to hurt anybody. So, you know, do you need eap, do you need a coach, those sorts of things, instead of only making it about punishing that person. But I'll get off that soapbox because I can go on forever about just culture.
Mark Graban:
But, yeah, because, yeah, I mean, there's this point where certainly you punish intentional acts. And like, to me, these are extreme outliers cases where, again, rare cases where a nurse is intentionally injecting patients who are dying to put them out of their misery, that's by most laws in most places considered an intentional criminal act. Then there's kind of more complicated cases of the series, and I had trouble listening to it. The podcast and then a television series.
Dr. Derek Leiner:
Doctor Death, I've heard, I've not heard it, but I have heard of it.
Mark Graban:
And my understanding of it is it was a doctor who was known to be performing inappropriate surgeries that were having really bad outcomes. And it seemed like one of the situations where maybe systemically the organization or organizations maybe kind of looked the other way, which to me, then that becomes a different form of systemic problem. Or the cases of, well, the organization suspects that somebody is working impaired and doesn't really do anything about it, and then there's a patient harm or death, like, well, the organization had an opportunity systemically to address that through coaching, addiction treatment, or what have you. I mean, there's, you know, I think cases like that are the outlier as opposed to good people working in a bad system. And I think one other thing that's really informative with the just culture framework is the one part of that algorithm that asks, I think, essentially, would another similar professional in that same situation would they have been pressured to make the same quote unquote bad choice?
Mark Graban:
Because I've seen circumstances wherever people cut the corner and they get away with it, and those people don't get punished or they may get celebrated for things they're accomplishing because of, or in spite of cutting the corner, but then the one time it bites somebody in the ass for a lack of a more professional way of saying it, but then that person gets punished.
Dr. Derek Leiner:
Right.
Mark Graban:
And like, we, you know, you know, we need to learn from those opportunities, you know, that that didn't create harm and learn early and prevent the big outcome. And, like, to me, that's part of what's being fair and just to the people involved. If don't put them in a situation where they are being pressed to cut corners of, I'm sorry, I'm climbing up on a soapbox now. But just last point I'll make on this is, you know, I've been in hospitals where, you know, you quite literally go and do an analysis of, like, here are all the tasks a nurse in an inpatient unit is being expected to complete in an hour, and it's like literally 80 minutes worth of work. And so now you're putting the nurse in, I think, a very unfair position of choosing what can I get away with not doing?
Mark Graban:
And then if it ends up being the, quote unquote wrong decision, they might be blamed or punished for what? I would say, hey, that screams systemic problem. We've got to reduce waste. And part of that 80 minutes of work is walking up and down the hallway searching for things and improve the process to reduce that 80 minutes down to something manageable or have the correct staffing level so that people aren't being put in that position. So that's my engineer view of things.
Mark Graban:
What do you say?
Dr. Derek Leiner:
I think you're absolutely right, and I love that. A lot of the models or decision support tools for using just culture include what you're referring to as the substitution test, because that investigator is really pushed to think about, is there a cultural norm that has led to this happening? And this was just that one out of 100 times that it was caught or something bad happened, rather, and 99 times before, nothing bad happened. So it really forces the investigator to look through that and try to figure out if there's that system piece that needs to be addressed. Like you said, whether it's the structure, whether it's waste, whether it's movement waste, if things need to be redesigned, if it's staffing models, everybody is short staffed these days, so it's hard to really work around if it's a staffing model problem.
Dr. Derek Leiner:
But we can at least explore that idea. The challenge, I think, becomes who the investigator is. And I think one of the hardest parts about just culture is who is doing the investigating. There are a lot of systems that have leaders as investigators, and I think that they are in a really great position to know what the team, what threats are coming at the team, what the team is going through, what the stresses are on the team, what the staffing model was that day. They also carry a psychological size.
Dr. Derek Leiner:
And so when they're doing a substitution test, there is the risk that they're going to be asking their team members, those they lead, would you do this? And because it's their supervisor or their leader, they'll be like, no, I wouldn't do that. So that maybe introduces a little bit of bias and makes it hard to be fully just in that moment.
Mark Graban:
Yeah.
Dr. Derek Leiner:
The other thing, too, is if the investigator is the leader who's also responsible for performance evaluations and performance appraisals and the reputation of the service, you know, are they going to really, where is their pool going to be? Are they going to be compelled to look more at one, say it was one person and take the fall for the whole team, or are they going to be more willing to share more broadly, like, yeah, this is a whole service problem, and I don't know that. I think there may be people who struggle with, with that conflict of interest. So I do think that there's value in having peers be investigators of events so that you reduce the psychological size, you reduce the feeling of safety for those who are being not investigated but interviewed. And I think the substitution test works better when it's appeared.
Mark Graban:
Yeah. And you talk about something that's done 99 times and there's no bad outcome, and then that 100th time it's bad. There's a phrase, I encourage people to google it because it leads to a lot of interesting reading. The normalization of deviance.
Dr. Derek Leiner:
I was just thinking that.
Mark Graban:
Yeah, that's an interesting phrase. We're not saying the people are deviants, but it's that, I mean, it could happen in a factory for an example of you're supposed to wear your safety glasses and let's say people get lax on wearing them. The supervisors don't want to rile people up and they start looking the other way. And now you're normalizing that deviance from standard safety practice. And it'll be okay because people wouldn't lose an eye every single day, but you don't know what day it's going to be.
Mark Graban:
And that's why you wear your safety glasses or you're supposed to. So, you know, that's an example that could occur in other settings.
Dr. Derek Leiner:
Yeah, I've heard it referred to as drift behavior. Also the spiral of deviance or the circle of devianta deviance. The graphic that I like is that, you know, you've got error in the middle and you've got the safe behaviors on the outside. And every time someone cuts a corner, you get closer and closer to that error. And every time someone cuts a corner and nothing bad happens, everyone adopts that new quote, unquote, simpler behavior.
Dr. Derek Leiner:
But of course, at some point it is going to be risky. And then, of course, at the end of when something happens, it's like, why was no one wearing their goggles? How could you do this? Right, but absolutely. And so that's, that's so important with the substitution test to explore what were the external drivers of behavior and then.
Mark Graban:
To lighten the mood for a minute, maybe inappropriately, if I were starting a heavy metal band, circle of deviance might be. Might be a great name, or I would debate spiral of deviance. I don't know, though.
Dr. Derek Leiner:
Yeah, I love that. Mine's more of a chemistry nerd approach, but my band name would be the ketones.
Mark Graban:
That's a good one, too. So, one other question. And I want to talk about this all day, but I wanted to come back when you talked about sort of like the feelings and the emotion and the embarrassment in doctor Mayer's story he witnessed, the resident surgeon was the one who made the incision on the wrong side. And that resident surgeon, like, had to go into the corner and sit down because they were distraught. That very human, oh, my gosh, what have I done?
Mark Graban:
Reaction, and I'll link to this in the show notes. I really appreciated something you wrote on LinkedIn about your reaction to a mistake. And that was the thing that prompted me to reach out and say, let's do this episode, but I'll encourage people to go read that. But I'm curious, maybe as a final question, what have you learned, or how do you coach others about recovering from a mistake and fighting some of that negative self talk? That is, I think, again, like very much human nature, especially among people who care so much about what they do.
Dr. Derek Leiner:
I really appreciate that question. I'll take it back just a little bit to try to link just culture into it it, because that's how I've sort of developed my approach. We talked a little bit about the different types of human behavior. And I like Cindy Decker's writing, where he really pushes people to think of, how do you restore people and prevent them from leaving, try to avoid people being punished, really focus on people's growth in that. One of the first questions we always ask when we see someone's made a mistake is, who's hurt?
Dr. Derek Leiner:
And we, you know, in patient care, patients are our primary victims. We want to make sure they're taken care of. But the secondary victims can be the people who made the mistake. Because we're working in high stakes, high stress environments, we want to make sure we're taking care of the human who's here under our care. And when something bad happens, we can carry a lot of guilt, and we can carry a lot of stress.
Dr. Derek Leiner:
We can start to have a lot of negative inward thoughts, something that's called the second victim phenomenon, actually. And it's a real thing. It certainly is linked with higher rates of burnout, lower resiliency, leaving medicine, suicidality, making errors in the future. And so it is a problem that we have to address. And I think when you think about how do you help someone work through an event when something's happened, one of the just culture questions is, well, what do the hurt people need?
Dr. Derek Leiner:
And for those second victims, there is internal work that I think they need to do, and there's external work that the leaders or their peers are obliged to do to help them through it. And so that internal work is what I've been thinking about for a while, and how do you take a step back and grow through that and then get back into the game? And so I have three questions or three truths that I like to remind myself when I've made a mistake. One of them is, you know, number one, you're human. You know, take a step back for a second.
Dr. Derek Leiner:
Go into a quiet space. All right, number one, I am human. Humans are always going to be imperfect. I can try my best. I am imperfect.
Dr. Derek Leiner:
I can't get around that. Number two is, okay, I am taking this really hard because I care very much about taking good care of people. And so you need to ground yourself in remembering that we take this job really seriously, and that is something that should be celebrated. And that just having the second victim phenomenon means that you care deeply about the people that you're taking care of. Just remind yourself, like, okay, I am meant for this.
Dr. Derek Leiner:
Right? And then number three is that you're not. Your feelings, like, your imposter syndrome, is going to be telling you over and over again. How could you not look at that medication list? Why did you not pull up a computer?
Dr. Derek Leiner:
It takes 30 seconds. How are you so stupid? You know, why did you not double check things? And those are feelings. Those are not who you are.
Dr. Derek Leiner:
And it's very easy to confuse the two. And so if you have a moment to then just ground yourself in those three truths. And I like to sort of reflect and retell my story. Some people like to write it down. Some people like to meditate.
Dr. Derek Leiner:
Some people like to have moments of silence. But if you can reframe the story in a third person and just kind of walk through, I think grounding yourself in the three truths and then retelling your story does help you see what the system pieces were. And truly, I think one of the most important things of internal work is to turn it into action. So, in my situation, it was describing my story, telling my story to my trainees, telling my story to my peers, so I could hear them repeat it back to me. And I'm so grateful to have been with a group.
Dr. Derek Leiner:
I still am with a group who's amazing, but even at that time, like, a group who was so supportive, and we're happy to stop what they're doing and listen to a story when something bad happened.
Mark Graban:
Yeah.
Dr. Derek Leiner:
Tell that story through the event reporting system. So put it into the system and look for system pieces. Talk with peers. About the system pieces. Share with safety event investigators what the system pieces were, what the external drivers were of the event, so that we can make changes.
Dr. Derek Leiner:
I think once we've done the internal work and repaired and get back into the game, if we don't also report it and talk about where the system can get stronger, that's a huge missed opportunity.
Mark Graban:
Well, I think that is great advice that I can use when I make a mistake. A software developer who creates some sort of bug or company puts out an update that knocks so many computers offline. I hope you weren't having to deal with that very recently, that crowdstrike problem. A lot of hospitals were. But, yeah, I think those three truths, that framework, I think it's really worth highlighting, and I think something that's a very practical, helpful takeaway as we wrap up here, I'm just thinking that there's two sides of this coin.
Mark Graban:
I interviewed once, somebody wrote a book called, and he knew it was controversial. Patience comes second because he was talking about the connections between you need to take care of the staff and medical professionals so they can take care of the patients. And this idea of, you know, a second, you know, second victim, you talked about all of those things, but, you know, attention getting title. But you know what? I wish going from soapbox to magic wand.
Mark Graban:
Of course, none of us want to see any patient getting hurt. There's debate about the numbers, surveys, studies, estimates, extrapolations are correct or not. We don't see anybody being harmed, and I don't want to see any healthcare professional being harmed, emotionally or professionally because they've been put in a situation where an error or harm is possible. And, you know, and hopefully we're focusing on, you know, things. I know you, and you and I agree on systems, processes, communication, culture.
Mark Graban:
You know, as you were saying, not trying harder, not caring more, not being more careful. I think that's hopefully what we will see someday. It's a tough journey.
Dr. Derek Leiner:
It's a tough journey. It's an exciting journey. I know we're learning more every single day on how we can make that more reality, because you're absolutely right. You're absolutely right. We always do want to take care of patients, and certainly patients first is sort of a mantra of, your patient always comes first, no matter what's coming up.
Dr. Derek Leiner:
But I don't ever want to forget about our human needs, too, and what we need to make sure that we are taking great care of patients. People who feel cared for provide the best care. So.
Mark Graban:
Absolutely. Yeah.
Dr. Derek Leiner:
We have to do the work to make our organization and our models for healthcare delivery stronger, I think.
Mark Graban:
Well, Derek, thank you so much for being a guest, for sharing your story. Thank you for the work that you're doing out there to improve things in healthcare for your patients, for your colleagues, and really appreciate you being able to have the conversation here today. Thanks so much.
Dr. Derek Leiner:
Well, thank you so much. Really enjoyed the conversation.
Episode Summary and More
High Reliability in Healthcare: Lessons from Close Calls
In the ever-evolving field of healthcare, ensuring patient safety is a paramount concern. As an academic hospitalist and physician champion for high reliability at the Richmond VA Medical Center, Dr. Derek Leiner provides an in-depth look at the complexities involved in maintaining a culture of safety. This article delves into his experience, the lessons learned from a close call, and the principles that define high-reliability organizations (HROs) in healthcare.
The Complexity of Clinical Medicine
Healthcare is inherently complex, with numerous variables at play during patient care. Dr. Leiner's story about a chaotic day on service highlights the dynamic challenges faced by healthcare providers. On a rotation where new interns joined a team already burdened with a heavy patient load, the atmosphere was ripe for errors. One such patient requiring a lumbar puncture to investigate a neurological injury became the focus of the day's challenges.
The procedure, complicated by the patient's physical condition and the need for meticulous execution, culminated in an unforeseen complication—uncontrollable bleeding due to the patient being on a blood thinner. This incident underscores the importance of thorough preparation and the relentless pursuit of safety, even under pressure.
Importance of Adopting Preemptive Measures
The incident serves as a stark reminder of the need for preemptive measures. Dr. Leiner's recounting of the preparation process, including a “timeout” checklist to ensure safety, exemplifies best practices in patient care. However, the event also revealed gaps in the system. The blood thinner enoxaparin had been administered unbeknownst to the team due to a lapse in updating the patient's medication list.
The significance of using reliable tools and accurate information can't be overstated. Dr. Leiner highlights the limitations of relying on a manually updated Word document, which can be prone to human error. This particular case catalyzed discussions on implementing more dependable solutions, such as software that automatically pulls the latest data from patient charts.
Learning from Near Misses and Harm Events
In healthcare, the distinctions between near misses and harm events are critical. Both necessitate thorough investigation and reporting to foster a culture of safety. Dr. Leiner emphasizes that even close calls, where adverse outcomes are narrowly avoided, should be treated seriously. They serve as canaries in the coal mine, providing critical insights into potential system vulnerabilities.
Dr. Leiner's reflection on the emotional and psychological toll of the incident reveals the human side of medicine. His sense of guilt and subsequent efforts to teach his team how to cope with errors highlight the importance of emotional and educational support. This approach not only aids in personal recovery but also strengthens team cohesion and learning.
Embracing High Reliability Organizations (HRO) Principles
The concept of high reliability organizations (HROs) has gained traction in healthcare, inspired by industries such as aviation and nuclear power, where safety is imperative. Dr. Leiner elucidates that HROs are characterized by their ability to operate safely in complex, high-risk environments. In healthcare, this translates to a system where patient safety is ingrained in the culture through constant vigilance and systematic improvements.
HRO principles revolve around collective mindfulness and behaviors aimed at minimizing errors. This includes situational awareness, open communication, and a culture that encourages speaking up about potential safety concerns. It's about instilling a shared commitment to zero harm, where every team member actively participates in identifying and mitigating risks.
Building a Culture of Safety through Systems and Beliefs
Transforming a healthcare facility into a high reliability organization requires more than just procedural changes; it necessitates a cultural shift. Dr. Leiner points out that HRO is not merely an entity but a mindset—a combination of collective behaviors and beliefs that prioritize safety. This includes celebrating speaking-up behaviors, trusting team members, and fostering an environment where everyone feels responsible for patient safety.
The journey from human error to high reliability is ongoing. Through continuous learning and system improvements, healthcare can move closer to achieving its goal of zero harm. Dr. Leiner's experiences and insights provide valuable lessons for healthcare providers aiming to enhance safety and reliability in their practice.
The Role of Transparency in Patient Care
Transparency is one of the cornerstones of high reliability organizations (HROs). Healthcare providers, like Dr. Derek Leiner, emphasize the importance of being open and honest with patients, especially when an error occurs. Dr. Leiner’s experience underscores how transparency fosters trust and, in many cases, helps in mitigating the negative emotional impact on both patients and healthcare providers.
The Impact of Honest Communication
In the healthcare domain, patients often appreciate and understand the complexities involved in medical procedures. Dr. Leiner recounted a situation where a patient reacted with grace when informed about an error. The patient’s kind response, despite the error, highlights a vital lesson: patients are often more forgiving when they feel they are being communicated with honestly and transparently.
Learning how to disclose bad news is a crucial aspect of medical training. Dr. Leiner remembers being taught to be transparent and honest without downplaying the situation. This genuine communication can significantly affect a patient’s reaction and foster a stronger patient-provider relationship.
The Emotional Toll and Coping Mechanisms
Despite the patient’s forgiving nature, the emotional burden on healthcare providers can be heavy. Understanding that patients appreciate honesty and are often gracious in response can paradoxically make the emotional experience harder for the provider. The sentiment of having let someone down, paired with the patient’s kindness, can amplify feelings of guilt.
Dr. Leiner’s reflections highlight the importance of emotional support systems within healthcare organizations. Peer support and counseling services can help providers process these emotions constructively, ensuring they continue to practice with compassion and confidence.
Cultural Shifts and Historical Perspectives
The landscape of healthcare transparency and error disclosure has evolved. Historical anecdotes, like the one shared by Dr. Mayer about a resident anesthesiologist experiencing a culture of hierarchy and dishonesty, contrast sharply with modern practices.
From Specialized Training to Modern Expectations
The educational focus has shifted considerably since the era of Dr. Mayer’s residency. Today’s training emphasizes attributes like honesty, transparency, humanism, and collaboration. This shift aims to equip new healthcare professionals with the tools and mindset necessary for high reliability and patient-centered care.
The fear of litigation remains a concern, but studies suggest that sincere apologies and transparent communication can reduce the likelihood of lawsuits. Rather than seeking retribution, many patients and families pursue legal action to get answers and feel heard. This understanding has driven the adoption of openness as a standard response to medical errors.
Building Systems for Mutual Growth
Effective communication and transparency are foundational, but systemic solutions are also necessary to create a high reliability culture. Dr. Leiner stresses the value of a Just Culture—a balanced approach where human errors are treated as opportunities for systemic improvement rather than solely as individual failings.
Avoiding the Blame Game
In a Just Culture, the focus shifts from blaming individuals to understanding why an error occurred and how the system allowed it to happen. This approach encourages healthcare providers to report their mistakes without fear of punitive action, leading to more comprehensive safety improvements.
Practical Implementation of Just Culture
Implementing a Just Culture involves:
- Risk-Based Approaches: Evaluating decisions based on the risks they entail rather than the outcomes alone. This helps identify and mitigate hazardous behaviors before they lead to harm.
- Balancing Accountability with Growth: Ensuring that providers learn from mistakes in a way that fosters professional growth without fostering fear.
- Systemic Solutions: Integrating systemic changes based on identified risks to ensure that the same error does not recur.
Training and Retraining
While initial training is essential, ongoing education that adapts to new challenges and integrates lessons from past errors is equally crucial. Dr. Leiner underscores that retraining should be meaningful and address the root causes of errors rather than simply reiterating existing policies.
Real-Life Lessons for Improved Practices
Dr. Leiner’s lumbar puncture incident, where a patient suffered from uncontrollable bleeding due to enoxaparin not being properly documented, illustrates the importance of reliable documentation and communication systems. Transitioning from manual processes to automated solutions can help bridge gaps and reduce the likelihood of such errors.
Additionally, fostering a collaborative environment where team members feel empowered to speak up and contribute to safety measures can significantly enhance overall care quality. By focusing on systemic improvements and fostering a culture of transparency and continuous learning, healthcare organizations can move closer to the goal of zero harm.
The Role of Transparency in Patient Care
Transparency is one of the cornerstones of high reliability organizations (HROs). Healthcare providers, like Dr. Derek Leiner, emphasize the importance of being open and honest with patients, especially when an error occurs. Dr. Leiner’s experience underscores how transparency fosters trust and, in many cases, helps in mitigating the negative emotional impact on both patients and healthcare providers.
The Impact of Honest Communication
In the healthcare domain, patients often appreciate and understand the complexities involved in medical procedures. Dr. Leiner recounted a situation where a patient reacted with grace when informed about an error. The patient’s kind response, despite the error, highlights a vital lesson: patients are often more forgiving when they feel they are being communicated with honestly and transparently.
Learning how to disclose bad news is a crucial aspect of medical training. Dr. Leiner remembers being taught to be transparent and honest without downplaying the situation. This genuine communication can significantly affect a patient’s reaction and foster a stronger patient-provider relationship.
The Emotional Toll and Coping Mechanisms
Despite the patient’s forgiving nature, the emotional burden on healthcare providers can be heavy. Understanding that patients appreciate honesty and are often gracious in response can paradoxically make the emotional experience harder for the provider. The sentiment of having let someone down, paired with the patient’s kindness, can amplify feelings of guilt.
Dr. Leiner’s reflections highlight the importance of emotional support systems within healthcare organizations. Peer support and counseling services can help providers process these emotions constructively, ensuring they continue to practice with compassion and confidence.
Cultural Shifts and Historical Perspectives
The landscape of healthcare transparency and error disclosure has evolved. Historical anecdotes, like the one shared by Dr. Mayer about a resident anesthesiologist experiencing a culture of hierarchy and dishonesty, contrast sharply with modern practices.
From Specialized Training to Modern Expectations
The educational focus has shifted considerably since the era of Dr. Mayer’s residency. Today’s training emphasizes attributes like honesty, transparency, humanism, and collaboration. This shift aims to equip new healthcare professionals with the tools and mindset necessary for high reliability and patient-centered care.
The fear of litigation remains a concern, but studies suggest that sincere apologies and transparent communication can reduce the likelihood of lawsuits. Rather than seeking retribution, many patients and families pursue legal action to get answers and feel heard. This understanding has driven the adoption of openness as a standard response to medical errors.
Building Systems for Mutual Growth
Effective communication and transparency are foundational, but systemic solutions are also necessary to create a high reliability culture. Dr. Leiner stresses the value of a Just Culture—a balanced approach where human errors are treated as opportunities for systemic improvement rather than solely as individual failings.
Avoiding the Blame Game
In a Just Culture, the focus shifts from blaming individuals to understanding why an error occurred and how the system allowed it to happen. This approach encourages healthcare providers to report their mistakes without fear of punitive action, leading to more comprehensive safety improvements.
Practical Implementation of Just Culture
Implementing a Just Culture involves:
- Risk-Based Approaches: Evaluating decisions based on the risks they entail rather than the outcomes alone. This helps identify and mitigate hazardous behaviors before they lead to harm.
- Balancing Accountability with Growth: Ensuring that providers learn from mistakes in a way that fosters professional growth without fostering fear.
- Systemic Solutions: Integrating systemic changes based on identified risks to ensure that the same error does not recur.
Training and Retraining
While initial training is essential, ongoing education that adapts to new challenges and integrates lessons from past errors is equally crucial. Dr. Leiner underscores that retraining should be meaningful and address the root causes of errors rather than simply reiterating existing policies.
Real-Life Lessons for Improved Practices
Dr. Leiner’s lumbar puncture incident, where a patient suffered from uncontrollable bleeding due to enoxaparin not being properly documented, illustrates the importance of reliable documentation and communication systems. Transitioning from manual processes to automated solutions can help bridge gaps and reduce the likelihood of such errors.
Additionally, fostering a collaborative environment where team members feel empowered to speak up and contribute to safety measures can significantly enhance overall care quality. By focusing on systemic improvements and fostering a culture of transparency and continuous learning, healthcare organizations can move closer to the goal of zero harm.
Addressing Reckless Behavior and Systemic Issues
While Just Culture emphasizes understanding and systemic improvements for most errors, it delineates a different approach for reckless behaviors. Distinguishing between simple human mistakes, behavioral choices, and reckless behavior is essential.
Framework for Different Types of Errors
- Simple Human Mistake: When a healthcare provider makes an error due to not seeing the risk or unintentionally taking an incorrect action.
- Behavioral Choice: When the provider understands the risk but chooses what they perceive as a lesser risk through actions like cutting corners or workarounds.
- Reckless Behavior: When the provider fully understands the substantial risks but decides to proceed anyway.
Responses to Different Types of Errors
- Human Mistake: Usually calls for training and systemic changes rather than punishment.
- Behavioral Choice: Often requires coaching and highlighting the importance of following protocols.
- Reckless Behavior: May involve disciplinary actions but also necessitates psychological support and examining the surrounding context that led to the reckless decision.
Ensuring Fair Investigation Processes
The principle of Just Culture also incorporates the substitution test, which questions whether another professional in similar circumstances might have made the same mistake. It challenges investigators to look beyond individual actions to systemic contributors, such as chronic under-staffing or workflow inefficiencies.
Normalization of deviance is a concept where repeated deviation from protocol, when left unchecked, becomes normalized. This can be seen in both routine and high-stakes environments. For example, if safety glasses are regularly ignored in a factory setting, the risk gradually feels non-existent until an accident occurs.
The Role of Peer Investigators
Leaders investigating incidents carry inherent biases which can affect the outcomes. Investigations led by peers can reduce these biases and improve the accuracy of incident analysis. Peer investigation fosters a safer environment for honest dialogue, which is crucial in identifying systemic issues.
Developing Emotional Resilience Among Healthcare Providers
Handling errors with transparency and a systemic approach is critical, but so is supporting the provider’s emotional well-being. The second victim phenomenon recognizes that healthcare providers who make errors also suffer emotionally, with potential impacts including burnout, lower resilience, and even leaving the profession.
Supporting Second Victims
When addressing second victim phenomenon:
- Internal Work: Providers need to accept their humanity, recognize their commitment to care, and differentiate between their feelings and their identities.
- External Support: Leaders and peers must offer support through listening, counseling, and creating an environment of understanding rather than blame.
Practical Steps to Emotional Recovery
- Acknowledge Human Imperfection: Understanding that humans are inherently imperfect can mitigate some of the guilt associated with making mistakes.
- Reflect on Professional Commitment: Recognizing that emotional reactions stem from a deep care for patient well-being is crucial.
- Separate Feelings from Identity: Differentiating between guilt-induced feelings and one’s professional identity can help in overcoming the emotional aftermath.
Actionable Reflection and Systemic Contribution
Encouraging providers to recount their experiences and share learnings with peers can transform personal mistakes into collective learning opportunities. This not only aids in personal recovery but also contributes to systemic safety enhancements.
Prioritizing Staff Well-being to Improve Patient Care
Balancing patient care with healthcare worker well-being creates a sustainable and effective healthcare environment. The notion of “patients come second” can be provocative, yet it underscores an important truth: healthcare providers who feel supported and valued are better equipped to deliver high-quality patient care.
Ensuring Emotional and Professional Support
Dr. Leiner's reflections dovetail with the experiences articulated by other healthcare professionals who stress the importance of taking care of medical staff to prevent emotional and professional harm. Creating an environment where healthcare providers feel supported mitigates the risk of errors and the emotional toll associated with high-stress roles.
- Professional Development: Continuous education and retraining, focused on both emotional resilience and technical skills.
- Peer Support Programs: Establishing peer support groups to offer communal support in emotionally taxing times.
- Counseling Services: Integrating psychological support as a part of routine healthcare practices for staff.
Systemic Processes and Communication
The foundation of high reliability in healthcare leans heavily on robust systems, effective processes, and seamless communication. Rather than relying solely on individual vigilance or effort, healthcare organizations must cement processes that naturally minimize risks and errors.
- Improved Documentation Systems: Transitioning from manual to automated systems to ensure accuracy.
- Structured Communication Protocols: Implementing standardized communication protocols to reduce misunderstandings and miscommunications.
Fostering a Culture of Continuous Improvement
The journey towards an organization-wide culture of transparency and continuous improvement is challenging but immensely rewarding. By continuously learning and adapting, healthcare providers and institutions can make incremental changes that lead to significant long-term improvements.
- Data-Driven Decision Making: Leveraging data analytics to identify areas needing improvement and track the effectiveness of implemented changes.
- Regular Feedback Loops: Instituting regular feedback mechanisms where both patients and staff can contribute insights into what’s working and what isn’t.
- Cultivating a Learning Environment: Encouraging an organizational mindset where learning from mistakes is ingrained in the culture.
The Role of Leadership in Promoting Well-being
Leadership plays a crucial role in fostering an environment that prioritizes the well-being of both patients and healthcare providers. Leaders must be proactive in advocating for systems that support their teams and investing in the professional development and emotional health of their staff.
- Visible and Responsive Leadership: Leaders should be approachable and responsive to the needs and concerns of their teams.
- Investment in Infrastructure: Allocating resources towards infrastructure that supports both patient and staff needs.
The Intersection of Human Needs and Organizational Goals
Ultimately, aligning human needs with organizational goals creates a synergetic effect where both patients and providers thrive. This holistic approach is fundamental in transforming healthcare delivery into a more humane and effective system.
- Holistic Care Models: Adopting care models that equally prioritize patient outcomes and provider well-being.
- Integrative Strategies: Implementing strategies that synchronize individual needs with organizational objectives, ensuring that neither is compromised at the expense of the other.
By nurturing an environment where both patients and healthcare professionals are cared for, organizations can achieve higher levels of trust, better patient outcomes, and a more resilient healthcare system. Through systemic enhancements, transparent communication, and unwavering support for healthcare providers, the vision of zero harm and high reliability in healthcare becomes increasingly attainable.