← back · transcript · o58wHQ_dPJc · view dossier

Transcript

TEDxCreativeCoast - Marty Scott - Is It Safe? Designing a Culture of Patient Safety

So, thank you. I want to thank the committee for uh kind of taking a little different approach and allowing me to be here. So, don't think I'll be as uh as as entertaining as Drew. That's a hard act to follow. But uh I should have I should have asked to go first before you. But we'll tell you about patient safety, what we're doing at Memorial and in Savannah and, you know, around the nation. Give you some ideas about what goes on with patient safety and how we're designing things, designing processes to keep people safe. So, I like in asking that question, is it safe? I kind of like to reflect back to that marathon man clip. And some of you probably seen that clip. You know what I'm talking about with the dental examination where sir Lawrence Olivier is examining Dustin Hoffman. And it's a really famous scene. It's uh he's trying to find out information, but the just before that he comes in and he has they have Dustin Hoffman tied down the chair and he asks, "Is it safe?" And Dustin Hoffman has no idea what he's talking about because he's not involved in any of this plot. His brother was his brother's dead. But he says, "Is it safe?" And of course, Dustin Hoffman does what anybody was would do. He says, "Well, I I don't know what you're talking about. What's the definition of is?" And uh and so he asked again. He goes, "Is it safe?" And once again, he still doesn't know what he's talking about. and he said, "I still don't know what you're talking about. I'm sorry. I don't know what you're talking about." Again, he asked, "Is it safe?" And finally, he realizes he needs to give him an answer. He says, "Of course it's safe. It's very safe. It's so safe you wouldn't believe it." Um, and then Sir Lawrence Olivier is is only he can do as a master actor that he is, fixes him with this steely glaze. His face hardens and he says, "Is it safe?" And Dustin Hoffman realizes, "Oh, I probably gave him the wrong answer." So then he switches and says, "No, it's not safe. It's it's very dangerous. You should stay away." And I think we in healthcare are very much like that. Because if you ask me, uh, is is the hospital safe? Is it safe? I'm going to say, "Of course it's safe. We have great doctors. They're well trained. We have dedicated nurses, pharmacists, everybody arrives to work every day to do the best thing they can. Of course, it's safe." But then if you come back and say, "No, no, let me let me ask that again. My daughter's having surgery in the morning. Is it safe?" That's a very different question. Because then I'm going to tell you, I'm going to fall into the doctor stuff. Well, you know, there's always a risk. You know, anytime you put somebody to sleep, there's a risk. People are there's human beings taking care of other human beings. Sometimes the human beings were taking care of hadn't read the textbooks. They don't respond with the way we've been trained to believe they respond. And of course, everybody makes mistakes. The average human being makes 25 to 30 mistakes in a day. It's pretty scary. Most of those don't matter, though. You forget to uh pick up your car keys when you're walking out the door. You just go back in and pick them up. Sometimes though, some of those mistakes start lining up. We're going to talk about that. Institute of Medicine 10 years ago said a 100,000 people die in American hospitals from medical errors. Let me give you some context on that. 100,000 deaths from medical error is more deaths than from breast cancer, more deaths than from AIDS, and more deaths than from trauma. 100,000 people die from medical errors. Another way to think about that is a 100,000 people dying from medical errors is an airliner crashing every other day. Every other day. Now, I suspect that if an airliner crashed every other day, after about the second or third crash, the FAA would probably shut down the airline industry to figure out what was going wrong. The difference is in healthcare is we kill patients one at a time. So most of the time you don't hear about it unless it's Dennis Quaid's twins and they you have famous person who's able to have a public forum and his tw his twins fortunately did fine. Has they had a very few scary days but fortunately did fine. But unless it's a public person or something like that or something very egregious you got the wrong leg off of somebody. You most of the time don't hear about the mistakes that happen in healthcare. I'd like to say that it's getting better, but unfortunately a recent report from HealthGrades looked at what's happened in the last 10 years and we're still killing about 100,000 people every year in healthcare from medical mistakes. So, we have to acknowledge that all people make errors, that health care is a high-risisk environment and that all people make errors. But if we can recognize that there are error likely situations, if we can recognize when we're in these high-risisk situations, if we can provide some tools, some training, some behavior modification, some technology, some processes, we can capture those errors and we can prevent them from hitting the patient. Let me take you through a medic an error, real error. Few months ago, one of our pediatric surgeons took a patient to the operating room. He came out of the operating room at about 4:00 a.m. He wanted to give the patient prophylactic antibiotics. He pulled out an order sheet and wrote the antibiotic 6 milligrams per kilogram DIV Q6 hours. Now, I'm a I'm a pediatric intensive care doctor. So, to me, I've been doing that all my life. So, that's that order is very clear to me, not very clear to everybody else. If I ask any of you Q6 means every six hours, but DIV maybe that means divided. What does that mean? That goes down. Now, now the far now that physician hadn't been trained in our computerized physician order entry, our electronic order entry system. If he had been, all he had to do was select the antibiotic that he wanted. The computer would have offered up the weight-based dose and the interval correctly, and he would have just clicked and accepted that. Unfortunately, he hadn't been trained yet. That dose goes down to the pharmacist at 4:00 a.m. in the morning. The pharmacist experience, he went in, goes in, and the order, the weight from the registration system hadn't migrated over to the pharmacy system. and they those two systems hadn't spoke to each other yet. His experience has been that when he goes in and enters the weight if it's off by even 0.1 kilograms that he has to go back and redo all that work. So what he has been doing his workaround has been doing is he just manually calculates that and then once the weight does migrate over then they then they fix it up on the back end. So he manually calculates send up and that's what he does. He manually calculates six milligrams per kilogram based on this child's weight. Forgets to divide it into four doses. Sends that dose up to the floor. Now I'm going to show you this. This is a Swiss cheese model. This is James Rezins who's an organizational psychologist and talks about the Swiss cheese model. The idea that there's latent defects in processes and some of those times those defects the holes in the processes line up and all those holes in the Swiss cheese line up and the defect hits the patient. So this is James Rez's mom. I'm going to take you through. That's that's what I'm taking you through. So that goes up to the uh to the floor. Patient up on the receives the antibiotic up on the floor. The nurse who's on orientation. The policy is if it comes up an order like that goes not through the computer between 7:00 p.m. and 7 a.m. The nurse is to do an independent verification of that dose using an outside source and we provide those reference sources. This was a new nurse and she didn't follow that policy. She did hang the medicine on what we call a smart pump. And a smart pump has a library of drugs that it can match against. So it can look at how she programmed that pump to give that medication and see if that matches the library. And she did that correctly and the pump alarmed correctly. Unfortunately, this patient had been didn't have an infection was getting prophylactic antibiotics was admitted to our PA unit where we take care of he patients. problem with that causes that is in in that floor he patients give chemotherapy and there are literally thousands of different choices of chemotherapy and the pharmacist hadn't completed that drug library to give those guard rails that reference library for all the options for chemotherapy. So the nurses were used to these pumps alarming regularly and their response was guess what overrided. So the pump alarmed. The uh nurse went to her mentor her that was orienting her and told her the pump alarmed. The nurse said, "Oh, that's that's how that happens all the time. I'll show you how to override it." And indeed they did. And in fact, when they handed off to the next shift, to the next group of nurses, to their colleagues, they told them that this was happening and here's how you override it. And they did that four more times. So in fact, that patient got an overdose of antibiotics for 24 hours. Four times a dose, four times for 24 hours. Now fortunately this was an antibiotic that didn't cause harm. Patient did have to be monitored on telemetry because there was a risk that she have some heart problems but was monitored for 24 hours and had no harm and was able to go home be back with their family. Not all now not always are we that lucky. So we got to ask ourselves what you know what do we look at and how do we use all of our resources? How do we bring people processes and technology to bear and design that to limit those errors to prevent those errors from causing those kind of struggles? So, we know that already said the best healthcare providers make mistakes. We work in high-risisk situations. You know, we're stressed. We're, you know, there's always new patients going on, people are dying, people are sick. So, any any day at work is a high-risisk situation at some point in time. But if we know that we have low risk behaviors, we can minimize that. So really talking about changing culture. So you have to establish what you want people to do, establish the expectations, go out and teach that that culture change. And then you have to go back and monitor it and build accountability. Are we doing what we are you doing what we ask you to do? Are we doing what we know can be done to reduce errors? And that's what we did at Memorial. We went out and created and worked with people to get behavior-based modification tools to reduce errors. One of them is a a tool called star. This is reduces slips and laps. Those things where you forget your car keys. It reduces you can use this in about 6 seconds anytime you want to. Reduces your chance of slips and laps by about a,000%. I'll give you an example of star working with my daughter. She's in the a rising eighth grader. Since kindergarten, we've been trying to get her book bag, her lunchbox, and her into the classroom all at the same time. So we started on that in kindergarten and we would star at the dining table because that's where her book bag and her lunchbox and and she worked and we would star. We'd have everything we'd head out the door and we were only about 20% reliable. About one out of five times I would actually get things out the car to the So what I discovered was I had to star multiple times. So we would start at the dining table. We would start at the front door. We would star when she stopped to rub the cat on the sidewalk. We would star when we got to the car. we would start again when we got out of the car. And now she's, like I said, a rising eighth grader. We've up to about 90% reliability. There's only about once a month that me or my wife has to take her book bag or her lunch over to her at school. I'm hoping that high school or college, we're going to really move up to that next level. So, how do you take get people to change? And how do you get people to embrace that safety is important? And how do you get people to do that? People want to do that. Nobody went to medical school, nursing school to cause harm. People went because they want to do the right thing. So you start that from the ground up from the sharp end where people touch patients. You got to build safety as a core value. You have to give those tools and steal those tools. You have to give some way to find and fix those problems. So you have to build a culture where reporting errors and reporting these incidents is not punitive. That's a big problem because if you tell me that you made a mistake and you gave the patient the wrong medicine, the first thing you're going to think if you don't had any experience is that your job's going to be threatened. You're going to get some kind of performance evaluation that's not good. Your job's going to be threatened. You got to take that away from people. And that's what we've done at Memorial. Try to build that culture. This is ongoing. This is ongoing. This is like pushing, you know, rocks up a hill that you never you never get you never can get there. And you got to build accountability all the way up to the board level. So from the middle managers to senior leadership all the way up to the board level, they have to live, breathe, and demonstrate that patient safety and that commitment to patient safety is something that we live and breathe 100% of every day that we're there. You think about technology, a lot of people think, well, I'll just have, you know, the computer system will save me. The smart pump will save me. I'll just rely on the technology. Bad idea. Technology can help us, but it's still a tool. It's still technology. You can't prevent some of these errors just as you did just as I showed you about the smart pump that didn't prevent the air. Although the pump did exactly what it did, we still defeated the pump. We can have automated delivery systems. We can barcode drugs so that when it comes from the pharmacy, it has a a barcode that's unique for that patient and we can double check that against the patient's armband. Smart pumps I've described, the computerized order entry system. It also has real-time decision support. So, if I go in and enter an order, it will tell me that Drew is allergic to that antibiotic and do I really want to do that? And I'll say, "Oh, didn't want to do that?" Or that Drew is taking another medication that interacts with that drug and do I really want to do that. I can say, "I don't really want to do that." And now even more online is point of care resources, smartphones, iPads, those sort of things that can, you know, that you can pull information immediately to you or you can actually push information immediately to you at the point of care. So, it help you with that decision process. But you know it's uh the other thing we got to think about though is design and where does in design of the environment how we shape the environment how does that fit in? Can we shape the environment to um improve teamwork because medicine's a team sport and relies on the team functioning in a highly reliable way communicating well. We also need to involve the families more. We need to get the families more involved in their care, the patients and the families more involved in their care and partner with them so that they can help us u and then promote education and accountability. Like I said, James Rezen teaches us that there's these latent conditions. So that's what we got to do. We got to look at how we can minimize or eliminate some of these latent conditions. Lack of standard processes, poor visibility, high noise levels, excessive patient movement. You know, sometimes patients spend only a few hours in the room. They're going to the CT scanner. They're going over here for a colonoscopy. They're going back here. And all these movements and handoffs contributes to the opportunity for errors. So, some of that work is being done. You know that there's ways we can reduce patient movement. We can reduce noise levels. In our neonatal ICU, we have a a stop light. And if the if it's green, the noise level is very good because we know that too much noise for these preeies can be detrimental to their to their growth. If it gets up to yellow, we're in kind of a little too much noise. If it hits red, we're way too much noise and it's in all of our all of our care pods. So you so they can staff can watch that. Standardization, more standardization. Many rooms now are designed so that the bed when the patient gets out of the bed they automatically can grab a bed rail on the wall and that bed rail will take them if they if will lead them right to the bathroom because a lot of times people are confused they get up in the middle of the night they need to go to the bathroom and they fall and injure themselves. Now we do some other things with monitor bed alarms and things like that to try to minimize that but that's another way that we can help reduce that error. Decentralized nurses and improve visibility so that the nurses can be out. Also those decentralized stations improve communication, allow people to communicate more. So these are the goals. This is from Paul Barack who's one of my counterparts is was in Miami is now in Australia, but talks about how you to do those kind of design things. So that's kind of one of my challenges for for you folks is how can you help us? What kind of technology can you design? What how can you help shape our environment? How can you partner with us to shape our environment? because there's a lot of creative talent that that we need to be meeting, you know, halfway to help us look at what's the next generation of healthcare look like. At Memorial Health, we went uh we've had great success. We reduced from 2007 to 2009, we reduced our incidence of errors of harm by 90%. In fact, we went 17 months at one point in time with only one episode of patient harm. 17 months with only one episode of patient harm. Sounds like a great success. Let me tell you this story. This is the Ridley Baron family and I've uh I've shared the podium with Reverend Baron many times. So I have permission to tell his story. His little son and his his wife there has told his little son Josh. They were driving home from Hilton Head after a vacation the Saturday before Easter a few years ago and they were t-boned in an accident. Driver ran a stop sign 20 miles from their home. His wife died instantly. Josh, his 17-month-old, was flown to Memorial with with a brain injury. He was comeosse. He was on life support. But I admitted him to the intensive care unit that Saturday and that's how he was on life support and comeomaosse. The next day he started showing some little bit of improvement. Started taking some breaths on his own. If I pinched his toes he would pull back. So those were all good signs. I signed out to my partner told him, you know, I thought he was improving. I thought he might be able to be extated to be taken off the ventilator, be taken off of life support fairly soon. And indeed on Monday he was. In fact, his father had had a broken shoulder, but was able to come to the hospital and hold him and visit with him. On Tuesday, Josh had a seizure. Now, what what what we know about seizures that far away from a traumatic brain injury like that is that Josh is going to have some neurologic problems. He's going to have some ongoing neurologic problems, but probably he's going to survive. They order den, which is an appropriate drug. They order an appropriate dose. They send that down to the pharmacist who reaches up on the shelf and pulls down the wrong dilution, mixes that in the syringe, labels it as it's ordered, and so the nurses it appears to be the appropriate dose. Goes back to the pediatric intensive care unit and Josh Baron received a hundfold overdose of dilin. his heart stopped and he died. And that was our one patient harm in 17 months. His dad likes to remind me and all of us that his son's not a sentinel event. And in fact, those hundred thousand people that die this year, those people that die this week are all somebody's son, somebody's mom, somebody's dad. And that's why this is very important. That's why we need health care people, doctors, nurses, patients, families, everybody partnering on how can we get better. How can we do this without harm? Because at the end of the day, no matter how hard we try, no matter how hard how well intended we are, we're still humans and we're still going to make mistakes and we still need better processes to improve those processes is to learn from those processes. So that's my challenge for you. That's my challenge for me is how we make how do we keep getting better? How do we keep chasing the zero? How do we get perfect and dealing with the infallibility of humans? So, thank you.