TEDxCreativeCoast - Marty Scott - Is It Safe? Designing a Culture of Patient Safety
## Speaker Context
- Speaker role: Presenter discussing patient safety processes.
- Audience: Committee members (implied professional setting).
- Setting: A presentation/talk before a committee.
- Framing: The speaker is presenting on patient safety improvements at Memorial and in Savannah, aiming to provide ideas on process design for safety.
## People
- Lawrence Olivier + actor + subject in a famous scene with Dustin Hoffman.
- Dustin Hoffman + actor + subject in a famous scene with Lawrence Olivier.
- Josh Baron + patient + son of Ridley Baron, who was involved in a patient harm incident.
- Ridley Baron + father + subject of the narrative about his son, Josh.
## Organizations
- Memorial + hospital system + where the speaker discusses improvements and successes.
- HealthGrades + organization + published a recent report on medical mistakes over the last 10 years.
- Institute of Medicine + group/committee + stated 10 years ago that 100,000 people die in American hospitals from medical errors.
- FAA + governmental body + would shut down the airline industry if an airliner crashed every other day.
## Places
- Memorial + hospital + location of work and reported success.
- Savannah + location + where the speaker discusses patient safety work.
- Hilton Head + location + where the Baron family was on vacation when the accident occurred.
## Tools, Tech & Products
- Smart pump + device + can match a library of drugs against programmed medication and alarm correctly.
- Barcode system + technology + system where drugs have a unique barcode for a patient and can be double-checked against a patient's armband.
- Computerized physician order entry system + technology + system that can offer up weight-based dose and interval correctly if trained.
- Electronic order entry system + technology + system that requires training to be used properly.
- Point of care resources + technology + includes smartphones and iPads, used to pull or push information immediately at the point of care.
## Concepts & Definitions
- Patient safety + overall topic + designing processes to keep people safe.
- Latent defects + concept from James Rezen + defects in processes.
- Swiss cheese model + model from James Rezen + illustrates how aligned holes in processes can lead to defects hitting the patient.
- Sentinel event + concept implied by the discussion of the Baron incident + a major adverse event or injury.
- High-risk environment + description of healthcare + where people are making mistakes.
## Numbers & Data
- 100,000 people + count + estimated number of deaths from medical errors in American hospitals (stated 10 years ago).
- More deaths than breast cancer + comparison count + what 100,000 deaths from medical errors exceeds.
- More deaths than AIDS + comparison count + what 100,000 deaths from medical errors exceeds.
- More deaths than trauma + comparison count + what 100,000 deaths from medical errors exceeds.
- Every other day + frequency + rate compared to an airliner crashing.
- 25 to 30 + range + average number of mistakes a human being makes in a day.
- Last 10 years + duration + period covered by a HealthGrades report.
- 6 milligrams per kilogram DIV Q6 hours + specific dosage/timing written on an order sheet.
- 0.1 kilograms + precise weight difference + that caused the pharmacist issue.
- 7:00 p.m. and 7 a.m. + time window + during which the nurse was supposed to perform an independent verification of the dose.
- 24 hours + duration + time for which the patient received an overdose of antibiotics.
- 90% + percentage reduction + reduction in the incidence of errors of harm at Memorial from 2007 to 2009.
- 2007 to 2009 + period + time frame for error reduction success at Memorial.
- 17 months + duration + time frame at Memorial with only one episode of patient harm.
- 17-month-old + age + of Josh Baron when he was flown to Memorial.
- 20 miles + distance + from home where the driver ran a stop sign.
## Claims & Theses
- It is important to ask, "Is it safe?" in healthcare.
- People are human beings and everybody makes mistakes.
- The Institute of Medicine stated 10 years ago that 100,000 people die in American hospitals from medical errors.
- 100,000 deaths from medical errors is more deaths than from breast cancer, more deaths than from AIDS, and more deaths than from trauma.
- If an airliner crashed every other day, the FAA would probably shut down the airline industry after the second or third crash.
- The difference in healthcare is that we kill patients one at a time, so most people don't hear about the mistakes.
- A recent report from HealthGrades showed that we are still killing about 100,000 people every year in healthcare from medical mistakes.
- We must acknowledge that all people make errors, that health care is a high-risk environment, and that all people make errors.
- If we recognize error-likely situations and provide tools, training, behavior modification, technology, and processes, we can capture and prevent errors from hitting the patient.
- The nurse who was on orientation did not follow the policy of independent verification between 7:00 p.m. and 7 a.m.
- The pharmacist's workaround involved manually calculating the dose when the weight hadn't migrated over.
- The latent defects in processes can line up, and the defect hits the patient.
- The policy for nurses was to do an independent verification of the dose using an outside source between 7:00 p.m. and 7 a.m.
- The pharmacist hadn't completed the drug library for all chemotherapy options.
- The nurses' response to the pump alarming was to override the alarm because it happened regularly.
- The best healthcare providers make mistakes.
- Any day at work is a high-risk situation at some point in time.
- If we know low-risk behaviors, we can minimize risk.
- Establishing expectations and teaching culture change is necessary for safety.
- A culture where reporting errors and incidents is not punitive must be built.
- Technology, while helpful, cannot prevent all errors.
- Designing the environment and improving teamwork is crucial for improving care.
- Family involvement in care is necessary.
- We must build accountability all the way up to the board level.
- The challenge is how to keep getting better and chasing the zero while dealing with the fallibility of humans.
## Mechanisms & Processes
- The process of questioning safety: Asking "Is it safe?" in multiple ways to get a clear answer.
- Workflow for medication administration: Doctor writes order sheet $\rightarrow$ Pharmacist receives order $\rightarrow$ Nurse administers drug via pump.
- Manual calculation workarounds: Pharmacist manually calculates dose when weight hasn't migrated over between systems.
- Swiss cheese model: Latent defects in processes must align for a defect to hit the patient.
- Policy for night shift verification: Nurse must perform an independent verification of the dose using an outside source between 7:00 p.m. and 7 a.m.
- Drug library completion: Pharmacist must complete the drug library to provide guardrails for all chemotherapy options.
- Overriding alarms: Nurses overriding the smart pump alarm because they were used to it happening regularly.
- Behavior-based modification tools: Tools implemented to reduce errors.
- Star process: A process used to improve reliability by performing discrete actions at set points (e.g., at dining table, front door, getting in car).
- Standardization of room design: Implementing wall-mounted bed rails to guide patients to the bathroom, preventing falls.
- Decentralized nurses stations: Improving communication by allowing nurses to be out of their fixed posts.
- Real-time decision support: Technology that alerts the user about drug allergies or drug-drug interactions upon order entry.
## Timeline & Events
- Kindergarten: Started the process of improving reliability for getting items out the car.
- Childhood to present: Ongoing effort to improve the routine of getting the book bag, lunchbox, and student into the classroom.
- Saturday before Easter, a few years ago: Baron family was involved in a t-boned accident near Hilton Head.
- That Saturday: Josh Baron was flown to Memorial with a brain injury and placed on life support in the ICU.
- The next day: Josh Baron began showing signs of improvement, like pulling back when his toes were pinched.
- Monday: Josh Baron was able to be extubated, indicating improvement.
- Tuesday: Josh Baron had a seizure.
- Prior to the overdose: The order for Dilin was sent to the pharmacist.
- At an unspecified time before the overdose: The wrong dilution was pulled and labeled by the pharmacist.
- At the time of the overdose: Josh Baron received a hundredfold overdose of Dilin.
- 2007 to 2009: Period when Memorial reduced its incidence of errors of harm by 90%.
## Examples & Cases
- Lawrence Olivier examining Dustin Hoffman: Demonstrates the necessity of precise questioning ("Is it safe?").
- Pediatric surgeon's order: Writing "6 milligrams per kilogram DIV Q6 hours" on an order sheet.
- Smart pump alarming: A correctly functioning pump alarming when administered medication did not match the drug library.
- Overdose of antibiotics: Patient received an overdose for 24 hours because the smart pump alarm was overridden four times.
- Improving reliability with the Star process: Reliability improved from about 20% to about 90% for getting school items ready.
- The Ridley Baron case: Son Josh Baron received a hundredfold overdose of Dilin because the wrong dilution was pulled by the pharmacist.
## Trade-offs & Alternatives
- Potential approach: Saying, "Of course it's safe" (initial, superficial answer).
- Alternative approach: Addressing the question with specificity ("My daughter's having surgery in the morning. Is it safe?").
- Technology reliance: Relying solely on technology (e.g., the smart pump) is insufficient, as human error can bypass it.
- Human intervention: Requiring manual oversight (e.g., pharmacist manually calculating dose) as a workaround when systems fail to communicate.
## Counterarguments & Caveats
- The initial statement "Of course it's safe" is inadequate when context (like a daughter's surgery) is provided.
- People are still making mistakes even with best intentions.
- While technology can help, it is still just a tool.
- The smart pump's alarm, while signaling an error, was bypassed by human behavior (overriding the alarm).
- It is hard to build a culture where reporting errors is not punitive, as job security is a concern.
## Methodology
- Reviewing past incident reports (e.g., the 100,000 deaths cited).
- Analyzing system workflows (e.g., order entry $\rightarrow$ pharmacy $\rightarrow$ nursing).
- Implementing procedural changes: Establishing new policies (e.g., independent verification) and tools (e.g., Star).
- Environmental design changes: Implementing physical changes like standardized bed rails and dedicated monitoring equipment (e.g., stop light system in NICU).
- Auditing and monitoring: Regularly assessing adherence to new protocols and expected behaviors.
## References Cited
- James Rezen + organizational psychologist + author of the Swiss cheese model.
- Paul Barak + counterpart + spoke about design things in Australia.
## Conclusions & Recommendations
- We must acknowledge that all people make errors, and health care is a high-risk environment.
- We must capture and prevent errors by providing tools, training, behavior modification, technology, and processes.
- We must build a culture where reporting errors and incidents is not punitive.
- We must build accountability from the middle managers up to the board level to demonstrate commitment to patient safety.
- Future work should focus on how to help shape the environment and what kind of technology can be designed for the next generation of healthcare.
## Implications & Consequences
- Ignoring latent defects in processes can lead to fatal patient errors (the Swiss cheese model).
- Failures in communication or adherence to updated procedures can result in severe patient harm (Josh Baron's overdose).
- A lack of institutional focus on reporting errors being non-punitive will inhibit necessary learning.
- Poorly designed environments or processes contribute to error opportunities (e.g., excessive patient movement).
## Open Questions
- How can we help design technology solutions for the next generation of healthcare?
- How can we partner with external groups to help shape the environment?
- How do we keep getting better and chasing the zero while dealing with the infallibility of humans?
## Verbatim Moments
- "I don't know what you're talking about. What's the definition of is?"
- "Of course it's safe. It's very safe. It's so safe you wouldn't believe it."
- "The average human being makes 25 to 30 mistakes in a day."
- "100,000 deaths from medical error is more deaths than from breast cancer, more deaths than from AIDS, and more deaths than from trauma."
- "This is the Swiss cheese model. This is James Rezins who's an organizational psychologist and talks about the Swiss cheese model."
- "Oh, that's that's how that happens all the time. I'll show you how to override it."
- "I'm hoping that high school or college, we're going to really move up to that next level."
- "You got to take that away from people."
- "This is like pushing, you know, rocks up a hill that you never you never get you never can get there."
- "We can have automated delivery systems. We can barcode drugs so that when it comes from the pharmacy, it has a a barcode that's unique for that patient and we can double check that against the patient's armband."
- "My son's not a sentinel event."
- "How do we keep getting better? How do we keep chasing the zero? How do we get perfect and dealing with the infallibility of humans?"