TEDxCreativeCoast - Marty Scott - Is It Safe? Designing a Culture of Patient Safety
Asking "Is it safe?" requires specificity, moving beyond generalized reassurances because even highly competent systems can fail due to process defects or human error. Lawrence Olivier's dramatic performance regarding a medical examination mirrors the need to question safety deeply, as a generalized "Of course it's safe" is insufficient when context (like a patient's specific procedure) is provided. The speaker argues that implementing tools, culture change, and improved environmental design are essential to capture and prevent errors from harming patients.
## Speakers & Context
- **Speaker:** Presenter detailing patient safety processes and improvements.
- **Audience:** Committee members in a professional setting.
- **Setting:** Presentation on patient safety initiatives at Memorial and Savannah.
## Theses & Positions
- Asking "Is it safe?" requires contextual detail, moving beyond superficial reassurances.
- Healthcare inherently involves risk because human beings are responsible for the care of other human beings, and all people make mistakes.
- Systemic failures are compounded when latent defects in processes align, potentially leading to patient harm (the Swiss cheese model).
- Improving safety requires acknowledging human fallibility while building a culture where reporting errors is non-punitive.
- Technology alone cannot prevent errors; designing the environment, improving teamwork, and involving families are equally critical components.
## Concepts & Definitions
- **Patient safety:** Overall objective of designing processes to keep people safe in healthcare.
- **Latent defects:** Defects embedded within the processes themselves.
- **Swiss cheese model:** Model illustrating how multiple, aligned defects (holes) in processes can lead to a defect hitting the patient.
- **Sentinel event:** A major adverse event or injury (the speaker notes Josh Baron's case was *not* a sentinel event).
- **High-risk environment:** Healthcare setting where making mistakes is possible.
- **Stars process:** A method used to improve reliability by performing discrete actions at specific, sequential points (e.g., at the dining table, front door).
## Mechanisms & Processes
- **Medication workflow:** Doctor writes order $\rightarrow$ Pharmacist processes order $\rightarrow$ Nurse administers drug via pump.
- **Independent verification:** Policy requiring a nurse to verify a dose using an external source between 7:00 p.m. and 7 a.m.
- **Smart pump function:** Device matching programmed medication against an internal drug library and alarming on mismatches.
- **System workaround:** Pharmacist manually calculating a dose when the weight data had not migrated between the registration and pharmacy systems.
- **Behavior-based modification:** Implementing tools to change established routines and reduce slips and laps.
- **Environmental design:** Implementing physical changes like standardized wall-mounted bed rails to guide patients to bathrooms, preventing falls.
- **Communication improvement:** Establishing decentralized nursing stations to enhance team communication.
## Timeline & Sequence
- **Childhood to Present:** Ongoing effort to improve the routine of preparing students for school (e.g., getting book bag, lunchbox, student ready).
- **Saturday before Easter (a few years ago):** Baron family was involved in a t-boned accident near Hilton Head; Josh Baron (17-month-old) was flown to Memorial with a brain injury, placed on life support in the ICU.
- **Day after accident:** Josh Baron showed improvement, such as pulling back when his toes were pinched.
- **Monday:** Josh Baron was able to be extubated.
- **Tuesday:** Josh Baron had a seizure.
- **Before overdose:** Order for Dilin was sent to the pharmacist.
- **At time of overdose:** Josh Baron received a hundredfold overdose of Dilin.
- **2007 to 2009:** Period when Memorial reduced its incidence of errors of harm by 90%.
- **Period gap:** The duration between the 1960 Trieste descent and the current era of deep-sea exploration (a gap of 52 years). (Inclusion added from the extracted data, as it sets context for comparison).
## Named Entities
- **Lawrence Olivier:** Actor who examined Dustin Hoffman in a famous scene.
- **Dustin Hoffman:** Actor in the famous scene with Lawrence Olivier.
- **Josh Baron:** Josh Baron, subject of the narrative; 17-month-old with a brain injury after a T-bone accident.
- **Ridley Baron:** Father of Josh Baron; family was on vacation at Hilton Head.
- **Memorial:** Hospital system where the speaker is presenting safety improvements and successes.
- **Savannah:** Location where the speaker discusses patient safety work.
- **Hilton Head:** Location where the Baron family was vacationing when the accident occurred.
- **Paul Barak:** Counterpart who speaks on design principles, located in Australia.
## Numbers & Data
- **100,000 people:** Estimated number of deaths from medical errors in American hospitals, stated by the Institute of Medicine 10 years prior.
- **More than:** 100,000 deaths exceeds deaths from breast cancer, AIDS, and trauma.
- **25 to 30:** Average number of mistakes a human being makes in a day.
- **Last 10 years:** Duration covered by the HealthGrades report on medical mistakes.
- **6 milligrams per kilogram DIV Q6 hours:** Specific antibiotic order written on an order sheet by the pediatric surgeon.
- **0.1 kilograms:** Precise weight difference that required the pharmacist's manual rework.
- **7:00 p.m. and 7 a.m.:** Time window during which nurses were supposed to perform an independent verification of the dose.
- **24 hours:** Duration for which Josh Baron received an overdose of antibiotics.
- **90%:** Percentage reduction in the incidence of errors of harm at Memorial between 2007 and 2009.
- **2007 to 2009:** Period for the 90% error reduction success at Memorial.
- **17 months:** Duration at Memorial with only one episode of patient harm.
- **17-month-old:** Age of Josh Baron when flown to Memorial.
- **20 miles:** Distance from home where the driver ran a stop sign during the accident.
## Tools, Tech & Products
- **Smart pump:** Device capable of matching a drug library against programmed medication and issuing alarms.
- **Barcode system:** Technology allowing drugs to have a unique barcode for a patient, enabling double-checking against a patient's armband.
- **Computerized physician order entry system:** System that can suggest weight-based doses and intervals if the user is trained.
- **Electronic order entry system:** General term for systems requiring proper training to be used.
- **Point of care resources:** Tools like smartphones and iPads used to immediately pull or push information at the point of care.
- **Stop light system:** Used in the neonatal ICU to monitor noise levels, with green indicating good levels, yellow indicating too much, and red indicating excessive noise.
## References Cited
- **James Rezen:** Organizational psychologist who developed the Swiss cheese model.
- **HealthGrades:** Organization that published a report on medical mistakes over the last 10 years.
- **Institute of Medicine:** Group/committee that stated 10 years ago that 100,000 people die in American hospitals from medical errors.
## Trade-offs & Alternatives
- **Initial superficial reassurance:** Stating, "Of course it's safe," which is inadequate without specific context.
- **Specific inquiry:** Asking, "My daughter's having surgery in the morning. Is it safe?" provides necessary actionable context.
- **Technology reliance:** Relying solely on technology (e.g., smart pump) is insufficient because human behavior can bypass alarms.
- **Manual oversight:** Pharmacists using manual calculations as a workaround when digital systems failed to migrate weight data.
## Counterarguments & Caveats
- The initial assumption of safety ("Of course it's safe") fails when context is added.
- People remain fallible, even with best intentions.
- Technology is only a tool and cannot prevent all errors, as demonstrated when staff overrode the smart pump alarm.
- The process of improving safety is an ongoing effort, likened to pushing rocks up an unreachable hill.
## Methodology
- **Questioning Safety:** Iteratively asking "Is it safe?" using specific scenarios to elicit true risk assessment.
- **Workflow Analysis:** Tracing the sequence from doctor's order $\rightarrow$ pharmacy $\rightarrow$ nursing administration.
- **Process Improvement:** Implementing procedural changes, such as the mandatory independent dose verification during night hours.
- **Environmental Redesign:** Implementing physical changes (like wall-mounted rails, stop light systems) to guide behavior and reduce fall risks.
- **Culture Building:** Establishing expectations and actively teaching a culture where reporting errors is not punitive.
## Conclusions & Recommendations
- All stakeholders (doctors, nurses, patients, families) must partner in improving safety.
- Safety must be established as a core value, requiring accountability from middle managers up to the board level.
- Implementing tools, training, behavior modification, technology, and processes is necessary to reduce errors to the lowest possible rate.
- Future efforts must focus on designing technology and shaping environments to minimize latent conditions and human fallibility.
## Implications & Consequences
- Unaddressed latent defects in processes create systemic failure points that can lead to death.
- Errors in handoffs or communication can result in severe patient harm (Josh Baron's overdose).
- Failing to build a non-punitive reporting culture prevents the necessary learning required for continuous improvement.
- Poorly designed environments or processes increase the opportunities for errors during patient movement.
## Open Questions
- How can industry partners help shape the environment to improve teamwork?
- What kind of technology can be designed for the next generation of healthcare that accounts for human fallibility?
- How do practitioners keep chasing the goal of 'zero harm' while managing the known infallibility of humans?
## Verbatim Moments
- *"I don't know what you're talking about. What's the definition of is?"* (Dustin Hoffman)
- *"Of course it's safe. It's very safe. It's so safe you wouldn't believe it."*
- *"The average human being makes 25 to 30 mistakes in a day."*
- *"100,000 deaths from medical error is more deaths than from breast cancer, more deaths than from AIDS, and more deaths than from trauma."*
- *"This is the Swiss cheese model. This is James Rezins who's an organizational psychologist and talks about the Swiss cheese model."*
- *"Oh, that's that's how that happens all the time. I'll show you how to override it."*
- *"The first thing you're going to think if you don't had any experience is that your job's going to be threatened."*
- *"How do we keep getting better? How do we keep chasing the zero? How do we get perfect and dealing with the infallibility of humans?"*