ECRI’s Top 10 Health Technology Hazards

William Storage – Nov 30, 2016
VP, LiveSky, Inc.
Visiting Scholar, UC Berkeley Center for Science, Technology, Medicine & Society

ECRI recently published their list of top ten health technology hazards for 2017. ECRI has released such a list each year since at least 2008.

ECRI’s top ten for 2017 (requires registration) as they label them:

  1. Infusion Errors Can Be Deadly If Simple Safety Steps Are Overlooked
  2. Inadequate Cleaning of Complex Reusable Instruments Can Lead to Infections
  3. Missed Ventilator Alarms Can Lead to Patient Harm
  4. Undetected Opioid-Induced Respiratory Depression
  5. Infection Risks with Heater-Cooler Devices Used in Cardiothoracic Surgery
  6. Software Management Gaps Put Patients, and Patient Data, at Risk
  7. Occupational Radiation Hazards in Hybrid ORs
  8. Automated Dispensing Cabinet Setup and Use Errors May Cause Medication Mishaps
  9. Surgical Stapler Misuse and Malfunctions
  10. Device Failures Caused by Cleaning Products and Practices

ECRI is no doubt aiming their publication at a broad audience. The wording of several of these, from the standpoint of hazard assessment, could be refined a bit to better inform mitigation plans. For example, the first item in the list (infusion errors) doesn’t really name an actual hazard (unwanted outcome). I take a crack at it below, along with a few comments on some of the other hazards.

ECRI lists their criteria for inclusion in the list. They include – in system-safety terminology – severity, frequency, scope (ECRI: “breadth,” “insidiousness”), detectability (“insidiousness”), profile,  preventability.

That seems a good set of criteria, though profile might better point to an opportunity for public education rather than be a good criterion for ranking risks. We’d hope that subject matter experts would heavily discount public concern for imaginary hazards.

Infusion failures/errors resulting in wrong dose, rate, duration or contamination

I’m guessing there’s a long list of possible failures and errors that could lead to or contribute to this hazard. Some that come to mind:

  • Software bugs
  • Human-computer interaction (HCI) errors (wrong value entered due to extra keystroke)
    • Unit-of-measure confusion
    • Unclear instructions and cues
    • Alert fatigue
    • Unclear warnings
    • Unheard warnings (speaker volume low)
  • Monitor failures (false positive, failure to alert when alert condition is met)
  • Undetected physical damage (material fatigue-cracks allowing water penetration
  • Unannunciated battery failure
  • Electrical power failure

Ventilator alarms

This issue includes two unrelated problems, one simple and infrequent, the other common  and often called “preventable human error.” Human error may be the immediate cause, but systems having a large number of critical, preventable errors are flawed systems. That means some combination of flawed hardware design and flawed operating procedures. The first problem, latent failure of ventilator alarm resulting in undetected breathing problem, caused several deaths in the last ten years. Failure of caregivers to respond to alarm reporting critical breathing condition is much more serious, and has been near the top of ECRI’s list for the past five years.

Undetected Opioid-Induced Respiratory Depression

In 2006 an Anesthesia Patient Safety Foundation conference set a vision that “no patient shall be harmed by opioid-induced respiratory depression” and considered various changes to patient monitoring. In 2011, lack of progress toward that goal led to another conference that looked at details of patient monitoring during anesthesia. Alert fatigue was again a major factor. Inclusion in ECRI’s 2016 list suggests HCI issues related to oximetry and ventilation-monitoring still warrant attention.

Occupational Radiation Hazards in Hybrid ORs

Wouldn’t traditional radiation badges for the staff in hybrid facilities be a cheap solution?

Software Management Gaps

Yes Judy, EHR vendors’ versioning practices from the 80s do impact patient care. So do sluggish IT departments. ECRI cites delayed implementation of software updates with safety ramifications  and data inaccessibility as consequences.

Mitigations

Isn’t there some pretty low-hanging fruit for mitigation on this hazard list? Radiation badges, exhaust-fan filters on heater-cooler systems to catch aerosolized contaminants, and formal procedures for equipment cleaning that specify exactly what cleaning agents to use would seem to knock three items from the list with acceptable cost.

Correcting issues with software deployment and version management may take years, given the inertia of vendors and IT organizations, and will require culture changes involving hospital C-suites.

onriskof.comDespite decades of psychology studies showing that frequent and repetitive alarms (and excessive communication channels) negatively impact our ability to recall “known” information, to cause us to forget which process step we’re performing, and cause us to randomly shed tasks from a mental list, computer and hardware interface design still struggles with information chaos. Fixing this requires the sort of multidisciplinary/interdisciplinary analysis for which current educational and organizational silos aren’t prepared. We have work to do.

ECRI deserves praise not only for researching and publishing this list, but for focusing primarily on hazards and secondarily on risk. From the perspective of system safety, risk management must start with hazard assessment. This point, obvious to those with a system safety background, is missed in many analyses and frameworks.

  – – –


In the San Francisco Bay area?

If you are, consider joining us in a newly formed Risk Management meetup group.

Risk assessment, risk analysis, and risk management have evolved nearly independently in a number of industries. This group aims to cross-pollinate, compare and contrast the methods and concepts of diverse areas of risk including enterprise risk (ERM), project risk, safety, product reliability, aerospace and nuclear, financial and credit risk, market, data and reputation risk, and so on.

This meetup aims to build community among risk professionals – internal auditors and practitioners, external consultants, job seekers, and students – by providing forums and events that showcase leading-edge trends, case studies, and best practices in our profession, with a focus on practical application and advancing the state of the art.

If you’re in the bay area, please join us, and let us know your preferences for meeting times.

https://www.meetup.com/San-Francisco-Risk-Managers/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s