This is a guest post by J. Paul Curry, M.D.
I was inspired when I lost my best friend 15 years ago to a common medical-error phenomenon: The lack of monitoring patients in the hospital.
Losing Mark altered my entire career in medicine and started me on a long journey of trying to understand how this particular problem happens. The journey has been eye-opening for me for many reasons, and probably most importantly by striving to learn and understand how the human brain can deceive itself into believing that thoughtful, rational, goal-directed tactics are always the solution to finding the answers to highly-complex enigmas.
Actually, the blockbusting solutions that change the course of our culture — how we do things — are most often totally unpredictable and discovered by accident by disruptive innovators, such as Dr. Larry Lynn of the Sleep and Breathing Research Institute, willing to tinker on their own and against the grain of thousands of smart people who dismiss this kind of outlier work as fantasy. To get just how often this happens and why, I’d invite those unfamiliar with Nassim Nicholas Taleb’s work to read “The Black Swan : The Impact of the Highly Improbable” and other books of his. This is what we’re up against today.
I was recently operated on, having a significant multi-level back surgery at one of the outstanding university spine programs in the country, supported by one of the elite anesthesia programs. I was told by the resident that I’d be going to the general care floor following my surgery, where I’d be checked on regularly. This was a given because I’m a fitness fanatic, but the resident wasn’t prepared for my followup questions. As I probed for more detail, it became apparent that no one in the organization had any inkling that nursing checks only occurring every four or eight hours on a patient fresh from surgery with patient-controlled narcotics was less than standard of care.
I told them I have mild sleep apnea and wanted pulse oximetry at minimum. I had to be upgraded to telemetry to get it. What’s more interesting is that there was so little understanding of this problem that they put me on pulse oximetry in a room where the only one who could watch it was me — the patient.
The point? Medical errors are preventable because there’s knowledge out there that predetermines the right thing to do, but for whatever reasons it doesn’t get done. In my case, there wasn’t the knowledge there to begin with. Even the progressive monitoring industry is making incremental strides in trying to advance and change the culture, but their work today is based on what they think they know, not what they don’t know for sure — which is considerably behind the thinking in Dr. Lynn’s message. Frustration is rooted not in harm from carelessness, but rather continuing harm occurring from lack of a critical mass of people needing to know the unknown and accept what people like Dr. Lynn have been talking about: To actually drive postoperative care to a totally-transformed entity: A “Black Swan.”
There’s been a lot of work done on human behavior and how best to motivate these kinds of mega-changes. Appreciative inquiry (AI) has been very successful and is based on evidence that while people can learn from mistakes, if we’re able to unearth them (since many assume bad consequences must be the result of carelessness) we do much better from looking at and learning from our successes. That’s why I’ve made these reading recommendations and hope to continue through candor to add value to our work towards better patient safety in postoperative care.
J. Paul Curry, M.D. is the past Chief of Staff at Hoag Memorial Hospital Presbyterian and a current Clinical Professor in the Department of Anesthesia and Perioperative Medicine at Ronald Reagan UCLA Medical Center.
Dr. Curry researches clinical safety and co-authored the February 11, 2011 article entitled “Patterns of Unexpected In-Hospital Deaths: A Root Cause Analysis” in the journal Patient Safety In Surgery.