Find this information useful? YubaNet is powered by your subscription
December 10, 2019 – In 1999, the Institute of Medicine issued To Err Is Human, a 300-page declaration of a crisis in patient safety. The report made headlines with its claim that 98,000 Americans were dying each year from medical mishaps. Congress and the medical industry responded with alarm and promises of reform.
Medical errors, however, remain as vexing a problem as they were 20 years ago, according to Bloomberg Distinguished Professor Kathleen Sutcliffe of Johns Hopkins University and her new book from Oxford University Press, Still Not Safe: Patient Safety and the Middle-Managing of American Medicine, co-authored with Robert Wears.
In the Q&A below, Sutcliffe, an organization theory expert with faculty appointments at the Johns Hopkins schools of business, medicine, public health, and nursing, talks about the still-current threat to patient safety.
Q: As your book title states, many patients today still aren’t safe.
SUTCLIFFE: The problem is still there. It’s still big. We still have roughly the same rates of harm. The World Health Organization issued a study this past September that said 40 percent of patients in primary and outpatient care are harmed.
How do you define patient safety?
It’s the idea of preventing and avoiding harm to patients. Do no harm, as the expression goes. Harm doesn’t occur just at the hospital. It could be a misdiagnosis during a regular visit to your physician or the improper prescribing of medication. That same WHO report said millions of people are harmed each year by diagnostic and medication errors, and those mistakes cost billions of dollars.
In your book, you say that the health care industry has basically ignored taking a hard, honest look at itself and making systemic reforms, preferring instead to focus blame for errors on front-line workers such as nurses.
To admit that these mishaps are not just the result of human error means that health care administrators would have to change their systems in a major way – and trying to change a large system is difficult. Over the past decade, there have been efforts to change this and create just cultures. That means trying to create an atmosphere of trust in which people are encouraged, and even rewarded, for providing safety-related information — but in which they are also clear about where the line must be drawn between acceptable and unacceptable behavior.
How do you think administrators should be responding?
The preferable response for an organization facing a crisis is to try to understand the context. Health care industry leaders focus on what has gone wrong when they should focus on how things are going right and what they might learn from that. What’s problematic is that safety is really a non-event. What I mean is that when the system works as it should and nothing untoward happens, nothing attracts attention or initiates concern. When something bad happens, when an “event” occurs, the tendency is to want to point a finger at someone or something. And often that means blaming people lower down in the organization.
Yes, there needs to be a preoccupation with failure, but not in a blaming-shaming way. It means trying to understand in advance how everything you plan to do could go wrong. Anticipate accidents in the making. And when something does go wrong, you use that opportunity to get a sense of the state of the system as a whole. For example, if you’ve learned that every day on the second shift there’s going to be a lack of a certain resource at a particular place within the hospital, then you take the necessary steps to correct that.
Your book also recommends a multidisciplinary approach to addressing these problems.
Patient safety efforts should involve psychologists, sociologists, organizational behaviorists, and engineers, not just health professionals. Use the knowledge from multiple disciplines. Engineers helped bring about advances in anesthesiology. Health care is a huge industry, responsible for about a fifth of our economy, and the huge problems it faces could be better addressed by people from a variety of fields pooling their expertise.