Hospital CEOs Play Key Role In Reducing Preventable Harm

By Frank Mazza, Chief Medical Officer, Quantros
Our system of healthcare is inherently unsafe, with an estimated 400,000 people dying in American acute care hospitals each year as a result of medical errors. When considered in real terms, this figure is truly staggering. By sheer numbers alone, this means medical errors represent the third largest cause of death, after heart disease and cancer. As a comparative, it is the equivalent of seven to eight fully loaded Boeing 737 jets crashing every day with loss of all life on board. If this statistic was true of airline safety, few people would elect to travel by air. Nevertheless, every year, millions of Americans entrust their lives and personal safety to hospital personnel.
This acknowledged harm to patients does not result from carelessness or indifference on the part of healthcare workers. In fact, most healthcare workers are highly dedicated to their profession. Rather, the errors occur as a result of imperfectly designed systems of care and because individuals within these systems function imperfectly. Hospitals operate within an environment of enormous complexity. An array of hospital units that form their own microsystems (ICU, ED, OR, Radiology, etc.) combine to form a system in which the need for tight coupling all but guarantees errors of both omission and commission.
Patient-safety luminaries — such as Don Berwick, Paul Batalden, and others — have repeatedly claimed that every system is perfectly designed to achieve the results that it gets. In saying this, they are pointing out the inconvenient truth that harm to patients in healthcare represents an inherent system property. In other words, our healthcare system has been designed in such a way that harm is unavoidable. Another contributor is the culture of healthcare and, in particular, the culture of medicine.
How do we legitimately make care safer for our patients? Reliable application and better tactical execution of best practices is one way — and the favored approach of the Centers for Medicare & Medicaid (CMS) and other quality regulators. Central line infections can be reduced, if not eliminated, by following standard infectious barrier precautions, maintaining them under sterile conditions, and removing them when they are no longer needed. Similarly, injury from falls can be eliminated with the appropriate use of bed alarms, by paying attention to pain and sanitary needs of patients, and by minimizing the use of drugs that can induce delirium. This tactical approach, however, is not enough. There is additional need for “behavioral accountability” on the part of healthcare workers as a means to reach “high reliability” in care. Higher reliability can be achieved by considering and managing cognitive biases.
One particularly troublesome cognitive bias in healthcare and other industries is the human tendency toward normalization bias. More specifically, this term refers to so-called normalization of deviance (NOD), a term coined by Columbia University sociologist Diane Vaughn in her seminal report on the 1986 Space Shuttle Challenger disaster. By definition, NOD denotes “the gradual process by which unacceptable practice or standards become acceptable.”
The Challenger disaster is instructive in this regard and illustrative of how such deviance occurs. The solution devised by NASA engineers to prevent the leakage of potentially harmful inflammatory gases from the solid rocket boosters that carried the space shuttles into orbit was to place rubber O-rings between the booster engines. Early on, after each shuttle voyage, and particularly during cold weather launches, the engineers noted that the O-rings sustained significant damage during launch. Concerns were voiced about the possibility of an explosion resulting from compromise of O-ring integrity. After 31 successive successful launches in the shuttle program, however, some NASA and industry scientists and engineers began to question their judgment, to feel that their concerns about risk were perhaps over-estimated, and expectations for success in the design became the norm. While the temperature at the time of the December 1986 launch of Challenger was predicted to be the coldest in the history of the program, supervisors still overruled concerns raised by a group of engineers, preferring to believe that the launch could proceed safely. The next day, merely 71 seconds after clearing the launchpad, Challenger exploded, leading to the death of the seven astronauts on board.
Unfortunately, history is littered with other tragic examples of how NOD/normalization bias has led to tragedy, including the Columbia shuttle disaster of 2005, the BP Deep Water Horizon well blowout of 2010 in the Gulf of Mexico, the stuck accelerator pedal problem for Toyota, and the Bhopal Union Carbide chemical disaster of 1984, where thousands of workers perished as a result of exposure to methyl cyanide gas.
Sadly, an analysis of the evidence has demonstrated that none of the above-mentioned events were due to one-time catastrophic failures that could not have been predicted in advance. Rather, they all occurred within the context of a pattern of small failures that ultimately culminated in disaster. The common denominator in each event was the repetitive overlooking of small failures that eventually normalized on the pretense that risk was negligible, when it was not. A lesson for healthcare workers and their leaders is that those same tendencies underlie many of the serious safety events that occur in care settings today.
Remarkably, the behaviors that lead to NOD bear a strong resemblance to those that lead to corrupt business practices, with the exception that healthcare workers do not exhibit them with malicious intent, frequently choosing to practice them in the best interests of relieving the suffering and pain of their patients. Instead, the path that is taken first includes institutionalizing newly hired workers to the deviant ways that care is administered in the delivery setting, and frequently involving flagrant disregard of rules and policies in the interest of getting work done in an efficient manner. This is followed by a collective rationalization that deviant behavior is necessary, even desirable, and the importance of “going along to get along.” Policies that require two-person identifiers, or the scrubbing of a central line port for 10-15 seconds meant to assure sterility, give way to workarounds in the name of “we have never had a problem with that here.”
Further, the tendency toward NOD may be something that is hard-wired as part of the human condition. Numerous studies have shown that, when given hypothetical scenarios that involve risk of injury or death, people tend to choose alternatives based upon previous experience, rather than the realistic odds of harm. After assessing the risk of injury, and choosing convenience over safety for a number of these scenarios, healthcare professionals may then alternatively choose the safer alternative when told that another individual had been harmed. This tendency helps to explain why providers flagrantly disregard policies and procedures designed to maintain safe practice in the name of expediting workflow. It also speaks to the potential for reducing or even eliminating such deviant behaviors when stories of failures that led to patient harm are actively shared.
If NOD is a predictable behavioral response made by busy clinicians who must juggle the demands of ever sicker patients, ever increasing tasks, and a demanding workload, how can we truly assure safe care when some degree of harm can be anticipated? Fortunately, prevention and mitigation of deviant behaviors is possible. It’s important to make clear that unsafe practices will be subject to immediate corrective action in the context of the Just Culture. Encouraging healthcare professionals to think about the cost of patient harm that resulted from well-meaning workarounds and other deviations can be a powerful tool. But without question, the most powerful mitigating factor toward NOD is the role of leadership, especially that of the CEO or practice/business owner in making it clear that safety is the first priority. This is a trickier proposition than it may seem. The CEO who professes “safety first,” but then pushes to reduce operating room turnover times will immediately lose all credibility with providers if he tolerates early removal of monitoring devices in the name of “negligible risk.” Furthermore, nurses who are warned about billing for overtime will actively seek workarounds to get their work done faster. Providers are intelligent, if not observant, and they will expose any senior leader that evokes patient safety without modeling that behavior as an “emperor with no clothes.”
The bottom line is that healthcare systems need redesign, and best practices in care delivery must be reliably surfaced and practiced. Deviant behaviors and NOD must be addressed to truly advance the safety culture in healthcare settings and achieve higher reliability. The role of leadership is crucial. Does your CEO pass the test?
About the Author
Frank Mazza, M.D., chief medical officer, Quantros, is a physician by training (pulmonary, critical care, and sleep disorders) and still practices medicine part-time. Prior to joining Quantros, he held several executive positions within the Seton Healthcare Family in Austin, TX, including system-level chief Patient Safety officer and associate chief medical officer, as well as vice president of Medical Affairs at Seton Medical Center, Austin.