Industry Voices—A 737 MAX rush to judgment: Lessons for healthcare leaders

Recently, the CEO of Boeing acknowledged that the flight control system of the 737 MAX may be to blame for the March 10 crash of Ethiopian Airlines Flight 302 as well as the October 2018 crash of Lion Air Flight 610.

“We’re deeply saddened and are sorry for the pain these accidents have caused worldwide. It’s our responsibility to eliminate this risk. We own it and we know how to do it,” said Dennis Muilenburg in a video statement released to the press.

While I appreciate Boeing taking ownership of the system failures that (may have) led to the loss of 346 lives in these two accidents, I find the acknowledgment of culpability in stark contrast to the initial response from Boeing, the Federal Aviation Administration, much of the American aviation industry and even many pilot groups in the U.S.

In fact, according to reports in the April 2 issue of The Wall Street Journal, after the Lion Air crash, top Boeing officials told U.S. pilot groups they would not be at risk for a similar event because they were better trained than the pilots at foreign-flagged airlines. Consequently, many pundits pinned the cause of the Ethiopian Airlines crash on an inexperienced crew—the captain was the youngest captain at the airline (although he had more than 8,000 flight hours) and the copilot was a recent Airline Academy graduate with only 350 hours of flight time.

This rush to judgment was one reason the U.S. airline industry and the FAA were so slow to ground the 737 MAX, even though nearly every other country around the world immediately grounded the plane after the second crash. 

RELATED: Q&A: New Patient Safety Movement Foundation CEO David Mayer discusses need for new laws, ROI of safety

Here’s my question and how it relates to our work in healthcare leadership of safety and reliability: Why are we so prone to rush to judgment and assign all human error as a failure of the person—the “bad apple” theory—instead of as a symptom of larger system causes? I am guilty of this as much as the next person. I was convinced—given the relative inexperience of the crew and my unconscious bias regarding a perceived lack of training at non-U.S. airlines—that the Ethiopian Airlines crash had to be due to a couple of bad pilots who didn’t follow proper operating procedures or simply weren’t good enough. 

I was wrong.

In fact, if you read the April 4 account of the final 50 seconds of the flight, these two pilots fought bravely to overcome a serious technical failure that caused the nose of the aircraft to violently pitch downward—one of the most difficult situations any pilot can imagine confronting. 

What are some lessons for healthcare leaders? 

  1. Leaders need to ensure they are clearly positioning safety as a core value that will never be compromised. It’s not enough just to state it. They have to live it, each and every day. In its statement, Boeing stressed that “safety is a core value for everyone” and that the safety of its airplanes, passengers, and crews is always the top priority. But were Boeing’s actions fully aligned with those stated values? Why weren’t the pilots notified about how the Maneuvering Characteristics Augmentation System (MCAS) worked before the Lion Air crash?

    Why didn’t Boeing ground the airplanes immediately after the Ethiopian Airlines crash? Many of the answers probably come down to the eternal struggle between safety and finances. Better procedural guidance on the MCAS would have required additional resources to train aircrew on its use. Grounding the 737 MAX fleet after the Lion Air crash would have, and eventually did, cause a tremendous financial hit. 
     
  2. When a patient or worker safety event occurs in an organization, leaders need to avoid an instinct to blame the person and instead use the human error to help point to the system causes. James Reason, who many consider the father of human error theory, stated that 85% of human error is system-induced. Organizations must commit to thorough cause analysis programs led by senior leaders who calmly state for all to hear that we will not rush to judgment. The first obligation is to ensure the safety of patients and care for the caregivers, and then to conduct a thorough and credible root cause analysis to ensure corrective actions are identified, implemented and sustained so that similar events will never happen again. 

RELATED: Want health professionals to help reduce medical errors? Patient Safety Movement releases new curriculum

When tough decisions must be made, leaders must always err on the side that is safest for the patient, their families and caregivers, without exception. Sometimes system factors are more obvious, like a technical failure. Sometimes they are less obvious, like a system cultural failure in which stated priorities do not align with actions.

In all situations, leaders must avoid the urge to blame an event on human error until all system contributors have been fully investigated, including one of the strongest system factors: culture.

Steve Kreiser, a partner with Press Ganey Transformational Services, served the U.S. as a naval officer and F/A-18 pilot, retiring as a Commander with 21 years of leadership and management experience.