The Just Culture Company appreciates you!
This holiday season we will not be shipping orders from December 23rd through January 2nd. All orders placed in that time will ship on January 3rd. If you have any urgent needs during this period, please reach out to sdutra@justculture.com. Thank you.

Boeing’s Culture Crisis: In Recommending ASAP, the Expert Panel Missed the Mark

January 5, 2024, a fuselage plug blows out mid-flight.  February 26, the Federal Aviation Administration’s (FAA) Expert Panel issues its report.  Boeing is given 90 days to present a plan to the FAA.  And the world is watching.

One recommendation from the expert panel was to implement the FAA’s Aviation Safety Action Program (ASAP) at all Boeing facilities.  To do so, however, is a mistake.  Here’s why.

“Report only what you cannot hide,” declared Harvard’s Dr. Lucian Leape, testifying before Congress in 1997 about the state of safety reporting in healthcare.  Healthcare, an industry with a mortality rate from errors equivalent to crashing a 747 each day, was very punitive.  Dr. Leape told Congress the single greatest impediment to error prevention was punishing people for making mistakes. 

In the mid-1970s, the FAA started a landmark program called the Aviation Safety Reporting System (ASRS).   The FAA knew that it had limited access to what actually happened on the flight decks of commercial aircraft and the cockpits of private aircraft.  “What happens in the cockpit, stays in the cockpit,” was the code.  For pilots, making a mistake exposed them to civil penalty, certification suspension, and felony criminal prosecution.  Consequently, expecting pilots to report errors to the FAA was about as realistic as expecting drivers to wave down a police officer to report running a red light. 

Taking a page from your favorite crime show, the FAA offered pilots the opportunity to report their violations to a third party (in this case, NASA researchers), and receive limited immunity in return.[1]  The FAA wasn’t really interested in the violators (i.e., the pilots); it wanted to know the causes of the violations, so that they could be fixed.  If the FAA deemed the violation “inadvertent and not deliberate,” the pilot could utilize the incentive to receive a “get out of jail free” card.

While the data in the ASRS helped the FAA learn about the national airspace system, it was not very helpful for airlines trying to manage risk internally.  So, by the early 1990s, the FAA had created the Aviation Safety Action Program (ASAP).  Event review teams, comprising management, labor, and the FAA, were set up to receive voluntary reports of inadvertent violations.

So how well has it worked to change aviation’s punitive culture?  We recently queried employees across 13 U.S. industries regarding their organizations’ approach to human error.  The study found that commercial aviation was the most punitive.[2]  ASAP did not fix the U.S. aviation industry’s punitive culture, but reinforced it.  International airlines and other industries are making progress: UK aviation scored 25 percentage points higher than U.S. aviation, and U.S. healthcare, to its credit, is now 16 percentage points ahead of U.S. aviation (see table).

Paradoxically, the FAA’s Expert Panel also recommends that Boeing adopt the principles of Just Culture, where an organization’s policies and practices, as well as those of its regulator, support an accountable learning culture, whether the issues be service-related, safety-related, or tied to financial stewardship.  In practice, ASAP and Just Culture are more in opposition than alignment.  ASAP takes inescapably fallible pilots and technicians, brands them violators when they make mistakes, and then, if they’re not caught first, offers them limited immunity for their report provided they hide any intentional violations.  What about that arrangement seems just?

The extremely narrow scope of ASAP’s enforcement-related incentive will be a detriment to Boeing’s safety culture work.  All organizations have safety issues that extend well beyond the “inadvertent violations” sought under the guidelines of the ASAP rules.[3]  Consider:  a factory technician wants to report an ongoing, systemic workaround of a workplace rule (workarounds occur every day in aviation).  An employee recovering from surgery wants to self-report on his own growing addiction to oxycodone.  A female engineer wants to report being bullied by a misogynistic co-worker.  A manager wants to report on financial and schedule pressures.  Are these safety-related?  Yes.  Do we want them reported?  Yes. Unfortunately, ASAP provides no safe haven for the reporting of these issues.

The FAA’s Expert Panel expressed its concern for an employee raising a safety issue to their manager.  To raise an issue was to open the employee to retaliation, they said.  Their position was that only the unbiased ASAP team could be trusted to handle such reports.  So, what of those examples listed above?  Are these employees now compelled to conceal problems they might have previously discussed with their manager?  ASAP in the Boeing factory and engineering offices will have every employee wondering:  What can I safely report to the ASAP team, what can I safely report to my untrustworthy manager, and what should I hide?  ASAP will inevitably perpetuate a culture of concealment of the very problems that need be surfaced.

To its credit, the expert panel admitted that employees would prefer to talk with their managers, and for good reason.  As a former Boeing engineer, talking to my manager was overwhelmingly my first choice for most all things workplace related.  If ASAP were my preferred process for resolving safety issues, I would have been communicating weekly with the ASAP team (which likely would have had little expertise to help; whereas, in contrast, my manager was an international expert on the subject).

Building a team where issues can be raised internally, as a first course of action, is the optimal course – a course more aligned with a Just Culture, and a safety culture.  To report all safety-related issues to a separate team, no matter how theoretically more-trustworthy, erodes rather than builds safety culture.  Yes, there are times when an employee must confidentially report.  They must occasionally become a whistle blower, internally within the organization, or even externally to the FAA, OSHA, EEOC, or local law enforcement.  But these should be rare events.  Safety-related issues come up every day in engineering offices and on the manufacturing floor.  The working premise of ASAP, that employees as a first option should bypass an inherently untrustworthy manager in favor of an external event review team is fundamentally flawed, and inconsistent with any principle of Just Culture.

The more we learn about Boeing’s current issues, the more they appear far too complex, entrenched, and intentional to be resolved by an ASAP Event Review Team.  I was once lectured by an airline maintenance manager, “we paint a black and white picture for the leaders of this company and the FAA,” he said, “but to get an aircraft out of these hanger doors, it’s a gray world.”  Prescribing ASAP as a cure to the secretive and punitive culture of aviation is to invite more of the same.  It is time for Boeing and the FAA to have a moment of honesty about how the aviation system really works, to stop looking for workarounds, and to embrace the considerable work of changing culture, within both Boeing and the FAA.

Cite as Marx, D. (2024) Boeing’s Culture Crisis: Where the FAA’s Expert Panel Missed the Mark.  www.justculture.com

David Marx is a former Boeing design engineer and maintenance engineer.  David assembled Boeing’s first maintenance human factors group which produced the Maintenance Error Decision Aid (MEDA), an award-winning investigative process used by airlines around the world.  David also served as a human factors manager within a large U.S. airline, and served on the FAA’s Human Factors Research Advisory Committee.

[1] The FAA prefers the more socially acceptable terminology of “enforcement-related incentive.”

[2] See Marx, D. & Huntsman, D, (2024). Re: Human Error, We Live in a Punitive World. The Data Series, (1), 1–2.  wwhttps://www.justculture.com/re-human-error-we-live-in-a-punitive-world/

[3] See Marx, D. (2024). Psychological Safety? I am a willful violator and so are you. A Just Culture Commentary, (1), 1–2.  https://www.justculture.com/psychological-safety-i-am-a-willful-violator-and-so-are-you/

Leave a Reply