What your company can learn from NASA’s tragedies

BYU business professor Peter Madsen has been researching NASA’s safety climate ever since the Columbia shuttle broke apart upon re-entering Earth’s atmosphere on Feb. 1, 2003.

Specifically, Madsen has been studying and quantifying how the organization recognizes “near-misses”—events where failures were narrowly averted resulting in successful outcomes.

As reported in a news release, a new study of NASA’s safety climate coauthored by Madsen finds that recognition of those near-misses goes up when the significance of a project is emphasized, and when organizational leaders emphasize safety relative to other goals (like efficiency).

In other words, if you want to avert disasters, your employees need to feel like their work has greater significance, and they need to know that their leaders value safety.

“It is challenging for people to see something that didn’t have an overtly bad outcome as a near-miss,” Madsen said. “It’s part of human nature: We tend to over-weigh what happened instead of what could have happened. But that can be changed by effective leadership.”

Using a database of inflight anomalies for two decades (1989-2010) of unmanned NASA missions, the researchers found when NASA leadership emphasized the significance of projects and emphasized the importance of safety, the organization recognized near-misses for what they were instead of passing them off as successes.

The findings, which appear in the Journal of Management, can be implemented by company leaders in a number of industries where safety is paramount, including transportation, power generation, extraction and healthcare.

“If you’re in an industry where safety is important and you really want your employees to pay attention to it, it takes not just talking about it, but backing it up,” he said. “Employees are very good at picking up the signals that managers are giving about what they really value.”

The same has been true for NASA over the years, Madsen found. When leaders have carried out those two steps—increasing the safety climate and emphasizing the significance of projects—near-misses have been better catalogued and used to improve operations.

Unfortunately, Columbia launched during an era of low near-miss reporting at NASA.

An investigation into the crash revealed that the failure that ultimately doomed the Columbia (foam debris striking the orbiter) happened on at least seven prior launches. On each of those, good fortune intervened. They were near-misses that became successes.

NASA’s own Columbia Accident Investigation Board identified NASA’s safety climate as a primary cause for its inability to see foam loss as a near-miss, stating, “NASA had conflicting goals of cost, schedule, and safety. Safety lost out.”

“A lot of safety improvements have happened after a disaster and they shine light on the deficiencies in the system,” Madsen said. “If you can pick up on those deficiencies before something happens, that’s the gold standard.”

Madsen’s connections to NASA go back to the time of Columbia’s loss, when he was in graduate school at U.C. Berkeley. His dissertation adviser was a well-known organizational safety expert, which led to Madsen and other Ph.D. students being assigned to work with NASA to research safety procedures.

He’s continued his contacts with NASA ever since and Edward W. Rogers, chief knowledge officer at NASA’s Goddard Space Flight Center, is a coauthor on this study. Robin Dillon, of Georgetown University’s McDonough School of Business, served as lead author.


Substack subscription form sign up