IE 11 is not supported. For an optimal experience visit our site on another browser.

Ten years later, Columbia's tragic loss serves as a warning to NASA

A NASA video focuses on a piece of debris falling from the external tank, then striking the left wing of the space shuttle Columbia during its launch on Jan. 16, 2003. Investigators say the damage led to the shuttle's destruction 16 days later during atmospheric re-entry.
A NASA video focuses on a piece of debris falling from the external tank, then striking the left wing of the space shuttle Columbia during its launch on Jan. 16, 2003. Investigators say the damage led to the shuttle's destruction 16 days later during atmospheric re-entry.NASA / Getty Images file

HOUSTON — Ten years ago, the Columbia tragedy showed that not everyone at NASA had learned the most important safety lesson from the shuttle Challenger disaster, more than a decade earlier. Will the new teams now stepping forward into the American spaceflight arena have to relearn the same bitter lesson?

Beyond the tragic loss of life, the greatest tragedy of the space shuttle Columbia was that NASA should have known better. As an organization and as a team, the agency learned nothing new from the 2003 disaster. Rather, the disaster was a harsh reminder of what NASA had forgotten. Or, as the German philosopher Friedrich Hegel, wrote, "The only thing we learn from history is that we learn nothing from history."

After the Columbia and its crew of seven astronauts were lost, an independent investigation board delved deeply into the immediate causes of the disaster. But the board's director, retired U.S. Navy Adm. Harold Gehman, set his team an even more profound task. He wanted them to find out why, just 17 years after operational errors and bad engineering decisions doomed the space shuttle Challenger and its seven astronauts, the same types of management flaws had reinfected NASA's culture and struck again with equally hideous results.

The fundamental safety rule had been to base no belief purely on hope. Safety was a quality that had to be explicitly verified. To assume that all was well unless there were visible hazards was imprudent and irresponsible. Convenient, unverified assumptions of goodness had led to the loss of Challenger and its crew — and Gehman wanted to find out if the same kind of lapse had led to Columbia's loss.

Over the ensuing months, as investigators developed these deeper insights through extensive interviews and document reviews, they regularly conducted news briefings to answer questions about what they were discovering. I attended those briefings as a newly hired space analyst for NBC News, and I had a tough question on my mind.

"How much of your 'NASA safety culture' assessment," I asked, "could have been written before the accident?"

Gehman paused, thought deeply, and then sighed. “Maybe three-quarters of it,” he acknowledged.

This was a dramatic moment: NASA itself could have done the diagnosis and come up with the get-well prescription without the cost of seven lives.

No accident

As it turned out, neither the 2003 Columbia disaster, nor the 1986 Challenger disaster, nor the robotic Mars mission failures of 1999, nor the cascade of near-death experiences of American astronauts aboard the Russian Mir space station in 1996 and 1997, were "accidents" in any traditional sense of the word. They weren't out-of-the-blue surprises, striking without warning. They didn't happen because "space is hard," as NASA apologists repeatedly proclaimed. It wasn't because we were pushing a fearsome frontier and just had to expect, and accept, such losses.

These bad things happened mostly because attitudes toward safety got soft. And as complacent carelessness and time-saving shortcuts crept into the culture, many people had noticed, had given warnings, and had been ignored.

There were many members of NASA's space team who continued to keep faith with the rigorous standards that had gotten America to the moon in the Cold War space race. Those space workers would later come to feel they were betrayed by their colleagues who had dropped the ball, and broken the chain, and made conscious choices that had lethal consequences.

It wasn't a matter simply of hindsight. People inside and outside had been noticing the shift and raising objections to increasingly careless management choices, as front-line workers were overruled for schedule and budget reasons.

As a senior worker at the Johnson Space Center in the mid-1990s, I had been assigned more and more safety-related duties in addition to my primary specialization, orbital design work. As I learned more of the principles of flight safety, I saw more and more disconnects with the way it was being practiced, especially with regard to the diplomatically motivated "shuttle-Mir" program.

Following a series of near-fatal crises while American astronauts were aboard Mir in 1997, NASA managers prepared arguments for continuing the project — a continuation directed by White House officials. One manager wrote, "Despite concerns, there is no hard evidence that Mir is currently unsafe." Another asserted, "The experts that we had asked, the majority of them, determined that there were no technical or safety reasons to discontinue the program."

I had already learned enough about NASA's "best practices" to recognize that these officials had it completely backward. In the real world, you don't assume safety and seek evidence of danger — especially when working with the Russians, who regularly covered up flight hazards. You must decide positively to continue only after a thorough hazard review, and without it, you do not continue. If a vocal minority, or even one engineer objects, you address those concerns head-on.

Looking back, looking forward

After leaving the NASA program in 1997, I was able to write more candidly about these safety concerns, both with respect to the mismanagement of a fleet of Mars robots in 1999 and a book chapter on Mir safety in 2002

I concluded an article written for Scientific American with a warning: "NASA will have to address its systemic weaknesses if it is to avoid a new string of expensive, embarrassing and perhaps in some cases life-threatening foul-ups."

Quoting retired colleagues whose judgment I had learned to respect, I noted in a 2000 report for New Scientist that critics were accusing NASA of "repeating the errors that led to the Challenger disaster."

"The consequences of a future accident could, also, be fatal," I wrote, three years before the Columbia disaster. "So far, no more human lives have been lost, but the question NASA must answer is whether this will continue."

So when Admiral Gehman acknowledged that an accurate diagnosis of the systemic flaws leading to the disaster could have been made before seven astronauts died, I knew he spoke the truth. And I knew that the next generation of "safety hawks" and pain-in-the-ass picky whiners in the space effort would need to be more effective than I was.

Those chapters are yet to be written.

More about the Columbia tragedy:

NBC News space analyst James Oberg spent 22 years at NASA's Johnson Space Center as a Mission Control operator and an orbital designer. He is the author of several books on space history and space policy.