Warnings Unheeded

We don’t know what the late Roger Boisjoly, whio died last month, thought about climate change. But we do know what he thought about expert warnings that remained unheeded.

Via the LA Times:

Roger Boisjoly was an engineer at solid rocket booster manufacturer Morton Thiokol and had begun warning as early as 1985 that the joints in the boosters could fail in cold weather, leading to a catastrophic failure of the casing. Then on the eve of the Jan. 28, 1986, launch, Boisjoly and four other space shuttle engineers argued late into the night against the launch.

In cold temperatures, o-rings in the joints might not seal, they said, and could allow flames to reach the rocket’s metal casing. Their pleas and technical theories were rejected by senior managers at the company and NASA, who told them they had failed to prove their case and that the shuttle would be launched in freezing temperatures the next morning. It was among the great engineering miscalculations in history.

A little more than a minute after launch, flames shot out of the booster joint, melted through the nearby hydrogen fuel tank and ignited a fireball that was watched by the astronauts’ families and much of the nation on television. Boisjoly could not watch the launch, so certain was he that the shuttle would blow up. In the months and years that followed, the disaster changed his career and permanently poisoned his view that NASA could be trusted to make the right decisions when matters came to life and death.

A more extensive, contemporaneous account of Boisjoly’s involvement in the Challenger disaster is online at the LA Times site.

Sometime on the night of Jan. 27, 1986, Roger Boisjoly’s feeling of frustration turned to anger.

He and a team of engineers unanimously recommended postponing the shuttle launching because of the frigid conditions in Florida. But four Thiokol vice presidents, under pressure from NASA in a late-night telephone conference, overruled their top technical experts.

Boisjoly saw it coming. After listening to NASA executives challenge the reliability of their scientific data and complain that they were “appalled” at the recommendation for delay, Thiokol’s general manager, Jerry Mason, told his fellow vice presidents: “It’s time to take off your engineering hats and put on your management hats.”

“I knew they were going to change,” Boisjoly said. “I couldn’t believe it, but I knew it.”

Trying to head off the shift, Boisjoly raised his voice. Later, some would say that Boisjoly’s emotional approach actually worked against him. That he had made too many dire warnings and “waved his arms and rolled his eyes” too many times before. That he was like the boy who cried “Wolf!”

Boisjoly shrugs. “When you see a guy about to walk off a cliff you don’t whisper. You yell at him to move back.”

Slashdot summarizes:

Boisjoly, Allan J. McDonald and three others argued through the night of 27 January, 1986 to stop the following day’s Challenger launch, but Joseph Kilminster, their boss at Morton Thiokol, overruled them. NASA managers didn’t listen to the engineers. Both Boisjoly and McDonald were blackballed for speaking out. NASA’s mismanagement ‘is not going to stop until somebody gets sent to hard rock hotel,’ Boisjoly said after the 2003 Columbia disaster. ‘I don’t care how many commissions you have. These guys have a way of numbing their brains. They have destroyed $5 billion worth of hardware and 14 lives because of their nonsense.’”

(emphasis added)

This is not a unique instance of the phenomenon of ignoring inconvenient warnings. The disaster that hit New Orleans in the aftermath of hurricane Katrina was not only predicted, but in fact had a precedent.

Certainly it’s easy to see people ignoring sound advice in daily life, but it is more striking when billions of dollars and human lives are at stake. Are there other examples at this scale in engineered systems?

They may shed light on our difficulties in convincing the world about climate and other sustainability issues.

Comments:

  1. I am given to understand that the Russians were warned of the potential positive feedback dangers of the type of nuclear reactor at Chernobyl. {But I must also point out that there are still 11 of that type still running.]

    • Yes, I had seen Tufte's similar argument. It's an object lesson on how you can fool yourself, in particular with sample bias.

  2. I'm going to risk attracting phony skeptic rebuttals like a dog with fleas, and enjoy my penchant for mixed metaphors by mounting a hobby horse here.

    Feynman, in order to show how simple and obvious the science here is, did a demonstration with an O-ring and a glass of ice water. I didn't at the time have a TV, so at second hand it seems the discussion was about how some technical people wanted to claim it wasn't obvious when it was and pass the buck.

    Feynman is frequently quoted in support of phony skeptic talking points, and sometimes the full text of what he said is provided. As a skilled debate technique this is hard to beat. The reader may not know enough to realize that the points Feynman make do not support the arguments of those who would deny climate science (and insist they not be called on denying for fear of being accused of accusing deniers of being deniers because of association with Nazis, how convoluted can we get?). "Cargo cult science" as far as I can see applies exactly to denial, but it is being claimed wholesale for the wrong side. (I get in particular trouble here because he was part of a group of friends who met via an art center at MIT during his tenure at Thinking Machines; you'd think I could make it clear that he would make short work of them, but it has the opposite effect. My father (PW) also has a good idea of his thinking, but that too doesn't get anywhere. The power and vitality of the big lie is unfathomable; the truth doesn't have much chance in virtual but unvirtuous discussions.)

    I am troubled by the word "side" here as well, as it denotes an equivalence that does not exist. By acknowledging an argument, we give it a substance that it should lack.

    Continuing my foot in mouth angels fear to tread behavior, I am going to call this evil. It is time that calling evil evil stop being something we are not allowed to do for fear of reprisal.

    • Whoa... Which reminds me of an article by PW, "Brainwashed by Feynman?" (which is actually about Feynman diagrams, not Feynman). Is he still that crisp? What's he thinking of AGW? I'm sooo fed up of elderly guys like Freeman Dyson turn from science hero into science antihero. Maybe it's time for a calibre like PW to write a WSJ op-ed...

    • Whoa yourself. He doesn't want to pronounce outside his field, 88, doesn't want to get down in the thickets, and is busy with his own stuff (HTC I think). Figured out the physics in mid-70s. I call myself a firebrand (former black sheep) & he has to conserve energy. My physics ain't so gud. I'm coming out is because of providing cover art for his recent book, "More and Different: Notes from a thoughtful curmudgeon" which is an eclectic collection of reviews and stuff, a bit pricey. If what you mean by crisp means doesn't like to waste time, yes.

  3. The people multiplying the evil, it is important to note, are largely not aware they are promoting evil, but the promulgation of the big lie they have bought is. The idea that politics trumps reality is dangerous and a lot of hot air is obscuring the dangerous truth that we are rapidly approaching volumetric trashing of our only home in every way possible.

    • For people like Dyson (this applies to non-scientists as well) it's not so much a matter of buying a Big Lie as having a sense of reality based on physical intuition developed over a lifetime, which note has served them well in some other circumstances, although maintaining their views does requires them to make a choice to not carefully examine contradictions. The resistance to and pushback against evolution and relativity among scientists and laypeople alike is exactly on point here, and I'm afraid Planck was quite right that such people will die before they consider changing.

      Further to my point about the Big Lie, go have a look at what WUWT has turned into. It used to be that Watts would have to defend many of his posts at length, but now things are moving so fast (~6 posts per day) that most substantive criticisms raised in the comments can just be ignored. After all, it's not like anyone is keeping track.

      The full-on Gish Gallop there is no accident, as it gives its readers (and such material gets distributed widely) a quick gloss without time to really consider the material, then it's on to the next thing. So it's more like filling the communications space with a thousand Small Lies, which allows the audience to have continuing confidence in the Big Lie even as some of the small ones fall by the wayside. This works fine for scientists, too. Heck, it seems that some can even administer it to themselves, as with RP Sr.

  4. For every warning that was ignored and ultimately caused catastrophe, there are magnitudes more warnings that went unheeded and resulted in nothing. Y2K as one example, and we all know there is a well worn list of these passed around on a regular basis.

    The space shuttle has over a million parts and the list of possible things that can go catastrophically wrong is enormous. Humans try to account for all these with testing, backup systems, monitoring, analysis of previous flight data, etc. If we waited for every possible risk to be avoided and every engineer to withdrawal every objection, the shuttle would never leave the ground.

    Risk analysis is a tricky business, and clearly they got this one wrong. Determining what failed and why post catastrophe is substantially more easy than predicting a future failure.

    If you got back to the Apollo days, you will see that those guys were real cowboys. Taking crazy chances in light of how we handle things today. Getting to the moon in the 1960's was an epic risk, and it was successful. Boisjoly would have never signed off on an Apollo launch.

    As far as expert prediction goes related to climate change, this is apples and oranges. Risk assessment is much about values as it is the data and uncertainty.

    Ah, but a man's reach should exceed his grasp, Or what's a heaven for?

    • Y2K did not go unheeded. This is another problem which you raise; inadequate reward for risks avoided, because in most cases the people who were unconvinced of the risk remain so.

      In climate, as things continue to turn out more or less exactly as predicted, people still can't bring themselves to believe it. What happens when the system goes so far off the rails that nobody can predict it? People will say we're refuted. Anything to keep those money pumps going, just like in China.

      As for Boisjoly "not signing off on an Apollo mission" that is a pretty damned lame excuse for the ignorant triumph of management over engineering.

      Finally, regarding your "risk assessment is much about values as it is the data and uncertainty" well, that's certainly an issue. Some people want to write off the future to save the present. Let them come out and say so, though, instead of lying about what the evidence actually says.

    • Tom, I find it puzzling that you are adamant avoiding "economic sacrifice" at all costs, while you have no problem with sacrifice of human life.

      -- frank

  5. I'd say that the cardinal law for dealing with human stupidity is to assume that humans were stupid, are stupid, and will continue to be stupid.

    And I think the best way to deal with stupidity is to have a 'last line of defence' allowing us to mitigate the inevitable impact of whatever stupid things the stupid people are doing. Lawsuits, political manoeuvres, 800-pound gorillas -- whatever it is, you need to have something to fall back on in case you can't convince the other guy not to do the stupid thing.

    So we need to ask this: what is the 'last line of defence' of climatology? Does it even have one? Is it thinking of having one? The climate inactivists have shown that they're willing to employ fake 'expertise', sock puppets, vexatious FOIs, computer hacking, and even rape threats -- all even as the Earth's climate continue to edge closer and closer to the runaway precipice.

    Now, the Climate Science Legal Defense Fund is a good start, but clearly much more is needed in order to build a reliable 'last line'.

    -- frank

  6. It's not stupidity - it's the mental frame of reference. Most managers have no idea about physical limits or the obduracy of materials. Their education and environment tells them that compromises are always possible, that finding the point where the different opinions balance best is the important thing, and that technical opinions are just that - opinions. The techies are seen to "just not get it". And it's true for them most of the time. And it's true that the techies often overstate , and the sales people overstate routinely. It's a world of shades of gray, with no real limits. Lived there, did not like it.

    • Peter T, I refuse to believe that, when someone indiscriminately applies the 'it's all about opinions and there's no underlying truth' mantra to every darn situation, he's somehow not guilty of grave stupidity.

      -- frank

    • Peter T, I refuse to believe that one's not guilty of grave stupidity when one indiscriminately applies the 'it's all about opinions and there's no underlying truth' mantra to every situation.

      -- frank

      • Frank, please don't be so grumpy.

        Peter was obviously not defending that position. It's just that a manager has to balance credibility. Engineers do tend to overstate their case. I know the temptation to do so when I have had my engineer's hat on, and I understand the difficulty that the manager has in weighing that opinion. It is exactly that difficulty that the deniers are leveraging. When engineers say that nuclear power plants can be made absolutely safe, what is your reaction?

        The difference in the present case, though, was that Boisjoly was in no way expressing typical engineering overconfidence. He was expressing underconfidence. Nobody likes to hear that, especially in a commercial engineering setting.

        But sometimes doubts are important. Boisjoly was written off not as an overconfident engineer, but as a tedious worry-wart.

        Which brings to mind another comparable example. "Bin Laden determined to attack on US soil". Management wrote that one off as too inconvenient as well.

        It's also important to understand that errors, while to be avoided diligently, cannot be eliminated. One part of every voter's high school education should be about Type I and Type II errors, false positives and false negatives, and a bit on sample bias, and risk. The number of people who understand these issues is tragically small.

    • MT, what I'll say is, there's a difference between (1) a mistake made because the truth was wrongly approximated, (2) a mistake made because there was too little information to reliably approximate the truth, and (3) a mistake made because one wasn't thinking about approximating the truth at all. The third kind is pure stupidity, and in my mind was the kind that caused the Challenger disaster.

      (9/11 was probably of the second type; the intelligence report you cited was just too short on information to allow anything useful to be done to counter the threat.)

      Anyway, my main question remains: what do you do when you see someone doing (or about to do) something totally stupid -- a mistake of the third kind?

      -- frank

  7. re Frank's comment

    OK, It's stupidity - but it's a kind of stupidity that most people have. The manager's framing tells him/her that absolute truths are very rare, and that the central point of the opinions around the table is the least risky option. This is a poor way to deal with the physical universe, but that's not what managers do.

    The reverse is where people who are used to seeking physical truths try to manage social issues. They often try to treat social conventions as absolutes, or attempt to derive complex multi-causal behaviours from a few simple rules and treat them as a deductively certain truth. The results are either amusing or disastrous depending on how much influence these people have. In other words, it's another kind of stupidity. The trick is in checking what you are dealing with, and not assuming that the way you think applies everywhere.

    • Peter T, I don't recall coming across any genuine instances of the latter type of fallacy. (Ayn Rand doesn't count -- she was never much of a scientist or truth-seeker.) Meanwhile, the idea that 'there's no underlying objective truth and even if there is it doesn't matter' is steadily gaining ground.

      -- frank

  8. Frank

    For the latest instance of the second kind of stupidity, check current EU economic policy, particularly as regards Greece. For an earlier manifestation, check Milton Friedman and Chile. For a merely amusing rather than disastrous example, people like Steven Pinker come to mind. Or read about Galton, Haldane and the eugenics movement.

    • The first is a poor example, as it's more a matter of politicians using economics (not any sort of science anyway) as cover for what they want to do anyway. Within the profession, the austerians are in full retreat, yet the concept remains central to EU policy as well as in the US, although recently reality seems to be setting in a little more there.

      The second may be a better example, although I'm not familiar enough with the historical details to comment on it. I would note that it involves medical science rather than a hard science, and in its origins dates back to before the introduction of regorous research practices.

      An interesting further case involving medical science is the anti-smoking movement, which was very much motivated by research results and made progress despite resistance from government motivated by tobacco dollars (plus of course the population of addicts, many of whom wanted to quit but still didn't want prices going up). Of course it wasn't just a matter of science, as anyone with their eyes open could see the problem.

      It's an interesting topic, but in general we have to watch out here for policies truly being pushed from the science end vs. opportunistic politicians looking for excuses for stuff they'd like to do anyway.

  9. Good blogpost here on the different viewpoints of climate policy analysts (or some of them at least) and climate advocates:

    "...The logic of political action and movement building is different from the logic of policy efficiency..."

    http://greenpolicyprof.org/wordpress/?p=790

  10. Dr. Smith Dharmasaroja was abused and threatened when he had tried for ten years to warn of a possible tsunami in the Indian Ocean. We all know what happened on Boxing Day 2004.

    “You’d really have to go digging into very old historical records and the scientific literature and extrapolate from what’s there to find that yes, there could be effects (leading to tsunamis) in Thailand,” says Phil Cummins, a seismologist who studies the region at Australia’s national geological agency. “But he was correct.”

    Perilocity: Tsunami Smith.


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>