There’s an excellent and thoughtful piece on Medium by David Wolman on the prosecution of geophysicists in the aftermath of the disastrous d’Aquila earthquake in Italy.
I can’t help noticing that whether in issuing real warnings or in refuting fake ones, scientists fare badly in public nowadays. Critics of climate science go on about “post normalcy”, wherein they claim that science behaves oddly once it has policy implications. I think this model is completely wrong.
Science behaves according to codes that are utterly predictable to its participants. These may not be perfect, but transgressions are very rare. But when the table is set by nonscientists the law and the press and the public are ill-equipped to understand which are the charlatans and which the actual members of the scientific community.
In the case of l’Aquila, irresponsible earthquake warnings were being issued by a nonscientist. The scientists never claimed that an earthquake was not imminent. They simply asserted, correctly, that there was no evidence one way or the other. But this message was garbled and misunderstood. When the earthquake actually came, people were perhaps more complacent than they ought to have been. This may indeed have contributed to mortality. We can’t actually run the earthquake twice to be sure.
Expressing probabilities correctly does not mean predicting things accurately. It is a bad idea to draw to an inside straight, even if you won a big pot doing so based on a hunch. Your hunch was wrong; your behavior was unreasonable; the good outcome does not prove you were right. In the earthquake case, the bad outcome does not prove that the charlatan was right or the scientists wrong. It seems most people cannot understand this; they are what makes it possible to earn a living playing poker.
But when the stakes are high, says Fischhoff, like when communicating seismic risk, “we owe it to people to understand what the specific barriers are and how we can best get past them.”
This is where the scientists and engineers of the Serious Risks Commission went wrong, even if they didn’t realize it. They had no sense of how their words would land. They were used to closed-door meetings, and the commission’s mandate was to advise the Civil Protection Department, not the public. But once microphones and cameras were added into the mix, everything changed: They were now risk communicators, and whether they knew it or not, or what they might have felt about it, became irrelevant. (Unfortunately, says Fischhoff, another robust result in social science is that “people tend to exaggerate how well they communicate.”)
Yet they had to speak up. Someone had to. If they didn’t, there would have been no counter-message to the false information that was infecting the community, thanks to a one-man panic driver.
My point here is that the trouble did not begin with science. It began with anti-science. Just as climate science was conducted in a perfectly normal and sound way until McIntyre started lobbing bombs at it. Science is no utopia, and academia has numerous flaws, but it has norms that are pretty broadly respected. Postnormalcy does arise from politics, but in the cases I know about the bizarreness enters from outside the scientific community and is injected into their lives, entertaining the press at the expense of making serious people’s lives miserable.