An article by Josh Fischman at the Chronicle of Higher Education discusses the recent strange case of the Italian court decision against 4 scientists, 2 engineers, and a government official who, as part of a government earthquake commission, had provided a reassuring assessment of earthquake risks a week prior to a major earthquake in the town of L’Aquila. About 300 people died, and the men have been sentenced to 6 years each in prison for manslaughter. As Fischman reports, scientific societies around the world have reacted with alarm, in particular worried that this precedent will scare scientists away from sharing important information with the public.
But the case isn’t as simple as one might think from some of the reporting (and alarm). As Fischman reports,
… court testimony indicated that the scientists had felt pressured by public officials to calm the worried citizens.
So what actually happened here was, in the face of significant uncertainty, those who spoke to the public underplayed the risks. Whether that is grounds for jail time or not, the issue was not “scientists sharing important information” but scientists and public officials they worked with sharing information that was incomplete and inaccurately reassuring. Uncertainty is not your friend as we at Planet3.0 have been saying for quite some time now.
And if you want to play things safe, you would most likely overemphasize the risks of harm, not downplay them. Nate Silver has talked about this in regards to weather forecasting:
In what may be the worst-kept secret in the business, numerous commercial weather forecasts are also biased toward forecasting more precipitation than will actually occur. (In the business, this is known as the wet bias.) For years, when the Weather Channel said there was a 20 percent chance of rain, it actually rained only about 5 percent of the time.
People don’t mind when a forecaster predicts rain and it turns out to be a nice day. But if it rains when it isn’t supposed to, they curse the weatherman for ruining their picnic. “If the forecast was objective, if it has zero bias in precipitation,” Bruce Rose, a former vice president for the Weather Channel, said, “we’d probably be in trouble.”
While I don’t condone distorting what forecasters actually know, that knowledge is uncertain and risk-weighting of forecasts is a rational thing. Since bad weather is more likely to be trouble for people, it is obviously useful to those people to have at least a slight bias towards worrying more about bad weather.
And yet somehow Fischman moves on to quote a couple of people (how did they even get inserted into this?) who have totally missed the irony relating to their own pronouncements regarding climate:
Mr. Pielke [Roger Pielke Jr.] said scientists advising a public-safety agency need to stick to the science, and leave recommendations for action up to public officials. “When a policy maker asks a scientist for an estimate of an earthquake, or a nuclear-power-plant meltdown, the scientist can answer with numbers,” he said. “Or he can give a range of options for action. But he has to clarify the difference between options and actual decision making.”
Ms. Curry [Judith Curry], who researches hurricane intensity and climate at Georgia Tech, said that scientists “have to be more sensible about giving uncertainty statements. There is a problem with researchers’ trying to be overspecific about courses of action.” And actions, she said, are for emergency managers.
Those familiar with Pielke and Curry’s arguments that a changing climate is nothing to worry about must be scratching their heads.