I’ve been invited to participate in a new Bray & von Storch survey of climate scientists.
The questions seem structured to de-authenticate climate models. These questions will have a different meaning within the field and outside the field.
When asked for instance “How well do you think atmospheric models can deal with influence of clouds?” on a scale of 1= “very inadequate”, 7 = “very adequate”, my honest answer for the modeling community is far more severe than my honest answer for the public, because the context in which they will perceive the answer is dramatically different. Many respondents will be unlikely to notice this bait and switch.
There’s also this question:
What can I say to that? “No answer” is not my answer, nor are any of the choices.
The lack of a >100% option shows that the survey is being conducted by people who are not paying attention.
I’m not sure I understand the purpose of such surveys anyway. If you want the consensus of a field, you ask the research leaders of that field.
I’m afraid I don’t qualify. I guess I don’t want to be a member of a club that would have me as a member.
I’m inclined not to complete the survey. Or I could just be ornery and fill out the most alarmist-friendly box on every question, because unreal answers opposite to the intended skew are appropriate for an unreal survey?