Science: Climate Change Linked to Large Scale Severe Events (UPDATED)

Environmental Research Web is among several outlets running with a Potsdam Institute for Climate (PIK) press release about a forthcoming dynamical analysis by Petoukhov et al., supporting the suspicion many have had that recent persistent extremes are connected to climate change. Rahmstorf and Schellnhuber are on it, and it’s going into PNAS, so while it’s not settled science it has an impressive pedigree.
Screen Shot 2013-02-26 at 10.04.04 AM

It’s Petoukhov, V., Rahmstorf, S., Petri, S., Schellnhuber, H. J. (2013): Quasi-resonant amplification of planetary waves and recent Northern Hemisphere weather extremes. Proceedings of the National Academy of Sciences (Early Edition) [doi:10.1073/pnas.1222000110]

Weblink to the article (once it is published): www.pnas.org/cgi/doi/10.1073/pnas.1222000110

Here’s the release:

02/2572013 – The world has suffered from severe regional weather extremes in recent years, such as the heat wave in the United States in 2011 or the one in Russia 2010 coinciding with the unprecedented Pakistan flood. Behind these devastating individual events there is a common physical cause, propose scientists of the Potsdam Institute for Climate Impact Research (PIK). The study will be published this week in the US Proceedings of the National Academy of Sciences and suggests that man-made climate change repeatedly disturbs the patterns of atmospheric flow around the globe’s Northern hemisphere through a subtle resonance mechanism.

Meridional windfield over four different timespans.

“An important part of the global air motion in the mid-latitudes of the Earth normally takes the form of waves wandering around the planet, oscillating between the tropical and the Arctic regions. So when they swing up, these waves suck warm air from the tropics to Europe, Russia, or the US, and when they swing down, they do the same thing with cold air from the Arctic,” explains lead author Vladimir Petoukhov.

“What we found is that during several recent extreme weather events these planetary waves almost freeze in their tracks for weeks. So instead of bringing in cool air after having brought warm air in before, the heat just stays. In fact, we observe a strong amplification of the usually weak, slowly moving component of these waves,” says Petoukhov. Time is critical here: two or three days of 30 degrees Celsius are no problem, but twenty or more days lead to extreme heat stress. Since many ecosystems and cities are not adapted to this, prolonged hot periods can result in a high death toll, forest fires, and dramatic harvest losses.

Anomalous surface temperatures are disturbing the air flows

Climate change caused by greenhouse-gas emissions from fossil-fuel burning does not mean uniform global warming – in the Arctic, the relative increase of temperatures, amplified by the loss of snow and ice, is higher than on average. This in turn reduces the temperature difference between the Arctic and, for example, Europe, yet temperature differences are a main driver of air flow. Additionally, continents generally warm and cool more readily than the oceans. “These two factors are crucial for the mechanism we detected,” says Petoukhov. “They result in an unnatural pattern of the mid-latitude air flow, so that for extended periods the slow synoptic waves get trapped.”

The authors of the study developed equations that describe the wave motions in the extra-tropical atmosphere and show under what conditions those waves can grind to a halt and get amplified. They tested their assumptions using standard daily weather data from the US National Centers for Environmental Prediction (NCEP). During recent periods in which several major weather extremes occurred, the trapping and strong amplification of particular waves – like “wave seven” (which has seven troughs and crests spanning the globe) – was indeed observed. The data show an increase in the occurrence of these specific atmospheric patterns, which is statistically significant at the 90 percent confidence level.

The probability of extremes increases – but other factors come in as well

“Our dynamical analysis helps to explain the increasing number of novel weather extremes. It complements previous research that already linked such phenomena to climate change, but did not yet identify a mechanism behind it,” says Hans Joachim Schellnhuber, director of PIK and co-author of the study. “This is quite a breakthrough, even though things are not at all simple – the suggested physical process increases the probability of weather extremes, but additional factors certainly play a role as well, including natural variability.” Also, the 32-year period studied in the project provides a good indication of the mechanism involved, yet is too short for definite conclusions.

Nevertheless, the study significantly advances the understanding of the relation between weather extremes and man-made climate change. Scientists were surprised by how far outside past experience some of the recent extremes have been. The new data show that the emergence of extraordinary weather is not just a linear response to the mean warming trend, and the proposed mechanism could explain that.

Emphasis added. Given the team involved, it’s unlikely that there are obvious serious flaws with this paper. This could be a major result.

UPDATE: The egregious Pat Michaels claims in Forbes that

[Petoukhov et al] generated a lot of news traffic.

At precisely the same time, two University of Melbourne scientists published a paper in Geophysical Research letters, studying virtually the same data and finding little significant change. Further, they found that any changes in these patterns, known as atmospheric “blocking”, under which weather tends to stagnate, were small compared to natural year-to-year variability. In what is always a bad sign for solid science, they found that any connections between blocking frequency and global warming are highly dependent upon the methodology they used. Bottom line: they couldn’t find much of a signal, and even if they did, they weren’t sure what it all meant.

News traffic? Zilch.

The difference is that death and destruction sell ad copy, while, as the story goes, “plane lands on time” doesn’t. But, in climate change, there’s a remarkable disconnect between what people read and what they think.

Well, there is another difference: a result is generally more interesting than a null result, except to certain sections of the commentariat. A certain Roger Pielke Jr., for instance, has a long string of null results to his name and flogs them endlessly.

Of course another common trick is to misrepresent the contents of a study. So it would be helpful if Michaels deigned to, you know, tell us the author or title of the paper.

Maybe he even has a point, like the stopped clock telling the right time. Who knows? Does anybody know what paper he is referring to?

Anyway, the publication and celebration of null results is a peculiar feature of climate science. Outsiders seem quick to confuse “no proven link” with “proven no link”, which I suppose is the point.

Comments:

  1. Xtreme Weather is slow suicide for your position. Look at this--how many times have you criticized 'science by press release'? And this is reporting 'findings' that don't even reach a 95% level of confidence.

    Even if the press release was exactly accurate, placing your communications strategy in the hands of the weatherman is folly. It is far too early in the trajectory of climate change for AGW to have influenced the weather. The IPCC and many established scientists have said this, more or less plainly.

    You do your service no cause to advocate change due to weather. Weather can change. And you remove from your arsenal an argument that in future will be needed.

    Folly.

    • This isn't a drug trial. 90% confidence does constitute support.

      You are exhibiting a very common misunderstanding of statistics.

      It doesn't mean there's a 10% chance that the theory is wrong. It means if the theory were wrong there is a 10% chance you'd see this anyway.

      If we reached 95% confidence that would mean that if the theory were wrong there is still a 5% chance you'd see it.

      And even so, a whole lot of model assumptions go into these numbers.

      (Phil Jones was talking nonsense when he said that the temperature rise over ten years was "not statistically significant, but only just". What he meant was that using the particular null hypothesis model he had that similar rises happen more than one time out of twenty. Yawn. Not only was it bad public communication, it wasn't even a rigorous way to say it.)

      What the group are saying is that "what we reported extracted from theory looks a whole lot like what we are actually seeing".

      Even the common shorthand "there is a 10% chance that the theory is wrong based on the data" is quite wrong. I actually think statistical significance is not a sensible statistical approach to this class of problem. Rather, you should be doing an assessment of which of a set of models your data best matches.

      Anyway, I did in fact clearly say that it isn't settled, so I fail to see your gripe.

      • That said, it isn't a purely theoretical result. It's certainly the case that people are looking for reasons for a more meridional, more blocking-prone pattern, because that looks like what we are seeing observationally. So a match to observations is somewhat devalued.

        Similarly, I would expect tropical storm people are looking for a dynamical explanation of why tropical storms are getting larger and less tightly wound, reducing wind damage but enhancing storm surge damage. If they then look back at the data for confirmation, that needs to be discounted a bit, because they were looking for an explanation of what they are already seeing. There's a risk of constructing a post hoc explanation.

        If nature turns around and displays another pattern as climate change continues to take us into new regimes, it may be the case that we will have trouble distinguishing between actually forced climatic changes and stochastic variability.

        But I think we need some sort of weirdness index. Sandy, for instance, was not an ordinary event, or even a typical extraordinary event. Statistical techniques pretty much are useless on severe outliers. The dynamical explanation for Sandy unambiguously involves a large meridional excursion of the jet along with extraordinarily high sea surface temperatures and a very large area circulation.

        There is no way to attribute this individual event to a statistical trend - it only happened once. But clearly it combines some marked recent trends.

        If we had some way of characterizing how much of an outlier any particular event of any sort is, we might have some way of testing the widely held impression that the weather is getting weirder. I believe a realistic understanding of climate change would expect that to indeed be true, but that would be hard to extract from either formal theory or statistics.

      • Neven, I am indeed concerned about Arctic ice levels. In fact, as Judith Curry pointed out, the low levels of ice reached this summer may have contributed to the blocking pattern that steered Sandy onto the Jersey shores.

        But peer-reviewed literature has already both discussed and dismissed any connection between Pakistani floods of 2010 and 2011, drought in Texas, Egyptian famine and Muscovian heatwaves to climate change.

        Your team has lashed themselves to the mast of Xtreme Weather. I don't believe it's a good idea--for your team. I hope you won't mind if I occasionally remind you of that.

      • "But peer-reviewed literature has already both discussed and dismissed any connection between Pakistani floods of 2010 and 2011, drought in Texas, Egyptian famine and Muscovian heatwaves to climate change."

        The Egyptian famine (more like food shortage IIRC) aside since in that case climate change is only one piece of a complex puzzle, I would suggest that you look again. There's by no means a consensus as yet, but I would suggest that that's more a matter of science being not very good at turning on a dime, per Planck's dictum.

        BTW, if you're looking for a clearly climate change-related drought leading to an out-and-out famine, East Africa 2011 is the event for you.

        Michael, you might want to make a note up top that Petoukhov et al. (2013) is now out, conveniently open-access.

      • Lots of interesting stuff happening on this site and elsewhere on the web right now and I have a pile-up of other obligations; hard to keep up. Some very specifically science-related points where I'm probably a good person to pipe up. So I will be terser than I'd like for the moment.

        Tom Fuller's idea that once something is disputed in the literature it has been "dismissed" is peculiar to say the least.

        In particular Hoerling's much-blogged approach to individual event attribution is not even in the (peer-reviewed) literature as far as I know, far from being canonical.

        In short I dispute Tom Fuller's claim “But peer-reviewed literature has already both discussed and dismissed any connection between Pakistani floods of 2010 and 2011, drought in Texas, Egyptian famine and Muscovian heatwaves to climate change.” Please provide evidence for this very strident claim.

      • By all means remind me, Tom. And I'll remind you of all the great work your team has accomplished every time we get yet another can't-be-attributed-to-AGW-once-in-a-thousand-year-event.

      • Here's a topical article from Oz. I must say it sounds very consensusy with regard to the new regime. Excerpt:

        The tumbling of records had prompted conversations in the scientific community to turn a corner, he said. Previously, "weather is not climate" was the mantra, but now the additional boost from greenhouse gases was influencing every event.

        It might even be the case that the mantra chanted after every catastrophic weather event - that it can't be said to be caused by climate change, but it shows what climate change will do - has become a thing of the past.

        Progress!

      • "Neven, how frequently on this planet do you think we should see once in a thousand year events?"

        It depends mostly on the event scale.

        A thousand year flood over an area comparable to a single US county is something we'd expect several times a year on average. (Anomalous years are not independent even in a stable climate, but over the years we'd expect to see them essentially a thousandth as often per year as there are county-sized areas.)

        On a continental scale, as Steve points out, they should be far more rare. We have only one Arctic, so a thousand year event in the Arctic should happen in a stable climate once per thousand years.

        Tom, despite your adamant attachement to the idea that anomalies are not already emerging from the data, Hansen Sato and Ruedy last year demonstrated that extreme warm temperature anomalies are happening much too often to be attributed to chance while extreme cold anomalies are not.

        http://www.giss.nasa.gov/research/briefs/hansen_17/

  2. This sounds like it relates to Jennifer Francis's work on the effect of sea ice loss on the jet stream ( or it it proposing a different mechanism for the same phenomenon of 'stuck' weather patterns?)
    Wish the damn thing would show up on PNAS - the whole thing is a bit of a tease without the paper!

  3. The last highlighted sentence is interesting. What do you interpret from it?

    I am asking because it sounds rather scary. On video, Dr Francis said she had tried a 4x CO2 simulation and found even more north-south component of wind in that simulation. I took that as fairly reassuring message that the change is somewhat linear.

    This Petoukhov et al approach sounds more sophisticated than Dr Francis approach.

    If we have different groups: Petoukhov et al, Dr Francis, and Chris Reynolds each independently noticing and working on the same thing, does that add much confidence to the result? I assume not (if the theory is wrong there is still up to a 10% chance of seeing this anyway) but I am mentioning it because I think it does tend to show that the effect is pretty important.

    • Strictly speaking, all "nonlinear" means is that the response to a doubled given forcing is not double the response to the given forcing. If one of the effects is a rearrangement of the jet, it can't possible be linear because the planet is finite.

      But I think the sentence is squrming about something I;ve been squirming about. I would call it the Hoerling fallacy. (I am pleased that Nielsen-Gammon has backed off it a bit, now taking this flavor of attribution correctly as a rough lower bound.)

      The fallacy is to conclude that if an event is not part of an observed trend that matches the forcing, then that event cannot be considered to have been forced.

      You can reasonably say that it was AT LEAST much attributable. But it could be more, because the whole environment could be having new behaviors.

      And of course, it will eventually have new behaviors, and thus certain phenomena could cut in suddenly, and not part of a trend at all.

      It's not going to be easy to get a rigorous handle on this, but that doesn't mean it is or isn't happening already.

  4. If you had a 9 out of 10 probability of having a heart attack or a fire, you wouldn't stand around saying I don't want to hear it, so I'm going to find every reason to emphasize that 1 in 10 probability. This emphasis on uncertainty is largely a rhetorical trick. Phil Jones' "trick" wasn't in it with this fake skeptic talking point.

    I find the inability to deal with levels of uncertainty baffling. I would not be surprised to find it correlates with a lack of curiosity, which is another problem that plagues science deniers IMNHSO.

    As for observation and measurement, I'm tempted to say, we're human, get used to it. Observation has far outstripped science's ability to pin down the phenomena. I don't find that reason not to be worried: quite the reverse.

    • As usual, if I slowed down I could fit some of these afterthoughts in the original. First, my use of the word "probability" probably violates specific scientific terminology, but I think the reader knows quite well what I mean, unless they're determined to prove there's something wrong with a very clear idea.

      Also, I was a little overdoing it with "fake skeptic talking point" if I wanted to make myself clear. I believe I am not alone in being exhausted with the continuous quibbling and extreme emphasis on the small subject of the meaning of small statistical indicators of likelihood, and wish the larger and life-changing meaning of the overall ideas were not buried under this mishigass. The varied barrage of tricks and techniques to create a fog of doubt is not useful in advancing our understanding of the real world.

      Walking on eggshells might be easier than dealing with all the prickly and politically motivated parties bent on preventing honest conversation.

  5. Thomas Fuller --- This seems relevant:
    http://theconversation.edu.au/hot-summer-yes-the-hottest-12505

    As for how many 1-in-1000 year events, divide the continents into appropriately large sections for blocking high events. North America (other than Alaska) is probably about 5 sections while Europe, including European Russia, might be only 3 or 4. Assume each such section is independent with regard to highly unusual, but persistent, weather events. Assuming climate stability, North America ought to have about 5 such events every millennium. However, North America has had 2 such events already in the 21st century.

    That this is not just bad luck is a matter of fundamental meteorology, it seems. In a warmer world the Rayleigh waves which move along highs and lows are slower and so there will be more blocking highs, on average.

  6. Pingback: Another Week of GW News, March 3, 2012 – A Few Things Ill Considered

  7. I suspect the paper Michaels refers to is "Exploring Links between Arctic Amplification and Mid-Latitude Weather" by James A Screen and Ian Simmonds [who is from UM]. Published on line at GRL at precisely NOT the same time of 19th January 2013.

  8. The Screen and Simonds (2013) abstract:

    This study examines observed changes (1979–2011) in atmospheric planetary-wave amplitude over northern mid-latitudes, which have been proposed as a possible mechanism linking Arctic Amplification (AA) and mid-latitude weather extremes. We use two distinct but equally-valid definitions of planetary-wave amplitude, termed meridional amplitude, a measure of north-south meandering, and zonal amplitude, a measure of the intensity of atmospheric ridges and troughs at 45°N. Statistically significant changes in either metric are limited to few seasons, wavelengths and longitudinal sectors. However in summer, we identify significant increases in meridional amplitude over Europe, but significant decreases in zonal amplitude hemispherically, and also individually over Europe and Asia. Therefore, we argue that possible connections between AA and planetary waves, and implications of these, are sensitive to how waves are conceptualised. The contrasting meridional and zonal amplitude trends have different and complex possible implications for mid-latitude weather, and we encourage further work to better understand these.

    Michaels says:

    In what is always a bad sign for solid science, they found that any connections between blocking frequency and global warming are highly dependent upon the methodology they used.

    (Emphases added.)

    Wait a minute. Paper discusses amplitude and Michaels makes a claim about frequency? Um, bullshit. Of the standard Michaels variety.

    I've requested a copy of the paper and will read it since I'm very interested in this topic anyway, but I have zero expectation of finding anything in it that supports Michaels' claims about frequency and blocking events (which note are not the same thing, although related).

    Re blocking events, folks might want to have a look at this post by Chris Reynolds including a number of striking graphs (<a href="http://farm9.staticflickr.com/8074/8351175402_09869c5389_o.jpg"the annualized NH scatterplot e.g.) generated from data maintained by Anthony Lupo (ironically a sometime-colleague of Michaels, but from everything I can see an honest scientist). The recent trend in blocking events is amazingly blatant. Chris opines that the change in trend is too recent to be stat sig, but I wonder.

    BTW, here's another version of the Michaels piece, this one with more on the papers and no political rant at the end. It makes it very clear that the misstatement about frequency is no misunderstanding.

    Also BTW, I notice that just last month Screen moved to UExeter to head a new team focusing on this research area, more evidence that scientific concern is ramping up in a big hurry.

  9. Describing Screen and Simmonds as "plane lands on time" is another bit of considered misdirection. It's an odd life.

  10. Pingback: The Boy Who Cried Werewolf | The Lukewarmer's Way


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>