The Long Fat Tale of the Long Fat Tail

A long and fat tale goes with it.

Keith Kloor throws a softball my way. He offers up an argument which he attributes to William Pentland at Forbes, quoting Pentland thus:

Uncertainty is intrinsic to complex systems like Earth’s climate, but in the context of catastrophic climate change, this uncertainty is so severe that it is difficult to draw basic conclusions about how fat the fat tail is. According to [Harvard University’s Martin] Weitzman, it “is difficult to infer (or even to model accurately) the probabilities of events far outside the usual range of experience.”

Indeed, ”[r]ather than justifying a lack of response to climate change, the emphasis on uncertainty enlarges the risk and reinforces the responsibility for pursuing successful long-term mitigation policy,” according to a 2010 analysis by researchers at Sandia National Laboratory.

All things considered, alarmism seems like common sense to me.

Right up my alley. See, I have been saying this exact thing for a long time.

The Long Fat Tail

The question amounts to a version of the “long tail”, which leads us into the fraught territory of probabilistic thinking. Let me not stray too far into arrogance when I assert that I frequently see people who should know better and claim to know better blundering hopelessly in this territory. I won’t name names. And surely you, fair reader, are not included among the overconfident…

The point is that there is something peculiar about a probability density of an experiment that can never be repeated. Really, we are not talking about probability but about a Bayesian prior. It resembles frequentist probability in that it gives us a rational foundation for risk assessment. The calculations are identical, and this gives us a reassuring sense. But what we are talking about is not really a probability; it is about how well constrained we believe our beliefs are.

So here’s the really funky thing. People who are forever arguing in the public sphere that climate science is worthless and contemptible are basically arguing for the longest, fattest tails. These people, not believing in the science one little bit, cannot constrain the sensitivity of the system whatsoever. (Judith Curry at least had the decency to admit this – as I recall she once notoriously put the sensitivity between 0C and 10C per doubling.)

What does such a long tail mean to risk assessment? Of course, it matters what your distribution looks like. Is it centered at the IPCC range of 2.5 C per doubling? Perhaps you might argue for the zero-feedback 1.2 C, which apparently means your faith in science doesn’t even extend to established facts about how water evaporates. But if your personal prior doesn’t take much stock in climate science, then you are pretty much hosed, because your long tail extends into the catastrophic.

How It Works

Suppose you roll a pair of conventional dice. Your median roll, as is well known, is a seven. Now let’s play a game. We make a bet wherein you pay me the dollar amount of two raised to the power of your roll. If you roll a two, you pay me two squared dollars, a three, two cubed, etc. How much would I have to pay you up front to make this worth your while? Half the time you will roll less than a seven, and half the time more than a seven, so I propose 2^7 which is $128. You say you are indifferent to that bet, and suggest $256 instead.

You will be ahead 21 times out of 36, behind 10 times, and break even 5 times. A great bet, right? So what is your expected winning? I summon my magic snake and determine

>>> print 256 - reduce(lambda x,y: x+y,[2**(i + j + 2)
        for i in range(6) for j in range(6)]) / 36
-185

Your expected winning is negative $185, because the cost goes up rapidly in the tail, which is fat enough to matter. So, even though you win more than twice as often as you lose, it is a bad bet.

How Not to Distrust Climate Science

Now, we know that costs go up nonlinearly with global temperature. Clearly, a tenth of a degree is noise, and ten degrees is massive redistribution of population and ecosystems at best. Far more than a hundred times worse. So the less you know, the more scared you should be. Now a couple of prominent people do take that position. Lovelock for one, but more influentially within science, Broecker. These guys believe that we don’t have a good handle on things at all, and so we should be much more scared than IPCC suggests.

Somehow, Dr. Curry is not in that camp. Although she sets herself up as an expert on uncertainty, it is unclear how she manages to be less than enormously alarmed with a 10 C sensitivity in her prior. But in this she is aligned with a lot of people who aren’t thinking rationally at all. Their “gut check” says it’s just a bunch of hooey, just as mine is pretty much unconvinceable on homeopathy. Indeed, the whole thing has a homeopathic smell to them: how can so little CO2 relative to the whole atmosphere make a difference?

But it’s not. The amount of extra CO2 is small relative to the volume of the atmosphere, but it is large compared to the volume of infrared-opaque gases in the atmosphere.

How much ink does it take to change the color of a tank of water, relative to the volume of the tank? It’s exactly the same question. CO2 is infrared-colored ink. It turns out to be not very much. And how does colored water behave in the presence of light? Well, differently than pure water, which is the point. So if there are still any sincere skeptics out there, most of them are operating on a gut check about how much invisible ink there is in a tank, a quantity which is not visible but is easily measured! Others are willing to throw away all of astrophysics along with climate science, and suggest that nothing about radiative transfer is known. How fat should their tails be? There’s no telling.

So let’s presume that the basics are right and we are in “lukewarmer” territory. “Yes it will warm”, they say. “So what? What are you guys panicking about?”

The Long Tale of the Real World

There are a number of components to the answer, but the long, fat tail is part of it. And the tail is suddenly getting fatter on two accounts. First is the astonishing behavior of the Chinese, which appears flatly unsustainable on a very short time frame for purely economic reasons. And while I wish no harm to the Chinese, we’d best hope that they not keep it up. Because if they do, suddenly the emissions picture looks like this:

Why are there three curves for China and only one for everybody else? Because the curves for China have been revised sharply upward over the past decade, while the other major energy consumers remained on track. So what does this mean? If it’s not that either China is in for a hard landing, or the rest of us are, I’d like to know what.

And second is the increasingly palpable collapse of decision making systems in most other countries, especially North America and Europe, and the increasingly confused ownership pattern of the corporations, which are mostly owned by other corporations. In other words, the SkyNet scenario may be upon us already. We are ruled by monstrous artifacts with nothing resembling a conscience. This means that even if we constrain the outcome perfectly given human behavior, human behavior looks increasingly likely to be out of human control.

The Biggest Uncertainty

Eventually the fatness of the tail is the fatness of our heads. What is your prior on the sanity of collective human behavior? I mean, given the Copenhagen fiasco and everything.

The biggets uncertainty by far has nothing at all to do with science. It has to do with the desperate acceleration of unsustainable behavior just as the limits to the behavior are coming into view.

We are behaving like the Easter Islanders in the Jared Diamond scenario, whether that turns scenario out to be true or not.  (More on that another day.)

We have gone from vaguely seeing trouble on the horizon to doubling down on the statues.

The politically preferred answer to unsustainability that we perceive is to hurry up and break everything, as seems to have occurred in Easter Island. That makes for some pretty fat tails. And that makes for a drastically losing game. Can we change? How? How?

Images: Alligator via Flickr by Cimexus is in the Creative Commons (CC BY-ND 2.0). Cartoon courtesy Marc Roberts.  Graph is a photo by the author of a slide presented by Trevor Houser at the first SXSWEco conference in Austin TX recently at the session ” Texas and China: Non-Obvious Energy and Environmental Bedfellows”.

Comments:

  1. Judith has a post on this today also. I have difficulty sifting out the differences of the two viewpoints because they are so wildly different, even if the based on the same uncertain science.

    Her comment:
    2. Avoidance of future risk should drive the decision making process (JC opinion: loss avoidance is more important than risk avoidance).

    I have no idea what that's supposed to mean. It's almost like you have to make a whole language just to convince yourself of that logic. If that just means that we should unwilling to sacrifice now for later, I'm afraid we are in a bit of trouble.

  2. I believe that adaptation is the way forward, simply because it is too difficult to extract the signal from the noise of what is happening to global temperatures, over a useful timescale. Even if the world stopped emitting CO2 today, we would not know the impact until 30 years had passed - if my understanding is wrong, please correct me. So let's try to adapt. But then you need to know what you are adapting to. As I understand it, the UK is headed for cooler and wetter weather, depending on what models you look at - which means that our response is going to be very different from the places that are going to get a couple of degrees warmer. The question then shifts into how much faith can legitimately be placed in the GCMs about local climate change.

  3. Yes, I have a hard time distinguishing what she is saying from "let's hurry up and break everything".

    I do NOT think a fat tail argument is equivalent to the "precautionary principle", which in many renditions is unworkable and IS the equivalent of Pascal's wager.

    Quite the contrary, the fat tail argument is intended to support a quantitative and nuanced assessment of precaution to replace what is arguably an unworkably absolutist "precautionary principle" version.

    So she's off to a bad start. I hate to be curmudgeonly. She's sent her traffic over this way to have a look-see, which is great.

    Regardless, I am not just affirming the long tail argument here, I am readdressing the issue of human self-determination. As far as I am concerned this is a key to the future. Do we have any say over what "happens to us", given that what happens to us is mostly our doing?

    Somehow, we have influential people who find myriad ways of saying we don't. My main point is that these are the people who fatten the risk tail.

    If it's really the case that "loss avoidance is more important than risk avoidance" the prospect of disaster is practically certain.

    Thorium or solar thermal or whatever can't save us as long as coal is cheap. And coal will be cheap as long as coal is subsidized. And coal will be subsidized as long as modern life is about "jobs".

    And see that cliff over there? We seem to be marching over it. What fun! Let's place our bets as to when we will die in the crush and see who is smartest.

  4. This is off topic for the present thread but is a good launching point for further discussion. I will say that among those of us who think about it most, the ones who seem to me to be talking sense claim that there is no reasonable adaptation strategy without mitigation, i.e., without drastically reduced emissions.

    In the present article and the linked one at Eli's, I am claiming that the more uncertainty there is, the stronger the argument in favor of reduced emissions.

  5. I'd like to have an example of the rendition which makes the precautionary principle equivalent to Pascal's wager.

    Make that two, since it's supposed to be "many renditions".

    Some find Pascal's wager quite valid, incidentally:

    http://plato.stanford.edu/entries/pascal-wager/

  6. My primary objection to relying solely or large on adaptation is based on the perversity of the timing involved. In human terms, warming is happening at a slow pace, and the worldly conditions we experience day to day, season to season, year to year are dominated by noise and not this continual movement of the underlying signal. As a result, we're likely to get some really bad years (like 2010 in Russia and Pakistan, and 2011 in the US), interspersed with much lower impact, "normal" years. As the mix of bad and good years gets worse, with individual events likely getting worse as well as more numerous, it will become ever tougher for us to ignore that rise in the underlying signal until at last we're spurred to take significant action in the form of mitigation.

    The nasty detail is that by the time we hit that mental tipping point not only will we have incurred immense costs in lives and money, but we'll have waited so long that we've almost certainly locked ourselves in to centuries of additional impacts. As I've said numerous times online in various venues, the single least understood aspect of CO2 among a mainstream audience is its long atmospheric lifetime. The overwhelming majority of people implicitly assume that CO2 is like other pollutants, meaning we can "get serious" about reducing our emissions and the amount in the the air, and therefore the impacts from it, will dutifully drop in merely a few months or a couple of years. (I'm convinced that this is what drives our "we need jobs now, we can fix the CO2 thing later" attitude I hear so often.) Sadly, that's not the case, hence all the "warming in the pipeline" we see referenced so often.

    All of which is to say that this question of fat tails and probability distributions and Bayesian priors is anything but idle eco-wonkery; it lies at the heart of our climate challenge. If we err in estimating the shape of that curve, or the public and policymakers refuse to listen to an assessment from the science community because it is, how shall I put this, inconvenient, then we're at risk of locking ourselves into a horrific scenario.

  7. The important factor in Pascal's Wager is that the costs of action (saying your prayers) are low relative to a discrete, if uncertain, outcome (going to hell). So, the answer is easy, providing you place no value on your own moral integrity and assume that the deity won't be able to identify cynical expediency when He sees it. The Precautionary Principle, in its simplest, trivial form--better safe than sorry--is also easy to accept. The hard question is how much better is safe than sorry.

    The problem, I think, is not in getting people to accept small costs to avoid unlikely bad outcomes (even skeptics buy insurance); it's that in climate change the bad outcomes will disproportionately happen to other people and mostly after we are dead. But the costs of mitigation are to be paid by us, here and now.

    When an unsophisticated skeptic raises the question of uncertainty they are fundamentally using it as a moral justification for inaction: we don't know, so why should we act? The fat tails argument has a fat chance of changing any of their minds. The more sophisticated inactivist is already talking about discount rates, thinning out distant fat tails with compound interest calculations.

    As a pragmatic person with a background in business and science, I would prefer arguing for something on the basis of reliable knowledge, statistics and self interest. It's becoming increasingly clear to me, however, that these are the wrong tools for the job. Action on climate change hinges on ethics and values, not probability density functions.

  8. If the cartoon above is supposed to be a metaphor of the present condition, I'd just point out that seven months before its indicated date(10/10/06)a paper was published at Science's online site disputing the accepted paradigm about Easter Island that included these statements:
    "The researchers also dispute the claim that Easter Island's human inhabitants were responsible for their own demise. Instead, they think the culprits may have been Europeans, who brought disease and took islanders away as slaves, and rats, which quickly multiplied after arriving with the first Polynesian settlers."
    Knowing what's chasing you before you decide whether or which way to run would seem to be a good strategy to avoid breaking your neck by tripping over the inevitable unintended consequences littering every path you might choose.

  9. Again, the actual facts of Easter Island, and the bizarre nature of the ensuing controversy are a good topic for a later conversation.

    There are aspects of the dispute that are what one might call postnormal.

    Note that I specifically, explicitly allowed for Diamond's version being untrue in the above. Nevertheless people feel it necessary to keep kicking at it. Odd. Downright unfriendly, I'd say, given the stipulation.

  10. "Action on climate change hinges on ethics and values, not probability density functions."

    For me at least, it depends on both values and probability density functions.

    If it looks like AGW has a reasonably probable result being the extinction of humanity without mitigation, I would be very motivated to support mitigation.

    However, if the most significant results were the extinction of polar bears and some sea level increase, and mitigation is at the cost of keeping the people of China and India in poverty, I will not support mitigation.

  11. If there is a plausible risk, more uncertainty makes the case for action stronger. In a snowstorm it's prudent to reduce speed.

    The need for action is further increased by the large inertia's in the system: The carbon cycle responds only very slowly to changes in emissions, and the climate system responds only very slowly to changes in concentrations. The ‘stop’ button has a delay of multiple decades, which means we have to act based on foresight, or what comes closest to it (e.g. projections based on science), if we want to prevent the worst consequences. Waiting for more certainty, or waiting untill the consequences hit us in the face, comes with considerable risk, because at that point even worse consequences will be locked in (due to the system's inertia).

    The better our foresight, the lower the risk. The more uncertainty about what the future has in stock, the higher the risk, due in part to the fat tail-ness that mt desribes, and due in part to the large inertia and thus relative irreversibility on human timescales.

    http://ourchangingclimate.wordpress.com/2010/07/21/the-risk-of-postponing-corrective-action/

  12. From an email:

    "After reading your fat tale piece, I couldn't resist going over to Climate Etc. and seeing the contrast. Nearly an hour and a half later, I am sadder and no wiser (and much more tired). Just a reminder that 416 useless comments is worse than none."

    Gratifying, of course.

    Still and all, there won't be many good comments if there aren't many comments. By all means let us know what you are thinking, especially if it is you that is doing the thinking.

  13. "And while I wish no harm to the Chinese, we’d best hope that they not keep it up."

    They probably will while the West keeps outsourcing so much of its manufacturing to there. To be honest, I can't really consider a hefty chunk of those emissions (~30%) to be Chinese at all, which is something US Commerce Secretary Gary Locke was thinking, too.

  14. IIRC the demand before that was that they ramp up their non-carbon energy production. Well, they did it with solar and apparently are in the process of doing it with wind, and yet everybody's not lining up to thank them. They entirely neglected the implied part about "and do it in a way that doesn't compete with us," the ingrates.


Leave a Reply

Your email address will not be published.