Tuesday, October 20, 2009

Climate Sensitivity and Expected Temperature Increase

My response to C S Norman questions' about yesterday's post was getting so long I decided to turn it into a blog post.

The climate sensitivity is the expected temperature increase for the equivalent of a doubling of carbon dioxide in the atmosphere from the pre-industrial level of 280ppm to 560ppm. My understanding is that the confidence interval (the expected range about which we can be 90% or 95% confident say that the actual temperature increase will fall within that range) is still very wide and research hasn't managed to narrow it signficantly. Mean or mode expected changes in temperature have edged up in recent research as more historical data is examined and more feedbacks incorporated in modeling.

This situation explains a couple of things:

1. The attention given to Martin Weitzman's research which focuses on the potentially catastrophic impacts of the upper tail of the distribution of temperature changes. The deterministic cost benefit approach of Nordhaus and other economists who hated the Stern Review doesn't really address the correct problem at all. See also my post on who should win the Nobel Prize for environmental economics.*

2. The new emphasis on lower concentration targets like 350ppm and 450ppm of carbon dioxide equivalent rather than the formerly popular 550ppm. Even if the mean climate sensitivity was 3C there is a 50:50 chance that temperature would rise more than that likely resulting in the melting of at least the Greenland Ice Sheet among other impacts.

How can we relate climate sensitivities to expected temperature changes at any point in time? You can't just halve the climate sensitivity for a 50% increase in CO2, because:

1. Radiative forcing - the direct effect of carbon dioxide on the heat balance of the planet - is logarithmic in the increase in CO2. So a 50% increase in CO2 has more than half the effect of a doubling. A quadrupling has double the effect of a doubling etc.

2. Temperature changes very slowly in response to changes in radiative forcing. This is mainly due to the need to heat up the oceans which can store far far more heat than the atmosphere. In my model, the equilibrium climate sensitivity was 8C but, in an experiment that increases the CO2 concentration until doubling is reached, the temperature at the point of doubling had increased by less than 2C (this is known as the transient climate response). This is one of the factors that makes estimating the climate sensitivity from historical data very difficult.

Here is a chart from an IPCC report that illustrates this point:



For the doubling of CO2 scenario under a climate sensitivity of 3.5C, temperature had increased by 2C at the point of doubling. Even after 500 years the equilibrium increase hadn't been attained in the 3 layer GCM (global circulation model) simulation (red). The green curve is for a GCM with just a shallow ocean component and no transport of heat to the deep ocean.

This slow approach to equilibrium does have a positive implication - we can probably overshoot our desired long term concentration without too much damage if we can then bring concentrations back down again. For example, hitting 450 or 550ppm at 2050 and then bringing concentrations down to 350 or 450 by 2100. But this would probably require geoengineering of some sort.

* After Ostrom won the prize I wouldn't expect another one for environmental economics for a few years...

1 comment:

  1. Thanks! This helps me with something I've been working on. Turns out climate science is hard, who knew?

    ReplyDelete