Assessing impacts of future anthropogenic carbon emissions is currently impeded by uncertainties in our knowledge of equilibrium climate sensitivity to atmospheric carbon dioxide doubling. Previous studies suggest 3 K as best estimate, 2 to 4.5 K as the 66% probability range, and nonzero probabilities for much higher values, the latter implying a small but significant chance of high-impact climate changes that would be difficult to avoid. Here, combining extensive sea and land surface temperature reconstructions from the Last Glacial Maximum with climate model simulations, we estimate a lower median (2.3 K) and reduced uncertainty (1.7 to 2.6 K 66% probability). Assuming paleoclimatic constraints apply to the future as predicted by our model, these results imply lower probability of imminent extreme climatic change than previously thought.
The main theme of the paper is estimating climate sensitivity more accurately. Several blogs have interpreted the finding already (see links below), with the common revelation or concern relating to the lack of a fat-tail in the sensitivity.
The lack of fat-tails is visible when we plot Schmittner’s climate sensitivity PDFs on a semi-logarithmic scale, as shown in the figure below. Note the steep fall-off on all the PDF curves apart from the land-only curve.
Fat-tail distributions are quite common in most natural phenomenon and usually derive from the propagation of uncertainty when the ratios of parameters factor in to model of an observable. Some well-known examples come under the heading of Ratio Distributions. Applying parametric ratios will lead to the commonly modeled asymmetric right-tail skewed climate sensitivities that occur in many studies (as shown below, taken from the Roe and Baker 2007 Science article).
Even something as well-known as the Planck’s Law distribution when graphed as a function of wavelength instead of frequency will heavily skew right. This is just another example of a ratio distribution as the wavelength is the reciprocal of the frequency (i.e. the ratio 1/f) of the photon.
The observation as it relates to the Schmittner paper is that they have somehow propagated their uncertainties such that all fat-tails have gotten truncated. Whether this is realistic or they have prior information on probabilities, one can’t tell from the paper.
The real concern is if the truncated probabilities no longer modulate the land-only PDF. With the little info they give, I can infer that the Markov chain algorithm (perhaps a HMM or something similar) they apply mixes the probabilities together according to Bayesian rules. If a probability down in the 10-16 level mixes with a larger probability as a multiplication, this will clearly truncate the larger probability. They do make the admission that a “mis-specification of the statistical model” might have happened (see line 170, p.8). It is also possible that the simulation sampling (Monte Carlo presumably, likely not importance sampling) was insufficient to generate enough statistics to generate probabilities for the empty tails. These are questions I would ask as part of the peer-review process.
Related to the sensitivity, is the fact that Schmittner relies on the Last Glacial Maximum (LGM) for the model analysis. Coming from a statistical physics background, I think of temporal climate changes as a random walk in a shallow energy well, as shown in the figure below. From ice core data that I have looked at, the random walk appears to be constrained by an Ornstein-Uhlenbeck or Semi-Markov process whereby it looks to revert to the mean, set by the minimum part of the well.
In reality these energy barriers are feedback boundaries as shown in the figure below. The upper temperature side is governed by the negative feedback provided by the Stefan-Boltzmann law. This has a logarithmic sensitivity to a CO2 forcing function. On the low temperature side, we have a latent barrier resulting from the huge heat of fusion required to freeze large amounts of sea-ice. During this process the temperature will remain more-or-less constant, providing an escape hatch should the process decide to reverse. The random walk wanders between these barriers during interglacial periods, with subtle forcings and noise accompanying the rather stable solar insolation.
The potential issue is that Schmittner’s study occupies the maximum glaciation time frame which sits in the low temperature left corner of the energy well. This is where the climate sensitivity gets tested via simulation and where it is compared against the paleoclimate data. They don’t have the data for the higher temperature regime where we reside today and where the climate sensitivity may differ. During the LGM the random walk was alternately bumping up against the cold temperature latent barrier while making excursions to warmer temperatures. In other words, it wasn’t occupying the shallow warm part of the potential energy well.
This skewing of the potential well may make a difference in the results, just as the skewing in the propagation of uncertainties in climate sensitivity can make a difference.
JC note: The origin of this post was suggestions by commenters that we discuss the Schmittner et al. paper. Based upon his comments on the thread and also his work on sensitivity, I invited WHT to do a guest post. I would like to thank WHT for preparing this at my request. This guest post does not imply any endorsement by me of either the paper or the content of WHT’s comments. BTW I am too busy this week to dig deeply into this.