by Judith Curry
On the time scale of a few decades ahead, regional variations in weather patterns and climate will be strongly influenced by natural internal variability. The potential applications of high resolution decadal climate change predictions are described in this CLIVAR doc. Based upon my own interaction with decision makers, I see a need on these time scales that is primarily associated with infrastructure decisions. Sectors that seem particularly interested in predictions on this time timescale are city and regional planners, the military, and the financial sector.
It is the combination of this natural variability and forced anthropogenic climate change that is of particular interest. Natural variability dominates regional climate change in many locations. Further, decision makers need to know the extent to which climate is varying because of natural variability, and hence expected to reverse at some point, or whether the climate is changing as the result of irreversible anthropogenic forcing.
So how should we approach this problem? Is there predictability in the climate system on these timescales? If so, how can this predictability be realized and converted into useful predictions?
The targets of interest on the timescale of the next two decades are
- the evolution of the global average temperature anomaly, for understanding the relative roles of anthropogenic versus natural climate variability/change
- the evolution of regional climate variability to support regional decision making: average temperature and precipitation; extreme events (heat/cold waves, floods, droughts, hurricanes, wildfires, etc).
Possible strategies for making decadal climate predictions on both global and regional scales include:
- global climate model simulations, including dynamical downscaling methods using regional climate models forced by the global climate model simulations.
- statistical forecast methods that combined projections of forcings and known modes of natural internal variability
- statistical/dynamical methods, that use elements of both of the previous methods
Climate model projections from CMIP3
While the CMIP3 20th century simulations used in the AR4 show some average skill on subcontinental scales (e.g. the U.S.), they show little skill on regional scales, and none in many regions (notably the southeastern U.S., which is a location that I have investigated closely.) One strategy that has been used for future projections of regional climate change is to take the projections from the CMIP3 21st century simulations and use these fields to force higher resolution regional climate models (referred to as dynamical downscaling.) The idea of dynamical downscaling is to force a regional climate model nested from a scale of say the U.S. with successively higher resolution grids down from the continental to the regional to the local scale of interest. An ambitious statistical downscaling effort for the U.S. climate based on the CMIP3 simulations is described here.
The challenge of decadal scale predictability
The IPCC AR4 projected a near term global average temperature increase of 0.2C per decade. Further, the AR4 showed an insensitivity of global average surface temperature to emission scenarios prior to about 2060. For projections on the time scale 2010-2030, the strategy described above is marginally useful owing to problems that the global models have with the modes of natural internal variability, which dominates the global warming signal on this time scale. The IPCC AR4 projected a near term global average temperature increase of 0.2C per decade for the next few decades, showing an insensitivity of global average surface temperature to emission scenarios prior to about 2060.
The challenge for prediction on the time scale 2010-2030 is that this timescale is both a boundary value and initial value problem. Hurrell et al. 2009 describes the challenge:
Efforts to predict the evolution of climate over the next several decades that take into account both forced climate change and natural decadal-scale climate variability are in their infancy. Many formidable challenges exist. For example, climate system predictions on the decadal time scale will require initialization of coupled general circulation models with the best estimates of the current observed state of the atmosphere, oceans, cryosphere, and land surface – a state influenced both by the current phases of modes of natural variability and by the accumulated impacts to date of anthropogenic radiative forcing. However, given imperfect observations and systematic errors in models, the best method of initialization has not yet been established, and it is not known what effect initialization has on climate predictions. It is also not clear what predictions should be attempted or how will they be verified. The brevity of most instrumental records furthermore means that even the basic characteristics and mechanisms of decadal variations in climate are relatively poorly documented and understood. As a consequence, the representation of natural variability arising from the slowly-varying components of the climate system differs considerably among models, so the inherent predictability of the climate system on the decadal time scale is also not well established. Demands will therefore be made on observations, particularly ocean observations, not only to describe the state of the climate system and improve knowledge of the mechanisms that give rise to decadal fluctuations in climate, but also to provide the optimal observations for decadal climate predictions and their verification.
Basis for decadal scale predictability
Excerpts from Hurrell et al.:
a) External forcing
A significant source of predictability on the decadal time scale is associated with radiative forcing changes. Past emissions of greenhouse gases have committed the climate system to future warming as the ocean comes into equilibrium with the altered radiative forcing (Hansen et al. 2005; IPCC 2007). Changes in solar irradiance and volcanic activity in the recent past also can provide some level of decadal predictability as the climate system responds to these forcing changes on decadal scales.
The best possible estimates of future radiative forcing changes are also needed for predictions. Estimates of future emissions of radiatively important pollutants are needed for making predictions, as well as modeling capabilities to accurately simulate both how these pollutants affect the global energy, carbon and sulfur cycles, and how the climate system subsequently responds to that altered forcing. In this regard, future external forcing from greenhouse gases is likely to provide significant regional decadal predictability (e.g., Lee et al. 2006), since the increase of concentrations over the next 30 years is about the same no matter what emission scenario is followed (Hibbard et al. 2007). While man-made aerosols can be washed out of the atmosphere by rain in just a few days, they tend to be concentrated near their sources such as industrial regions, and can affect climate with a very strong regional pattern. Future changes in anthropogenic aerosols, therefore, could have very significant regional climatic impacts on decadal scales. Unpredictable volcanic eruptions can be a significant “wild card” to decadal climate predictions, although techniques to handle this aspect are under consideration. Similarly, only very general features of the 11-year solar cycle can be projected, but could provide some decadal scale predictability. The influence of the stratosphere, by transmitting external forcing signals to the troposphere, might also be a significant source of predictability.
b) Natural internal variability
Regionally, the predictability of SST can be higher than for the global field (not shown), with the highest levels on decadal time scales over the middle to high latitude ocean areas of both hemispheres, especially in regions where the surface layer makes contact with the deeper ocean beneath (Boer and Lambert 2008). A fundamental precept in predictability is the notion that long-lived variations, such as those associated with the PDO or changes in the strength of the Atlantic MOC, can be predicted for a significant fraction of their lifetimes. This is simply a reflection of the fact that the persistence of the variation implies a stable balance that permits the variation to have an extended lifetime. Thus, there is some confidence that naturally occurring climate variations with decadal time scales can, at times, be predictable given an accurate initial state. These times are likely to be when a significant amplitude variation exists. At other times, particularly the nascent phase of variation growth, the predictability of variations is likely to be quite delicate and require a very accurate depiction of the current state of the climate system if there is to be any hope of accurate prediction.
Finally, it should be noted that many climate and biogeochemical variables exhibit long-term persistence that could be exploited using statistical forecasting schemes. Physical damping of high-frequency variability increases the decadal signal to noise ratio and hence the potential predictability on decadal timescales. Simple linear multivariate decadal prediction schemes that exploit the long-term damped persistence of certain physical processes may, in fact, be quite successful (e.g., Lean and Rind 2009), but they rely heavily on long-term data sets to accurately estimate the covariance matrix. With their potential for long records, paleoclimate reconstructions may be of use in estimating such statistical relationships and developing predictive models. For example, Enfield and Cid-Serrano (2006) developed a statistical model for predicting regime shifts in the AMO; similar methodology could be applied for other climatic shifts.
In recognition of this need for improved decadal scale predictions, CMIP5 is coordinating decadal hindcast and prediction experiments, which will be used in the IPCC AR5. The following information on the CMIP5 decadal simulations is drawn from the presentation made by James Hurrell at the Fall AGU meeting:
- 10 year integrations with initial dates from 1960-2005
- 1960, 1980, and 2005 integrations extended 30 years
- Ensemble predictions (minimum 3 members)
- Ocean initial conditions should be in some way representative of the observed anomalies or full fields for the start date
- Land, sea-ice and atmosphere initial conditions left to the discretion of individual modeling groups
The CMIP5 simulations are not yet publicly available. Some results from decadal scale simulations can be obtained from
Hoerling et al. (2010, submitted) has conducted an interesting set up experiments that makes predictions for the North American climate for 2011-2020:
North American mean surface air temperature and precipitation are predicted for the upcoming 2011-2020 decade. Multiple climate models forced by various plausible scenarios for the 2011-2020 change in ocean surface boundary conditions are first employed in order to estimate the forced response, and its uncertainty, to expected changes in anthropogenic forcing. A full probabilistic decadal forecast is then generated by commingling the statistics of the forced response with those arising from internal decadal sea surface temperature (SST) and sea ice variability. The latter are estimated from a multi-model suite of 20th Century atmospheric climate simulations driven by the observed time history of SST and sea ice variations.
The prediction is characterized by surface warming over the entire continent and precipitation decreases (increases) over the contiguous United States (Canada) relative to 1971-2000 conditions. The signs of these signals are robust across the scenarios and the models employed, though their amplitudes are not. An assessment of the sources of forecast uncertainty reveals comparable sensitivity to the various scenarios of forced SST change, model dependency, internal atmospheric noise, and internal decadal SST variability. Taking these sources of forecast uncertainty into account, predictions for the 2011-2020 decade indicate a 94% and 98% probability for warmer than normal conditions over the U.S. and Canada, respectively, a 99% probability of wet conditions over Canada, and a 75% probability of dry conditions over the U.S.
The key element in these simulations is selecting scenarios for the 2011-2020 sea surface temperatures and sea ice extent:
Three scenarios for the 2011-2020 SST change (relative to 1971-2000) due to anthropogenic GHG emissions are generated (Fig. 1). One uses the 22 model average SST anomalies computed from the CMIP3 simulations (Meehl et al. 2007) subjected to the SRESA1B emissions scenario. The other two are derived from the temporal optimal detection method (Ribes et al. 2010) in which the temporal pattern of the forced response over 1901-2020 is derived from the CMIP3 simulations, but the spatial pattern of change is derived from observations. In particular, the temporal pattern is generated by averaging global mean surface air temperature over the 22 separate CMIP3 models for each year, and further imposing a smoothing constraint as in Ribes et al. (2010). Its structure resembles a linear increasing function during the first half of the 20th century and an exponentially increasing function in later decades. A spatial pattern is computed by regressing observed SST upon the temporal pattern for 1901-2009, and the 2011-2020 anomaly amplitudes are then derived from the temporal optimal for 2011-2020. Given the observational uncertainty in estimating the spatial pattern of the centennial SST trend (e.g., Deser et al. 2010), two datasets, the NOAA Extended Reconstruction version 3b analysis (Smith et al. 2008) and the Hurrell analysis (Hurrell et al. 2008), are used. For Arctic sea ice, we use a single scenario that involves persisting the recent (2007-2009) pattern of observed monthly sea ice concentration. This was a period of record low Arctic sea ice extent and concentration (see Fig. 1 in Kumar et al. 2010).
JC’s assessment of climate model decadal predictions
The simulations that seem most relevant to the problem at hand are the CMIP5 simulations initialized 2005 and run for 30 years (out to 2035). Based upon my experience with subseasonal and seasonal predictability, it seems that predictability is greatest when the model is initialized in a well established regime. For example, for subseasonal forecasts this depends on where in the cycle of the Madden-Julian oscillation that the model is initialized; and for seasonal forecasts, this depends on where in the ENSO cycle the model is initialized. The well-known spring (April) ENSO prediction barrier arises because this is the season of transition. If this same general principle holds for the decadal forecasts, then 2005 is a good year to initialize in terms of the AMO/NAO/AMOC, since 2005 was a peak (if not the peak) in the current warm phase of the AMO. This means that regional climate features that are sensitive to the AMO should be well represented (e.g. Sahel drought, North Atlantic hurricanes). In terms of the Pacific, 2005 was a transition year for the PDO, so I would hypothesize that the models would have more difficulty in simulating the correct evolution of the PDO.
The experimental design of the Hoerling et al. paper is probably to be preferred out for one decade, since you can take your knowledge of the ocean oscillations and project forward sea surface temperature and sea ice characteristics, and then see how the atmosphere and land surface responds. How far into the future such an experimental design will be useful depends on when you initialize it. Initializing the model in 2010, with the warm AMO and cool PDO well established, this strategy should work for another 15 and possibly 20 years, depending on the length of period that we can expect the AMO to remain in the warm phase.
The key challenge of multi-decadal climate forecasting is prediction of the change points (transitions) of the major ocean oscillations. Again, based on my experience with probabilistic seasonal forecasting, the only way I see to do this potentially with any skill is to select the models that do a relatively good job at simulating the key features in hindcast mode, and then select the ensemble members from these models that compare best with observations for the first year or two of the simulation. The rationale for such a selection is that ensemble members that get off to a good start are more likely to be on a good trajectory going forward. I look forward to getting my hands on the CMIP5 simulations.
Roger Pielke Sr. asks whether it is worth the expenditure to improve multi-decadal climate forecast models. The answer he comes up with is “no.”
The subject of part II will be statistical modeling approaches, including issues related to solar variations, the ocean oscillations, and the interesting new ideas that border with ignorance related to magnetic fields, extraterrestrial influences, and some of the other issues raised by participants here.