by Judith Curry
Two key questions in the climate debate are:
- How much of the recent warming can be attributed to natural variability versus anthropogenic forcing?
- Is the rate of warming in the latter half of the 20th century unusual or unprecedented?
A new paper published in Climate Dynamics tackles both of these questions:
On the time-varying trend in global-mean surface temperature
Zhaohua Wu • Norden E. Huang • John M. Wallace • Brian V. Smoliak • Xianyao Chen
Abstract. The Earth has warmed at an unprecedented pace in the decades of the 1980s and 1990s . In Wu et al. (2007) we showed that the rapidity of the warming in the late twentieth century was a result of concurrence of a secular warming trend and the warming phase of a multidecadal (~65-year period) oscillatory variation and we estimated the contribution of the former to be about 0.08C per decade since ~1980. Here we demonstrate the robustness of those results and discuss their physical links, considering in particular the shape of the secular trend and the spatial patterns associated with the secular trend and the multidecadal variability. The shape of the secular trend and rather globally-uniform spatial pattern associated with it are both suggestive of a response to the buildup of well-mixed greenhouse gases. In contrast, the multidecadal variability tends to be concentrated over the extratropical Northern Hemisphere and particularly over the North Atlantic, suggestive of a possible link to low frequency variations in the strength of the thermohaline circulation. Depending upon the assumed importance of the contributions of ocean dynamics and the time-varying aerosol emissions to the observed trends in global-mean surface temperature, we estimate that up to one third of the late twentieth century warming could have been a consequence of natural variability.
Climate Dynamics, published online 07 July 2011 DOI 10.1007/s00382-011-1128-8 [link] The full paper is behind paywall.
The authors have a presentation with the same title that can be viewed here, dated 24 January 2011. The abstract reads:
The Earth has warmed at an unprecedented pace in recent decades. In assessing how much of this warming is natural and how much of it is human-induced it is useful to partition the global-mean surface temperature into the secular trend and the (oscillatory) multidecadal variability. Previously, we showed that the rapidity of the warming in recent decades was a result of concurrence of a secular warming trend and the warming phase of a multidecadal (~65-year period) oscillation and we estimated the contribution of the former to be about 0.08°C per decade since ~1980. Here we demonstrate the robustness of those results and focus on their physical interpretation, considering in particular the shape of the secular trend and the spatial patterns associated with the secular trend and the multidecadal variability. The shape of the secular trend and rather globally-uniform spatial pattern associated with it are both suggestive of a response to the buildup of well-mixed greenhouse gases. In contrast, the multidecadal variability tends to be concentrated over the extratropical Northern Hemisphere and particularly over the North Atlantic, suggestive of a possible link to low frequency variations in the strength of the thermohaline circulation. Depending upon the assumed importance of the contributions of ocean dynamics and the time-varying aerosol emissions to the observed trends in global-mean surface temperature, we estimate that up to half the late 20th century warming could have been a consequence of natural variability. In addition, we show that the long term global warming associated with the secular trend is not accelerating in recent decades.
The authors’ previous PNAS paper is available online:
On the trend, detrending, and variability of nonlinear and nonstationary time series
Zhaohua Wu, Norden Huang, Steven Long and Chung-Kang Peng
Abstract. Determining trend and implementing detrending operations are important steps in data analysis. Yet there is no precise definition of “trend” nor any logical algorithm for extracting it. As a result, various ad hoc extrinsic methods have been used to determine trend and to facilitate a detrending operation. In this article, a simple and logical definition of trend is given for any nonlinear and nonstationary time series as an intrinsically determined monotonic function within a certain temporal span (most often that of the data span), or a function in which there can be at most one extremum within that temporal span. Being intrinsic, the method to derive the trend has to be adaptive. This definition of trend also presumes the existence of a natural time scale. All these requirements suggest the Empirical Mode Decomposition (EMD) method as the logical choice of algorithm for extracting various trends from a data set. Once the trend is determined, the corresponding detrending operation can be implemented. With this definition of trend, the variability of the data on various time scales also can be derived naturally. Climate data are used to illustrate the determination of the intrinsic trend and natural variability.
Ensemble empirical mode decomposition: a noise-assisted data analysis method
Zhaohua Wu and Norden Huang
Abstract. A new Ensemble Empirical Mode Decomposition (EEMD) is presented. This new approach consists of sifting an ensemble of white noise-added signal (data) and treats the mean as the final true result. Finite, not infinitesimal, amplitude white noise is necessary to force the ensemble to exhaust all possible solutions in the sifting process, thus making the different scale signals to collate in the proper intrinsic mode functions (IMF) dictated by the dyadic filter banks. As EEMD is a time–space analysis method, the added white noise is averaged out with sufficient number of trials; the only persistent part that survives the averaging process is the component of the signal (original data), which is then treated as the true and more physical meaningful answer. The effect of the added white noise is to provide a uniform reference frame in the time–frequency space; therefore, the added noise collates the portion of the signal of comparable scale in one IMF. With this ensemble mean, one can separate scales naturally without any a priori subjective criterion selection as in the intermittence test for the original EMD algorithm. This new approach utilizes the full advantage of the statistical characteristics of white noise to perturb the signal in its true solution neighborhood, and to cancel itself out after serving its purpose; therefore, it represents a substantial improvement over the original EMD and is a truly noise-assisted data analysis (NADA) method.
Excerpts from the Climate Dynamics article
From the Introduction:
Of particular interest is the estimation and attribution of the secular trend (ST).
Four different estimates of linear trends of observation based GST presented in Figure TS.6 of the Technical Summary of the Fourth Assessment Report (AR4) of the Intergovernmental Panel on Climate Change (IPCC 2007) fitted for different timescales, ranging from a warming trend of 0.045 ± 0.012C/decade for the past 150 years to 0.177 ± 0.052C/decade for the most recent 25 years (both periods ended at 2003), are shown in Fig. 1, together with a time series of the 25-year running linear trend. It is apparent from that figure that global warming has proceeded in a stepwise fashion, with relatively rapid rates of temperature increase from 1915 to 1935 and from 1980 to 1998 alternating with periods with much weaker and sometimes even negative trends centered around 1900 and 1950. These statistics serve to illustrate the sensitivity of estimates of such linear trends to the choice of start and end points upon which they are based. Short-term linear trends are an amalgamation of the ST and fluctuations with timescales too long to be resolved by conventional time series analysis techniques. The interpretation of the multidecadal variability (MDV) is particularly problematic in this respect.
The most widely used method of determining the trend in a data set is to draw the least squares best fit straight line within prescribed intervals, as was done in IPCC AR4. In reality, the rate of increase of GST in response to the cumulative buildup of long lived greenhouse gases and the changing rates of emission of aerosols is time dependent. Representing secular trends in GST in terms of linear trends is often not physically realistic. A more informative representation is an intrinsically-determined monotonic curve, having at most one extremum within a given time span (Huang et al. 1998; Wu et al. 2007).
If the cycles and secular trend extracted from the data do reflect the physical processes operating at a given time, then they should be temporally local quantities and the corresponding physical interpretation within specified time intervals should also not change with the addition of new data, for the subsequent evolution of a physical system cannot alter the reality that has already happened. Indeed, temporal locality should be the first principle in guiding all the time series analysis. This requirement reflects the evolution of time series analysis from the Fourier transform, to the windowed Fourier transform (Gabor 1946) and on to wavelet analysis (Daubechies 1992). It can be verified that the linear trends as fitted in AR4 (IPCC 2007) do not satisfy this locality principle, while the adaptive trend defined in Wu et al. (2007) and extracted using the ensemble empirical mode decomposition (EEMD) method (Huang and Wu 2008; Wu and Huang 2009) satisfies it qualitatively at least (as will be shown later), and hence, the ST determined adaptively by the data has a better chance of reflecting the underlying physics and resolving the ambiguity between the trend and the fluctuations superimposed upon it.
From the Summary and Discussion:
In the previous sections, we have presented the results of EEMD analysis, which indicate that the secular warming trend during the 1980s and 1990s was not as large as the linear trends of the observation-based GST estimated in AR4 (IPCC 2007); and that the unprecedented rate of warming in the late twentieth century was a consequence of the concurrence of the upward swing of the multidecadal variability, quite possibly caused at least in part by an increase in the strength of the thermohaline circulation, and a secular warming trend due to the buildup of greenhouse gases. We estimate that as much as one third the warming of the past few decades as reported in Fig. TS.6 of the Summary for Policymakers of AR4 (IPCC 2007) may have been due to the speeding up of the thermohaline circulation.
Other researchers have reached a similar conclusion: Keenlyside et al. (2008), Semenov et al. (2010) and Del-Sole et al. (2011) on the basis of numerical experiments with a climate model capable of representing the variability of the Atlantic meridional overturning circulation; Wild et al. (2007) on the basis of long term trends in the character of the diurnal temperature cycle at the Earth’s surface; and Swanson et al. (2009) based on an analysis of the partitioning of the GST trends using linear discriminant analysis. Furthermore, by analyzing the temporal derivatives of ST, we have demonstrated that the secular warming trend in GST has not accelerated sharply in the past few decades.
These caveats notwithstanding, the results presented here further substantiate the reality of human-induced global warming, as evidenced by the similarity between the secular trend curve recovered from EEMD of GST and the buildup of atmospheric greenhouse gas concentrations and by the near-global extent of the temperature increases associated with the secular trend. Our results also serve to highlight the importance of Atlantic multidecadal variability in mediating the rate of global warming, and they suggest that these variations deserve more explicit consideration in twentieth century climate simulations and in attribution studies based on recent observations of the rate of change of GST.
JC comments: I think this paper is an important contribution to our understanding of the climate variability of the 20th century. The paper highlights significant inadequacies in the IPCC AR4 analysis.
Note from Mike Wallace and Zhaohua Wu added 7/19:
You portray our article that appeared recently in Climate Dynamics as arguing that up to a third of the warming in the latter half of the 20th century can be attributed to the Atlantic Multidecadal Oscillation (AMO). The referent for this statement is the fourth column of Table 2 in our article, which presents our attribution of the trends for the past 25 years, ending in 2008. In the third colum of that same Table, we attribute only about 15% of the 50-year trend to the AMO. OUr intent in presenting these statistics is not to contest the IPCC’s attribution of the late 20th century (i.e., the 50-year) trend, but, rather, to question whether the acceleration in the rate of greenhouse warming has been as pronounced as implied by the graph presented in Figure TS.6 in the Technical Summary of the IPCC’s Fourth Assessment Report (AR-4), which shows linear trends for the most recent 100, 50, and 25 years ending in 2005. The papers of Delsole and Shukla and by Semenov et al., referenced in our paper, make the same point, but based on different kinds of evidence.