by Judith Curry
The recent articles in the Daily Mail and the Guardian are generating heated reactions – more heat than light. Lets break down the arguments on both side and assess them systematically.
The big picture
i) Warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice, and rising global average sea level. JC comment: It is only the surface temperature record that has sufficiently long observational time series on a global scale for credible detection and attribution studies.
ii) Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations. JC comment: the evidence that carries the greatest weight in this assessment is global climate model simulations, conducted with and without anthropogenic forcing.
iii) For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenarios. JC comment: confidence in this statement comes from the following: “Since IPCC’s first report in 1990, assessed projections have suggested global average temperature increases between about 0.15°C and 0.3°C per decade for 1990 to 2005. This can now be compared with observed values of about 0.2°C per decade, strengthening confi dence in near-term projections.”
The implications of the 16 year plateau are this:
a) the IPCC detection arguments rely on a clear separation between the signals from forced climate change and natural internal variability. Numerous climate model analyses find that it is very unlikely that a plateau or period of cooling extends beyond 15-17 years in the presence of anthropogenic global warming.
b) failure of the climate models to predict a >17 year plateau raises questions about the suitability of the climate models for detection and attribution analyses, particularly in terms of accounting adequately for multidecadal modes of climate variability
c) comparison of the observed temperature trend with the IPCC projection of 0.2C increase in the early 21st century raises issues about the models’ reliability in terms of sensitivity to external forcing and ability to deal with natural internal variability
My criticisms of the IPCC’s detection and attribution argument can be found in these two recently published papers:
What I personally think is going on with the climate system is summarized in my post
Mail and Guardian articles
As I see it, the main issues of contention in these two articles are semantic and related to data quality. Nuccitelli trusts the climate models, whereas in the Mail article, both Jones and Curry agree that climate models are imperfect and incomplete and did not predict such a long pause (Jones worries once the pause exceeds 15 years).
In the headline of the Mail article, and in the first statement, the following words are used: “Global warming stopped 16 years ago.” In the context of the rest of the article, this apparently refers to the 16 year plateau (or hiatus) in global average surface temperature anomalies. Critics of the Mail article seem to think this statement infers that the anthropogenic forcing of the climate has stopped; the later context of the article makes it clear that natural variability has been dominating the anthropogenic signal. That said, an arguably preferable title would have been ” 16 year plateau in global surface temperatures puzzles climate scientists”. However, such an article should have been written by the climate scientists, they should have owned this issue. In the absence of that, we get the inflammatory “Global warming stopped 16 years ago.”
The Guardian article brings in additional data: the Arctic sea ice minimum and ocean heat content.
Observations of global warming
In the IPCC SPM statement cited above, they include evidence of surface temperature, atmospheric temperature, ocean heat content, snow and ice melt, and sea level rise.
In assessing this evidence, we need to consider the quality of each of these data sets in terms of their maturity as climate data records and length of the records, so that we can appropriately interpret the recent variations. Further, for the purpose at hand (detecting an anthropogenic signal in recent climate change), we need to include confounding factors in assessing quality for purpose.
What do I mean by ‘quality’ and ‘maturity’ of climate data records?
Elements of the climate data maturity matrix (John Bates, NCDC):
- Software readiness: are algorithms under configuration management and how mature?
- Metadata: how full and complete are the metadata and quality assessment?
- Documentation: Is the Operational Algorithm Description full, complete, and peer reviewed?
- Product validation: How complete is the validation?
- Public Access: Are the data, algorithms and softare open and available to the Public?
- Utility: How extensive is the peer reviewed literature and how varied are the applications?
Quality indicators include (following Funtowicz and Ravetz):
- well established theory and method
- best available practice; large sample; direct measure
- auditability: well documented trace to method
- calibration: good fit to data
- validation: independent measurement of same variable
- objectivity: no discerible bias
Let’s assess the individual data sets according to these criteria:
- Surface temperature: Meets the maturity criteria. There are several independent data sets (although they are mostly based on the same raw data). New methods are still being developed, and past data are being revised. Post climategate, these data sets are arguably all auditable. Concerns remain regarding bias in some of the data sets, associated with adjustments and homogenization.
- Atmospheric heat content: medium maturity, two independent data sets, extremely complex algorithms that are not easily audited.
- Ocean heat content: The ARGO data scores low in terms of maturity and auditability.
- Sea level rise: The altimetry-based methods scores medium in terms of maturity, the algorithms continue to be revised.
- Sea ice extent: The satellite-derived data sets are mature, and there are multiple independent datasets of sea ice extent. These datasets are auditable and widely used.
- Ice sheet and glacier mass balance data: low maturity.
For the purpose at hand (global climate of the last 16 years), all but surface temperatures and atmospheric heat content are associated with confounding factors:
- ocean heat content: given the long time scales in the ocean, it is difficult to interpret relatively short variations on the scale of 1-2 decades
- sea ice extent: this is a regional (not global measure), that is heavily influenced by natural internal variability, as well as the long ocean time scales described above
- glaciers and ice sheets: local to regional (not global), with strong regional influences of natural internal variability. Snowfall is a counfounding factor.
- sea level rise: strong element of natural internal variability, confounding factors associated with coastal land use and geologic processes.
Further, all datasets except for surface temperature decay in quality substantially prior to 1980, making it difficult to interpret the natural background variability.
Based on this analysis, its difficult to get away from the idea that the best (most mature, highest quality) data set for inferring recent climate change is the surface temperature data record.
Italian Flag analysis
To sort through the claims made by both the Daily Mail and Guardian articles, lets adopt the three-valued logic approach of the Italian Flag analysis. The basics are:
The Italian flag (IF) is a representation of three-valued logic in which evidence for a proposition is represented as green, evidence against is represented as red, and residual uncertainty is represented as white. The white area reflects uncommitted belief, which can be associated with uncertainty in evidence or unknowns.