by Judith Curry
Climate Science and the Uncertainty Monster
This paper is a very long one, so I elected not to try to discuss the paper all in one post but to present sections. Further, most of the material is not new, I have developed these arguments on previous uncertainty threads. Once the paper is in press, I will post the entire thing for discussion.
1. Introduction (draws mainly from my original post on the uncertainty monster)
Sidebar: Uncertainty lexicon (with material from the original uncertainty monster post)
2. Uncertainty of Climate Models (draws heavily from the post what can we learn from climate models?)
3. Uncertainty and the IPCC (see below)
3. Uncertainty and the IPCC
“You are so convinced that you believe only what you believe that you believe, that you remain utterly blind to what you really believe without believing you believe it.” Orson Scott Card, Shadow of the Hegemon
How to reason about uncertainties in the complex climate system and its computer simulations is not simple or obvious. Scientific debates involve controversies over the value and importance of particular classes of evidence as well as disagreement about the appropriate logical framework for linking and assessing the evidence. The IPCC faces a daunting challenge with regards to characterizing and reasoning about uncertainty, by assessing the quality of evidence, linking the evidence into arguments, identifying areas of ignorance, characterizing uncertainty and assessing confidence levels.
3.1 Characterizing uncertainty
“A long time ago a bunch of people reached a general consensus as to what’s real and what’s not and most of us have been going along with it ever since.” Charles de Lint
Over the course of four Assessment Reports, the IPCC has given increasing attention to reporting uncertainties (e.g. Swart et al. 2009). The “Guidance Paper” by Moss and Schneider (2000) recommended steps for assessing uncertainty in the IPCC Assessment Reports and a common vocabulary to express quantitative levels of confidence based on the amount of evidence (number of sources of information) and the degree of agreement (consensus) among experts.
The IPCC guidance for characterizing uncertainty for the AR4 describes three approaches for indicating confidence in a particular result and/or that the likelihood that a particular conclusion is correct:
1. A qualitative level-of-understanding scale describes the level of scientific understanding in terms of the amount of evidence available and the degree of agreement among experts. There can be limited, medium, or much evidence, and agreement can be low, medium, or high.
2. A quantitative confidence scale estimates the level of confidence for a scientific finding, and ranges from ‘very high confidence’ (9 in 10 chance) to ‘very low confidence’ (less than 1 in 10 chance).
3. A quantitative likelihood scale represents ‘a probabilistic assessment of some well-defined outcome having occurred or occurring in the future.’ The scale ranges from ‘virtually certain’ (greater than 99% probability) to ‘exceptionally unlikely’ (less than 1% probability).
Oppenheimer et al. (2007), Webster (2009), Petersen (2006), and Kandlikar et al. (2005) argue that future IPCC efforts need to be more thorough about describing sources and types of uncertainty, making the uncertainty analysis as transparent as possible. The InterAcademy Council (IAC) reviewed the IPCC’s performance on characterizing uncertainty. In response to concerns raised in the review, the IAC made the following recommendations regarding the IPCC’s treatment of uncertainty:
- “Each Working Group should use the qualitative level-of-understanding scale in its Summary for Policymakers and Technical Summary, as suggested in IPCC’s uncertainty guidance for the Fourth Assessment.” This is a key element of uncertainty monster detection.
- “Chapter Lead Authors should provide a traceable account of how they arrived at their ratings for level of scientific understanding and likelihood that an outcome will occur.” Failure to provide a traceable account is a symptom of uncertainty monster hiding.
- “Quantitative probabilities (as in the likelihood scale) should be used to describe the probability of well-defined outcomes only when there is sufficient evidence. Authors should indicate the basis for assigning a probability to an outcome or event (e.g., based on measurement, expert judgment, and/or model runs).” Using quantitative probabilities when there is insufficient evidence is uncertainty monster simplification.
The recommendations made by the IAC concerning the IPCC’s characterization of uncertainty are steps in the right direction in terms of dealing with the uncertainty monster. Curry (2011a) argued that a concerted effort by the IPCC is needed to identify better ways of framing the climate change problem, explore and characterize uncertainty, reason about uncertainty in the context of evidence-based logical hierarchies, and eliminate bias from the consensus building process itself.
3.2 Reasoning about uncertainty
“It is not so much that people hate uncertainty, but rather that they hate losing.” Amos Tversky
Many of the key conclusions from the IPCC AR4 WGI Report are quantitative assessments of likelihood. The IPCC characterization of likelihood is based upon a consensus building process that is an exercise in collective judgment in areas of uncertain knowledge. The general reasoning underlying the IPCC’s arguments for anthropogenic climate change combines a compilation of evidence with subjective Bayesian reasoning. This process is described by Oreskes (2007) as presenting a ‘consilience of evidence’ argument, which consists of independent lines of evidence that are explained by the same theoretical account. Oreskes draws an analogy for this approach with what happens in a legal case.
Given the complexity of the climate problem, expert judgments about uncertainty and confidence levels are made by the IPCC on issues that are dominated by unquantifiable uncertainties. Curry (2011a) argues that because of the complexity of the issues, individual experts use different mental models for evaluating the interconnected evidence. Biases can abound when reasoning and making judgments about such a complex problem. Bias can occur by excessive reliance on a particular piece of evidence,the presence of cognitive biases in heuristics, failure to account for indeterminacy and ignorance, and locial fallacies and errors including circular reasoning. The IAC states that “Studies suggest that informal elicitation measures, especially those designed to reach consensus, lead to different assessments of probabilities than formal measures. Informal procedures often result in probability distributions that place less weight in the tails of the distribution than formal elicitation methods, possibly understating the uncertainty associated with a given outcome.”
Oreskes (2007) draws an analogy for the consilience of evidence approach with what happens in a legal case. Continuing with the legal analogy, Johnston (2010) characterized the IPCC’s arguments as a legal brief, designed to persuade, in contrast to a legal memo that is intended to objectively assess both sides. Along the lines of a legal memo, Curry (2011a) argues that the consilience of evidence argument is not convincing unless it includes parallel evidence-based analyses for competing hypotheses, and hence a critical element in uncertainty monster detection. Any evidence-based argument that is more inclined to admit one type of evidence or argument rather than another tends to be biased. Parallel evidence-based analysis of competing hypotheses provides a framework whereby scientists with a plurality of viewpoints (including skeptics) participate in an assessment. In a Bayesian analysis with multiple lines of evidence, it is conceivable that there are multiple lines of evidence that produce a high confidence level for each of two opposing arguments, which is referred to as the ambiguity of competing certainties. If uncertainty and ignorance are acknowledged adequately, then the competing certainties disappear. Disagreement then becomes the basis for focusing research in a certain area, and so moves the science forward.
JC note: Curry (2011a) is part of a special issue on framing and communicating uncertainty for the IPCC, in Climatic Change. The electronic version of this issue should be published later this summer, with print version in Sept. This should be a very interesting issue.