by Judith Curry
Meteorologists have taken the lead in explaining uncertainties in forecasts to the public (JC note: see previous post The weatherman is not a moron). In the Powell formulation, “what you know” is that a storm is coming; “what you don’t know” is its exact track and thus how much snow will fall where, illustrated by the comparison of the varying model predictions; “what you think” is that snow accu- mulation is likely; and “which is which and why” are the models’ uncertainties and their limitations, due, in part, to sparse data.
Conversely, warnings or forecasts that do not communicate uncertainties can have embarrassing and sometimes counterproductive results. In 2008, as Hur- ricane Ike approached the Texas coast, the National Weather Service warned that people who did not evacuate coastal communi- ties faced “certain death.” In fact, fewer than 50 of the 40,000 who stayed on Galveston Island were killed. The predicted 100% probability of death—stated with no indication of uncertainty—fortunately proved significantly too high.
The trade-off is that worst-case warnings may save lives, but repeated overpredictions that do not acknowledge uncertainty can cause the public to ignore warnings. Hence, it is desirable to issue more nuanced warnings that explain the potential danger while acknowledging the uncertainty.
One major challenge is that real uncertainties often turn out to have been underestimated. In many applications, 20%–45% of results are surprises, falling outside the previously assumed 98% confidence limits. A famous example is measurements of the speed of light, in which new and more precise mea- surements fell outside the estimated error bars of the older ones much more frequently than expected. This effect arises in predicting river floods and earthquake ground motion and may arise for the IPCC uncertainty estimates [Curry, 2011].
Another tough challenge, for which scientists do not yet have a good approach, involves extreme events that are so rare that their probabilities are hard to estimate. The 2011 Tohoku earthquake was much larger than considered in the Japanese government’s hazard map and so caused a tsunami that overtopped seawalls, causing more than 18,000 deaths and $210 billion in damage. An immediate question is if and how coastal defenses that fared badly should be rebuilt, because building them to withstand tsunamis as large as 2011’s is too expensive. In one commentator’s words [Harner, 2012], “The question—to be asked in the current case— is whether sometimes the bureaucratic impulse [to] avoid any risk of future criticism by presenting the ‘worst case scenario’ is really helpful…What can (or should be) done? Thirty meter seawalls do not seem to be the answer.”
Formulating effective mitigation strategies is both an economic and political challenge. In both spheres, explaining the uncertainties involved in hazard forecasts is crucial, even though they cannot be precisely estimated. From an economic view-point, they can be factored into analyses of the optimum mitigation level, i.e., that which minimizes the total cost to society, which is the sum of the cost of mitigation and the expected losses. Presenting the uncertainties is equally important for the public discussion needed to formulate policies. Sarewitz et al.  argue that predictions must be as transparent as possible; that assumptions, model limitations, and weaknesses should be forthrightly discussed; and that uncertainties must be clearly articulated.
A similar view of the need for explaining uncertainties comes from considering technological accidents, which are like natural disasters in that the risks are hard to assess but can be large. Richard Feynman, dissenting from the official report after the loss of the space shuttle Challenger, showed that the risks had been greatly underestimated and stated that “NASA owes it to the citizens from whom it asks support to be frank, honest, and informative, so these citizens can make the wisest decisions for the use of their limited resources” [Feynman, 1988]. Scientists working on natural hazard forecasting should consider Feynman’s advice.
JC comment: Since I am currently developing an interdisciplinary proposal about natural hazards, I am in the midst of pondering these issues. It seems like the points raised in this paper target the discussion surrounding extreme events into a much more useful direction, than attempting to attribute these events to global warming. Following the advice of Powell and Feynman here seems to cover the bases in terms of how we should communicate the risk of natural hazards. One issue I would like to see discussed is the (potential and actual) role of social media in addressing these issues.