Decision making under climate uncertainty: Part I

by Judith Curry

Based upon the precautionary principle, the United Nations Convention on Climate Change (UNFCCC) established a qualitative climate goal for the long term: stabilization of the concentrations of greenhouse gasses in the atmosphere. The view of climate change held by the UNFCCC regards both the problem and solution as irreducibly global. This view of the problem has framed the IPCC’s assessment and national funding priorities on the subject of climate science.

The UNFCCC’s policy solution places us all between the proverbial rock and a hard place, as evidenced by the international and national debates being conducted on the topic.  The dilemma is aptly  described by Obersteiner et al. (2001):

The key issue is whether “betting big today” with a comprehensive global climate policy targeted at stabilization “will fundamentally reshape our common future on a global scale to our advantage or quickly produce losses that can throw mankind into economic, social, and environmental bankruptcy”.

Weitzmann (2009) characterizes the decision making environment surrounding climate change in the follow way:

“Much more unsettling for an application of expected utility analysis is deep structural uncertainty in the science of global warming coupled with an economic inability to place a meaningful upper bound on catastrophic losses from disastrous temperature changes. The climate science seems to be saying that the probability of a system-wide disastrous collapse is non-negligible even while this tiny probability is not known precisely and necessarily involves subjective judgments.”

The question needs to be asked as to whether the early articulation of  a preferred policy option by the UNFCCC has stimulated a positive feedback loop between politics, science, and science funding that has accelerated the science (and its assessment by the IPCC) towards the policy option (CO2 stabilization) that was codified by the UNFCCC. This feedback loop marginalizes research on natural climate variability (forced and unforced) on regional and global scales, focuses research on model development rather than observations (particularly paleoclimate), and values model agreement over a full exploration of model uncertainty (including model structure).   The net result of such a feedback loop is an overconfident assessment of the importance of greenhouse gases in future climate change.  Which has brought us to our current position between a rock and a hard place, where we lack the kinds of information that we need to understand climate change more broadly and develop and evaluate a broad range of policy options.

My particular interest in this situation is to understand the dynamics of uncertainty at the climate science-policy interface.  I am questioning whether these dynamics are operating in a manner that is healthy for the science and for the policy making process.  The IPCC’s efforts to consider uncertainty focus on communicating uncertainty (apparently motivated by building the political will to act), rather than on characterizing uncertainty in a way that would be useful for risk managers and resource managers, not to mention the scientists and the institutions that fund science.

While I am an novice (in academic terms) at considering these issues,  I  would like to raise them here at Climate Etc. for discussion.  We need additional perspectives and more discussion on these issues, and I look forward to your input and ideas.

Overall framework of the series

I am envisioning this series in three parts, organized around three different decision making strategies:

I.  The decision making strategy associated with the UNFCCC global emissions stabilization targets

II.  Robust decision making and the “fat tail” issue raised by Weitzmann

III.   Regional adaptation strategies that focus on reducing vulnerability to extreme events and resource management.

Background on decision making under uncertainty

The most useful overviews that I have found on decision making under uncertainty in the context of the climate problem are Obersteiner et al. (2001), Morgan et al. (2009) and van der Sluijs et al. (2010).  Note to the plagiarism police: none of these are my original ideas and I am claiming no academic credit for them, I have done my best to synthesize my knowledge into clear statements and attribute specific ideas to their source.

Decision making identifies and choose alternatives based on the values and preferences of the decision maker. Circumstances of relatively low uncertainty and low stakes are a comfortable domain for applied science and engineering, where application of cost benefit analysis and expected utility is straightforward. However, if the uncertainty is high and/or the stakes are high, the decision making environment is much more volatile.

Classical decision analysis identifies an optimal choice among actions based upon the probability of occurrence of possible outcomes and the decision maker’s utility functions. Uncertainty in the input parameters are propagated through a model to generate the expected utility of the different options.  Decision rules are then applied (e.g. the maximum expected utility). While probability theory has been the foundation of classical decision analysis, other alternatives have been employed including possibility theory and Dempster-Shafer theory. An example of a nonprobabilistic decision rule is minimax, which minimizes the possible loss while maximizing the possible gain.

The classical linear technocratic model of decision making assumes that more scientific research leads to more reliable knowledge and less uncertainty, and that the scientific knowledge forms the basis for a political consensus leading to meaningful action (van der Sluijs et al.). When uncertainty is well characterized and there is confidence in the model structure, classical decision analysis can provide statistically optimal strategies for decision makers.

Classical decision making theory involves reducing the uncertainties before acting.  In the face of irreducible uncertainties and substantial ignorance, reducing the uncertainty isn’t viable, but not acting could be associated with catastrophic impacts.  While a higher level of confidence can make decision makers more willing to act,  overestimating the confidence can result in discounting the value of information in the decision making process if the confidence later proves to be unwarranted.

Under conditions of deep uncertainty, optimal decisions based upon a consensus can carry a considerable risk. Obersteiner et al. describes the uncertainty surrounding the climate change science is a two-edged sword that cuts both ways:  what is considered to be a serious problem could turn out to be less of a threat, whereas unanticipated and unforeseen surprises could be catastrophic. Obersteiner et al. argues that the strategy of assuming that climate models can predict the future of climate change accurately enough to choose a clear strategic direction might be at best marginally helpful and at worst downright dangerous: underestimating uncertainty can lead to strategies that do not defend the world against unexpected and sometimes even catastrophic threats. Obersteiner et al. notes that another danger lies on the other side of the sword if uncertainties are too large and analytic planning processes are abandoned.

Resilient and adaptive decision making strategies are used in the face of high uncertainty.  Resilient strategies seek to identify approaches that will work reasonably well across the range of circumstances that might arise.  Adaptive strategies can be modified to achieve better performance as more information becomes available.   Adaptive strategies work best in situations where large nonlinearities are not present and in which the decision time scale is well matched to the actual changes. (Morgan et al.)

Robustness is a strategy that formally considers uncertainty, whereby decision makers seek to reduce the range of possible scenarios over which the strategy performs poorly.  As an example, Info-gap decision theory sacrifices a small amount of optimal performance to reduce sensitivity to what may turn out to be incorrect assumptions. A robustness criteria helps decision makers to to distinguish  reasonable from unreasonable choices. Robustness suggests decision options that lie between an optimal and a minimax solution. Relative to optimal strategies that focus on the best estimate, robustness considers unlikely but not impossible scenarios without letting them completely dominate the decision.

The precautionary principle is a decision strategy often proposed for use in the face of high uncertainty, which consists of precaution practiced in the context of uncertainty.  While there are many different notions of what the precautionary principle does and does not entail, there is a very clear meaning of the principle in the context of climate change.  Principle #15 of the Rio Declaration from the Earth Summit, in 1992 states (UNEP, 1992):

“In order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”

This statement clearly explains the idea that scientific uncertainty should not preclude preventative measures to protect the environment. However, the precautionary principle implies the need for a minimal threshold of scientific certainty or plausibility before undertaking precautions.

Obersteiner et al.  argues that stabilization of greenhouse gas concentrations at a target level is a non-robust strategy in an environment that is extremely uncertain and most likely nonlinear.  The UNFCCC strategy has dominated the framing of IPCC assessment by focusing their efforts on the documentation of dangerous climate change in the context of specific levels of warming associated varying amounts of atmospheric carbon dioxide.  Embellishing the IPCC’s climate paradigm has come to dominate national and international climate programs and their funding.

Using climate models to optimize stabilization targets

Given the inadequacies of current climate models, how should we interpret the IPCC’s multi-model ensemble simulations of the 21st century climate?  This ensemble-of-opportunity is comprised of models with similar structures but different parameter choices and calibration histories, running simulations under different emissions scenarios with a few models conducting multiple simulations for individual scenarios.

There are several different viewpoints regarding the creation of meaningful probability density functions (PDFs) from climate model simulations. Staniforth et al. (2007) argue that model inadequacy and an inadequate number of simulations in the ensemble preclude producing meaningful probability PDFs from the frequency of model outcomes of future climate. However, Stainforth et al. emphasize that models can provide useful insights without being able to provide probabilities, by providing a lower bound on the maximum range of uncertainty and a range of possibilities to be considered.  Knutti et al. (2008) argues that the real challenge lies more in how to interpret the PDFs rather whether they should be constructed in the first place.   There is a growing emphasis on trying to assign probabilities to the distribution of simulations from the ensemble of opportunity.  Staniforth et al. is consistent with Betz (2007), who views a climate model simulations as a modal statements of possibilities.

Inferring an actual prediction/projection/forecast of 21st century climate from these simulations assumes that the natural forcing (e.g. solar, volcanoes) will be essentially the same as for the 20th century.  Apart from the uncertainty in forcings, the models have substantially varying sensitivities to CO2, marginal capabilities for simulating multidecal natural internal variability, and there is also the issue of scenario uncertainty.

If we assume that CO2 sensitivity dominates any conceivable combination of natural (forced and unforced) variability, what do the simulations actually say about 21st century climate?  Well, the sensitivity range for the IPCC calculations are essentially in the same range (1.5-4.5C) that was estimated in the 1979 Charney report.  And the calculations show that the warming proceeds until about 2060 in a manner that is independent of the emissions scenario.

So exactly what have we learnt about possible 21st century climate from the AR4 relative to the TAR (and even relative to the 1979 Charney report) that refines our ability to set an optimal emissions target?  I suspect that we are probably at the point of diminishing returns from learning much more in the next few years (e.g. AR5) from additional simulations by the large climate models of the current structural form.

Where is all this heading?

There are three overall thrusts in climate model development that I am aware of, which are driven by the needs of decision makers.

The big new push in the climate modeling enterprise is for Earth Systems Models. These models are beginning to include biogeochemistry (including a carbon cycle) and ecosystem interactions.  Some are proposing to incorporate human dimensions, including economics models and energy consumption models.  Such models would could in principle generate their own scenarios of CO2, and so reduce the scenario uncertainty that is believed to become significant towards the end of the 21st century.

There is also a push for higher resolution global models to provide regional information, particularly on water resources.  There is currently no evidence that global models can provide useful simulations on regional scales, particularly of precipitation.

Another push is for credible predictions on a time scale of decades (out 20 years in advance).  This necessitates getting the natural variability correct: both the external forcing, and the decadal scale ocean oscillations.  I don’t expect the models to do much of use in this regard in the short term, but a focus on the natural variability component is certainly needed.

So it seems like we are gearing up for much more model development in terms of higher resolution and adding additional complexity. Yes, we will learn more about the climate models and possibly something new about how the climate system works.  But  it is not clear that any of this will provide useful information for decision makers on a time scale of less than 10 years to support decision making on stabilization targets, beyond the information presented in the AR4.

Conclusions

The current decision making framework based on the UNFCCC/IPCC has led to a situation where we are between a rock and hard place in terms of decision making.  The strategy (primarily model based) has provided some increased understanding and a scenario with about 3C sensitivity that is unlikely to budge much with the current modeling framework.  A great deal of uncertainty exists, and emissions target policies based on such uncertain model simulations are  not robust policies.

It seems that we have reached the point of diminishing returns for the science/decision making strategy reflected by the UNFCCC/IPCC. Its time to consider some new decision making frameworks and new scientific ideas. In Part II, we will explore some robust decision making strategies that consider Weitzman’s “fat tails” and ponder the kinds of scientific information needed to support such strategies.

203 responses to “Decision making under climate uncertainty: Part I

  1. I don’t see how different “decision making strategies” can alter whether climate sensitivity is though to be around 3C or not.

    Also:

    “The strategy (primarily model based) has provided some increased understanding and a scenario with about 3C sensitivity that is unlikely to budge much with the current modeling framework. “

    This is a pretty incredible accusation and suggests that current conclusions on climate sensitivity are an artifact of modelling frameworks and that different frameworks might somehow produce a different number.

    Either climate models correctly incorporate what’s known about climate or they do not. Either they’re missing key components of how climate works or they’re not. The precautionary principle, the IPCC and decision making are all irrelevant to this question.

    • Stay tuned for Parts II and III. The issue of model uncertainty was addressed extensively in the threads What can we learn from climate models? also in Part III of detection and attribution.

  2. While a higher level of confidence can make decision makers more willing to act, overestimating the confidence can result in discounting the value of information in the decision making process if the confidence later proves to be unwarranted.

    Jim Hansen has already eroded the confidence of the public by predicted that parts of Manhattan would be under water and the bird species would be different in a magazine article back in the late Eighties. It does not matter that he didn’t say that in a peer reviewed article. He is a high profile climate scientist and he said it, that is more than enough. This gives us every reason to doubt the current dire prognostications. What will they be saying in 10 years? That’s one reason I am for waiting another 10 years to give us a better, more “classical” chance to evaluate the uncertainty associated with the science and observe the climate. We can compromise and build small and conventional nuclear reactors for a 24/7 base of energy that does not produce CO2. In this way, mitigation will be under way without throwing the economy under the bus.

  3. Alexander Harvey

    Judith,

    “And the calculations show that the warming proceeds until about 2060 in a manner that is independent of the emissions scenario.”

    This really needs flagging up, so I have. It is a political nightmare, even if you have the optimal answer, there is no confirmation feedback.

    You face IMC or blind flight (Instrument meteorological conditions), I guess that somewhere, someone, is addressing appropriate global instrumentation issues, we have much good stuff, do we have enough?
    Is it temps, or precipitation, nocturnal warming. or … , that will give us the best feedback, if the models all have a strong tell-tale, something that does diverge strongly between sensitivities or scenarios in the 10-25 year outlook that would be handy.

    If this be a war, and we must fight it, wouldn’t it be a good idea to know if we are winning, and when we have won???

    Sorry a bit of topic.

    Alex

    I am starting to labour my “flight test” theme, but it was well received on another thread, (which is appreciated :) ).

    Alex

    • “we are no where close to knowing where energy is going or whether clouds are changing to make the planet brighter. We are not close to balancing the energy budget. The fact that we can not account for what is happening in the climate system makes any consideration of geoengineering quite hopeless as we will never be able to tell if it is successful or not! ”
      -Kevin Trenberth- Oct 2009

      Well, he should know better than anyone.

      I suspect the “missing energy” never arrived in Earth’s climate system in the first place. This would be because of the difficulty in calibrating the sensors which measure TSI, and the problems with the sensor Frohlich is having to use to generate the values for his PMOD model . The cooling of the oceans since 2003 (According to NOAA latest re-analysis) happens to coincide with the point in time when the Sun’s activity levels dropped below the long term average in terms of sunspot numbers. My research suggests that value coincides with the ocean equilibrium value. That would mean there is no ‘heat in the pipeline’ from elevated co2 levels. Not that downwelling atmospheric radiation can heat the bulk of the ocean anyway.

  4. The UK government agrees with Jim and has just given the go-ahead for eight new reactors at existing nuclear sites. Pragmatism in the face of uncertainty seems to be the order of the day.

    • AnyColourYouLike

      Perhaps O/T but this is an interesting point…have Nuclear reactors suddenly become much safer? There seems to be a new meme (hate that word!) in the air that nuclear is the new “clean” safe option. Has the technology evolved to the point where they are now seen as impregnable to accident? An accident could still be catastrophic right? Is this the first back-door concession to the low-carbon lobby? And just how “economic” is nuclear?

      Genuinely wondering, as nuclear used to be the green bogey-man – is it now the green saviour?

      • We live in a dangerous world, it is unrealistic to think we will never make mistakes and accidents will never happen. We do our best to prevent what we can. But that does no mean we never apply a technology for fear of some later “catastrophe” that may never materialize. The uncertainty topic of this thread applies to virtually all human endeavors and advancements. Risk taking is the price of advancement.

        Nuclear problems are very few today. If we want cheap reliable base load power, nuke is the best option. But that’s just base load. There has to be cyclical generation to keep up with the daily and seasonal demand cycles. And that is mostly from fossil fuels like coal and natural gas. There’s no way around that unless someone can invent a nuke reactor that can be turned on and off like a light switch.

      • Also, if you haven’t already, read “The Power To Save The World.” It was written by Gwyneth Cravens, a converted nuclear skeptic. She makes a case for even the large nuclear plants.

        Read the book review.

      • Sorry for the OT posts.

      • Turn on and off? One of the proponents of salt-cooled thorium was talking about this. (Can’t give the link, sorry, I just overheard part of something my husband linked to the other day.)

      • LFTR (thorium molten salt reactor)

        10000 times less toxic waste stream
        safer, cannot explode or melt
        can load follow
        much cheaper to build

        http://energyfromthorium.com/2010/07/01/welcome-american-scientist-readers/

      • Looks very promising. Just think if the US had spent $80B on this instead of AGW research where LFTR would be today. Making cheap reliable power. Exellent example of money grossly wasted on AGW that is now never available for other research. Such is the case when politics, political world agendas, take control of science.

      • Yes, I have heard about that too. A test thorium reactor would be nice to see. I hear no plans for one anywhere.

      • Thorium reactors were build and tested successfully in the 60’s. Funding was cut because, among other reasons, they didn’t produce enough plutonium (which the military wanted).

      • Leonard Weinstein

        You can use Nuclear reactors with enough capacity to satisfy max (reasonable ) load rather than base load. Excess power available for lower loads can be used intermittently to make Ammonia (electrolyze water for Hydrogen, purify N2 from liquid air, and combine in Haber process). The Ammonia can be transported (pipe or truck) for later catalytic conversion in fuel cell autos or home power, or used to make fertilizer. This allows the entire continuous use of Nuclear reactors and a useful storable form of energy as a byproduct.

      • Small nuclear reactors “might” (I hate using that word after all the dire global warming catastrophes that “might” happen :) ) be a safe alternative to the large ones. They could also cut assembly time since they could be built on an assembly line, with the final power plant assembled like a pre-fab house. There are many designs in existence now.

        WaPo Small Nuke Article

        Small reactor for Alaska

        Small reactor summary

      • AnyColourYouLike

        Jim

        Thanks for the links. I’m still a bit troubled by the idea that just one accident could devastate an enormous land-area though. The smaller units do look safer than the old designs, but still seem to have vulnerabilities.

        It looks like the road we’re going down and I’d concede that wind-power is probably like a blind alley….but I can’t help worrying about nuclear. New paradigm, but is the safety really there? Still not convinced…but I’m no expert.

      • I doubt the environmentalists who joined the AGW bandwagon are pleased. The law of unintended consequences in operation. The exclusion zone around Chernobyl is about the size of Britains second largest county.

      • “I’d concede that wind-power is probably like a blind alley”

        Probably? It is. Check this out: http://ontariowindperformance.wordpress.com/

      • I understand that Japan is working on small local reactors. If they can be safely used in subs I see no reason why local nuke reactors wouldn’t be viable, safe, and cost effective.

      • Alexander Harvey

        AnyColourYouLike:

        “meme (hate that word!) ”

        Then don’t use it!

        Meme is for certain a meme, it started off with a meaning and mutated and was preferencially selected for by people who insert it instead of a more precise word. I hate the meme meme too, but it is a super meme and eventually it may come to replace countless words rendered less flexible and adaptive due to their instance on maintaining their meaning.

        Alex

      • AnyColourYouLike

        Point taken Alex. I couldn’t think of a better word at the time!

      • Alexander Harvey

        Sorry about that, you took it well.

        Re UK Nuclear:

        I suspect a rat, there is no money for them and I know of no policy in place for making them profitable.

        It could be a smoke grenade, lobbed in to make people chase their tails for a bit until they unveil a slightly less unpalatable option, or just to waste some time.

        I don’t have a huge problem with nuclear, but I also wonder how I would feel if each consumer was divvied up their share of their vitrified waste each year. :)

        Alex

      • Nullius in Verba

        “I also wonder how I would feel if each consumer was divvied up their share of their vitrified waste each year.”

        It’s been suggested.
        Julian Simon calculated that a family of four would accumulate about 2 kg over a lifetime, which would be about the size of an orange. You would need to build a 3-foot thick stone box around it in your back yard, if you planned to spend a lot of time out there. It would take a few hundred years to decay to a level matching natural radioactive ores. He describes it as a ‘whimsical’ example, but it would be safe.

        But it’s a bad idea, since once we get fast breeders running, we’ll have to go round and buy it all back again so we can burn it for fuel.

      • You could put a grate on top of the box and use it for a Bar-B.

      • Or even better, keep all the waste together and use it as a 1,000 year heat source. I wonder if one could utilize the heat for a (one or more) Stirling engine to generate power?

      • Your entire energy needs for a lifetime can be supplied by a lump of thorium the size of a golf ball. And the radioactivity levels of thorium are low enough that you could safely hold that golf ball in the palm of your hand.

        India seems to be in the forefront of commercial development of thorium reactors, partly because they have lots of it.

        As for nuclear waste, you get more radiation from living near a coal power station. Coal is about one part in one million uranium. Most of it probably ends up in the fly ash.

      • Coal ash contains uranium and thorium. We should probably be extracting the stuff for use later.

  5. > “And the calculations show that the warming proceeds
    > until about 2060 in a manner that is independent
    > of the emissions scenario.”

    This is unclear.

    Dr. Curry, the warming after 2060 depends on — what?
    Emissions during which decades?
    Decades during the late 20th and early 21st centuries?

    Can you clarify?

  6. The Precautionary Principle is often grossly abused. http://www.scribd.com/doc/19575080/Be-Very-Cautious-of-the-Precautionary-Principle

    If PP is to be applied to AGW then it equally should be applied to, for example, populated areas around active subduction zones. If the same mindset in AGW was applied to earthquake prone areas, trillions should be spend on moving all those cities out of harm’s way. It’s predicted the next “big one” in California could kill hundreds of thousands. 300 thousand killed in Haiti. Should we not spend money to relocate that entire country to safety? We don’t, and we won’t because we play the odds. Every day we play the odds.

    Yet for some reason, and again climate science is unique in this regard, we are told to spend what every we have to, curb CO2 emissions to what ever level we have to, throttling the economy regardless, because of climate models that can’t predict anything, in the name of “saving the planet”.

    You also missed out on another possibility. Spend billions, trillions, crippling an already fragile economy, on mitigations that do absolutely nothing in regards of “stabilizing” the climate.

    • Funny, I was thinking the same thing today. Earthquakes are an obvious problem as are hurricanes. I am a conservative, so my favored solution would be to let people absorb the consequences of their decisions. If you choose to live in Cali, pay for your own insurance, if you can get it, and build a proper house. If you live on the coast, the government shouldn’t be subsidizing flood insurance. If you want to live there, figure out a way to do it safely or don’t do it. In the long run, the situation would adjust itself.

  7. Decision making under climate uncertainty: Part I

    Remember
    China’s auto industry in September continued the brisk market performance started from August. The country’s auto sales in the first three quarters totaled more than 13 million units, approaching the annual sales for whole year of 2009, when China overtook the United States to become the world’ s largest auto maker and auto market with production and sales hitting 13.79 million and 13.64 million units respectively, according to the China Association of Automobile Manufacturers (CAAM) said in a recent press conference. In the first nine months, auto sales leapt 35.97 percent year on year to 13.1384 million units, while output soared 36.10 percent to 13.0827 million. Auto sales increased 17.73 percent month on month in September and 16.89 percent year on year to 1,556,700 units. Output in the month grew 24.69 percent month on month and 16.94 percent year on year to 1,592.900 units.

    No doubt the Indian Highway system is eagerly anticipating massive development.

    The poor of this world cannot be denied their birthright to emit the same amount of GG as everyone else, if not additional amounts to make up for unfair lost advantage.

    A person’s right to industrialize, emit, procreate and abstain from birth control as a cornerstone of personal choice and religious belief is sacred and cannot be criticized. The IPCC doesn’t care. They roll over and recognize the legitimacy of doing such.

    If China India or the Pope are adamant about being inflexible then the IPCC shrugs and moves onto to a more lucrative target of coercive opportunity for propaganda based advocacy.

    The IPCC certainly doesn’t mind giving away EVERYTHING. They will happily do whatever it takes to protect and promote the IPCC’s interests, expanding mandate and importance.

    Go back to your ‘Climate change’ professional buddies Prof. Curry. You are all dupes for the globalist NGO pork barrel raiders. The hypocrisy sickens me beyond words.

    Happy motoring.

    • That’s why the IPCC and its attempts to sway Western governments has nothing to do with science, the climate, and everything to do with destroying Western civilization and imposing a one world, unelected, government. We in the west have had it too good for too long at the expense of the rest of the world. Time to move aside and let the Third World have their time. That’s the UN’s ultimate goal.

      • The UN may give the Third World and the Pope a pass now, what would happen if they had a good deal more power? I don’t think it would be good, given the events in Germany during the National Socialist Party’s reign, and when China, and the Soviet Union became Communist.

      • More power?

        That the IPCC tacitly lends credence to the undeveloped and developing worlds inalienable right to emit in excess of the evil West is way way more pontifical and much more lucrative than the *now defunct* peddling of Papal Indulgences

      • The IPCC has neither the desire, nor the responsibility to assess the uncertainty and forcing of ‘Climate Change’ brought on by the IPCC’s own investigative process.

        The big uncertainty picture is ‘holier’ than the proverbial ‘Swiss cheese’.

  8. David L. Hagen

    Re Rio: “Where there are threats of serious or irreversible damage”
    What constitutes “irreversible damage” in light of geological evidence of plants thriving with CO2 being >> 5000 ppm?
    Since CO2 absorption is logarithmic, it’s climate impact is self limiting.
    Consequently water vapor absorption if tied to is also self limiting.
    Earth has gone from tropical to glacial and back multiple times. Can either be “irreversible”?

  9. David L. Hagen

    Judith. Re the assumption:

    Inferring an actual prediction/projection/forecast of 21st century climate from these simulations assumes that the natural forcing (e.g. solar, volcanoes) will be essentially the same as for the 20th century.

    The transit from solar cycle 23 to cycle 24 cycle is very different from the seven solar cycles 16 to 23 of the 9 in the last century.
    Paul Stanko observed:

    Out of the numbered solar cycles, #24 is now in 7th place. Only 5, 6, and 7 of the Dalton Minimum and cycles 12, 14, and 15 of the Baby Grand Minimum had more spotless days. Since we’ve now beaten cycle #13, we are clearly now competitive with the Baby Grand minimum.

    Paul Stanko reports that 2008 and 2009 had the 2nd and 3rd most spotless days in the last 100 years. They had the 4th and 5th most spotless days in the 161 years since 1849.
    In June 2010 Stanko said

    Solar Cycle 24 now has accumulated 810 spotless days. 820, which would require only 10 more spotless days, would mean that Cycle 24 was one standard deviation above the mean excluding the Dalton and Maunder Grand Minima.

    See also Spotless days.

    Miyahara et al., find:

    . . . the number of sunspots was lowest in 2009, and the start of Solar Cycle 24 was delayed by about 2 years. Consequently, the length of the solar cycle has increased, as occurred during past grand solar minima. In addition, the intensity of solar wind observed by the Ulysses spacecraft in 2008 was the lowest of the past ~50 years (McComas et al., 2008), which is probably related to weakening of the solar polar field (Svalgaard et al., 2005). Likewise, the flux of cosmic rays in 2009 reached its highest level over the past ~50 years (see neutron monitoring data; The above features indicate that solar activity is now at its lowest level of the satellite-based observational era of several decades, possibly even of the past century. . . .if the solar activity were to become reduced to levels comparable to the Maunder Minimum, our current episode of global warming could be followed by another Little Ice Age.

    Miyahara et al. Is the Sun Heading for Another Maunder Minimum? J. Cosmology, 2010, Vol. 8, 1970-1982.

    Le Mouel et al analyzed long temperature series from Prague, Bologna and Uccle.

    Differences between average annual values corresponding to high vs low activity periods are also (about) 1 °C. Solar activity may account for these long-term temperature variations.

    These factors indicate that assuming the natural forcings of the 20th century is unwise and contrary to accumulating solar evidence. Serious studies are needed to compare the unusual features of solar cycle 24 versus those of the last 70 years of the 20th century with its “incontrovertible” “anthropogenic warming”.

  10. Dr Curry,

    Warmers look at the data and worry about “dangerous climate change”. Skeptics look at the date and the proposed actions (cap and trade, carbon tax, wind, solar …) and conclude the date is too poor to risk the economy. The grand compromise is to find energy sources that are cheaper than coal, 24/7 and don’t emit co2.

    LFTR (liquid fluoride thorium reactor) is cheaper than coal, emits no co2, and is much “greener” than conventional nuclear. The waste stream is 10000 times less toxic, there are not fuel rods to melt or radioactive coolant under high pressure to leak thus even safer.

    It has been endorsed by Dr. James Hansen and is acceptable to AGW skeptics.

    http://energyfromthorium.com/2010/07/01/welcome-american-scientist-readers/

  11. David L. Hagen

    See: Jean-Louis Le Mouël, Vladimir Kossobokova, and Vincent Courtillot, A solar pattern in the longest temperature series from three stations in Europe Journal of Atmospheric and Solar-Terrestrial Physics
    Volume 72, Issue 1, January 2010, Pages 62-76

    Differences between average annual values corresponding to high vs low activity periods are also (about) 1 °C. Solar activity may account for these long-term temperature variations. . . We discuss possible physical mechanisms by which solar variation could force climate changes (e.g. through solar activity itself, the EUV part of the solar flux, cosmic rays, the downward ionosphere-earth current density, etc.).

  12. Dear Prof. Curry,
    I don’t mean any disrespect. But you seem to measure the quality of the decision making process with respect to an intended outcome. However, I might have misunderstood you.
    The decision of the people of this world in my opinion has been to do not that much, since the case was not convincing. This is a decision.
    I have the impression that the IPCC framed the decision one-sided with an intended outcome.
    The IPCC evaluates the science and recommends a solution on the same time, but there is no clear alternative like Lomborg’s suggestion offered to the people.
    Any decision maker who is faced with such a poor preparation will sent back such a lop-sided proposal. So what we see is poor performance on the UN/IPCC side.
    If I make a decision under uncertainty I will have different teams or people that evaluate the technical, logistical and financial aspects independently of each other. The IPCC process violates this rule, since it mixes up problem description, different technical solutions, financial solutions and implementation as well as control if the indented outcome is reached.
    We need truly independent groups that prepare the decision rather than a single mixed up group.
    Best regards
    Guenter

    • Exactly correct. This is how Stephen Schneiders “ethical double bind” should be overcome.

      “working to reduce the risk of potentially disastrous climate change” is the job of policy makers, if they decide it is the best course.
      Being “ethically bound to the scientific method” is the job of scientists.

      The IPCC llead authors need to stop trying to be planet saving superheros and start being careful, sober scientists with realistic assessments of uncertainty.

      • Never going to happen the IPCC is politics first, science last. Nothing in the recent shake-up has done anything to convince me otherwise. I remain hopeful that the new report will ACTUALLY be a scientific document, but i’m not going to hold my breath.

      • I say again, the UN has no place in climate science at this stage of the game. Form a Climate Science Society for goodness sake! Get out of the political arena. It is causing you more problems than it is helping you.

    • Guenter, these issues are addressed in Parts II and III, stay tuned.

  13. Can anyone here compare the uncertainty analysis related to climate system collapse with that for other potential cilivization threatening events such as major asteroid strike?

    • When you compare two unknown uncertainties, asteroid risk versus climate, you don’t get meaningful numbers.

      The problem with both risks is that they have been poorly assessed up to this point. That meaning that there is a great deal more that we don’t know than we do know for both risks, so it’s very difficult to quantify the uncertainty in these risks. I’d say that in the case of asteroid risk, adaptation is our only policy choice at this point.

      • I think the asteroid risk is comparatively easier to quantify, since we have some indication from paleo-, geology, craters on the moon etc. of the rough frequency of Earth impacts and their size (there are also a number of asteroid mitigation strategies under development).
        Quantifying climate change risk is much more difficult, because we can’t easily find analogous situations in the historic record, and the causality is not as clearly understood.

      • While it may be useful to define past frequency of impact using craters on Earth and maybe even the moon, the past frequency of such impacts is hardly a predictive parameter in determining risk for future events. In order to predict impacts, we have to identify which asteroids in orbit around the sun cross the Earth’s orbit, which are large enough to cause damage and when such a crossing might occur. There is large uncertainty in all of the types of information at this point.

        Given the proper resource allocation, it may be ‘easier’ to quantify our risk to asteroid impact than climate due to the complexity of the climate system, but we are no where that point at this juncture.

      • Maxwell, Zajko: Thanks for your helpful replies. I thought such comparison might be useful to people (like myself) who are not very familiar with risk assessment.

  14. “In order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”

    As a definition this is pretty circular and empty, but two words of it are very important, and usually neglected, ignored or forgotten – the words ‘cost-effective’.
    Few of the proposed solutions seem to meet this simple test, since they require vast money transfers, changes and adaptations, all to meet and supposedly solve uncertainly defined supposed problems. Nice work if you can get it.

  15. This whole situation is built upon a self-supporting assumption with very little hard evidence behind it. This in itself is not an issue- few theories are fully formed at inception, however we are now making economically catastrophic decisions based on this incomplete picture (windfarms and the recent revelations that the UK off-shore farms have foundations that are ALREADY begining to fail- not to mention their abject uselessness at creating energy).

    Dr Curry, i am interested in the tangent you are going off on here, it could turn out to be very interesting, however i think again you’ve missed a step.

    Is the IPCC truly impatial and can we trust them to represent the science (whatever it’s state) in an impartial and honest manner?
    Everything we’ve seen to them of date (including Pirachu’s deliberate lie, and that’s what it was, over the Himalayian data) say’s no.

    The original charter of the IPCC (long since deleted, but it’s still searchable) states that the IPCC was set up specifically to support the cAGW theory, NOT test it. There is nothing in their recent behaviour to suggest that they are not following this initial mission.

    Which is a real shame, climate science is complicated as hell and to have a moving summary of the recent science would be not only invaluable, but bloody interesting! Yet no matter what the IPCC produce in their next report, it will be met with suspision,which is again a shame- but only of their own making.

    Finally- the precautionary principle (PP from herin).
    Again, i’m interested to see where you take this analysis Dr Curry, but i confess to being confused… it’s a purely political device… why are we scientists discussing it as a viable option?

  16. Judith,

    Many times politicians make promises on technology not even developed yet. The auto industry is being forced to change even though the technological advancements are not that good. Such as electric car or hybrids.
    But is it really environment or economic properity?

  17. Judith,

    The media hinders scientific advancements due to the one sided reporting. Much of the media depends on government availability to speak with them and in many cases are being subsitized. This gives governments the okay to push policies and have the excuse that the science is behind it. Who gives out government grants? Who needs these government grants? Centainly gives the excuse to raise taxes based on what scientists say.
    Private industry and private investors are looking for a return on their investments unless it is a donation. 2-3% return used to suffice for investors but seeing others get 10-30% return has made industry in the developed world non-competitive.

  18. Judith,

    Religion itself has a vested interest in keeping science suppressed.

    And that is all I have to say about that.

  19. Roger Caiazza

    This is a very useful discussion albeit I think its primary value is to reduce the influence of climate science. As I see it the real problem is to convince society that we need to address the dilemma described by Obersteiner et al. (2001):
    The key issue is whether “betting big today” with a comprehensive global climate policy targeted at stabilization “will fundamentally reshape our common future on a global scale to our advantage or quickly produce losses that can throw mankind into economic, social, and environmental bankruptcy”.

    The uncertainty related issue is that one can argue that climate science will never be able to markedly reduce the uncertainty of model predictions because climate models cannot be fully verified. Therefore, society’s debate should focus on an energy policy that won’t produce bankruptcy and still reduce CO2 emissions.

    For example, if the concern is the energy system in 2030 we really need to be planning for the infrastructure today. Something that gets lost in many discussions of the future is the fact that many US power plants are old and cannot be expected to be available in 20 years so not only do we have to plan for load growth but significant load replacement. Is it reasonable to expect that renewables will be available for all the power required? Can the intermittent issues associated with solar and wind be addressed in this time frame and if not what has to be done. If renewables are not the complete answer should we build new coal-fired power plants with significantly higher efficiencies or must it be built with carbon capture and sequestration. If no to coal then are you going to go all in for gas-fired power plants with the implication that we are going to need to drill more that we are doing today. How should we add nuclear to the mix? Finally, worldwide there are over a billion people without electricity and how are they going to get it.

    All these are significant issues and Oberstenier et al. have nailed the point that poor decisions on these issues can throw mankind into economic, social, and environmental bankruptcy.

    • Roger Caiazza

      I apologize because I should have credited Dr. Curry for recognizing the value of the Obersteiner et al. (2001) quote in my comment.

  20. I consider that Gunter Hesse and Labmunkey correctly identify the problems in IPCC CAGW risk identification and management, which is policy built on shifting sands.

  21. Paradoxically, research need not reduce uncertainty in the short run. On the contrary, new instruments and new data can spawn a proliferation of conflicting hypotheses. This is precisely what has happened to climate science over the last two decades, especially with regard to natural variability. Indirect solar forcing and internal ocean dynamics are two prime examples where hypotheses are proliferating, but there are others.

    Unfortunately the models have not kept up, in the sense that they are not experimenting with the new speculations. There seems to be a conflict between (1) using the models as basic research tools and (2) developing them as applied forecasting tools. In basic research we would expect to see all the various hypotheses being played with. This is inconsistent with selling the models as a decision making tool, which is what we are seeing. It appears that the models have lost touch with the basic science program.

    Perhaps we need a new generation of more speculative models, as a way to reduce the uncertainty in the long run.

  22. “The question needs to be asked as to whether the early articulation of a preferred policy option by the UNFCCC has stimulated a positive feedback loop between politics, science, and science funding that has accelerated the science (and its assessment by the IPCC) towards the policy option (CO2 stabilization) that was codified by the UNFCCC”

    Does the question really need asking? Won’t a little common sense suffice. Academic naval gazing is fine and all, but a duck is a duck is a duck. Doesn’t prove or disprove CAGW, just a hat tip to reality.

  23. “There is currently no evidence that global models can provide useful simulations on regional scales, particularly of precipitation”

    So, the things that actually impact humanity are floods, droughts, storms, sea level rise. Floods and droughts are out. Can they do storms? Sea level rise? Exactly what real impact can models predict with useful (call it actionable) specificity?

    • Computer models suffer from a Catch22. I have been writing business software for 25 years, some of it very complex. It has to model what the business does. However much you think you have the code correctly modeling the business, you only find out if your output values are correct when an actual case emerges and you find out it is wrong. Recode it and retest it for that case. That is, testing code against reality.

      Computer models which predict future events cannot do this until the future becomes the past. Then the backcast the models to match reality. This is why computer models are a probability range, and why different models make different range predictions. Because the climate is extremely complex. Computers have restrictive components. Coding is only based on what we think we understand. Hence the continued cry for more money to make bigger computers to make their models “more accurate”, which I would argue is an oxymoron in the case of climate science.

      Basically, climate models are untested expensive “what if” computer games with no basis in reality when predictions cannot be tested against reality until the future becomes the past.

      Especially when climate scientists haven’t even figured out the heat balance of the planet, how can they possibly be modeling reality?

  24. Judith Curry:

    While a higher level of confidence can make decision makers more willing to act, overestimating the confidence can result in discounting the value of information in the decision making process if the confidence later proves to be unwarranted.

    Leonard Smith’s position seems to be that the climate modeling enterprise is in the weird position of proving that it’s own previous confidence* was unwarranted as new and better models get developed; Tennekes made a similar point.

    *at least as presented by advocates for public consumption; getting into the lit gives a different story

    David Wojick:

    Unfortunately the models have not kept up…

    The reasons you attribute for this might be part of it, but “the models” are complex enough software that they “turn” rather slowly.

  25. As long as climate change efforts are obsessed with CO2 at the expense of other climate forcings, as well as ignoring adaptation, there will be no effective policies. As long as the climate establishment is in effect pushing a CO2 or nothing policy, nothing will happen.

  26. Judith,

    Teachers also have the influence on the students they embrace to teach.
    So depending how passionate they are on their own beliefs, they do get passed down. Even if the science is incorrect. Which then generates a whole new line of teachers later.

    The biggest mistake in science is the advancement in the incorrect science through manipulation instead of following where the science leads.

  27. A major issue here is the proliferation of scary regional and local climate change “impact assessments” which all claim to be scientific. There is a major conference on these coming up: https://www.iaia.org/iaia-climate-symposium-dc/ The US State Dept is funding these kinds of regional scares worldwide. Arguably they are helping to give climate science a bad name, since the models clearly can’t do this kind of forecasting.

  28. Craig Goodrich

    Dr. C,

    Yes, indeed, the scientific uncertainty is a crucial consideration, and I look forward to the rest of this series. Another crucial consideration, though, is that the mitigations being touted — mostly wind and solar, since even the modern generation of nukes seems to be politically unacceptable — simply do not work: they neither produce actually useful power nor reduce CO2 emissions. Admittedly this is more an engineering matter than a scientific one, but I hope the issue will eventually be addressed in your series.

  29. NY Times writer Thomas Friedman told a farm science convention that “If there’s a one percent chance greenhouse gases could cause a problem, we need to act decisevely.”

    http://mobile.ohiofarmer.com/main.aspx?ascxid=cmsNewsStory&rmid=0&rascxid=&args=&rargs=8&dt=634241724936092000&cmsSid=43394&cmsScid=8

    Friedman has no uncertainty.

    • I would say Friedman has no judgment, and is in fact a fool.

      • The developed nations of the world could, if they chose, afford to use some of their wealth to insure against the possibility of catastrophic climate change by raising the cost of energy from fossil fuels whatever it takes (2X, 4X) to make alternatives economically practical for all but the most demanding applications (air travel, for example). But it is unlikely that others will cooperate, even if they are bribed with carbon offset payments. And the developed world is unlikely going to be willing to make their industries even less competitive with those in the less developed world by adding cheap energy to cheap labor and the other forces that are sending jobs overseas. If a 1% possibility of catastrophic climate change is not enough to spur decisive action, is there any % possibility that will? Can climate science ever define the risk? Will the public ever believe the scientists bearing this message? (Not when they are elitist academics who are enamored with central control, rather than the free market, and who have forgotten about telling “the truth, the whole truth, and nothing but — which means that we must include all the doubts, the caveats, the ifs, ands, and buts”.)

  30. The thing to ask about the PP is whether it is valid. If so, its valid for all kinds of areas of knowledge and uncertainty, not just climate and environment. To see that it is not valid, look at it as a form of Pascal’s wager. The argument is basically that when costs rise over a certain level, probability ceases to matter. So, reasoned Pascal, if the costs of disbelief may be eternal damnation, no matter how low the probability is, we should believe.

    Similarly, if the risk is the extermination of the human race due to overheating, then, no matter how low the probability, we should take action to eliminate the causes.

    The problem with this form of reasoning is that it will justify having the whole population stand on its head for five minutes every morning. If there is only the smallest chance that this will, for instance, wipe out cholera, surely we should do it?

    At this point in the argument you will find people indignantly saying that there is no proven plausible relationship between standing on our heads and cholera, whereas there is a plausible relationship between CO2 levels and warming.

    Ha, at that point we are back where we were trying to get out of in the first place, consideration of how plausible the case is for a link between the alleged cause and the feared effect, and the cost effectiveness and likely success rate of our proposed action.

    The fact is, you cannot get out of it. There is no way of bypassing making the case and putting together the evidence, and if you do not have it, you don’t have it, and you should not act until you do have it. Otherwise you waste enormous amounts of time and energy doing things that are ineffective, and which prevent you from doing things that could be very effective.

    So, for instance, there is no way to avoid discussing the pros and cons of adaptation to warming versus prevention of warming, which is more cost effective, more likely to succeed. The so called precautionary principle does not avoid it. Its just a sloppy way of saying ‘I want to do it but I have no evidence so please do it because I want to do it’.

    Another way to see it is to consider the argument, if there is only the smallest chance that emitting more CO2 would save the planet, obviously we should do it now. But of course, then you will get the argument that this is a crazy point of view. Yes, maybe it is, but that is where we came in, trying to avoid having an argument about evidence, but proceeding directly to action.

    Never cite the precautionary principle to a logician, you will be mincemeat in seconds.

  31. To solve the big problems we all face we’ll need big answers. Computer scientists have developed super-computers designed to solve large data crunching problems. The super-computers run control programs to solve problems like the enormous number of calculations necessary to predict worldwide weather patterns accurately moment by moment , for instance by dividing up the workload into packets, assigning each packet to a different processor, then each seperate processor does its share of the overall job and outputs its results back to the overall control program who then compiles those many individual results into a coherent whole.

    The big problems we humans are facing are far more complex than those any super-computer invented yet can understand let alone solve. The big problems we face can’t be modeled by numbers and manipulated by silicone gates connected by circuits because they each are framed by the emotional content of each independent viewer as well as being more abstract questions we, the beholders, are emotional about our view of it. Humans, unlike computers, each see incoming data set through our own worldview. Each human then sees the world and its problem at hand uniquely, each then is manipulating different information about the problem. Each then might consult outside specialists or authorities to better inform themselves, but as each sees the problem differently each would be choosing a different source or sources in order to become well informed.

    At that point, unlike the output of a computer, the many minds of the beholders will be outputting a huge range of solutions. At first glance it appears that these jumbled outputs would be useless in solving any problems, even the least complex. But it ain’t so. Because humans have the capacity to learn, because, instead of welded wiring harnesses, we have brains that can re-wire themselves on the fly, because we each have this re-wireable, general usage processor between our ears we are capable of using the emotive filtering that seems so limiting to advantage in solving-combining a massive set of jumbled outlets into big solutions.

    The secret? Communication. We, unlike computers, can communicate our individual emotionally driven outputs through our senses at various stages of developing those outputs and use this interactive recombining of our conclusions with those of other individuals whose outputs are like ours and those whose aren’t. Then we can re-start our processing, but in this new iteration we have new information to manipulate. Each individual’s solution to even the biggest problems can and does change because we can communicate with each other to change both the incoming-updated data-modified by the solutions of others and we can change the emotive sieve we use to sift and weight our view. Once our many minds have communicated, once we’ve modified our inputs and outputs by learning, we can solve the big problems with big solutions. Our big answers are as close as the person next to us if we communicate

    Don’t believe me? Here’s proof, check out Judith Curry’s website Climate Etc. where everyday hundreds of interested bloggers, laypeople and scientists from all prespectives surrounding the climate change issue are interracting, arguing, disagreeing, learning and communicating. As each day, each iteration, passes people and ideas come closer. Big solutions to big problems begin with communication.

  32. The precautionary principle is a source of leverage for people who have extreme views to argue that their extreme views should prevail. Mothers use it to argue that their children should not rough-house. Schools use it to prevent children from playing cowboys and indians at recess. Environmentalists try to use it to eliminate all pesticides since obviously we are all being poisoned, and now for climate change.
    I have noticed that some posters here have said sceptics are unreasonable for not accepting “the science” but the political movement is not based on “the science” (even the IPCC science) but on 20 ft sea level rise and catastrophic hurricanes and megadroughts and extinction of 80% of all species (and implied even worse that these)–which SURELY must require action!! Yet even 20foot sea level rise would not affect the world’s coastlines more than 1% (probably much less). It would surely be cheaper to move those living on coral atolls than to make gas $12/gal and restrict air travel and all the rest.

    • Dr. Loehle, I am quite surprised that toy made this comment and also surprised that Dr. Curry has chosen not to comment. If sea levels rise to 20 feet that suggests a warming that will have massive, global negative impacts. $12/gal. gasoline will be the very least of our worries.

      For those readers here who think that Dr. Loehel is making a valid point, I suggest visiting the links below:
      http://profmandia.wordpress.com/2010/06/21/global-warming-a-sea-change/
      http://profmandia.wordpress.com/2010/06/28/o-water-water-wherefore-art-thou-water/
      http://profmandia.wordpress.com/2010/08/22/climate-change-impact-on-oceans-shallow-seas/

      A few snippets from those links:
      * Currently, about 160 million people live in locations that are 1 meter above sea level or lower.
      * Ericson et al. (2006) estimated that nearly 300 million people inhabit a sample of 40 deltas globally, including all the large megadeltas. Average population density is 500 people/km2 with the largest population in the Ganges-Brahmaputra delta, and the highest density in the Nile delta. The authors estimate that more than 1 million people will be directly affected by 2050 in three megadeltas: the Ganges-Brahmaputra delta in Bangladesh, the Mekong delta in Vietnam and the Nile delta in Egypt. More than 50,000 people are likely to be directly impacted in each of a further 9 deltas, and more than 5,000 in each of a further 12 deltas.
      * Rising sea levels will flood the word’s largest ports. According to The Tipping Points Report (2009) commissioned jointly by Allianz, a leading global financial service provider, and WWF, a leading global environmental NGO, a rise in sea level by 0.5 meters by 2050 could put at risk more than $28 trillion worth of assets in the world’s largest coastal cities. The value of infrastructure exposed in port mega-cities, is just $3 trillion at present. A hurricane in New York, which could cost $1 trillion now, would mean a $5 trillion insurance bill by the middle of the century. Insurance rates will rise and taxes will increase to pay for the recovery and to move ports inland.
      * One-sixth of the Earth’s population rely on melt water from glaciers and seasonal snow packs for their water supply. Dr. Loehle’s warming shuts off water supply to these people.
      * As reported in The Sunday Times story (June, 2010), War clouds gather as nations demand a piece of the Nile, countries along the Nile River have signed an agreement to possess more of the water from that river. Egypt is already sabre rattling and the Foreign Minister has described the Nile waters as a matter of national security and a “red line” not to be crossed. Some Egyptian newspapers even discussed tactics that would prove effective if war erupts. Boutros Boutros Ghali, Egypt’s former Foreign Minister who later became the UN Secretary-General, warned: “The next war in our region will be over water, not politics.”

      Iraq, Syria, and Turkey may fight over Turkey’s control of the headwaters of the Tigris and Euphrates Rivers, further destabilizing the fragile Middle East. Arab countries may increase their nuclear capabilities to desalinate water and, in doing so, proliferate nuclear weapons to protect their dwindling resources (Dyer, 2008). Rivers fed by glaciers in the Tibetan Plateau (Indus, Ganges, Brahmaputra, Salween, Mekong, Yangtze, and Yellow) will initially flood due to rapid glacial melt but will eventually dwindle thus causing water shortages to billions of people during summer when needed most. This will lead to food shortages and cross-border conflicts between NUCLEAR nations such as China, India, and Pakistan (Ibid). Will India redirect water away from Pakistan to feed its own people? Will Pakistan use nukes to rest this resource back?
      * Climate-change is likely to alter river flows which in turn will impact hydropower generation. Hydropower impacts for Europe have been estimated using a macro-scale hydrological model. The results indicate that by the 2070s the electricity production potential of hydropower plants existing at the end of the 20th century will increase (assuming IS92a emissions) by 15–30% in Scandinavia and northern Russia, where currently between 19% (Finland) and almost 100% (Norway) of electricity is produced by hydropower. Decreases of 20–50% and more are found for Portugal, Spain, Ukraine and Bulgaria, where currently between 10% (Ukraine, Bulgaria) and 39% of the electricity is produced by hydropower. For the whole of Europe (with a 20% hydropower fraction), hydropower potential is projected to decrease by 7–12% by the 2070s (Ibid).
      * Dr. Loehle’s warmer world happens with CO2 levels that acidify our oceans. In his world coral reefs are dead and dissolving from lower pH after many years of bleaching due to warmer T. What price is the ocean worth?

      • This kind of topic is addressed in Part II, which should be out next Sunday. Part III tackles water supply.

      • The answer is to invest in mobile home manufacturers.

      • The ocean pH is still 8.1, that is basic not acidic. Also, while some species will suffer with more dissolved CO2, others will benefit. What is it you don’t understand about cost-BENEFIT analysis?

      • Scott- we need a moon-shot effort to develop cheap, safe nuclear power. Cheap energy is the answer to the water shortage problem. We need water de-salinasation plants.

  33. The “precautionary principle” is a sophisitic rationale to justify major policy changes needed to implement an unproven hypothesis.

  34. Embedded in the Precautionary Principle is the Opportunity Lost Cost analysis. If money and resources are invested in a wrong theory or strategy, then that money/resources are no long available for, at least in the climate scenario, mitigation and/or adaptation to “natural” driven climate change. A recent example is the former Soviet Union; a failed ideology bankrupted a nation state from which recovery will only occur after the incurred debt is paid; sometime in the future. To assess if the Precautionary Principle is warrented for climate change, the costs of being wrong need a fuller evaluation and consideration. Then the question of cost/benefit, includes the costs of higher energy added to the costs of no longer having the money for alternative priorities. My suspicion is that the “do nothing” strategy would end up out performing any other cost/benefit determination. I do not advocate doing nothing, just that cost/benefit analysis should be performed and an enlightened electorate can make a choice. As with conservation begun more than a century ago, the public spent tax dollars to purchase land to make National Parks. Energy choices are really no different. The problem is the public is not certain of what the full costs are, and they should be.

    • PP uncertainly – I’m afraid it’s worse than that – I’ve never seen reference to cost/benefit analysis in terms of lost-opportunity or any other scenario. In fact, there’s at least one book on the PP that decries this lack. It also claims, as I believe, that any such c/b analysis must put a price on a human life, including the values of different lives in different circumsances. Eugenics, anyone? Sigh.

      • John F. Pittman

        One also needs to careful that the analysis does not assume implicitly or explicitly that a future human life is worth more than a current life. This can take several forms. One such form is not correctly estimating the energy use of the future 3rd world. Or using estimates of currency value, rather than a estimate of purchasing parity. There was a complaint of AR4 where the use of the value rather than parity skewed the results. IIRC, one result was it increased the time that China and India would NOT be expected to do anything about CO2.

  35. AnyColourYouLike

    Instead of the precautionary principle, what about the “Is it likely?” principle, espoused at Unbearable Nakedness of Climate Change?

    http://omniclimate.wordpress.com/why-agw-is-logically-impossible/

    Some of the items in the list look slightly facetious, but I like some of these coincidences, which are all a necessary part of where we are right now…

    “Belief in AGW implies belief in a highly-improbable series of lucky discoveries and developments to happen just at the right time…

    Relatively widespread availability of computer power just enough strong to simulate the right climate projections on a multi-decadal scale.

    Climate science developed just beyond the minimal level needed to understand how to simulate the right climate projections on a decadal scale.

    Invention of satellites capable of photographing the poles, just at the moment they start to melt.

    Novel statistical approaches devised just in time, and correct from the get-go, for Mann’s Hockey Stick to emerge from the jumble of dendro- and other proxy data.

    Governmental willingness to co-operate together all over the world (after the end of the Cold War) just in time for a worldwide problem like AGW to happen.

    AGW recognized as an issue just as heavily-populated places such as India and China start getting their living standards on track to reach the Western world’s.”

    With thanks to Maurizio Morabito’s blog.

  36. However, if one were to apply the Precautionary Principle to the climate change movement itself, one could well ask why the Climategate Gang still have their jobs and prominence as climate scientists since they have done great damage to climate change’s credibility and therefore to the movement’s ability to protect the planet.

    Removing the Climategate Gang would be seen as a positive step for climate scientists to restore their credibility. Surely the burden of proof lies with climate scientists to demonstrate keeping the Gang will not continue the damage.

    Fire the Climategate Gang! The Precautionary Principle demands it.

  37. I hope this is not too of topic, Dr curry, but I was reading Keith Kloors blog last night and all the fuss over your Italian flag metaphor (if it’s more appropriate somewhere else please put it there if you can). Anyway here is some of what I wrote: ( in media res )

    “….whilst Shollenberger was patiently explaining to Lazar the pretty obvious meaning of IFA: a thought experiment, a metaphor (even the percentages are metaphors – Note to Lazar et al – not everything needs to be quantified or have the potential to be quantified, to be rational, use your intelligence, for God sake!), a way of introducing and eliciting a discussion.

    Think of it this way: Judith Curry, at heart, is a teacher. Think of her in a lecture room, introducing a talk on uncertainties in climate science; she puts up a picture of the Italian flag, her audience immediately sees the incongruence and starts thinking “What’s the Italian flag got to do with uncertainty?” and she’s grabbed their attention. It’s a common teaching technique and isn’t essential to the substance and is not to be taken too seriously in it’s own right. It isn’t a new mathematical or logical theorem Judy is proposing, just a means of framing the beginning of a discussion.

    But then we have a weird, slightly crazy student stand up and say ‘That flag theorem doesn’t fit into my truth table!’ It’s idiotic, it’s adolescent and then for that same student to run around telling people this lecturer is stupid and doesn’t deserve her standing and respect as a teacher, well, with Tom, I myself would get mad if my present equanimity didn’t precluded it.”

    Anyway, trying to keep up but it’s difficult, time wise but keep on keep on!

    • Lewis, yes i spotted this, thank you very much for this. I will back to the italian flag issue, now it looks like friday for the IF post (and here I was hoping to stay on track with the decision making under uncertainty series).

      • I’m hoping this response doesn’t mean that the IF was “not to be taken too seriously in it’s own right.”

      • Nope, the IF topic is too important to dash off before I leave on travel to Purdue (to participate in a panel with Revkin and Pielke Jr), and I need to respond to something else, that helps set the stage for the significance of the IF kerfuffle.

      • Look forward to it. As an addendum, and not to lead this thread astray, I wrote this:

        “Lewis Says:
        November 1st, 2010 at 5:05 pm
        Please, Lazar, your taking it too seriously – it’s a thought experiment, a way of illustrating uncertainty that is both attractive (I like the Italian flag!) and thought inducing (it got you going!) It isn’t a (none) peer reviewed paper in PNAS, it’s a blog exercise, hopefully conducive towards productive discussion. Ie, it is inclusive, non technical and open. That is the basis of free discussion and also good teaching.

        Remember why Judy is doing this. Not for the fame (or infamy!) but as a kind of outreach to all sides of the debate. I know some (with nasty minds – joke!) mutter darkly about her but, until I see the contrary, I take everyone by their words and their actions. It may be she is being niave, that, for want of enthusiasm or, even, want of skill, her project fails but until then, we should support the effort!

        And how? Via constructive criticism. I’ll tell you what, Lazar – Judith is going to do a IF redux probably on Friday. Why don’t you go over there – she would welcome you – and suggest a better metaphor for uncertainty? Remember it has to be articulate, non technical and open. Please try.

        For, I think, Lazar, you and PDA have been sincere in your efforts so far ( I initially came here via a roundabout route, finally jumping from PDAs temp blog (too hot!) to here). So, I am sure in the future such efforts are welcome. Lets get something done here.

        NB Many of what Tobis would like to think of as starry eyed Curryphiles (!) are actually, would they knew it, slowly being dragged or should I say honeyfied into the 21 centuary. They’re listening and learning and melting! Watch this space!”

  38. To determine the veracity any decision making regarding climate change, follow the money. See who will be enriched and who will be beggared by the decision and you will begin to understand whether or not the plan is legitimately connected to climate science or is merely a front for economic interests.

  39. Dr. Curry,
    A question: What is the problem you are trying to solve with your decisions? I don’t mean, how do you make a decision in spite of uncertainty. I mean, what are we trying to fix? I am not seeing that identified in this discussion.

    Now, though I am what is described as a skeptic, for this particular comment I will stipulate that the science presented by the AGW groups is correct. I am not arguing the science. (well, not in this comment)

    Given that, what problem are we trying to solve? CO2 level increasing? Nope, with no attendant temperature rise, a doubling of CO2 concentration would likely provide more positive results than negative. It would certainly be difficult to find a species of plant or animal that would be driven to extinction by that increase, which would be well below historical high levels. A CO2 increase might be described as a causative factor for other effects, some of which might be negative, but it is not the problem we are concerned about.

    Is the problem a temperature increase? Let’s assume 3 to 5 degrees C is correct. Monthly temperature averages for many mid-latitude land areas vary by 75 degrees C or more over the course of a few years. Year to year average temperature variations of several degrees are common. (I’ve downloaded the USHCN daily data files for my area and crunched through them.) 3 to 5 degrees here would be lost in the noise. (Trend would be 12 dB below the noise level in engineering terms!) Some northern areas, such as Siberia and Canada, might benefit greatly from that warming. I hope nobody would claim that arctic tundra provides a richer diversity in life forms than equatorial rain forests. Nope, warming is not an overall global problem.

    What is the problem then? We are actually assuming a set of negative impacts from warming to very specific areas. The problem, or more correctly problems you are trying to solve is minimizing those impacts. Each is a separate problem with a different degree of impact. Each should be evaluated for potential solutions, in terms of prevention, mitigation, and adaptation.

    This should all be obvious. Enumerate the problems you are dealing with and examine them individually. Perhaps a common measure can be devised. Simply using warming or CO2 concentration as the problem definition introduces huge amounts of uncertainty in to the discussion. Dealing with smaller chunks allows much greater latitude in planning responses and opportunity for developing backup plans.

    As for my personal views, if I was someone who agreed with the AGW model, I would still be writing to reject the attempt to control temperature by reducing CO2 production. It simply as has been a losing battle. We are trying to move the river instead of building a bridge.

    • Gary, the issue on the table is the risk of catastrophic climate change. The ideas you put forward will be discussed in the context of Parts II and III, which are now put off until next week.

      • Judith,

        You say the issue on the table is “the risk of catostrophic climate change.” I might have missed it but I didn’t quite get that from you post for this thread. If that is your intent, I suggest that you re-frame and clarify your objective. IMO, that is a very different perspective that many commenters have been addressing.

      • This will become clearer in part II, sorry I got distracted from a quick follow on, should be coming Sun nite.

  40. Steven Mosher

    Thanks Judith,

    My sense of things, as we wrote in the book, was that the solution ( global control of C02) was baked into the problem from the very outset, baked into the very structures that came to govern the science and the politics. On the science side of things, we didn’t see anything that amounted to corruption, but we noted that external forces ( money, power,politics) does not shape science by funding bogus answers. It changes the questions we ask:

    Here we found this most enlightening. Author of doubt is their product.
    Folks can read this and see if the shoe fits.

    By David Michaels
    Special to The Washington Post
    Tuesday, July 15, 2008
    Wal-Mart and Toys R Us announced this spring that they will stop selling plastic baby bottles, food containers and other products that contain a chemical {BPA} that can leach into foods and beverages. … Congress is considering measures to ban the chemical. But is there enough evidence of harmful health effects on humans? One of the eyebrow-raising statistics about the BPA studies is the stark divergence in results, depending on who funded them. More than 90 percent of the 100-plus government-funded studies performed by independent scientists found health effects from low doses of BPA, while none of the fewer than two dozen chemical-industry-funded studies did. This striking difference in studies isn’t unique to BPA. When a scientist is hired by a firm with a financial interest in the outcome, the likelihood that the result of that study will be favorable to that firm is dramatically increased. This close correlation between the results desired by a study’s funders and those reported by the researchers is known in the scientific literature as the “funding effect.”Having a financial stake in the outcome changes the way even the most respected scientists approach their research.

    Within the scientific community, there is little debate about the existence of the funding effect, but the mechanism through which it plays out has been a surprise.At first, it was widely assumed that the misleading results in manufacturer-sponsored studies of the efficacy and safety of pharmaceutical products came from shoddy studies done by researchers who manipulated methods and data. Such scientific malpractice does happen,but close examination of the manufacturers’ studies showed that their quality was usually at least as good as, and often better than, studies that were not funded by drug companies. This discovery puzzled the editors of the medical journals, who generally have strong scientific backgrounds.
    Richard Smith.. has written that he required “almost a quarter of a century editing . . . to wake up to what was happening.” Noting that it would be far too crude, and possibly detectable, for companies to fiddle directly with results, he suggested that it was far more important to ask the “right” question. ..Smith, Bero and others have catalogued these “tricks of the trade,” which include …. publishing the results of a single trial many times in different forms to make it appear that multiple studies reached the same conclusions; and publishing only those studies, or even parts of studies, that are favorable to your drug, and burying the rest.

    The problem is equally apparent in review articles and meta-analyses, in which an author selects a group of papers and synthesizes an overall message or pattern. Decisions about which articles to include in a meta-analysis and how heavily to weight them have an enormous impact on the conclusions. …….It has become clear to medical editors that the problem is in the funding itself. As long as sponsors of a study have a stake in the conclusions, these conclusions are inevitably suspect, no matter how distinguished the scientist.

    • …it would be far too crude, and possibly detectable, for companies to fiddle directly with results, he suggested that it was far more important to ask the “right” question

      I noticed a similar thing looking for papers documenting climate model predictive skill; it would be far to crude to fiddle directly with the numbers, but these considerations can weigh on whether you go out on a limb and make the “prediction” or not. For example, this paper makes a decadal prediction, but the paper is written when the decade it is predicting is more than half over; it is interesting to calculate the average anomaly at the time of the prediction 6 years in, with the predicted and actual anomalies. You think this sort of quick calculation wasn’t made by the authors before they decided to put that “prediction” in writing?

    • To simplify: the man who pays fiddler calls the tune.

      …the hardest part is knowing that you knew the answer all along. (Reckless Kelly)

  41. Although the attention of climate modelers has turned away from fundamental issues to trying to scare us with regional impact studies, I think you may be too pessimistic about evolution of climate models before AR5. The discrepancy appears to be growing between observation and predictions for the alleged tropical hotspot. Ocean heat content and mean global temperature are not rising as fast as predicted. The statistical likelihood of the current decade-long “pause” in warming is becoming embarrassingly small. And models many continue to make conflicting regional projections (such as whether rainfall in the Amazon basin will increase or decrease). Yes, climate modeler do have other agendas, but they may soon recognize they won’t get very far with their current level of credibility.

    Then there is the ocean, which could become the new frontier with systematic and reliable data coming from the Argo buoys. When analyzing the range of possible futures by varying parameters of climate models, Stainforth et al never varied the parameters that describe thermal diffusion through the ocean. We know much less about how well climate models describe energy flow through the ocean rather than the air, especially when one considers heat capacity and total energy flow. Long term cycles like the AMO and PDO and perhaps phenomena like the LIA/MWP seem more likely to arise changes in ocean currents that in phenomena that occur in the air.

  42. Charles Higley

    The Precautionary Principle fails when it is easily shown that the “climate science” behind the supposed global warming by man is fabricated, lies, and false assumptions.

    When we are not warming, CO2 is shown NOT to be a climate driver, others factors shown to be the real drivers (solar cycles, ocean cycles, water vapor heat engine, etc.), under what conditions are they NOT going to invoke the Precautionary Principle?

    Precaution is gone when we know and can clearly see no linkage between the climate and CO2, regardless of whether it is emitted by us or not. ‘Just not happening.

  43. I never fully understood the attractiveness of the precautionary principle. This ideal could be applied to any study or area of industry if if so desired. The problem that would com up would be that you would have an organization so intent at not taking any risks, that nothing would advance. A better principle is to allow for healthy risks with proper contingency planning and organizing. NASA during Apollo was big, but through good planning and nimble units they were able to recover from Apollo 1 and 13. NASA of the shuttle era on the other hand took years to recover from two shuttle losses, that were themselves caused by institutional blindness. Not a perfect metaphor, but it seemed appropriate to me.

  44. Willis Eschenbach

    Dear Judith;

    First, thank you for another in a most fascinating series of posts. The number of variables, each with their own uncertainty, is large.

    Next, there is another source of uncertainty that you haven’t mentioned in your discussion of the precautionary principle. That is the odds that a slight warming will either make little difference, or will be beneficial overall.

    It is often said that a two degree C temperature rise over the next hundred years will lead to calamitous problems, numbers of climate refugees, and the like.

    It is also said that the problem with climate is that we can’t do laboratory experiments. Fortunately, we have a natural experiment in climate. This is the ~ 2°C of global warming since the Little Ice Age.

    Now, certainly there have been a number of climate-related disasters during that time. In common with the rest of the planet’s history, there were storms and droughts and floods and hurricanes and the like.

    However, I have never seen any evidence that any of that was caused by the historical ~ 2° global temperature rise. In general, that rise is seen as a good thing by those concerned — longer growing season, fewer animals and plants lost to frost, that kind of thing.

    So there are two further fundamental uncertainties I’d like to raise here:

    1. Is increasing temperature bad for humans? The historical evidence shows that the range of answers must include the answer “No, it is generally good for humans. I point to this as a source of uncertainty.

    2. The last 2°C temperature rise since the Little Ice Age seems to not have produced climate refugees. Nor has it led to any known increase in extreme or catastrophic events. This means that “no catastrophe” has to be our default position. It is what happened last time, so in uncertainty, it is our best guess for the future.

    Now, what does that mean for the precautionary principle? I highlight the three important issues:

    Where there are
    threats of serious or irreversible damage,

    lack of full scientific certainty
    shall not be used as a reason for postponing
    cost-effective measures
    to prevent environmental degradation.”

    The spread of answers to “is there a threat of serious etc. from 2° of warming” must include the historical result of “no threat, general benefit”. Given that, the first issue of the precautionary principle may not be satisfied.

    And of course, issue number three, a “cost-effective solution” … now there’s a real rip-snorter. The US EPA recently announced that their regulation of carbon (estimated to cost billions) will cause around three hundredths of a degree of cooling by 2030.

    Anyone who thinks that is a cost-effective solution can leave the discussion now.

    Next, it’s not that we “lack full scientific certainty” about the future evolution of the climate to 2050. It is that we don’t know what the temperature trend from here to 2050 will be. Any honest scientist will say that it could be either warmer or cooler in 2050.

    So with no cost-effective solution, with painfully small scientific certainty, and with historical evidence not supporting impending catastrophe, this means we have to include another uncertainty.

    This is the uncertainty as to whether there is a case for the application of the precautionary principle. I say no.

    Again, thank you for your work on this blog. Very important.

    • Thanks Willis, stay tuned for Parts II and III. I agree that we need to lose the precautionary principle with regards to climate change, but that doesn’t mean that we shouldn’t consider the risks of climate change; we just need a better framework for doing so.

      • I think Willis’ point is that we also need to consider the benefits of global warming.

      • Alexander Harvey

        The PP seems to be built in to the UNFCCC, so that would need amending, which may be tricky to achieve (3/4 majority).

        Article 1 Para 3:

        “The Parties should take precautionary measures to anticipate, prevent or minimize the causes of climate change and mitigate its adverse effects. Where there are threats of serious or irreversible damage, lack of full scientific certainty should not be used as a reason for postponing
        such measures, …”

        What might be a good idea and what may be achievable are perhaps different things. The UNFCCC seems a remarkable agreement, not so much road-map as road block.

        Alex

      • Alexander Harvey

        Oops make that:

        Article 3 Para 3 !

      • Willis Eschenbach

        I await II and III with infinite patience. Living in the South Pacific for so many years, patience is a requirement.

        I have explained the circumstances in which I do and don’t invoke the precautionary principle in Climate, Caution, and Precaution. Might be worth a read if you haven’t seen it.

        Best regards,

        w.

    • “Anyone who thinks that is a cost-effective solution can leave the discussion now.”

      Another example. The claim is 2 meter rise in sea level by 2100. Thus a three meter high concrete dike wall needs to be built around the shores of Manhattan Island. The task so large and expensive it would have to start now. Besides being an eye sore, blocking the view it would serve as a huge monument to stupidity in 100 years. Just like those thousands of idle wind turbine will be by then.

      • But the question is, is it more expensive than the proposed solutions to global warming? Probably it isn’t. Also, why do you believe it would take 100 years to build a dike. Other countries have done it in much less time.

      • Solutions to global warming? You mean by cutting emissions by 80%, killing the economy completely putting us back to the stone age?

        You would like a three meter high wall along the entire coast of the populated US? And yes, it would take 100 years and how many trillions?

        You missed the point of the comment. Sea level isn’t going to rise 2 meters by 2100, not even 6 inches at 1.74mm/year, building a wall as a “precaution” of 2 meters that has no basis in fact is a colossal waste of money, energy and other resources.

      • I misunderstood. I agree with you :) I’m definitely for keeping the economy alive!

    • Richard S Courtney

      Willis:

      You make a good point concerning a possible future rise of 2 deg.C in mean global temperature. I write to support it.

      Mean global temperature rises by 3.8 deg.C from June to January each year and falls by 3.8 deg.C from June to January each year.

      So, I pose the following question.

      On what basis is it claimed that a rise of 2 deg.C in mean global temperature would be potentially catastrophic when mean global temperature rises (and falls) by nearly double that each year with no known harmful effects?

      Please note that this question is NOT rhetorical. In my opinion, a serious answer to this question is required if political decisions affecting economic and energy policies are to be decided on the basis that mean global temperature rise of 2 deg.C must be avoided.

      Richard

      • Richard S Courtney

        Ooops!:

        I wrote:
        “Mean global temperature rises by 3.8 deg.C from June to January each year and falls by 3.8 deg.C from June to January each year.”

        Of course, I intended to write
        “Mean global temperature rises by 3.8 deg.C from June to January each year and falls by 3.8 deg.C from January to June each year.”

        Sorry.

        Richard

      • Mean global temperatures vary by 3.8 C every six months?
        Shouldn’t that be mean hemisphere temperatures? –
        (and with North and South moving in opposite directions)

  45. Judith,

    The big question to climate science is:
    What is the balance of this planet?
    No one knows due to the complexity of the WHOLE system.
    Yet we have a mad fixation to only temperature data to the exclusion of ALL other factors that this planet has.
    Do we know what triggers an Ice Age or what it function is?
    What is the cause of the ocean salt changes that are occurring only on the first couple of inches on the ocean?
    Experimentally, this is impossible to occur unless ALL of the oceans salt changes.
    The ONLY factor NOT considered is atmospheric pressure changes.
    What is the function of changing the salt levels on the surface?
    It would hinder the sunlight being obsorbed in the oceans, hence deflecting more sunlight away.
    How do we know pressure has built up? Is our equipment sensitive enough?
    The growth up mountainsides shows that the atmosphere has been pushing the atmosphere back. Pressure can only exert so much on a solid/liquid planet surface before exerting out on the atmosphere.

    Being focused on only one area of science has left us vulnerable to a surprise we are not expecting.

  46. For me the tell that this is a non-productive enterprise is that the ones who claim to have discovered the cliamte apocalypse are the ones pushing the solution.
    From Hansen, to the RC gang, etc. they have not only claimed to have discovered the silver bullet of cliamte death, but have the solution as well.
    That is a very rare alignment of talent.

    • Nice point – it’s like the drug companies that discover a new ‘disorder’ or medical condition and at the same time discover a ‘cure’ for it.

  47. The claim of 3K warming for doubling [CO2] isn’t justified; it’s far less and could even be zero. In Figure 2.4 of AR4. 1.6W/m^2 AGW is really 0.4W/m^2 data, -1.2W/m^2 aerosol cooling offset. The combined range of the latter is -2.7W/m^2 to -0.4W/m^2; the datum is buried in noise.

    But there’s worse. -0.7W/m^2 ‘cloud albedo effect’ cooling isn’t proved by experiment, it’s theoretical from an equation adapted in 1974 by Hansen and Lacis in 1974 from Sagan and Pollack’s work on Venusian clouds. Go further back in time and their ‘lumped parameterisation’ assumes constant ‘Mie asymmetry factor’, g, also they neglect boundary effects.

    Mie assumed a plane wave. That only applies to the first scattering event. You get insight into what really happens from experiment. Measured albedos can be angular-dependent yet a diffuse emitter obeys Lambert’s cosine law. So, the reality is probably geometrical backscattering as the initial energy concentration from the first scattering [10^7 for 15 micron droplets] is dissipated, superimposed on the diffuse background.

    The former process, much lower g than for a plane wave, is likely to be strongly dependent on droplet size, in effect a shielding of the interior of the cloud, switched off by pollution. So, instead of ‘cloud albedo effect’ cooling, above a critical optical depth it’s heating, another form of AGW.

    Implicit in low CO2-AGW is the reduction in water vapour concentration predicted and apparently observed by Miskolczi. That begs the question: “Did ‘cloud albedo effect’ heating cause all AGW?”. If so, that might explain why according to ocean heat content, global warming stopped in 2003: switch off the shielding and [hemispherical] albedo asymptotes at 0.5.

    • Do you have links to papers (or articles) for this?

      • It’s my own research. Having worked on global warming related issues for 20 years, I had expected the science to have been done properly. It hasn’t, so I set out last February to find out why.

        I hadn’t expected to destroy the CO2 monopoly and to throw doubt on Venusian thermal runaway, but that’s the logical deduction.

        NASA’s claims ‘enhanced reflection from greater water surface area in polluted clouds’ [ http://geo.arc.nasa.gov/sgg/singh/winners4.html ] is incorrect. Elsewhere NASA claims up to 90% reflection from this process. This is a physics’ fairy tale.

      • OK, do you have a web site? Are you going to contribute to Dr. Curry’s skeptic articles effort?

      • No web site: I intend to publish.

        I also have a professional interest in energy policy.

      • Good stuff Alistair. I reached the same conclusion about the switchoff in 2003. That’s when the sun dropped below the long term average sunspot number. Which also seems to be the ocean equilibrium value, below which, they start losing energy as shown by the ARGO network.

      • Did you do an eleven year smoothing of sunspots, or were you looking at the peak levels? Do you have a link to that graph?

  48. With two Kamchatka’s volcanoes (Kljuchevskaya Sopka and Sheveluck), Merapi in Indonesia and possibly Island, drop in the global temperatures is almost certain, giving the decision makers some (volcanic ash and aerosol ) breading space.
    This would have a beneficial effect all around, CAGW lobby can claim that the volcanoes interrupted temperatures rise before, sceptics would claim it was about to happen anyway and thepseudo-science-astrology minority will claim that the solar minima always relate to an increase in the volcanic activity. I can see a happy new 2011 for all concerned.

    • (volcanic ash and aerosol ) breading space.

      Like a geothermal oven? Let them eat cake!

    • I was sorry to see the volcanoes erupt. Now the picture will be foggy due to the aerosols. I was hoping to see what effect the solar minimum would have without confounding factors.

      • The volume of ash ejected from Eyjafjallajökull is several orders of magnitude smaller than what Pinatubo released (100 million cubic metres
        versus 25 billion cubic metres) and amounts coming from the Kamchatka and Indonesian volcanoes – at least to date – are far, far less. So the effect is far from “certain:” as of this writing it can be assessed as “imperceptible.”

      • It is early days. Kluchevskaya Sopka is possibly the largest live volcano in the Northern Hemisphere, Marapi is the largest volcano in Indonesia, and in Iceland Grimsvotn (Laki) is brewing up, 1783-85 gas eruptions covered most of Europe with a blue haze lasting for months.

      • The Icelandic volcanoes frequently have complex eruption episodes that can last for years. I would not write them off just yet.

      • I said “to date,” and stand by that. Yes, more volcanoes may erupt, the existing eruptions may massively intensify… and if my aunt had testicles she’d be my uncle.

        I was responding to Mr. Vukcevic’s “certainty,” which was explicitly based on the eruptions going on right now.

      • almost certain < certain

      • imperceptible << almost certain

      • CO2 effect < imperceptible << almost certain < certain

      • Good point, and good luck with your Aunt.

    • Cooling! Because of a volcano, damn was looking forward to mild winters, time to break out the deep freeze winter coats, and expensive heating bills.

  49. One of the main problems is that by arbitrarily demanding only CO2 is to be considered as a significant climate driver variable, the UNFCCC has set up things where very little can be done.

  50. Dr. Curry,
    You might want to check what Dr. N-G has posted over at his blog on this. There is apparently not a formal set of definitions regarding catastrophic climate change. After reading ‘The Climate Fix’, it is clear that at least part of the problem has been inconsistent labeling of the issue. ‘Framing’.
    This is astonishing to me, since we are into the studying/debating/promoting/caucusing the issue of CO2 in the atmosphere for something in excess of $50 billion to date.

  51. Judith, you ask (rhetorically I presume):

    “So exactly what have we learnt about possible 21st century climate from the AR4 relative to the TAR (and even relative to the 1979 Charney report) that refines our ability to set an optimal emissions target? ”

    Not much.

    Indeed, we have known what we know for a while and we sat around and did (almost) nothing to change our course.

    “A great deal of uncertainty exists, and emissions target policies based on such uncertain model simulations are not robust policies.”

    1) Estimates of climate sensitivity are based on both observations and models (and they converge to more or less the same ballpark figure as already laid out in the Charney report)

    2) If current policies are not even sufficient to deal with the lower boundary of climate sensitivity, then the uncertainty in climate sensitivity doesn’t matter to the policy making process.

    I.e. we know enough to know that we’re not doing enough even if the uncertainties mean things are less bad than expected. If things are worse than expected, it’s even more blindingly obvious. Scientific knowledge, or lack thereof, or uncertainty therein, is *not* what is holding us back; it’s not the where the shoe pinches.

    • “If things are worse than expected, ”

      No chance at all that a “warmer” climate is good, eh?

      • The abuse of the term, ‘worse than expected’ is one of the other great tells of the AGW movement.
        We have had, according to our AGW promotion industry, ~150 years of temeprature increases that have been ‘historically unprecedented’.
        What has happened in that ~150 years or so?
        Increased crop yields, increased lifespans, increased populations, increased prosperity. Nothing in world weather systems indicates we are facing anything out of the ordinary.
        Yet fear mongers like Paul Ehrlich and his disciples, who got every aspect of this flat out wrong are the ones chosen to advise our science policies.
        How odd, to say the least.

    • bart, the issue is that changing our course in the ways specified by the various treaties may not help, and even if we passed the laws, implementation may not work. The problem is lack of robustness of the emissions targets policies. there are a lot of reasons for energy policy that moves in the direction of clean, green. At this point, the climate wars seem to be getting in the way. Like I said over at KK, non robust policies and bad politics.

      • Judith,

        I wasn’t referring to specific policies or treaties, just to the need to drastically reduce emissions. Do you agree with such a need?

        If yes, then we can discuss how to best enact such reductions, which is what your reply is concerned with. But you skipped the point that I brought up.

      • Bart, in my best confusionist tradition, it is isn’t as simple as that. Part II and III will frame how I think we should be approaching this.

      • Part II (how to do it) isn’t simple by any means.

        Part I (that we need to do it) seems quite straightforward to me.

        (“it” meaning reducing our emissions.)

      • Why this fixation on reducing emissions, want this society to collapse? That’s what a forced emissions reduction will do.

      • Curious. Why are you mixing emissions with energy? So called “clean” energy is bankrupting countries. 6 million in the UK are now in energy poverty because of “green” policies. Spain has had to cut and cancel their FIT program because of the costs, and here in Ontario we are errecting wind turbines which export power to the US and Ontarians are paying for it in triple increased bills. We have people losing their homes here because they can’t by their utility bills.

        “Green” energy jobs cost 2.2 private sector jobs. It’s killing the economy at a time when the economy is frigile enough as it is. Soon as the Conservatives get into power in Ontario next Fall our Green Energy Act will be Kaput!

    • Richard S Courtney

      Bart Verheggen :

      You assert:
      “1) Estimates of climate sensitivity are based on both observations and models (and they converge to more or less the same ballpark figure as already laid out in the Charney report)”.

      Sorry, but, no. You are wrong.

      Kiehl reports that estimates of climate sensitivity obtained from theory (and used in climate models) vary from 1.5 deg.C to 4.5 deg.C for a doubling of atmospheric CO2.
      (ref. Kiehl JT,Twentieth century climate model response and climate sensitivity. GRL vol.. 34, L22710, doi:10.1029/2007GL031383, 2007).

      This is a range of a factor of 3 which is not consistent with “they converge to more or less the same ballpark figure.”

      But, and very importantly, several empirical studies indicate that climate sensitivity is much lower and is ~0.4 deg.C for a doubling of atmospheric CO2; e.g.

      Sherwood B. Idso, “CO2-Induced Global Warming: A Skeptic’s View of Potential Climate Change,” Climate Research 10 (1998): 69-82.

      Stephen E. Schwartz, “Heat capacity, time constant, and sensitivity of Earth’s climate system”, Journal of Geophysical Research, Volume 112, Issue D24, November 2007

      David H. Douglass, John R. Christy, “Limits on CO2 Climate Forcing from Recent Temperature Data of Earth”, Energy & Environment, Volume 20, Numbers 1-2, pp. 177-189, January 2009

      Richard S. Lindzen & Yong-Sang Choi, “On the determination of climate feedbacks from ERBE data”, Geophysical Research Letters, Volume 36, Issue 16, August 2009

      Richard

      • Richard,

        Sounds estimates of climate sensitivity converge to the same (admittedly rather wide) range.

        Combining several constraints together in a Bayesian framework further narrows this range. (see e.g. http://julesandjames.blogspot.com/2006/03/climate-sensitivity-is-3c.html )

        A sensitivity of 0.4 is wholly incompatible with the paleo-record of large changes in global climate. Some major flaws in the studies you cite have been found (in the literature, at the same blog as just mentioned, at RC, and elsewhere).

      • Alex Heyworth

        Bart, leaving aside your contention of major flaws in the studies cited by Richard, could you just focus for a moment on the recent temperature record?

        At the moment, we are about 40% of the way to a doubling of CO2 from pre-industrial levels. Given the non-linear response of temperature to CO2 increases, this should mean that we already have more than 40% of the increase that would result from a doubling. If the sensitivity is really 3K, as you say, why have we only had approx 0.7 degrees increase in temperature since 1850 (not all of which is attributable to CO2 anyway, according to James Hansen)?

        I assume you would like to respond that the increase is “in the pipeline”. Perhaps you could describe this pipeline and exactly how it is storing the missing heat.

        Or perhaps you are going to use the “solar minimum” argument. Unfortunately, that doesn’t square with previous alarmist contentions that variations in solar forcing are minimal in their climate impact.

        Or perhaps you will trot out the aerosol argument.

        Anything rather than admit that you really don’t know.

      • Alex,

        Following Ramanathan and Feng (2009):

        Global average surface temperatures have increased by about 0.75 degrees Celsius since the beginning of the industrial revolution, of which ~0.6 °C is attributable to human activities. The total radiative forcing by greenhouse gases is around 3 W/m2, with which we have ‘committed’ the planet to warm up by 2.4 °C (1.6-3.6 °C), according to a climate sensitivity of 3 °C (2-4.5 °C) for a doubling of CO2. The observed amount of warming thus far has been less than this, because part of the excess energy is stored in the oceans (amounting to ~0.5 °C), and the remainder (~1.3 °C) has been masked by the cooling effect of anthropogenic aerosols.

        These numbers (esp on sensitivity and aerosol forcing) have high uncertainty.

        The warming over the past 100 years is consistent with the range of sensitivity as given by the IPCC and Charney reports; not with a sensitivity of 0.4.

        Don’t confuse uncertainty with knowing nothing. That’s the number one logical mistake you keep making.

      • Alex Heyworth

        I see you agree (still) that the numbers on sensitivity and aerosol forcing have high uncertainty.

        Just how uncertain are you saying the aerosol figure is? For example, is there any reason that it shouldn’t be 0.3 degrees rather than 1.3? How exactly was that figure arrived at?

        Also, I observe that the idea that “excess” energy is stored in the oceans is essentially nonsensical. Excess to what? Why shouldn’t it just stay in the ocean?

      • Alex,

        I have never denied that uncertainties are large and I think you’re quite well aware of that. Just leave those digs aside.

        IPCC AR4 Ch2:
        “The total direct aerosol RF as derived from models and observations is estimated to be –0.5 [±0.4] W m–2.
        The RF due to the cloud albedo effect (also referred to as first indirect or Twomey
        effect), in the context of liquid water clouds, is estimated to be –0.7 [–1.1, +0.4] W m–2.”

        See also IPCC AR4 wg1 fig 2.20 (lower panel, blue curve)
        http://www.ipcc.ch/graphics/ar4-wg1/jpg/fig-2-20.jpg

        Ram and Feng base their ocean heat on Barnett et al., 2001: 0.6 (+/-0.2)Wm-2 of the 3Wm-2 is still stored in the ocean. It’s about closing the radiative balance. Ram and Feng (2008 and 2009) are really good sources for the details on how aerosols and GHG interact.

      • Alex Heyworth

        Bart, my apologies if you were upset about that. I don’t think it was meant in quite the same way you seemed to take it. I guess I am surprised by the certainty you seem to have in the face of “high uncertainty”!

        One question I would like to raise with you: when the IPCC and climate scientists quote such uncertainty ranges, are they assuming a Gaussian normal distribution? If so, is there any basis for this? Is there any reason the distribution could not be, for example, rectangular, or bimodal?

      • Richard S Courtney

        Bart Verheggen:

        You asserted:
        “1) Estimates of climate sensitivity are based on both observations and models (and they converge to more or less the same ballpark figure as already laid out in the Charney report)”.

        And I pointed out that your assertion is plain wrong: measurement and estimates differ by more than an order of magnitude.

        Your response is to say;
        “Sounds estimates of climate sensitivity converge to the same (admittedly rather wide) range.”

        Say what!?
        All estimates of anything “converge to the same (admittedly rather wide) range”. Your reply is completely meaningless.

        The important point is that you were plain wrong when you asserted that
        “Estimates of climate sensitivity are based on both observations and models (and they converge to more or less the same ballpark figure”
        unless your “ballpark” is so large that it can contain an infinite number of balls.

        The papers you cite in support of climate sensitivity above 0.6 deg.C for a doubing of CO2 provide theoretical derivations. If they indicate climate sensitivity greater than the obtained empirical results then that proves the theories they use are wrong.

        It is is a basic principle of science that indications of a theory which fail to match empirical data disprove the theory.
        And this disproof remains true unless and until an error in the empirical data is determined.

        In the case of the paleo studies (derived from ice cores) the erroneous theory is an assumption that changes in CO2 drove the changes in mean global temperature. But this assumption cannot be true because the changes to atmospheric CO2 concentrations follow the changes to mean global temperature: at most the CO2 acted as a feedback to enhance warming from other cause(s).

        And it is pure pseudoscience to accept indications of a theory as having any validity when those indications disagree with empirical data.

        Importantly, nobody has found any fault in the paper by Idso snr but you assert:
        “Some major flaws in the studies you cite have been found (in the literature, at the same blog as just mentioned, at RC, and elsewhere).”

        You must be really desperate if you have to fall back on the RC propoganda blog for support. That attempted refutation by “Gavin” fails falsify Idso, even according to his own words. “Gavin” says;
        “As an aside, there have been a few claims (notably from Steve Milloy or Sherwood Idso) that you can estimate climate sensitivity by dividing the change in temperature due to the greenhouse effect by the downwelling longwave radiation. This is not even close, as you can see by working it through here.”

        This is typical misrepresentation by “Gavin”. The paper by Idso reports 8 different “natural experiments” which each assesses changes in surface temperature reulting from changes in radiative forcing. Idso’s work does not consider “downwelling longwave radiation” which most of his “natural experiments” are unable to meaure.

        And “Gavin” says this about radiative forcing:

        “Point 2: Radiative forcing – whether from the sun or from greenhouse gases – has pretty much the same effect regardless of how it comes about.”

        So, according to “Gavin” (and his simplistic model), measuring the temperature change per change in radiative forcing (as Idso did in 8 different ways) does provide
        (a) the climate sensitivity
        and
        (b) the time constant involved (which Idso determines to be ~90 days).

        “Gavin” then provides a simplistic model which he claims indicates a different result than Idso’s 8 different experiments each provides. And – on the basis of that – “Gavin” claims he has disproved Idso’s measurements.

        This, of course, ignores the fact that when that indications of a model fail to match empirical data then that is a disproof of the model (n.b. it is NOT a disproof of the measurements). Simply, “Gavin” does not refue Idso but provides a demonstration of pure pseudoscience.

        In addition, Idso’s results are consistent with the temperature effects of changes in radiative forcing (~7W/m^2; 3x CO2 forcing change over the last century) resulting albedo changes induced by variations in cloud cover over the last 25 years, whereas the high sensitivity “Gavin” suggests would have resulted in wild temperature swings.

        Good records of cloud cover are very short because cloud cover is measured by satellites that were not launched until the mid 1980s. But it appears that cloudiness decreased markedly between the mid 1980s and late 1990s. Over that period, the Earth’s reflectivity decreased to the extent that if there were a constant solar irradiance then the reduced cloudiness provided an extra surface warming of 5 to 10 Watts/sq metre. This is a lot of warming. It is between two and four times the entire warming estimated to have been caused by the build-up of human-caused greenhouse gases in the atmosphere since the industrial revolution. (The UN’s Intergovernmental Panel on Climate Change, IPCC, also adopts high values of climate sensitivity and says that since the industrial revolution, the build-up of human-caused greenhouse gases in the atmosphere has had a warming effect of only 2.4 W/sq metre).

        Richard

      • Richard,

        Did you read the link I provided?

        Can you quantitatively and based on established physical principles explain the magnitude of temperature changes in the deep past?

  52. Karl Hallowell

    “In order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”

    This statement clearly explains the idea that scientific uncertainty should not preclude preventative measures to protect the environment. However, the precautionary principle implies the need for a minimal threshold of scientific certainty or plausibility before undertaking precautions.

    Another implication is that costs have to be taken into account as well. Sounds to me like we should just do proper risk analysis instead. My view on things is that the Precautionary Principle is not self-consistent. If one applies it as a meta-rule to itself, you have to face that you’re making to some degree suboptimal decisions. This is environmentally relevant because excess effort expended on an effort that has lower benefit to cost ratio means you might have less resources for more effective environment efforts.

    One of the key problems is that the Precautionary Principle doesn’t give you a way to rank fixes or to understand opportunity costs of a choice. If you think you have a problem and the benefits are great enough and the cost low enough (to whom? The subjectivity of the decision is another problem which I’ll ignore here), then you have to attempt the fix, even if that effort could have been better put to some other problem.

    Given that human activity does cause serious, perhaps even irreparable harm at times, there is uncertainty, mitigation or cure strategies can be very expensive, and one has limited resources to throw at these, then one doesn’t get that much benefit from simplifying the decision-making process and possibly incurs serious, irreparable environmental harm through poor risk management and bad prioritization of environmental strategy. There’s also the risk of blowback (political or populist obstruction to further environmental efforts) from unpopular attempts to fix environmental harm that have greater costs than first expected.

    Hence, the Precautionary Principle cannot meet its own standard.

  53. Alexander Harvey

    I have read some of the Weitzmann (2009) paper and found it interesting.

    I am glad he mentioned learning whilst doing, which is not core to his argument.

    I wonder about doing to learn. Basically constructing an experiment by cleaning up emissions that should have a big impact, or whose impact is the least well quantified. Aersols are an obvious choice as they are either having little effect or a large masking effect. Also they have short lifetimes. Not cheap to do but perhaps not that expensive and good for the lungs.

    Alex

  54. Dear Dr Curry,

    I have been following your blog since its inception. I understand your purpose to be an attempt to build bridges between disparate points of view regarding the science of climate change and to clarify the importance of this science to policy decisions. As I understand it, you are focusing specifically on the uncertainty regarding the science and questioning whether it is wise to make policy decisions in the face of such uncertainty. The specific policy decision in question being the aim to reduce CO2 emissions by moving away from petrocarbon usage.

    The issue is often framed as a choice between reduction of C02 emissions or “doing nothing” But we are not “doing nothing”. That is a misrepresentation of what is actually happening, which is that we are permitting CO2 emissions to rise. This in itself is a policy decision. So my question is: how much uncertainty is there behind this decision?
    Is there a lesser degree of uncertainty in making decisions to build more coalfired power stations, put more cars on the roads and continue with BAU than there is to stabilize or reduce emissions?

    By your own reasoning we should have, as I understand it, unambiguous grounds on which to base decisons. Is there clear evidence to show that rising CO2 emissions will be beneficial to humankind, and will not endanger us by disrupting our climate? As “doing nothing” is not a logical option- we either attempt to stabilize emissions or we don´t, I can only assume that the decision to keep emissions rising by continued fossil fuel usage is based on a fairly low risk assesment, a high degree of certainty in the hypothesis that the changes we are experiencing are due to natural variability and a rejection of the hypothesis that increased atmospheric CO2 concentrations lead to an increase in mean global temperature. How much confidence is there in the science underlying the decision to continue business as usual?

    It would appear that the `uncertainty monster´ cuts both ways. If the answer is that we really do not know `if´ or `how´ or `by how much´ increasing atmospheric CO2 concentrations will affect global temperature, then it appears that we are taking a huge risk by allowing it to rise unimpeded. When faced with such uncertainty surely the wisest decision is to take steps to stabilize our emissions until the science becomes clearer and we know with greater precision what the consequences of our actions will be.

    • Sarah, this is the issue addressed in Part II. I am not saying at all that we need unambiguous grounds to base decisions. What I’m saying is that there are robust decision making frameworks where uncertainty is information that is incorporated into the decision making process. The precautionary principle based stabilization targets don’t qualify in this regard, which is why the decision is framed as an either or, between a rock and the hard place. part II should be ready Sun nite.

  55. One thing that occurs to me occasionally. We might talk about climate uncertainty, what about economic uncertainties?

    The presumption underlying many commentators is that economic catastrophe is absolutely guaranteed under various /any / all mitigation proposals. How certain of that are we really? Do we have a range of uncertainties to work with here, or is it a given?

    I’m not overly impressed with many economic theories and I’m even less impressed with economic forecasting of the last 100 years. Does anyone have any good stuff to back up these catastrophe claims? Who successfully forecast the GFC?

    afaik, the very few who did predict the disastrous outcome relied on economic history rather than economic theory or modelling, let alone econometrics. Do we have any good work done on this?

  56. Sorry this is non-helpful but this approach to uncertainty gives me the hebe jebes. An old professor of mine gave me some sound advise many years ago

    “DO THE F&@%!/G SCIENCE!!!”

  57. I’m in total agreement with your assessments of climate modeling’s impact on policy going forward. Indeed, I think your ten year time frame is optimistic.

    While I don’t entirely agree with the rest of this piece, it doesn’t appear to me to be anywhere near as problematic as your recent forays into the mathematics of uncertainty. I suggest you abandon the “Italian Flag” business entirely and build on the basis of this article and, if you get quantitative, to begin with more conventional approaches to propagating uncertainty.

    I elaborate somewhat on my blog.

  58. Judith Curry, in “Conclusions” above : “.. a scenario with about 3C sensitivity that is unlikely to budge much with the current modeling framework ..” …
    Richard S Courtney, citing 4 papers : “.. several empirical studies indicate that climate sensitivity is much lower and is ~0.4 deg.C for a doubling of atmospheric CO2” …
    … and of course there have been many other statements and opinions on the value of “climate sensitivity”.

    Now, as we are talking about uncertainty here : what if there is no such number?

    In the IPCC handling of climate sensitivity (defined as “the equilibrium change in the annual mean global surface temperature following a doubling of the atmospheric equivalent carbon dioxide concentration.“), “feedbacks” are added to the effect of CO2 itself (see IPCC report AR4 8.6.2.3 page 633 : “Using feedback parameters from Figure 8.14, it can be estimated that in the presence of water vapour, lapse rate and surface albedo feedbacks, but in the absence of cloud feedbacks, current GCMs would predict a climate sensitivity (±1 standard deviation) of roughly 1.9°C ± 0.15°C (ignoring spread from radiative forcing differences). The mean and standard deviation of climate sensitivity estimates derived from current GCMs are larger (3.2°C ± 0.7°C) essentially because the GCMs all predict a positive cloud feedback (Figure 8.14) but strongly disagree on its magnitude.“)

    The “feedbacks” which the IPCC report adds in are water vapour and clouds. They make it perfectly plain that in the case of clouds the mechanism for this “feedback” is not known. (eg. see IPCC report AR4 Box TS.8 “parametrizations are still used to represent unresolved physical processes such as the formation of clouds and precipitation, ..“, and 1.5.2 “They [Senior and Mitchell (1993)] produced global average surface temperature changes (due to doubled atmospheric CO2 concentration) ranging from 1.9°C to 5.4°C, simply by altering the way that cloud radiative properties were treated in the model. It is somewhat unsettling that the results of a complex climate model can be so drastically altered by substituting one reasonable cloud parametrization for another, thereby approximately replicating the overall intermodel range of sensitivities.“)

    Climate sensitivity is defined as having a logarithmic relationship to atmospheric CO2 concentration. But what if the “feedbacks” do not have a linear relationship to atmospheric CO2 concentration? Because we don’t know what the relationship is, we can’t know if it is linear. And if it isn’t linear, then climate sensitivity doesn’t exist as a single number. If the “feedback” is chaotic, or highly dependant on unrelated factors, then climate sensitivity (as per IPCC) is essentially not quantifiable at all.

    Given that
    (a) the major “feedback” cited by the IPCC is clouds, and
    (b) very little is known about cloud formation, and
    (c) the passages I quote from the IPCC report above, and
    (d) the multiple statements of uncertainty about clouds in the IPCC report, and
    (e) we don’t even know whether clouds react to AGW at all, and
    (f) we don’t know what (if any) the other “feedbacks” are,
    then I would suggest that it is highly likely that the “feedbacks”, if they exist at all, are non-linear, chaotic and/or highly dependant on unrelated factors. [rationale : if the relationship was linear, then the IPCC probably wouldn’t have found such a big range, and it would probably be quantified by now.]

    If this is in fact the case, then it will prove impossible to quantify climate sensitivity as a number.

    How’s that for a nice bit of uncertainty?

    • We could use more thoughtful analysis like this.

      Of course, then people will start to go out and try to think of ways to respond to it meaningfully.

    • Here, Mike exposes a foundational error of modern climatology. In the following remarks, I seek to describe this error in mathematical and logical terms.

      Though climatologists commonly assume the existence of a climate sensitivity in nature, this sensitivity is is not an observable feature of the real world. It is, instead, an observable feature of a kind of model. To confuse this feature of a model with a feature of the real world is the error. The confusion of the two has the effect of making this feature of the model sound as though it is true.

      The spatially and temporally averaged temperature at Earth’s surface is an observable feature of the real world. The spatially and temporally averaged atmospheric CO2 concentration is an observable feature of the real world. There is, in mathematical terms, a relation from the CO2 concentration to the temperature but this relation is not necessarily a functional relation. To assume the existence of a climate sensitivity in nature is to make the unjustified assumption that this relation is functional.

    • Richard S Courtney

      Mike Jonas :

      Concerning climate sensitivity you cite a post I made and you suggest:
      “Now, as we are talking about uncertainty here : what if there is no such number?”

      I, too, have considered that possibility and from a more fundamental basis. I stated my views of this uncertainty on anoter thread of this blog at
      http://judithcurry.com/2010/10/19/overconfidence-in-ipccs-detection-and-attribution-part-ii/
      in a comment posted at
      October 23, 2010 at 5:35 am

      To save you needing to find it, I copy it here.

      Richard

      Dr Curry:

      Sincere thanks for your two articles on ‘Overconfidence in IPCC’s detection and attribution’.
      I notice you say you have a Part 3 prepared and perhaps the comment I now provide would be more appropriate when that appears. But I now provide my comment in case it assists your Part 3.

      I write to address the underlying assumption in all the detection and attribution studies. None of these studies can have any confidence until their main underlying assumption is validated.

      The climate models are based on several assumptions that may not be correct. The basic assumption used in the models is that change to climate is driven by change to radiative forcing. And it is very important to recognise that this assumption has not been demonstrated to be correct.

      Indeed, it is quite possible that there is no force or process causing climate to vary. I explain this as follows.

      The climate system is seeking an equilibrium that it never achieves. The Earth obtains radiant energy from the Sun and radiates that energy back to space. The energy input to the system (from the Sun) may be constant (although some doubt that), but the rotation of the Earth and its orbit around the Sun ensure that the energy input/output is never in perfect equilbrium.

      The climate system is an intermediary in the process of returning (most of) the energy to space (some energy is radiated from the Earth’s surface back to space). And the Northern and Southern hemispheres have different coverage by oceans. Therefore, as the year progresses the modulation of the energy input/output of the system varies. Hence, the system is always seeking equilibrium but never achieves it.

      Such a varying system could be expected to exhibit oscillatory behaviour. And, it does. Mean global temperature rises by 3.8 deg.C from June to January each year and falls by 3.8 deg.C each year.

      Importantly, the length of some oscillations could be harmonic effects which, therefore, have periodicity of several years. Of course, such harmonic oscillation would be a process that – at least in principle – is capable of evaluation.

      However, there may be no process because the climate is a chaotic system. Therefore, the observed oscillations (ENSO, NAO, etc.) may not be harmonic effects but could be observation of the system seeking its chaotic attractor(s) in response to its seeking equilibrium in a changing situation.

      Very importantly, there is an apparent ~900 year oscillation that caused the Roman Warm Period (RWP), then the Dark Age Cool Period (DACP), then the Medieval Warm Period (MWP), then the Little Ice Age (LIA), and the present warm period (PWP).

      All the observed rise of global temperature in the twentieth century could be recovery from the LIA that is similar to the recovery from the DACP to the MWP.

      And the ~900 year oscillation could also be the chaotic climate system seeking its attractor(s). If so, then all global climate models and ‘attribution studies’ utilized by IPCC and CCSP are based on the false premise that there is a force or process causing climate to change when no such force or process exists.

      So, the assumption that climate change is driven by variations in radiative forcing needs to be substantiated for any confidence to be placed in the detection and attribution studies of the causes of climate change.

      Richard

  59. Re Bart and climate sensitivity: the IPCC values for climate sensitivity assume 1) that aerosols were blocking the “true” response, 2) that most of the warming post-1978 was AGW, and 3) that water vapor feedback is positive. But what if 1) aerosols did not have much effect (as evidenced by the models all using different aerosol forcings), 2) there is a natural 60 yr cycle accounting for much of the post-1978 warming, and 3) water vapor feedback (via clouds) is negative?

    • Then you have to figure out how to falsify all the other pieces of evidence gathered from observation of different time periods, all showing sensitivity in the same range.

      • In the a previous post I resposed to you complained about being compared to dogma. Well, you are coming very close. That link you provided above is not evidence. It was based on climate modeling. Models are not evidence. Unless that is, climate models are the Bible of the AGW faith. That makes it dogma.

      • Richard, your conceptual block here is a bit amusing as you go on about dogma. Had you read the study, you’d have seen that it includes “observed surface and ocean warming over the twentieth century,” “Satellite data for the radiation budget,” and “palaeoclimatic evidence” from both the Maunder Minimum period and from the remote past.

        The fact is, you didn’t read the study. Your ideological opposition to climate models blinds you to the fact that the understanding of climate change is not solely dependent on models. Your inability to tolerate any evidence to the contrary indicates a rigid, fixed mindset that can perhaps only be categorized as dogma.

      • Richard S Courtney

        PDA:

        You have stated two falsehoods when you say:
        “Richard, your conceptual block here is a bit amusing as you go on about dogma.”

        Firstly, I do not have a “conceptual block”.

        Secondly, I have said nothing about “dogma”. Cite it or apologise for the blatantly untrue assertion.

        I have only posted clear scientific information pertinent to the issues under discussion. It seems that your beliefs are preventing you from considering the science and you are projecting your anti-science attitudes on me.

        Richard

      • What I saw in the study is normal cyclic variation. I see no evidence the cause of these changes is from our emissions of CO2. Interesting that you are now resorting to attempts that I’m incapable of understanding. That is a common tactic of the faithful who dogmatically must hold on to a postion at all costs, unwilling to consider alternative explanations.

        Note to RC, he was commenting to me, my name is also Richard.

      • Are you now willing to acknowledge that your previous statement “it was based on climate modeling” is false?

      • Not to mention you’d have a very hard time explaining the amplitude of past climate changes.

      • How so? What does amplitude have anything to do with the cause being from CO2? You have evidence to back up that CO2 is causing any amplitude to be “abnormal”? How do you know the current amplitude is “abnormal”?

  60. Alexander Harvey

    There is a good quest post by Chris Colose on RealClimate, well worth a look.

    Perhaps more important is to read one of the underlying papers (Zaliapin and Ghil, 2010). It is a treatment of the dos and don’ts of handling feedback factors in non-linear systems.

    This has implications when considering the first level fat tail in the Weitzmann (2009) paper.

    Alex

    • Alexander Harvey

      On reflection it seems that correct handling of the non-linearity problem leads to an impossibility conjecture.

      That even if one could determine the linear parts of the various feedbacks dF/dT, one still could not possibly calculate or extrapolate from experiment, the stabilisation temperature in advance of a close approach to it. For that requires knowledge of the non-linear components (higher order derivatives) which can not confidentally be determined experimentally from a noisy (stochastic) signal before the stabilisation temperature has been closely approached.

      Alex

    • One thing that I didn’t see clearly mentioned was where the linear model for feedback breaks down. Lubos Motl had a post on this – at 192K, the quadratic term in Stefan’s Law becomes equal to the linear term. Therefore, the main impact of the Zaliapin/Ghil paper in my opinion, is that there is no infinite temperature increase as predicted by a linear feedback resulting in a temperature increase of the form 1/(1-f), however, when temperature rise exceeds 20K, you can no longer use the linear approximation. If you burn all the fossil fuel available, the most temperature is expected to rise is about 5-6C (without counting the methane release etc) if I remember Richard Alley’s talk well. Therefore, the Zaliapin/Ghil result seems to be more academic, in my opinion.

      • Alexander Harvey

        Yes it is academic, I would not rely on their approximations but it is important to realise that the non-linear terms, not only in the main term but also in the feedbacks decouples the climate sensitivity (in terms of dT/dF) from the stabilisation temperature. Even if one could determine the climate sensitivity (dT/dF) at current temperatures it does not inform us as to the stabilisation temperature. These two are often coupled as in: deltaT=deltaF*(dT/dF) which obviously does not hold in a non-linear environment. A number of papers do try to estimate (dT/dF) at current temperatures and use that to imply a stabilisation temperature and that has to be a questionable step.

        Alex

      • I’m not sure what approximations you were referring to, but I may not have been clear that the 20K is my estimate (i.e., 10% value) based on a 192K increase from current mean temperature for where the linear approximation yields results that do not have significant error. Yes, it is nonlinear, but what is the impact of using a linear analysis is the question. For a term of the form (1+a)^4, the quadratic term equals the linear term at a=2/3. Based on current mean temperature of 288K, the equivalent result is 192K. What you think is a questionable step does not appear to be so, practically speaking.

      • Alexander Harvey

        We do not know the second derivatives of the feedback terms. If any of them could be positive or negative then we cannot determine the stabilisation temperature without that knowledge, just knowing Stefan’s Law and its non-linearity is not enough.

        They do supply so quantifications for both albedo and the greyness factor but they are simplifiactions, other terms such as clouds, land surface are not I think included. Without knowledge of both the linear part and the higher order derivatives one can not determine the stabilisation temperature.

        It is not his values that are of interest so much as the principle that these additional terms are important. Most important is that they are not susceptible to determination from observations without letting the temperature vary by a sensible proportion of distance to the stabilisation temperature. That being the case the only way to verify the stabilisation temperature is to approach it. This would apply to model projections as without knowing the non-linearities short run observations would be a poor guide to long term observations.

        Alex

      • Alexander Harvey

        Also my post was with respect to the Weitzmann (2009) paper that deals with temperature rises of 10C and 20C due to the “fat tail”. The non-linearity issue does have an effect as to whether an assumed prior gaussian distribution in the feedback terms does give rise to a fat tail.

        Alex

      • If Weitzmann based it on the Roe/Baker paper, even conventional climatology thinks that this paper was flawed. James Annan, who if I understand correctly constrains the upper end of sensitivity estimates more than Judy Curry, has posted quite a bit on why he thinks Weitzman’s analysis is flawed.

        http://julesandjames.blogspot.com/search?q=weitzman

  61. Dr. Curry,

    I like your approach and I also like the precautionary principle, although I think that it has a much simpler interpretation than what you are suggesting.

    You apply the precautionary principle to the effects of AGW, while logically you should apply it to the world economy, and specifically to the US economy.

    The most obvious uncertainty is about the existence of AGW! Once there supposedly was a consensus, if there is a consensus there is no uncertainty! But no longer – now there is uncertainty. This is the natural condition in science, there is always some kind of uncertainty, awaiting the next Einstein to discover a new theory, and prove it scientifically.

    Many people, scientists and lay-people as well have examined the claims of AGW and reached a conclusion that they are completely unfounded, however, there are others who claim the opposite, so there is uncertainty.

    Quoting from the precautionary principle: “The precautionary principle states that if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is harmful, the burden of proof that it is not harmful falls on those taking the action”.

    So the IPCC and the EPA have the burden of proof, and so far they miserably failed to produce any credible scientific proof (I don’t count models which fail in each and every test).

    So I agree with you that we should do only what will not harm us, harm the economy, which includes research for other sources of energy, and the use of nuclear energy, while using fossil fuels as much as needed to prosper and bring relief to people in need.

  62. Richard S Courtney

    Bart Verheggen:

    Yes, I did read the link you provided. What point do you want to make that I have not demolished in my responses to you?

    And, no, I cannot “explain the magnitude of temperature changes in the deep past”. Nobody can do that (although some make demonstrably spurious claims that they can).

    But I can demonstrate – and I have demonstrated in my posts here – that the measured values of climate sensitivity (obtained by several workers each using different methods) in the actual climate that now exists on Earth is ~0.4 deg.C for a doubling of atmospheric CO2 concentration.

    Why do you want to change the subject from climate sensitivity to paleo studies?
    You raised the issue of the magnitude of climate sensitivity. Now, when I have demonstrated that your assertion is plain wrong, you want to change the subject.

    Richard

  63. Speaking of credibility:
    When scientists (within the IPCC or outside) promote the goal of containing or reducing CO2 emissions they venture not only into a political or policy debate but also into engineering debate. It is possible, at all, to reduce significantlly CO2 emissions ?
    When we asses the “solutions” promoted – wind and solar – they are rissibly, blatantly phantastic, unrealistic, impossible, un-physical.
    So, even if the abstract goal of diversifying energy sources to non carbon sources is laudable, it is, at this moment, under the proposed time frames, utterly unrealistic.
    That scientists (some of them) have such a poor grasp of physical, engineering, quantitative issues doesn’t inspire confidence.

  64. I’d like to visit the topic of Climate Disruption Decision Making as a Game Theory problem.

    Whatever else happens in the climate, the social problem of how human societies behave is integral to the outcome, barring the trivial case of a climate completely insensitive to anthropogenic forcings.

    We’ve seen some uses of Game Theory in the Precautionary Principle discussions, but there seems to be so much overloading of the term outside of the ordinary Game Theory sense as to convince me a thorough review of how Game Theory applies to Climate Disruption is needful.

    Moreover, stopping at the Precautionary Principle in discussions of Decision Making is a bit like stopping at learning how to apply the brakes in driving lessons. (Not that many parents of teenagers don’t feel that urge.)

    Further, there are not two, but at least five, candidate scenarios of Climate Disruption that should be discussed.

    0. Nondisruption. An insensitive atmosphere will not express important disruption due to to anthropogenic influences before these influences reach some limit and end on their own;
    1. Thermomechanical Disruption. The proposition that the Greenhouse Effect of GHGs in the atmosphere will cause thermomechanical energy rise to express important climate disruption.
    2. Acidosis. The proposition that an acidified biosphere will, independent or combined with other effects, express important pH changes.
    3. Botanical. The proposition that increased CO2 levels will shift plant activity due to respiration to the extent of important disruption.
    4. Zoological. The proposition that increased CO2 levels will shift animal biological activity due to respiration to the extent of important disruption.

    For “Case 0” to be true, each of the other cases must be false.

    I’m unaware of sufficiently detailed work to demonstrate which of cases 1-4 might be dominant or expected to be earliest to arrive.

    “Case 1” is interesting in that it is the most complex, difficult to prove, involves the most chaotic considerations and the most variables, has the most significant feedback components and yet for all these ‘administrative concerns’ is still the favorite topic of debate.

    What ‘important disruption’ means is variable, but would seem to depend on the cost of avoidance/mitigation or adaption.

    The interesting CO2 range of Case 0 is open-ended, or immaterial.

    I’ve seen suggestions for Case 1 of a return to “X baseline year” level of CO2, so back-of-the envelope the most interesting range is 280-350 ppm, or thereabouts, with the implication that each doubling of CO2 concentration (do we mean 560 ppm, 1120 ppm, 2240 ppm?) increments the Thermomechanical effect (+2 W/m^2 temperature rise globally, and +2 W/m^2 mechanical effects globally; +4/+4, +6/+6). At roughly 2 ppm/3 years, we’re 240 years from the first doubling (give or take a century)?

    Case 2 may be a far faster scenario, depending on ocean and soil sensitivity/transference of pH level. It appears we don’t need to turn the entire ocean to carboxylic acid to have important effects, just the top fathom or so could be enough, depending how slowly the solution dissipates and whether there’s a thermocline-like effect for pH. Acid soil is another kettle of fish. But at least this stuff is so much easier to measure and get agreement on than temperature and windspeed. Could give us a parameter for our discussion with a higher degree of certainty than warming.

    Likewise for cases 3 & 4. Plants and animals are pretty easy to experiment on, and to anticipate which species will tend to force out which other species. Be nice to know. Shouldn’t be too costly to figure out. No hockey stick necessary. No need to worry about whether it’s ever happened before in history, either, because we mostly know what plants and animals have what value to us.

    What about it folks, anyone have some actual litmus paper and a real greenhouse? Empirical experimentation, anyone?

    • Alex Heyworth

      The issue with plants and animals is that they are already adapted to a large range of temperature, due to seasonal and diurnal changes. Similar considerations apply to rainfall levels. Further, most species have a range of potential or actual habitats. Climate change, far from causing widespread species extinction, is likely to impact only species with very limited, very specific habitats that will be significantly impacted by climate change. These species are usually vulnerable to extinction from causes other than AGW, so their long-term prospects are in any case poor.

      A more likely result is changes in species populations. Some species will thrive and increase in population, others will not do so well.

      • Alex Heyworth

        I like your points, and would like to expand on my view of them one by one, but I’d also like to clarify, some affects on plants and animals would be covered by Case 1 (thermomechanics), and some by other cases independent of Case 1. This makes a significant difference to the statistical probabilities that are assigned to the problem, and to the way Uncertainty must be handled.

        p (Case 0) = (1 – p (Case 1)) * (1 – p (Case 2)) * (1 – p (Case 3)).. for at least four independent cases.

        If each of p1..4 were as low as only 0.80, p0 is no higher than 0.00016!

        We also are interested in net present value, as a cost incurred 240 years in the future is very different from one incurred today.

        A cost incurred as a liability compounded daily over the next 240 years, different too.

        And Uncertainty? If a game with one Precautionary Principle is hard to cope with, imagine what happens as we realize four Precautionary Principles in competition.

        The issue with plants and animals is that they are already adapted to a large range of temperature, due to seasonal and diurnal changes.

        Which would be true of weather, and of plant and animal gross survival, if we were constrained to only those topics.

        However, we’re also interested in climate and habitat destruction, or rather the cost of at least habitat destruction.

        If you change nothing about a habitat except any one key aspect of climate, then over the long term I expect plant and animal populations will react exactly as if stressed by any other type of habitat destruction.

        Whether this would amount to ‘important’ habitat destruction is yet to be demonstrated, but surely must be easier to demonstrate than if every weather station on the planet uses the right kind of paint.

        Life is adapted to a range of climate, but is most stable where that is the range the life adapted to.

        If the climate is unstable, the population is unstable.

        Similar considerations apply to rainfall levels.

        And to CO2 levels, yes.

        Further, most species have a range of potential or actual habitats.

        In botany, this is particularly true of what we laymen call ‘weeds’; in zoology, ‘vermin’.

        Climate change, far from causing widespread species extinction, is likely to impact only species with very limited, very specific habitats that will be significantly impacted by climate change.

        Species with very limited, very specific habitats such as is true of many domestic plants and animals at the heart of agriculture, forestry, animal husbandry, fisheries, links in the food chain?

        These species are usually vulnerable to extinction from causes other than AGW, so their long-term prospects are in any case poor.

        Or, in the alternative, aggressively invasive species will find unstable climate to their advantage and tend to force out productive or desireable species.

        This would be a very different kind of cost than habitat destruction.

        A more likely result is changes in species populations. Some species will thrive and increase in population, others will not do so well.

        Right.

        Which is where we’re at in our knowledge now.

        What we do not yet have and need to know is what the cost of this shift in these areas will be, to support decisions one way or the other.

      • Alex Heyworth

        Yes, I agree with what you’re saying. Unfortunately, modeling in many of the areas you are concerned about is in even worse state (and has a much worse reputation) than is the case with climate science. Have a look at Pilkey and Pilkey-Jarvis “useless arithmetic: Why Environmental Scientists Can’t Predict the Future” It makes depressing reading.

      • Alex Heyworth

        You are right again, as you so often are.

        My argument is that the areas of Case 2, 3 and 4 are much more amenable to improvement than Case 1.

        Empirical experimentation is said to be superior to experimentation by models (which is a fine bit of sophistry in itself; what else is any experiment but a model?) and it’s far easier to make specific statements about peak botanical or peak zoological or especially simple pH through empirical experiments that will scale globally than about thermomechanical ones.

        At the very least, Case 2 through 4 would dismantle the (already invalid but commonly cited) objection to climate models as a reliable form of experimental study, since litmus paper is not a computer, and for example a greenhouse full of tomato plants and kudzu at 500 ppm CO2 day and night can in a single season show more about benefits and costs than all the history of hurricane studies yet has.

        Also, we very much do not know the relative rates of dominance of each of the cases 1 thru 4, or if there is not some more dominant unlisted case 5 that would be influential sooner still. We certainly have the means to discover this for cases 2 thru 4, possibly making questions about case 1 moot.

        Is the Pacific at the edge of some significant pH precipice before 240 years pass, or 480 years, or never?

        Are larger swaths of the evergreen forests of the Northern Hemisphere due to be reduced to so much more sawdust from CO2 stress by the same dates? Sooner? Later?

        Is it remotely possible that the projected figure of 426 ppm CO2 as the global baseline might be significantly adverse to susceptible humans? And if not humans, then perhaps the poultry that so much of the animal protein of the western diet comprises. Or the mice, voles and like small mammals critical to the food chain in the wild?

        These are questions readily susceptible to experimental investigation, and which bypass many of the objections to case 1 studies.

        Also, there is already a very large body of reliable work to draw on in most of these areas, derived for other purposes, and so immune to allegations of bias or a stacked deck.

  65. This design is steller! You certainly know how to keep a reader
    amused. Between your wit and your videos, I was almost moved to start
    my own blog (well, almost…HaHa!) Excellent job.
    I really loved what you had to say, and more than that, how you presented it.
    Too cool!