Smith and Stern on uncertainty in science and its role in climate policy

by Judith Curry

Risk assessment requires grappling with probability and ambiguity (uncertainty in the Knightian sense) and assessing the ethical, logical, philosophical and economic underpinnings of whether a target of ‘50 per cent chance of remaining under +2◦C’ is either ‘right’ or ‘safe’. How do we better stimulate advances in the difficult analytical and philosophical questions while maintaining foundational scientific work advancing our understanding of the phenomena? And provide immediate help with decisions that must be made now?

The paper by Lenny Smith and Nicholas Stern published in the Proc. Roy. Soc. Special Issue on Handling Uncertainty in Science  is Uncertainty in science and its role in climate policy.  The complete manuscript is available here [link].  The whole paper is full of insights and well worth reading.

I reproduce here some of the summary text presented in the Concluding remarks.

Sound policy-making embraces the causal chain connecting actions by people to impacts on people. Many varieties of uncertainty are encountered along this chain, including: imprecision, ambiguity, intractability and indeterminacy. Science regularly handles the first with probability theory; ambiguity and intractability are more often used by scientists to guide the advancement of science rather than being handled within science explicitly. A better understanding by scientists of the roles of uncertainty within policy-making may improve the support science offers policy-making. In particular, an improved understanding of which scientific uncertainties pose the greatest challenges to policy-making when projected along the entire causal chain considered by policy, and informed scientific speculation on the likelihood of reducing those specific uncertainties in the near future, would be of immediate value. Some of these roles have been illustrated in the context of a particular example: selecting a stabilization target for greenhouse gas concentration.

Handling ambiguity in science, and the communication of insights from science, has been discussed. The value of scientific insight to policy-making, particularly in cases where state-of-the-art models are not empirically adequate, is stressed. Specifying the robustness of insights, and ideally quantifying how quickly model simulations are likely to become mis-informative as one moves further into the future, are each of significant value to sound policy-making. No scientific extrapolation is complete without a quantitative estimate of the chance of its own irrelevance. Communicating to policy-makers the level of confidence scientists have that their model-based probabilities are not mis-informative is at least as important as communicating the model-based probabilities themselves. Engagement of scientists in the policy-making process, not merely by presenting the outputs of models but by explaining the insights from science, can significantly improve the formation of policy. This is especially true in climate policy, where the scale of the risk is great even if we cannot provide precise probabilities of specific events, and where many plausible changes are effectively irreversible should they occur. Scientists who merely communicate results within the comfortable area of reliable theory abandon the decision stage to those who often have little engagement with the science. Sound policy-making is then hindered by the lack of sound scientific speculation on high-impact events, which we cannot currently model but may plausibly experience. Failing to engage with the question ‘What might a 6◦C warmer world look like, if it were to occur?’ leaves only naive answers on the table for policy-makers to work with.

Complementary to the need for scientific engagement with the policy process is the need for more transparent communication of the limits of current models when presenting model output. Policy-makers are often told that the models ‘have improved’ and that representations of more phenomena ‘have been introduced’. Clear statements of the spatial and temporal scales at which model output is ‘likely’ to be mis-informative, and how these change between 2020, 2050, 2090 and so on, would be of great value in interpreting when the model output is useful for a particular policy purpose. Honesty here enhances credibility and thus effectiveness. Even when technically coherent, failing to lay the limits of today’s insights in plain view, as with the presentation of ‘temperature anomalies’ in summaries for policy-makers, hinders communication of large systematic model errors in today’s models, and hence the relevant level of ambiguity. The eventual realization that such figures show weaker evidence than originally thought can be blown dangerously out of proportion by the anti-science lobby, making the use of science in support of policy-making more difficult than it need be. Again, greater engagement of scientists in the policy process, openly explaining the insights of today’s science and limitations of today’s models, is a significant benefit. This may prove especially true in situations where decisions are based upon feelings as much as upon numbers.

The expected utility approach is difficult to apply when one is unable to translate possible outcomes into impacts on people. There is both imprecision and significant ambiguity in predictions of the Earth’s global mean temperature, yet even a precise value of that temperature cannot be translated into precise impacts on people. And where we have impacts on people, there remain deep ethical challenges in attaching values to outcomes. This approach also struggles with low-probability events; the vanishingly small probabilities that mathematical modelling may suggest are not actually zero should not distract policy-makers from action either. 

In this paper, it has been suggested that the communication of science to policy-makers could be aided by:

— scientific speculation on policy-relevant aspects of plausible, high-impact, scenarios even where we can neither model them realistically nor provide a precise estimate of their probability;

— specifying the spatial and temporal scales at which today’s climate models are likely to be mis-informative, and how those scales change as we look farther into the future;

— identifying weaknesses in the science that are likely to reduce the robustness of policy options;

— clarifying where adaptation to current climate is currently lacking; — identifying observations which, in a few decades, we will wish we had taken today;

— distinguishing the types of uncertainty relevant to a given question, and providing some indication of the extent to which uncertainty will be reduced in the next few years; and

— designing model experiments to meet the needs of policy-making.

Similarly, policy-makers could encourage the engagement of scientists by:

— accepting that the current state of the science may not be able to answer questions as originally posed;

— working with scientists to determine how current knowledge with its uncertainties can best aid policy-making; and

— discrediting the view among some scientists that policy-makers are only interested in ‘one number’ which must be easy to understand, unchangeable and easily explained in less than 15 min.

The advance of science itself may be delayed by the widespread occurrence of Whitehead’s ‘fallacy of misplaced concreteness’. In areas of science, far removed from climate science, an insistence on extracting probabilities relevant in the world from the diversity of our model simulations exemplifies misplaced concreteness. Computer simulation both advances and retards science, as did the astonishing successes of the Newtonian model, Whitehead’s original target. In any event, better communication of uncertainty in today’s science, improved science education in the use of simulation modelling that values scientific understanding of the entire system, and the communication of all (known) varieties of uncertainty will both improve how science handles uncertainty in the future and improve the use of science in support of sound policy-making today. How science handles uncertainty matters.

JC comment:  this paper provides a wealth of provocative insights and ideas for dealing with uncertainty at the climate-policy interface.  I will definitely adopt “Whitehead’s ‘fallacy of misplaced concreteness'”, instead of the vaguer ‘overconfidence’ that I have been using.


249 responses to “Smith and Stern on uncertainty in science and its role in climate policy

  1. To understand what a 6 C world would look like, one needs to understand what make our world have about 15 C average global temperature.

    • Just so, gbaikie. We have evidence that during interglacials (such as now) temperature peaks at about that level or a little higher. But do we know why it doesn’t go higher still? Does some negative feedback or buffer come regularly into play? Does the earth’s climate ‘mitigate’ itself? Until we have a better understanding of the natural controls of climate oscillations on a wide range of time-scales (not just glacial-interglacial), planning how we might chip in with some ‘mitigation’ of our own seems to me, well, close to futile.

      • Earth’s Climate does Mitigate itself. When the oceans are warm and the Arctic Sea Ice is Melted, it snows like crazy and cools the earth and brings about the next cooling. The temperature does not go higher because it is snowing like crazy and albedo of earth is increasing. Earth has a stable temperature cycle. When it is warm and Arctic Sea Ice is melted, it snows and Earth cools. When it is cool and the Arctic Ocean is frozen there is no source for moisture and it don’t snow much and ice melts and retreats and albedo decreases and Earth warms.

      • Earth’s climate does mitigate itself. Climate change are reversible thermodynamic transformations in which sensible heat is exchanged between upper atmosphere and surface. Presently, the upper atmosphere is cooling and the surface is warming. Sensible heat is presently transferred from the upper atmosphere to the surface of the earth and surface temperature and sea level rise as a consequence. The upper atmosphere cannot continue to cool indefinitely and there will come a time in which carbon dioxide will deposit in the upper atmosphere. When this happens global warming ceases. This always happened following interglacial periods and that is why surface temperature had always an upper limit of not more than 16.5 degrees centigrades. For more, please see Article-12, Earth’s Magic, book PDF, and other related articles posted on http://www.global-heat.net.

      • 17.5 degrees centigrades instead of 16.5

      • What will happen, when you people wake up? Putting in writing that, atmospheric temperature can get up by 16,5 degrees…?
        If the overall temperature warms up by ONE degree, troposphere would expand minimum by 300m UPWARDS. Intercepts extra appropriate coldness in 3,5 seconds – falls down and equalizes in a jiffy. That extra volume of air produced by warming; if it doesn’t shrink for a whole day – would redirect enough extra coldness, to freeze all the tropical rivers.

        Unless one part of the atmosphere get colder – to accommodate the extra volume of air from part where is getting warmer – the part that is getting warmer air will have to expand UPWARDS. Instantly intercepts extra coldness, to cancel the extra heat. The only reason in Australia for the last 3 Januaries was warmer – because in Europe / USA was extra coldness. The big ice age was similar, only on a much larger scale. P.s. if Europe / USA gets colder by 8 degrees – Oceania needs to get warmer by 0,8 degrees, to equalize. Because Oceania is 10 times larger. Overall temperature in the atmosphere doesn’t go up and down as a yo-yo!

        Precursor of all evil started long before IPCC. When some failed geologist in mining, discovers some proof that some small part of the planet was warmer / colder than today; then declares that the WHOLE PLANET’S ATMOSPHERE was warmer… without using the laws of physics in consideration = end result is the nuclear winter, GLOBAL warming, mother of all lies. You cannot avoid my proofs forever. The Earth promised to Heaven: there is no such a thing as permanent lie on the earth. My grandma was saying many times: all lies have shallow roots – get exposed to the daylight sooner than you think… think about that, manipulators. If the WHOLE troposphere cannot warm up by one degree; how can it warm up by 3-4-5 – 15 degrees?! Please explain, or leave in shame!

  2. I’m not sure I follow the entire argument. But this caught my eye…

    ‘Sound policy-making embraces the causal chain connecting actions by people to impacts on people.’

    Of course policies have impacts on people. But those impacts are typically mediated in a democracy by engagement with the political process.

    I’m also wondering, who are ‘the anti-science lobby’? Are they people who may object to the post-democratic tendencies of technocrats?

    ‘…the vanishingly small probabilities that mathematical modelling may suggest are not actually zero should not distract policy-makers from action either.’

    Frankly, stuff the discussion about the clear communication of ‘science’ to ‘policy-makers’; there needs to be more effective communication about the role of the ‘politics of fear’ in the climate ‘debate’ — not that this paper’s authors seem to acknowledge that there is, or should be a debate, except on the terms they dictate.

    • In my opinion, the greatest weakness in the Stern and Smith paper is failure to grasp the damage done to the credibility of world leaders and leaders of the scientific community by exaggerated scare stories of global climate change.

      I suspect that this has contributed in no small way to the worldwide social unease and mistrust of politicians and scientists.

      • My suspicions are that the low credibility of world leaders is what moved them to beg scientists for authority. Shame on both their houses.

    • Ben
      Excellent observation that

      “there needs to be more effective communication about the role of the ‘politics of fear'”

      How about scientists addressing the politics of hope?

      Smith et al. note

      “— scientific speculation on policy-relevant aspects of plausible, high-impact, scenarios even where we can neither model them realistically nor provide a precise estimate of their probability;”

      Fig. 2 lists

      marginal cost of impacts (including benefits from adaptation) as a function of the stabilization target

      Whatever happened to seriously describing the potentially major “BENEFITS of global warming” with higher crop yields from higher rainfall and higher CO2?
      The world now has 7,000,000,000 people to feed going on 9,000,000,000.
      We collectively need to address how to strongly improve agriculture. More water and more CO2 to increase crop growth will go a long way.

      The abundance of fossil fuels in the earth did not come from a scarcity of CO2, but from its abundance.
      Geologists summarize:

      To form the thick layer of plant debris required to produce a coal seam the rate of plant debris accumulation must be greater than the rate of decay. Once a thick layer of plant debris is formed it must be buried by sediments such as mud or sand. These are typically washed into the swamp by a flooding river. The weight of these materials compacts the plant debris and aids in its transformation into coal. About ten feet of plant debris will compact into just one foot of coal.

      Note coal in the Powder River Basin:

      The Wyodak and Big George coal seams in Wyoming are up to 35 m and 61 m thick, respectively. The Lake DeSmet coal bed has a maximum thickness of 75 m, the thickest in the United States and second thickest in the world. The Big George seam extends about 100 km north-south, parallel to the basin axis, and about 25 km eastwest. The total amount of coal in the Powder River Basin is estimated to be about 1,200 trillion kg.2

      i.e., 75 m coal ~ 246 ft coal – that is equivalent to 2,460 ft depth of biomass!
      While having much higher CO2, that massive amount of biomass and consequent coal did not come from a scarcity of biomass due to hot dry conditions!

      Some global warming models have their projections backwards, predicting drought instead of increased precipitation. e.g., CRITIQUE OF DROUGHT MODELS IN THE AUSTRALIAN
      DROUGHT EXCEPTIONAL CIRCUMSTANCES REPORT
      (DECR)

      • The abundance of fossil fuels in the earth did not come from a scarcity of CO2, but from its abundance.

        True, and all that CO2 we have emitted is now wandering around in the active-layer carbon cycle with the excess having difficulty finding a permanent sequestering location. This is both fascinating and potentially disturbing in its implications.

    • I seem to be picking up the message that rather than lawyers filling the time honored position of political representatives for the people in each of the three branches of government, that it is now better for Climeatologists skilled in probability theory to take the lead. And, in a more general vein, that it is now absolutely essential for all scientists, doctors, lawyers, Indian Chiefs, generals, admirals, chief executives of every walk of life, to likewise be highly skilled in this new Art of Probability Guesstimation for the sake of everyone on the planet, less all be lost and civilization as we know it collapse under it’s own weight. Maybe I missed something?

  3. I asked:

    “who are ‘the anti-science lobby’? Are they people who may object to the post-democratic tendencies of technocrats?”

    I was right, they are:

    Smith & Stern: “Policy-making is usually about risk management.Thus, the handling of uncertainty in science is central to its support of sound policy-making.”

  4. “The eventual realization that such figures show weaker evidence than originally thought can be blown dangerously out of proportion by the anti-science lobby, making the use of science in support of policy-making more difficult than it need be. Again, greater engagement of scientists in the policy process, openly explaining the insights of today’s science and limitations of today’s models, is a significant benefit. ”

    one trusts that now that Stern has advocated what auditors have been asking for, that we won’t be labelled as skeptics, or denialists, or delayers.

    “Don’t over egg the pudding” What briffa said behind closed doors can now be discussed openly.

    It’s a good day.

    • No it’s not. Notice the homage to politics in that statement.

      • There are two dreams. The first dream is that science is carried on in isolation from politics. It’s not. The second dream is that one can return to a paradigm where science can be carried on in isolation from politics. you can’t .

      • Suggestions for disincetivising the corruption politics brings to science? More transparency? Less gov’t funding? Other?

      • You’ll note that those all involve politics. But yes, more transparency, more accountability to the public, basically any “systems” we use to keep corporations and governments in check have to be applied to big science. Big science spends big money. It is political. It cannot help but be political. Any attempt to de politicize it, is political. That is vice our heads are being squished in.

      • Maybe you missed my point. Anybody who would chose to use the phrase “anti-science lobby” is already damaged goods. He’s already rooting for the Team.

      • The third dream is that science (specifically climate science) can be conducted again without being corrupted by a politically motivated “consensus process”.

        As MLK said, “I have a dream”.

        (And there are several folks working on realizing this dream – including our host and maybe you?)

        Max

      • “The second dream is that one can return to a paradigm where science can be carried on in isolation from politics. you can’t .”

        Of course you can. It’s just that nobody in the wonderful world of climate science wants to.

        That horrible, awful, evil thing everybody refers to as “politics,” is how public policy is implemented.

        No one is stopping Michael Mann from going into a computer lab somewhere and crunching any numbers he wants, any way he wants. Now, if he wants to use other people’s money to do his “science,” and he wants other people to also drastically alter their lifestyles (and pay even much more of their money to government), he has the right to do that too. But that part isn’t science.

      • naive. the point is that once science has been politicized you cannot merely exhort people to depoliticize their science. Why? because they dont see the politics in their science. So, paradoxically once science has been politicized there is no way out that doesnt pass through a political door.

      • But that doesn’t mean that the public is required to support him. And this is the part of the “contract” that people like him are endangering.

      • Don’t let logic get in the way of a chance to appear condescending.

        “…once science has been politicized…”

        This from the guy who just wrote “The first dream is that science is carried on in isolation from politics. It’s not.”

        Science hasn’t “been politicized.” When science is implicated in questions of policy, the science becomes part of the political debate. And that is not a bad thing.

        It is not the polticization of science that is the problem. It is the dishonesty of the scientists. And scientists choose to be honest, or dishonest. The politics is there whether they want it or not.

        “Science” is a method, period, not some golden icon that was pure in the past and has been corrupted. There is science, and there is politics. They are commingled by choice. There are honest scientists and dishonest scientists, and that is a matter of choice as well.

        If you weren’t so busy trying to be clever, you might have noted that I agree with your basic premise, but disagree that continued dishonesty (not politicization) is inevitable.

        Someone around here is indeed naive. And it ain’t me babe.

      • How can science be “de-politicised” ? Simple. The consensus has to shift from “yes continued human CO2 emissions are potentially a serious problem” to “no they aren’t”.

  5. “Risk assessment requires grappling with probability and ambiguity (uncertainty in the Knightian sense) and assessing the ethical, logical, philosophical and economic underpinnings of whether a target of ‘50 per cent chance of remaining under +2◦C’ is either ‘right’ or ‘safe’.”

    It’s wrong and depends what is meant by safe.
    It’s politically safe. But government trying to anything hasn’t been
    safe for the people they are suppose to represent.
    Let’s take an example of Nuclear Non-Proliferation Treaty.
    I am almost certain that many people regard that treaty
    as right and safe.
    I would say the best result of Nuclear Non-Proliferation Treaty
    was it gave a good excuse for America and Soviet Union to
    talk about something. And this eventually lead to Soviet Union collapsing
    which I admit was a really good thing. But any good excuse for a conversation would had similar results.
    But if you imagine the Nuclear Non-Proliferation Treaty is stopping
    proliferation of nuclear weapons or the other of it, was to encourage
    the use of nuclear energy by non nuclear power state, then one should realize that it been quite a failure and hasn’t be safe or right.

  6. fallacy of misplaced concreteness

    First I heard that term was in relation to people who overdose on fibre because they think their crap is too hard.
    But this is about climate science, wait…what?

  7. “The eventual realization that such figures show weaker evidence than originally thought can be blown dangerously out of proportion by the anti-science lobby, making the use of science in support of policy-making more difficult than it need be.”

    The arrogant self-serving “expert” attitude defended even while being generally rational. We see it everywhere, ALL-THE-TIME. We should worry less about the imagined and vilified “anti-science lobby” and more about anti-science practices and industries like AGW climate alarmism.

  8. Fermi’s paradox illustrates the problem with using “best estimates”. If we had followed his estimates, we’d look awfully silly sitting around waiting for the flying saucers to arrive.

  9. Dr. Curry: I came across this paper that might stimulate a good discussion. It was linked at Prof. Isaac Held’s blog.

    http://romps.org/2009/warming/09warming.pdf

    If replicated, it might increase uncertainty about CO2 induced global warming.

    • Thanks for the link, but I’m not sure why this would increase uncertainty about co2 induced global warming.

      ‘This paper presents the first results of a CO2-doubling experiment in a high-resolution cloud-resolving model. When the preindustrial concentration of CO2 is doubled, the sea surface temperature increases by 2.9 K, the
      global precipitation flux increases by 10% (3% K21), and the local precipitation flux in the heaviest precipitation events increases by about 20% (7% K21), in rough agreement with Clausius–Clapeyron scaling.’

      They find one result which contradicts a few low-resolution GCM studies but is in accordance with the prevailing Clausius–Clapeyron theory:

      ‘As predicted by the scaling theory, increases in CO2 lead to more vigorous convection, composed of taller and faster plumes.

      The finding that higher CO2 leads to higher convective velocities contradicts the inferences from several GCM studies.’

      • Which is interesting, seeing as how Clausius-Clapeyron says nothing about fluxes.

      • Clausius-Clapeyron scaling is the specific theory in question:

        ‘The prevailing theory, referred to as Clausius–Clapeyron (CC) scaling, is that the local precipitation flux will increase in proportion to changes in the cloud-base saturation mixing ratio.’

      • So somebody just made up a theory and then named it post-facto after Clausius and Clapeyron in order to lend credibility to it. Nice. Maybe I’ll make up an Einstein theory.

      • P.E.

        CC scaling seems to perfectly fine in the tropics as an estimate. It doesn’t seem that it would be all that good in regions with less predictable cloud coverage. Most things don’t seem to be all that good in regions where temperature and humidity extremes are pushed.

      • Dallas, that wasn’t the point. The point is that CC is a famous thermodynamic relationship, and fluxes aren’t thermodynamics. They’re transport. I’m protesting this kind of namejacking.

      • Then I will concede name jacking :) The increase in vertical velocity with CO2 increase which was not anticipated in the models may require a new name. How about enHansening name jacked theories.

      • The model must be wrong, because precipitation is presently decreasing with carbon dioxide emissions.

    • Mattstat,

      Nice link. I believe I mentioned something about the height of the effective radiant layer and solar atmospheric absorption increasing upper layer convection. Of course, the Galileo of Global warming has a fairly low effective radiant layer and a touch low estimate for average cloud base.

      I am sure that the savvy climate science crew caught all that before building those complex global circulation models. :)

  10. Norm Kalmanovitch

    Risk assessment requires grappling with probability and ambiguity (uncertainty in the Knightian sense) and assessing the ethical, logical, philosophical and economic underpinnings of whether a target of ‘50 per cent chance of remaining under +2◦C’ is either ‘right’ or ‘safe’.

    A linear fit through the last 1200 monthly global temperature values of the HadCRUT3 temperature dataset shows overall warming of approximately 0.73°C (www.climate4you.com) for the past 100 years.
    The rate of increase in global temperature is virtually identical in the first 50 years as it is in the last 50 years in spite of the increase in CO2 emissions from fossil fuels doubling from 4gt/y to 8gt/y in the first 50 years but quadrupling from 8gt/y to 32gt/y in the last 50 years (values rounded greatly for simplification purposes).
    With no evidence to support any increase in rate of warming from increased CO2 emissions from fossil fuels; the only logical prediction based on the record of the past 100 years is only 0.73°C.
    The question therefore is what is the risk of the rate of global warming exceeding this 0.73°C/100 years rate.
    There has been no warming for the past ten years in spite of the 26% increase in CO2 emissions so we have no evidence from current data that this rate of 0.73°C/100 years is increasing so there is no justifiable reason to expect it to do so.
    The longterm record of the past 10,000 years shows that there has been an overall cooling trend for the last 5000 years so there is no evidence for an increase above this rate of 0.73°C/100 years based on the long term temperature record either.
    Therefore all evidence points to virtually zero risk that the global temperature will increase at a rate greater than 0.73°C/100 years.
    So where exactly does this 50% risk of 2°C come from when the certainty that it will not occur is close to 100%!!

    • Norm

      Let me provide evidence for your statements.

      A linear fit through the last 1200 monthly global temperature values of the HadCRUT3 temperature dataset shows overall warming of approximately 0.73°C (www.climate4you.com) for the past 100 years.

      Evidence=>http://bit.ly/s2TIsJ

      The rate of increase in global temperature is virtually identical in the first 50 years as it is in the last 50 years in spite of the increase in CO2 emissions from fossil fuels doubling from 4gt/y to 8gt/y in the first 50 years but quadrupling from 8gt/y to 32gt/y in the last 50 years (values rounded greatly for simplification purposes).

      Evidence=>http://bit.ly/de8ihf


      There has been no warming for the past ten years in spite of the 26% increase in CO2 emissions so we have no evidence from current data that this rate of 0.73°C/100 years is increasing so there is no justifiable reason to expect it to do so.

      Evidence=> http://bit.ly/g28h0V

      • Norm and Girma

        What we are seeing here is the debate between those who see the physically observed data as empirical scientific evidence (following the normal scientific process) and those who prefer model estimates based primarily on theoretical deliberations.

        You two fall into the first category (as do I).

        Paul S falls into the latter category (see my exchange with him further down the page).

        What Paul S is unable to do, however, is to provide empirical scientific evidence based on real-life physical observations or reproducible experimentation to support his premise that AGW will represent a serious threat to humanity.

        That is what the whole debate here is all about – and that is the inherent weakness of the “theorists” who rely on models rather than empirical evidence.

        Max

  11. “Scientists who merely communicate results within the comfortable area of reliable theory abandon the decision stage to those who often have little engagement with the science.”

    I am at a loss to understand what this actual means.

    “within the comfortable area of reliable theory”

    Would mean where you have confidence in your predictive abilities. If you operate outside the ‘comfortable area of reliable theory” why exactly should your opinion carry SCIENTIFIC weight?

    Are not the authors arguing that because they are actual scientists, who understand some scientific stuff, that their guesses, based on their gut feeling, should carry more weight than non-scientists?

    Are we expected to support the formalization of sympathetic magic in scientific advice?

    “abandon the decision stage to those who often have little engagement with the science”

    That would be elected officials who are able to make judgments based upon the advise of people drawn from all fields of expertise.

    Given the choice of estimating the mass of a cow, one could ask a zoologist or 1,000 people at random. If you pick the median of the crowd you will always outperform the zoologist who will be acting ‘ comfortable area of reliable’ experience.

    I am gob-smacked that this was published. Talk about pouring oil on burning waters. Scientists are now to act as Shamans.

    • >> “Scientists who merely communicate results within the comfortable area of reliable theory abandon the decision stage to those who often have little engagement with the science.”

      > I am at a loss to understand what this actual means.

      It looks to me like an attempt to reformulate the precautionary principle. It continues.

      “Sound policy-making is then hindered by the lack of sound scientific speculation on high-impact events, which we cannot currently model but may plausibly experience. Failing to engage with the question ‘What
      might a 6◦C warmer world look like, if it were to occur?’ leaves only naive answers on the table for policy-makers to work with.”

      The PP was the dominant argument/approach in policy making until the early 2000s.

      “In order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.” Principle 15 of the Rio Declaration – http://www.unep.org/Documents.Multilingual/Default.asp?documentid=78&articleid=1163

      It was problematic, though, for obvious reasons. It was only when it was possible to ‘unequivocally’ attribute GW to A, that the precautionary approach subsided, and arguments claimed that the ‘science is settled’. Now, a decade later, people are more circumspect about science, which failed to provide the policy-making process with certainty (failure @ COP15, Climategate, 2035, etc). So the argument is returning to uncertainty again.

      • It looks like more of an attempt to reformulate/relabel post-normal science.

      • we are already in PNS. there is no wishing it away.

      • Correction, certain fields of academic science are in PNS. Industry and the more ‘core’ sciences have (in my experience) firnly rejected it.

      • Science strolls Mulberry Street, engineering hunts Bartholomew’s Hat.
        ==================

      • Its not a matter of rejecting it or accepting it. PNS is a factual description of the state of science.
        you know you are IN PNS when
        1. facts are uncertain
        2. values are in conflict
        3. stakes are high
        4. decisions are required quickly

        There are two aspects to PNS. The first is mereley and completely descriptive. Descriptive.
        The second aspect is a proposal for governance. When you find yourself in PNS, then what?
        1. pretend that its not PNS
        2. use politics to take the politics out of science
        3. ??

        So, two aspects. the first is undeniable. its just a factual assesment of the situation. The second is where the arguments are

      • steven mosher

        You write:

        you know you are IN PNS when
        1. facts are uncertain
        2. values are in conflict
        3. stakes are high
        4. decisions are required quickly

        Let’s check:

        1. facts are uncertain. TRUE
        2. values are in conflict. TRUE
        3. stakes are high. PARTLY TRUE*
        * Economic and social stakes of shutting down the world’s carbon-based economy are enormous – particularly for the poorest on our planet (as has been pointed out elsewhere). But the climate-related stakes are not. A warming of one or two degrees is not a real problem, and there isn’t enough carbon in all those fossil fuels out there to go much higher.
        4. decisions are required quickly. FALSE**
        ** what is required are not “QUICK” decisions, but avoiding WRONG decisions whose unintended consequences we are unable to ascertain. None of the specific mitigation proposals.made so far would have a perceptible impact on our planet’s climate, yet they would all be extremely costly and disruptive to our society, therefore their implementation would involve WRONG decisions.

        Just my take on it, steven.

        Max

      • manaker. in PNS it only takes ONE to tango.

        that is, once you agree the facts are uncertain and once you see that values are in conflict, then 3 and 4 necessarily follow for at least one side of the debate. arguing about 3 and 4 just reaffirms 1 and 2.

        The fact that you argue about 3 and 4 shows you that you are already in PNS. cause no other science cares about 3 and 4.

        tricky business.

      • another way to look at it manaker is this

        ALL that is required is 1 & 2.

        Give me 1 &2 and somebody somewhere will manufacture an argument about 3 and 4. That is all that is required to steal the focus away from solving #1.

        Think of examples where uncertain facts interplay with values in conflict.
        It doesnt take much for one side to then claim stakes are high. in fact that is buried in #2 since our values are dear to us. get it. 1 &2 are all you really need to transform science from “distinterested investigation” into ‘value driven investigation”

        in fact any science that impinges on human values is on the margin of PNS

  12. ‘Even when technically coherent, failing to lay the limits of today’s insights in plain view, as with the presentation of ‘temperature anomalies’ in summaries for policy-makers, hinders communication of large systematic model errors in today’s models, and hence the relevant level of ambiguity.’

    Given that such problems are discussed quite frequently (from my vantage point anyway) I presume they mean specifically in documents aimed directly at policymakers i.e. IPCC Summary for Policymakers? It does seem that discussions of model weaknesses are often delivered in technical prose in the main report and not really mentioned in the SPM.

    Effectively their argument is that the SPM should become a more substantial explanation of the science in its own right, rather than something of a ‘best of the full report’ results collection that it has been(?)

  13. “Sound policy-making is then hindered by the lack of sound scientific speculation on high-impact events, which we cannot currently model but may plausibly experience. Failing to engage with the question ‘What might a 6◦C warmer world look like, if it were to occur?’ leaves only naive answers on the table for policy-makers to work with.”

    So acknowledge uncertainty, but increase the production of scare scenarios. Combine that with urging climate scientists to get MORE involved in policy making (is that even possible?) and I don’t see this as a step forward at all.

  14. May I draw you attention to the code of ethics that Forensic Scientists have to adhere to when they give evidence in court. Two major points stand out, with respect to Smith and Stern’s paper

    II. ETHICS RELATING TO OPINIONS AND CONCLUSIONS

    C. Conclusions should be based on the information drawn from the evidence itself, not on extraneous information from other sources. Opinions stated in a scientific report should have a similar basis.

    III. ETHICS RELATING TO TESTIMONY

    B. Considering the above, the individual testifying should make it clear in his testimony which opinions he is providing are in specific tests conducted and which are based primarily on his knowledge and experience. Likewise, if any opinions are based on information in the case other than or in addition to the scientific tests conducted, this should be clearly stated.

    http://www.mafs.net/index.php?id=codeofethics

    If a forensic scientist were to ‘communicate OUTSIDE the comfortable area of reliable theory” and in doing so they were to “abandon the decision stage to those who often have little engagement with the science”. They would be in breach of their professional ethics and open to a law suit.

    • Thank you for this link to the MAFs Code of Ethics, which is directly relevant to areas of my own work. In return, I offer the following citations to codes of professional conduct in other professions, most significantly engineering, accounting, appraisals and law. They too regularly require independent and impartial testimony, as well as imposing a duty to provide full information, including information adverse to the opinion expressed by the person testifying. However, those obligations do not seem to transfer into the world of public policy advocacy, where one-sided arguments are accepted practice.

      Interestingly, there is no similar code of professional conduct for academics generally I can find, although US universities and, for US government-sponsored research, the US Government prohibit “research misconduct” by policy and, for the US Government, regulation.

      American Institute of Certified Public Accountants (AICPA) Practice Aid 10-1 (2011), entitled Serving as an Expert Witness or Consultant

      American Society of Appraisers Principles of Appraisal Practice and Code of Ethics, Secs. 4.3 and 7.5.

      Institution of Civil Engineers Code of Professional Conduct (UK), Rule 1 and related guidance.

      National Society of Professional Engineers, Code of Ethics for Engineers, Rule II.1, III.1, III.3.

      Society of Petroleum Evaluation Engineers, Discussion and Guidance on Ethics (2005).

      American Society of Civil Engineers, Code of Ethics, Canon 3.

      Institution of Engineers Australia, Association of Professional Engineers Scientists and Managers, Australia, and Association of Consulting Engineers, Australia, common Code of Ethics.

      National Association of Forensic Economists, Statement of Ethical Principles and Principles of Professional Conduct, Sections 3 and 4.

      American Bar Association Formal Opinion 97-407 and DC Bar Legal Ethics Committee Opinion 337.

      See also International Bar Association Rules for the Taking of Evidence in International Arbitration, Rules 5.2(a) and 8.4 (2010).

      If this technical topic interests you (I can of course easily see that it might not :-)), you can find lots more detail in Kantor, A Code of Conduct for Party-Appointed Experts – Can One be Found?, 26 Arbitration International 323 (2010) [peer-reviewed, for what it is worth].

      I hope this is useful, or at least not too boring.

      MK

      • Let’s hypothesize that there is a universal code of ethics for scientists/professors. Regardless of what’s written, how easy is it to rationalize that the end justifies the means, and the ethical imperative is to stop climate change, and everything else is banal? I’m convinced that such a document would be a dead letter in the face of this kind of rationalization.

      • P.E., there is no need to reinvent the wheel.
        Other scientific disciplines have have screwed up big time in the past and developed systematic ethical structures and institutional knowledge so as to stop this happening again.
        The ease as which one can ‘rationalize that the end justifies the means’ is one of the main reasons for a code of ethics.
        A forensic scientist should not care if they are a witness for the defense or prosecution.

      • I understand that. But. That only works because the institutional culture says that you obey the piece of paper. The institutional culture of climate science says that they’re all superheroes saving the world and none of that other stuff matters. Putting a piece of paper on front of them won’t change any of that. It isn’t the paper that enforces ethics, is the the culture.

        Remember that Soviet constitution that guaranteed all those human rights?

  15. — scientific speculation on policy-relevant aspects of plausible, high-impact, scenarios even where we can neither model them realistically nor provide a precise estimate of their probability;

    PRIME MINISTER: 7DegC rise in temperatures by century end. Is that plausible Stern?
    STERN: Why yes Prime Minister.
    PM: And what is your estimate of it’s probability?
    STERN: We don’t have precise estimates of probabilities Prime Minister.
    PM: So this is just speculation then is it Stern?
    STERN: We always deliver on what we’re asked Prime Minister.

    — specifying the spatial and temporal scales at which today’s climate models are likely to be mis-informative, and how those scales change as we look farther into the future;

    PM: You can’t tell me what the weather will be like next week or next month but you know what it will be like in a hundred years Stern? I thought you told me this was mere speculation.
    STERN: Precisely Prime Minister

    — designing model experiments to meet the needs of policy-making.

    PM: What does this mean Stern?
    STERN: You tell us what you want the models to say, we design them Prime Minister. I thought we made that clear back in 1988.

    — accepting that the current state of the science may not be able to answer questions as originally posed;

    PM: Do you mean you can’t answer the questions I pose?
    STERN: Not at all Prime Minister. It means we design the questions you should ask of us. Cuts out any unforseen political difficulties you understand.
    PM: I see, yes yes quite.

    — identifying weaknesses in the science that are likely to reduce the robustness of policy options;

    PM: What are the weaknesses Stern?
    STERN: We need bigger more powerful computers Prime Minister
    PM: But I just got you a multi million pound one just a couple of years ago.
    STERN: Yes yes well I’m sorry to inform you that it appears that particular computer was burnt out Prime Minister.
    PM: (yelling) Burnt out? How?
    STERN: Errmm seems a fellow by the name of Jones burnt it out whilst frantically adjusting temperatures Prime Minister
    PM: In our favour?
    STERN: Why of course Prime Minister. He works at East Anglia, he is one of us.
    PM: Aww do tell him to be careful next time, there’s a good chap.

  16. scientific speculation on policy-relevant aspects of plausible, high-impact, scenarios even where we can neither model them realistically nor provide a precise estimate of their probability;
    This says it all. They are not modeled realistically and there is no good estimate of the probability.

    • if they can neither model high-impact scenarios realistically or provide a precise estimate of their probability, then how do they decide if a particular scenario is plausible,or not?
      For instance, in this high-impact scenario increased temperature, equals increased water vapor and so more and larger hailstones fall from a greater height and so will increase head injuries and so the worlds population will need to wear helmets.

  17. Willis Eschenbach

    The authors start out their list of recommendations with this whopper:

    In this paper, it has been suggested that the communication of science to policy-makers could be aided by:

    — scientific speculation on policy-relevant aspects of plausible, high-impact, scenarios even where we can neither model them realistically nor provide a precise estimate of their probability;

    Yeah, that’s what we need. More scientific speculation. More “may be” and “could influence” and “is possibly” and “may well cause” and “perhaps as soon as 2020” and all the rest of the weasel words and phrases that we get when scientists start to speculate.

    Because now that I think about it, Stern and Smith are right. There’s nowhere near enough scientific speculation in the media these days. We’re desperately short on scientists talking about “death trains”. We don’t have enough “maybe”s in our scientific papers. We’re lacking in scientists who are willing to make up “scary scenarios”. We haven’t heard enough SWAGs* about the upcoming Thermageddon. Clearly, that’s the problem right there …

    I swear, I don’t know what planet S&S live on that they think we’re short of “scientific speculation” about the horrors that assuredly and absolutely are guaranteed to have a good chance to possibly be awaiting us. By 2010. Honest. Unless this is now 2010 or beyond, in which case I actually meant the horrors that, no question, ought to have a distinct possibility of definitely, perhaps catastrophically, affecting us by 2020.

    w.

    PS: *

    WAG: Wild A**ed Guess

    SWAG: Scientific Wild A**ed Guess

    • So what’s a CGWAG?

    • Willis,

      It is too late for the people that will freeze and starve to death.
      MASSIVE precipitation is in the works due to the salt water changes that started 4 decades ago.
      IT THAT TEMPERATURE BASED???
      No!

    • Willis –

      I can enlighten us all with something I found in a peer reviewed scientific paper – nothing like speculation or guesswork or imagination. It is from Hansen&Sato (2011) which I read for my edification and understanding of the future of the climate.

      They say, as a result of their scientific calculations and reasoning (no WAGs..) –

      “..goals of limiting human-made warming to 2 degrees and Co2 to 450ppm are prescriptions for DISASTER”

      I have to assume that disaster has a very technical meaning and also that it will be inevitable, but either way, to my layman’s ears it really doesn’t sound like good news. It sounds very much like it’s going to be ‘worse than we thought’ which I think was fairly catastrophic – as far as I remember. And seeing as we’ve had the best part of one degree already, I can only assume that the disaster is quite close at hand – Hansen didn’t give a date, but maybe he was sparing us the really bad news until next week, when the first bit has sunk in.

      Isn’t this what the finest of scientists our obliged to do? To cut through the uncertainty with rigour and detachment and give us the hard truth, straight and undiluted? That the ‘evidence’ shows that we should be running around like headless chickens screaming that the end is nigh? Or at least that’s what it can look like for a fearful person on a dark night when they let their imaginations run away with them? As human beings have done since the beginning of history?

      Oh. Right. It’s all nonsense then?

      Thought so.

    • Agreed Willis. The fact that he’s advocating more extrapolation and speculation EVEN when there is no or vanishingly little evidence to support it erodes any credibility he had.

      It’s astonishing, just how many proposals or suggestions related to climate change and it’s ‘improvement’, that fly in the face of everything i know and have been trained to do in science.

      There seems to be a massive collective blindness going on here. It’s like Taggert Rail/Reardon steel all over again…

    • “More “may be” and “could influence” and “is possibly” and “may well cause” and “perhaps as soon as 2020″ and all the rest of the weasel words and phrases that we get when scientists start to speculate.”

      Well, the thing is, those aren’t weasel words, they are the truth. Whenever we start talking about inferences from a set of data, until actual analysis and experiments are done, all we can say is “possibly”, “potentially”, etc. Even with our analysis, we are never 100% complete on our own, and there’s always room for massive reinterpretation when new data or techniques get used.

      So… science is highly fluid. And I guess that’s the problem, the disconnect. How can we make policies, which have tangible, real world impacts, on fluid concepts that are still undergoing the long trial of testing? Or when there’s many alternate hypotheses and interpretations on the data?

      See, us scientists are not being dishonest or weaselly, we’re being completely forthright when we use those words, as science is never truly settled, and that’s the beauty of it–always more to discover even in what we think is firm and done.

      In the end, I suppose, that real issue breaks down into how can policy makers choose one set of hypotheses, data, and interpretations over another, and choose to carry forward real world plans that affect all our lives based on that? When is it appropriate, when do we know it’s safe to act? And that really is the discussion of uncertainties, I feel.

      I hope my words help in showing us scientists are being totally honest (with some exceptions), it’s just what people do and where they take our information from there, that’s where problems can arise and stuff can get lost in translation. Someone always has to attempt to apply science, that’s where our technology comes from, but with such huge things as policy we need to be ever the more careful… There really is no easy answer, but I think discussing uncertainties and what it all means is definitely a step in the right direction.

      • Ged (2.32 pm) – yes these words can be used to convey scientific uncertainty but too often they are used to give plausible deniability to alarmist statements that do not represent the study results fairly. A neutral statement would be ‘the likely range is between… and …’. That is, it gives as much prominence to the equally likely ‘less alarming’ end of the range found by the study as to the ‘more alarming’ end. But, clearly it does not make for as arresting a headline as ‘could be as much as…’.

  18. What about an idea of international or group nations that had a focus of
    enabling rather than denying. It’s focus was doing things instead of preventing things from being done.
    No doubt one thinks about throwing money at it. Something like International Monetary Bank. Isn’t what I mean. Instead I mean a treaty that
    allowed something. An example of this is the provision of NPT that is suppose to encourage nuclear energy use. And instead of it being a provision, it’s the sole purpose of a treaty. It has a single focus.

    One application could have to do with seeding ocean with iron so to capture CO2.
    What I am talking about isn’t an organization which funds such project, but rather a organization to governs it. So it would establish rules which it’s member follow. How the members get capital or funds is a separate matter.

    Or something which has profit associated with it, would methane hydrate
    mining. Again the purpose is to allow and formulate laws that govern it.
    Sheild members from arbitrary national bodies [which assumes these nations are part of treaty agreement granting this power].

    You could couple fish farming in international water with iron seeding- thereby make iron seeding “profitable”.

  19. Smith and Stern refer frequently to uncertainties in the “models”, but I think they mean uncertainties in GCMs. If we are interested only in global temperatures, we can make reasonably confident predictions about responses to CO2, within a relatively constrained range, for the rest of this century from transient climate response estimates based on simple energy balance models.

    I see the problems for policy makers as twofold. First, policy makers are relatively uninterested in global temperature. Rather, they want to know what will happen where they live – to temperature and to the prospects for floods and droughts. The simple models can’t do this for them, and the GCMs can’t yet do it very well and will never do it perfectly despite improvements in downscaling. Ultimately, global temperature is likely to catch up with everyone, but “ultimately” is a long time from now for most of us, and an eternity for policy makers.

    The second problem is I think more formidable if mitigation is a policy option under consideration. A society that reduces CO2 emissions will pay the entire cost of that effort, but the benefit will be diffused among all nations (plus the oceans). If even a large affluent society reduces emissions by 50% for example, it will bear the cost, but if no other nation acts, the reduction in future warming will be almost undiscernibly small and thus no commensurate benefit will ensue. Even global participation will fail to produce clear benefits for decades, an interval longer than the political lifetime of most policy makers.

    This is the familiar Tragedy of the Commons. Its implications for very short-sighted policy makers will be to do nothing. The larger problem, perhaps, involves implications for policy makers with some foresight, who want to avoid potential future adversity that their society might incur. These individuals will be motivated to act, but not necessarily to mitigate. Rather, their best option, where “best” is defined in terms of multidecadal national self interest, is to invest exclusively in adaptive measures – higher seawalls, irrigation systems in drought-prone areas, and better air conditioning for their summer homes. The society will pay the entire cost of adaptation, but it will also reap all the benefits. This is particularly advantageous for affluent nations that can afford the technology.

    The optimal balance between mitigation and adaptation is a huge topic and not one I’m qualified to address except to say that most expert opinion sees a need for both. In particular, adaptation requires a relatively stable future, even if it’s warmer, and adaptive measures that must change every few decades are a poor investment compared with adaptation combined with mitigation that will reduce the frequency of required upgrades to the adaptations. Some consequences of rising CO2 will be extremely difficult to adapt to – one example is the potential damage to the marine food chain from ocean acidification, another huge topic that can’t be adequately addressed in this thread but deserves attention.

    If I had to choose the most daunting obstacle from the above list in translating science into policy, scientific uncertainty, while an important concern, would deserve a lower priority than the difficulty, in a world of nations that cherish their national independence, of overcoming the tragedy of the commons without an unacceptable loss of national sovereignties – in other words, the need to balance national and global interests. I don’t know the best solution for that balance, but in my view, the worst two would be the two extremes.

    • Fred Moolten –

      You say that ‘ultimately, temperature is likely to catch up with everyone..’ which from your usual style of reasoning I guess is not meant in just a facile sense. I assume you mean that a temperature change is likely, in the end, to be BAD for everyone.

      I notice that it is on just this point of the argument that I find myself in sharp disagreement. I think there is a subtle point where scientific reasoning melds into imagination [unnoticed] that leads many of us astray.

      The arguments for non-feedback responses to changes in CO2 are clear and bright and not generally prone to subjectivity or conformation bias. However, from that point onwards, though we persuade ourselves that we are remaining objective I think we can be less and less certain of our understanding. But imagination takes over! From all the assumptions necessary to construct a high sensitivity, to speculation about disproportionate ice-melt, to the guesswork involved in prophesying floods or droughts….. I have much less confidence than you do that it is ‘likely’ that bad things will happen to us all. I doubt very much whether we can make any ‘good’ or ‘bad’ speculations about a degree or two temperature rise on the scale of, say, a century.

      So much for one aspect of my scepticism..

      I also have grave doubts that talk of ‘mitigation’ is based on any kind of solid foundation. I have been arguing on another thread that I don’t see the last 23 years as having ‘mitigated’ any climate change because I don’t view any carbon as having been left in the ground. I accept that one perspective is that some countries have ‘reduced’ their emissions. I don’t believe that has meant quite what the individual nations like to believe. In a global market for fossil fuels, a slight reduction in consumption from one or two players merely reduces the price (and therefore increases the consumption) for the rest. Even if it was measurable, a tiny reduction in Europe’s consumption would only pass on savings to India, China and Brazil.

      For believers in the efficacy of mitigation (assuming CCS is a dead duck) there has to be belief in the vision of many hundreds of billions of barrels of oil equivalent being left in the ground, Essentially for ever – delaying its combustion for a year or a decade will serve no purpose. So, in a global market we have to believe that there will be this vast quantity of economically recoverable fuel simply left in the ground – by everyone – when there are people very willing and able to dig it up and people desperate to purchase it.

      It is worth remembering that fossil fuels have many meanings. They can mean electricity for those without it, hospitals for those who are sick and food for those that are hungry. You are going to have to make a case to all these people that they must do without those things that fossil fuel can become because you have a ‘belief’ that it is ‘likely’ that some negative consequences will ensue – at some point in the future, to someone else. Good luck with that one..

      To put it another way. If a hundred billion barrels represents something like a tenth or two of a degree Celsius, you are going to have to explain that the alleviation of suffering for multiple generations of people is going to have to be suspended because of something that we cannot even reliably measure.

      A third way of looking at it is that all the expensive windmills in the world will have been pointless (in terms of their inefficiency) unless we can point to vast quantities of fossil fuels remaining in the ground that we have prevented EVERYONE from using. That no poor country has been allowed to buy, or sell it if it is under their territory.

      As you might guess, this vision isn’t in I’ll be investing in any time soon. Is there any other way you visualise ‘mitigation’?

      • Hi Anteros – Thanks for your long comment. The issue of consequences is an enormous topic, impossible to address adequately here. Somc have already occurred (mainly from sea level rise and the consequent increase in the damage from hurrican storm surges), others are uncertain, and still others are predictable but on uncertain timescales. An example of the latter is future sea level rise. A rise of several centimeters by the end of the century will threaten the lives and welfare of at least several million people. A rise of 1 meter, which is somewhat unlikely but not impossible during the next 89 years, would impact close to 100 million. That will happen eventually, however, although not necessarily by 2100. At any rate, those are my understandings, but multiple posts and threads would be needed for an adequate discussion.

        The other aspects involve areas where I’m poorly qualified to comment, so I will offer only a very tentative perspective. I’ve seen various estimates of the quantity of fossil carbon underground. If the largest ones are accepted, I would guess that even if we did nothing to curtail carbon emissions, more than 50% of fossil carbon would remain underground permanently – that’s because it would be too uneconomical to extract. The remaining fraction, however, can’t simply be categorized as easily recoverable, because its recoverability varies considerably from deposits that are worth extracting even at low sale prices to others that would only be worth the cost at high prices. To me, this means that the fraction that will remain underground permanently will be highly responsive to the cost and availability of alternative energy. If alternatives can be developed to the point of low cost availability, it should be possible to leave most fossil fuel underground, and this is an incentive to subsidize alternative energy development; the question of whether to subsidize fully developed alternative energy in order to maintain a low price is a political one that I’m not prepared to address.

        The issue is further complicated by the difficulty of estimating the true cost of fossil fuels. An argument can be made that the cost should include the societal and human economic and health costs of rising CO2, which of course would add to the apparent costs, but both the economics and politics of this consideration are formidable.

        I expect that as altenative energy is further developed and as oil, coal, and ultimately gas supplies dwindle to the point of increased extraction costs, a substantial substitution for fossil fuels will occur. Since this will be permanent, it will consign some of the fossil carbon to remain untapped forever, but I don’t know how much.

        As to the cost/benefit calculations regarding the poor or the sick, that’s an easy one. The lowest economic dwellers and nations will be most harmed from continued CO2 emissions, and there is no reason why the costs of mitigation should be imposed on them – that’s a societal decision on both a national and international basis. The contribution to global warming and ocean acidification from fossil fuel uses by the underprivileged is trivial compared with the contributions from the affluent, and so I see harm to the poor as very much a straw man argument.

        This obviously only touches the surface of the issues, but might be worth further discssion down the road.

      • As to the cost/benefit calculations regarding the poor or the sick, that’s an easy one. The lowest economic dwellers and nations will be most harmed from continued CO2 emissions”

        I fail to see why it is “an easy one”.
        Modern society has been related to CO2 emission.
        Remove CO2 emission without some other technology
        and you would not have the modern world.
        Without CO2 emissions, one would not have solar panels,
        nor modern wind turbine, or nuclear energy.
        Nor would have cheap steel- few bridges, no high rises, etc.
        So, we would horse carriage and oxen to plow the fields- not that these
        older technologies lack CO2 emissions [and methane].
        It seems the poor or sick have benefit as has as rest of humanity.
        And india and china are dependent technology related to CO2
        emission and these countries have large percentage of the world’s poorest.
        And China and India have benefited by being able to use technology developed elsewhere.

      • gbaikie- I agree that the entire world has benefited from the industrial era, which depended heavily on fossil fuel energy. My point was a different one – in underprivileged nations and among poor people, the use of fossil fuel derived energy is not enough to cause a significant climate problem – they should be able to have all the electricity they need while the world begins to substitute alternative energy sources for carbon based energy, with affluent nations leading the way,. In addition, the costs of both adaptive and mitigative measures should be borne by those able to afford them, but I see that not as an impassable barrier but as an issue that needs to be resolved in the moral, social, and political arena.

      • I would just add that the human race may not choose to act in the way I suggested above. I was simply suggesting that if we want to, we can protect the poor from adverse effects of climate change, permit them all the fossil fuel energy they need for a decent quality of life, and still transition to alternative energies as fast as technology permits (while also reducing carbon emissions by conservation and increased energy efficiency). If we don’t do it that way, it’s not because we can’t.

      • Fred

        Sorry, but no we can’t.

        Your idea can not be implemented given the long term budget situation in the EU and the USA. Both areas have fundamental, structural budgetary problems, and both will already have to incur tax increases and a reduction in services to their citizens.

        In order to provide the world’s “poor people” with just electricity it would require the construction of a vast number of conventionally powered electricity generation facilities and a massive amount of infrastructure construction in the areas where you wanted electricity distributed. The cost to do so would be in the trillions since there are over 4 billion people in the world who do not have regular access to electricity. In the even that it would result in a tremendous increase in CO2 emissions from all the new power plants and from the additional concrete required for the infrastructure.

      • Rob – Perhaps you’re right and I should merely have said that we can protect the poor from adverse effects of climate change while providing enough fossil fuel energy to improve their living conditions, even if it doesn’t give every poor person in every poor nation optimal conditions. I’m even more convinced that we could do it without worsening their circumstances as others have implied. Protecting them while maintaining and perhaps improving their access to energy would be compatible with a reduction in global carbon emissions. I suspect some of this could be done through nuclear power rather than fossil fuel energy, and some fossil fuel energy that substitutes gas for coal could also reduce CO2 emissions while we transition to alternatives. The transition will inevitably take time, and to me that means that as technologies develop, they can first be employed in affluent nations with high per capita energy consumption rates. These include the U.S.

      • I’m not very knowledgeable about rural electrification, but it appears that many societies are embarked on rural electrification programs that focus more on better distribution as well as renewable energy than on greatly expanded uses of fossil fuels.

      • Fred

        I am in many of these countries frequently and that is what forms my opinions. The problems that westerners see in SW Asia and south america will not be solved by foreigners.

    • Fred

      How do you come to the conclusion

      “If we are interested only in global temperatures, we can make reasonably confident predictions about responses to CO2, “

      When the paper you cite only concludes

      “Although our estimates are certainly sensitive to these uncertainties and to natural variability, they may be sufficiently narrow as to still be useful.”

      Seems like YOU are making a huge leap based on your belief of what is known vs. what the data actually justifies.

      • Rob – If you read both of the papers discussed in that thread (Padilla et al and Gregory and Forster 2008) as well as others that were referenced by Piers Forster in a previous thread, I think you’ll see that the statement you quoted refers to the fact that the range can’t be narrowed to a very precise figure, but not that the cited ranges are very uncertain. From the two papers, the transient sensitivity to CO2 doubling ranges from 1.3 C to 2.3 or 2.6 C, and these translate into the 2 to 4.5 C range commonly cited for equilibrium sensitivity. These are high confidence estimates (check the confidence intervals).

    • Fred: “…we can make reasonably confident predictions about responses to CO2, within a relatively constrained range, for the rest of this century…”.
      Even the IPCC (WG1 report, 2007, titles of chapters 10 and 11) seems to lack confidence about using the word ‘prediction’ with regard to climate. They prefer ‘projection’, which I take to mean a continuation of a trend. So what makes you so confident?

      • Coldish (and also Rob Starkey) – I wasn’t referring to climate in general, which is a more difficult challenge, but rather to our ability to predict the warming from a given increase in CO2. The transient climate sensitivity methods have now complemented GCM approaches, with both leading to estimates in the ranges reported in the earlier literature and cited by the IPCC. Since this is not a technical thread focused on climate sensitivity, if you want to see more of the evidence and discuss the details, I recommend visiting the thread on Probabilistic Estimates of Transient Climate Sensitivity. There is extensive discussion there, including comments of mine and valuable inputs from others on both the Padilla et al paper and Gregory and Forster 2008. The various estimates still span a range rather than offering a precise figure, but there is now enough evidence to be confident that climate sensitivity is very unlikely to be much lower or higher than the spanned range. This conclusion, I should add, applies to sensitivity to long term forcing from CO2 or similar forcing modalities, and doesn’t apply to climate responses to short term fluctuations such as those due to ENSO – phenomena addressed in reports by Lindzen/Choi, Spencer/Braswell, and Dessler. The latter estimates involve different climate dynamics and sensitivities.

      • Fred

        I suggest we believe we understand within a pretty wide range of error and not as narrowly as you seem to believe. My evidence is that the “models” did not accurately forecast the degree of observed warming over the last 10 years. We still do not seem to fully understand the degree of impact from the deep oceans from what I read.

        If you still disagree, then make a prediction of how much it will warm between now and the end of 2014, and you margin of error, and I may be willing to make a “friendly” wager with you.

      • Rob – I referred to climate responses to CO2, which we can predict with reasonable confidence to be within the range cited. Predicting global temperatures involves much additional computation, including the problems created by uncertainties regarding aerosols. The only point I was trying to make above regarded our increasing confidence in quantifying the temperature response to CO2.

        (Also, as noted in the Pause thread and other threads, 10 years is too short to interpret or predict multidecadal global temperature trends because of short-term variability that would be less of a problem over more than one decade The Pause thread would be a place for further discussion).

      • Joachim Seifert

        Fred,
        dont forget the great millenium climate forecast made by 40 institutes, under participation of the most famous climate brains – called TAR and SRES from 2001. They had all the means at hand, the fastest computer systems and not one of them predicted the global temp plateau starting from this very day of TAR publication in March 2001……
        Their models included ENSO, albedo, CO2-Increase (I even have a paper of Mr. Cox -UEA , proving the multiplyer effect of newly emitted CO2 – like “interest” of the “interests” in banking….)
        The most effective and scrutinizing milleniums climate work had been undertaken…

        Now you say, listen, the 40 institutes did a lousy job, so much unknowns,
        climate is full of uncertainties – but NOT THE CO2, this is the ONLY thing certain…..????
        You downgrade the institute and their models…….
        Why didnt those climate brains ask you to forecast the present temp plateau? What does the feeling in your left knee tell you when this explosive force of CO2 will shoot the temp up to African levels…..
        Any reply???
        (I calculated the plateau the first time in 2006 and it will continue for
        decades, with or without CO2, which is not the driver in the climate seat….)

      • Smith et al predicted natural variability would overwhelm anthropogenic-caused warming in the years 2005 through 2009, and that half of the years after 2009 would be warmer than 1998, and they gave a specific temperature range for 2014.

      • JCH

        Would you care to review all their other predictions to see how they did?

      • Rob – the paper is behind a pay wall.

        The earth is not tied to HadCrut. On two temperature series, 2010 beat 1998.

      • Fred, thanks for your response. Before I get time to read the thread you mention the caravan will have moved on, but cheers anyway.

  20. Corporate Message

    JCurry,

    I especially like this one describing the Gaian exhortations: The Pathetic Fallacy
    wiki:
    “When human-like qualities are attributed as well, it is a special case of reification, known as pathetic fallacy (or anthropomorphic fallacy).

  21. No scientific extrapolation is complete without a quantitative estimate of the chance of its own irrelevance.

    That’s a good line.

  22. I think that you’d have a terrible time translating the recommendations of the authors into operational terms, that is actual instructions for actual cases.

  23. Brisbane Australia this last year represents an ideal laboratory to discover why scientists’ science translates into terrible public policy. A billion + desalination plant was built and is mothballed. A reservoir held onto water in the face of a torrential rainfall. The inability of governmental policy makers to adapt to what the actual climate/weather situation as it presented itself, speaks to the horrible interface of governmental officials and scientists. Governmental officials universally erect barriers to identifiable accountability. Nobody wants to be on the “hot seat.” So the decision making process becomes diffuse and obscure. Scientists are asked: “when will the rain come and how much?” Scientists who are readily seduced say: “tuesday and 12 inches.” The real answer is: “I don’t know, it could be a lot or a little.” Policy makers may have built the second reservoir as the rain upon the plain may happen again, as 12 years earlier it had. Policy makers believed that global climate change would lead to extended, never ending, drought; hence the desalination plant. The climate scientists who made the policy recommendations and the policy makers to my knowledge have not been held accountable: i.e., their jobs and old age benefits. “Bad things happen.” Here in the USA we have another group of scientists recommending policy decisions and they are not on the “hot seat.” They are not held accountable: jobs and old age benefits. If such accountability were present, I believe there would be a more deliberate and gradual and more informative process and we would all be “on board.” If you tell me that in 5 years you predict X, Y, & Z, and if it doesn’t happen, then be prepared to lose your job and benefits, I might find you more credible as you have real skin in the game. As it is right now, Mann, Hansen, Schmidt, etc do not have enough skin in the game to be believable nor are their opinions valuable. A calculation widget to handle uncertainty is not a basis to make large scale public policies upon. If one is sure of the direction the uncertainty is headed, then by all means not only say so, but be willing to predict in a short enough time period that you can be seen as correct or otherwise. Step aside if you are wrong.

    • Same thing happened here with weapons. We bought a lot we never used. Anyone remember the B-58?

      • Unfortunate command decision about the tactical deployment. The plane was designed for high alt penetration and stupid generals ( mcnamera) decided that the soviet high alt threat ( guideline) could only be countered by a lo-alt mission profile. Basically, the higher ups over reacted to an over blown threat assement and change the mission to one the platform could not perform (Benefits of working as a threat analyst ) Given that the SA-2 could only engage one target at a time, and given that it was a fixed defensive asset (non mobile) the right decision would have been to
        1. conduct defense supression missions first
        2. use wild weasels

        So, you picked an example where a good asset was unused because people in high command were risk adverse

      • Well that, and I helped build them. The B-58 was an awesome little bomber. It was faster than many fighters. I will never forget seeing one explode at altitude. Just by chance I was looking up at the plane when that happened. The theory was an engine sucked in a weather ballon.

      • Steven,

        McNamara was an idiot, but the B58 decision was not among his costly mistakes. The mission the B58 was designed for was high-speed, high altitude deep penetration (into the Soviet Union) strategic bombing, which boiled down to delivery of a single nuclear bomb. Seemed like a plausible idea when development began, but advances in SAM capabilities rendered that mission impractical. The wild weasels of the time would have been F100s, replaced in mid-60s by F105s. They had some success in Vietnam but were very vulnerable to even the antique Mig 17s. Things would have been worse against more modern and far more numerous aircraft flown by Russian pilots. Besides I don’t think they had the range for the deep penetration mission. The B58 had outlived it’s potential usefulness, about the time it became operational. But we did buy a lot of weapons that proved to be very useful, see the continuing bang for the buck from the B52. They more than make up for the B58 mistake.

        I will tell you how stupid McNamara and his boss LBJ were. For almost entirely political reasons, they restricted use of the B52 strategic bomber to tactical missions in S Vietnam, and used the little A4s, F4s, F105s fighter-bombers for strategic bombing in the North. I often wondered whose side they were on, when I was on the radio trying to round up a couple of Phantoms to drop some desperately needed bombs, and I knew there were dozens of them flying around Hanoi attacking the same bridge, day after day. I guess what we needed were more scientists and technocrats involved in making political decisions. In other words, more McNamaras.

      • Don I shared a cubical with a wild weasel ( Bill Bryant: maybe you knew him
        http://mysite.verizon.net/t.gummo/id6.html)
        Hella of a teacher. and learned most of what I know about guideline from him and the other wizards. No love for Mcnamera. haha.. that bridge. I heard plenty of stories about that bridge ( or ones just like it )

        Or maybe you knew LtCol richie graham ( 64th agressor sqadron at nellis), also a great teacher.
        or the first slot man Bob McCormick: Learned a bunch from him
        http://aerobaticteams.net/sabre-dancers.html. He was a terror in the office but weekend golf with him and chuck knight was always fun
        http://boards.ancestry.com/topics.obits/43178/mb.ashx?pnt=1

        Everest Riccioni. also a brilliant man to learn from.
        http://en.wikipedia.org/wiki/Fighter_Mafia

        I believe the aircraft had a payload of 5 nukes, maybe M carey can confirm. Working in crew systems, the ejection pods of the B58 were most interesting. ( working on supercruise the high Q ejection requeirements were a bitch)

      • Steven,

        I knew a few pilots casually, none of those you mentioned. I was on the ground with the 82nd Airborne. We were essentially groundbourne there. Most of my conversations with pilots were about me asking for help. I only knew them by their call signs.

      • Yes, except the purpose of purchasing weapons is so that you don’t have to use them…Si vis pacem, para bellum.

      • That’s true. Weapons deter. The desalination plant was not built to deter, but both it and weapons could be thought of as insurance.

      • Fair enough, provided it was billed as such prior to construction (for the record, I don’t know whether it was or not).

  24. Broadly speaking there are two traditions in Public Economics (Stern is a public economist). One of those is prescriptive, and imagines policy set by a “benevolent social planner.” This doesn’t mean the economist imagines a dictator: Rather he or she imagines that governments want to choose policies that maximize social welfare (suitably defined). Stern is a member of that tradition. He understands free rider and commons problems, but he ignores them for the purpose of trying to describe optimal collective policy. There is a value in doing that. Skeptics, calm down, Nick Stern is doing what he sees as the job of the prescriptive public economist. Fred Moolton, Nick Stern has forgotten more about commons problems than you know. He is simply setting these aside, imagining that nations can solve them, for the purpose of thinking about optimal policy.

    The other tradition in public economics is descriptive. It is about interest groups, free riding, political processes and how these operate within democratic and non-democratic institutions to produce actual, observed policy outcomes. That is an interesting subject, in fact crucial to thinking about what policies actually get put in place. But one would have nothing to measure those against without the kind of work Stern and other “prescriptive public economists” do. I diagree with much of what Stern says, but people do him an injustice by imagining he is some elite anti-democratic monster. He is trying to imagine what kinds of information would be most beneficial for improving an optimal world-economy-wide policy, not trying to advocate that a small cabal should run the world.

    • Although actually he may privately believe that a small (or perhaps not so small; must have plenty of jobs for the cronies) cabal could do better job ..

    • NW is being naive.

      “Nick Stern is doing what he sees as the job of the prescriptive public economist. […] not trying to advocate that a small cabal should run the world.”

      Technocrats can tell themselves whatever they like. The facts are that the politics of the world (and nation states) is increasingly post-democratic, and increasingly organised around environmental issues.

      Nick Stern is uncompromising about climate change. He’s not merely some academic, reflecting benignly on ‘how to make the world a slightly better place’ by ‘optimising’ it. E.g. http://www.youtube.com/watch?feature=player_embedded&v=hDRUXA9s854 There’s nothing subtle about his message there. No nuance, or caution about his claims. Just ‘do as I say or the planet gets it’.

  25. Little wonder that Joe n Jenny Sixpack are leaving the ranks of believers in droves.
    Post modern science, false confidence, inaccurate models.
    Once the tone turned nasty and the volume elevated, the few observers remaining have been reduced to trying to devine the error bars in confused studies full of obtuse parsing and meaningless phrasing.
    Serandipity….
    Cheers
    Big Dave

    • Serendipity….
      An unexpected surprise (the future climate not the poor spelling skills)
      Cheers
      Big Dave

  26. “specifying the spatial and temporal scales at which today’s climate models are likely to be mis-informative, and how those scales change as we look farther into the future”

    The various scales aren’t divorced from one another, nor are time & space (i.e. marginal temporal & spatial distributions are NOT collectively equivalent to the joint spatiotemporal distribution). Paradoxical scale discontinuities can & do exist.

    It would be VERY helpful if someone – academic or otherwise – found time to illustrate a large variety of examples of spatiotemporal paradoxes to help educate those involved in the climate discussion (including climate academics). Statistical summary paradoxes are the sorts of things that most people don’t even know to imagine. What makes them a particularly weighty consideration is that they easily lurk beyond conventional notice due to hidden structural deficiencies in mainstream statistical inference paradigms (that should be made clearly explicit).

  27. Judith,

    Uncertainty is the crutch that keeps AGW alive.

  28. Unfortunately the fallacy of misplaced concreteness is not the only one to have infected science. It is common to see fallacies of composition, division and equivocation, especially in climate science.

  29. Maybe it’s because its late, but I am having a great deal of difficulty discerning exactly what these authors are talking about.

    • Don’t worry.

      It’s really early in England and they’re not making much sense here either :)

  30. Summarizing the post:
    A need of risk assessment of 2 C increase in global temperature.
    Policy making requires many aspects. Global models not adequate proven.
    Climate scientists should speak with audience. and focus on
    speculations regarding changes in climate while indicating how inadequate
    such guessing is. Discuss large scale and focus small scale details regarding future guesses. Discuss what you going to focus in the future which allow more precise guesses to policy wonks.
    And basically insist yakking for long periods of time rather provide a brief
    in which politician can easily forget about and return to playing golf etc.

    So what is the risk of 2 C global temperature increase. Well time is a very important element in any risk assessment. And since the time involved is more than 50 year, that essentially concludes the risk assessment.
    As in when it is closer than 50 years, let’s talk about it. And perhaps by that time, you could have more precise information and pols could have a better chance of getting any public support for such long term forecasts.
    So that is the obvious answer which we shall ignore.
    It’s my guess that a 1 C increase in global temperature over 2 centuries would more total affect than compared to 2 C increase in 100 years- in terms of climate changes. Or a 10 C increase in one year would have less affect upon climate. A 1 year 10 C increase would have enormous affect upon the minds of humans as it would thought to be nearly impossible. One would have enormous uncertainty- and therefore one couldn’t predict with any confident what the temperatures would be in coming years.
    But a 10 C increase in global temperature would not necessarily kill animals or people, ocean temperatures wouldn’t be affected much, nor glacier.
    Whereas Global temperature remaining near constant of increase by 1 or 2 C for the next century [and century after that] would cause larger changes in climate. And we [meaning modern west civilization] have had a larger increase in temperature and over a few centuries.

    So, 2 C increase in global temperature isn’t only about climate science, it involves an understanding of recent history. Recent history, will tell you it’s possible we could have volcanic events which erase a summer, they could dust bowl drought, and various other regional or global affects on climate.

  31. This reminds me of the situation in Europe in the 16th century when forests were disappearing and it looked like the collapse of civilization was imminent. What happened of course was that new sources of fuel were discovered, namely coal. I think all this hand wringing is originating from a lack of imagination. There are plenty of options for replacing fossil fuels that are coming on line. I’m reminded of Malthus and his predictions of doom regarding population. We have plenty of options, including nuclear power, geo-engineering, and adaptation. I do get tired of the doom and gloom doctrine that somehow mankind is going to destroy the planet. This idea is an old one (going back to ancient times) and its pretty effectively debunked by Bertrand Russell among others.

    • Please point me to Russell hustling Malthusians. I’ll never forget ‘as red pepper to dynamite’. Oh, wait, that’s Shaw.
      ================

  32. OT but timely news … Michael Mann awarded Hans Oeschger Medal

    “Michael Mann, professor of meteorology and geosciences and director, Earth System Science Center, Penn State, was awarded the Hans Oeschger Medal of the European Geosciences Union.
    The medal was established in 2001 in recognition of the scientific achievements of Hans Oeschger to honor outstanding scientists whose work is related to climate: past, present and future.”

    http://live.psu.edu/tag/research

    • I would not give them *anything*. I would not respond or even acknowledge receipt of their emails. There is no reason to give them any data, in my opinion, and I think we do so at our own peril!

      • It’s not hard to understand why researchers would be reluctant to share their data with auditors who provide fodder for histrionics.

      • That’s a lovely defense that anyone could use. however, you forget that science doesnt care about the histrionics. The science claim can only be tested if the data as used and the method as employed is shared.
        Will some people misuse the data. yup, you saw people stumble around with BEST data. in the end you trust the scientific method. If I share my data and my code I share my power. I cant control what you do with that, but I can say that I’ve taken all measures to remove my personality from the result. In your world data would only be shared is scientist X, trusted scientist Y, not to be emotional. Hardly a test anyone can apply rigorously. I’ve yet to find a case where data was released and code was released and the end result was WORSE, than the end result one sees from hiding code and data. So, its easy to see WHY researchers dont share, its impossible to JUSTIFY it, regardless.


      • Attached are the calibration residual series for experiments based on available networks back to: AD 1000, AD 1400, AD 1600… You only want to look at the first column (year) and second column (residual) of the files. I can’t even remember what the other columns are! mike
        p.s. I know I probably don’t need to mention this, but just to insure absolutely clarify on this, I’m providing these for your own personal use, since you’re a trusted colleague. So please don’t pass this along to others without checking w/ me first. This is the sort of “dirty laundry” one doesn’t want to fall into the hands of those who might potentially try to distort things…

      • M carey has no snappy answer for that

      • Au contraire, who can say he doesn’t have dirty laundry?

        Dirty laundry feeds an industry.

        ttp://www.youtube.com/watch?v=8icJnavt2So

      • snappy? hardly, you just equated Mann with Murdoch.

        own goal.

        Your first excuse was that mann was afraid of what others would do with his data. Your second defense is everybody does it, or some people are worse. please get the hell off my team. there is a planet o save and you are not helping.

      • Steve, anyone who could conclude I equated Mann to Murdoch could read McIntyre’s blog and conclude he equated Mann to Sandusky. I’m sure equating wasn’t my intention, and I imagine McIntyre would deny equating was his intention. However, I am not so sure it wasn’t McIntyre’s intention.

        I doubt many people would claim to have no dirty laundry. I would be mistrustful of anyone who made such a claim.

        I s

      • M. Carey is also disingenuous, as he knows that Mcintyre does not equate Sandesky with Mann but rather, the systematic failure of internal inquiries related to Sandesky, Kyle, Neisworth and Lasaga to elucidate truth with the entirely predictably with hindsight, similar result of the Mann inquiry.

        Apparently, Carey remains absolutely convinced that the Hockey Stick inquiry, carried by the same people for the same institution, some of whom have now been fired for gross miss-conduct, remains an unquestionable vindication of Mann.

        Good luck to Penn State’s counsel at the FOI hearing, given the prominent statements in support of refusal of release from now fired President Spanier and ex-Counsel Courtney being a matter of public record, they’re going to need it.

      • Gras Albert said on Nov 18, 2011 at 10:33 am

        “M. Carey is also disingenuous, as he knows that Mcintyre does not equate Sandesky with Mann but rather, the systematic failure of internal inquiries related to Sandesky, Kyle, Neisworth and Lasaga to elucidate truth with the entirely predictably with hindsight, similar result of the Mann inquiry.

        Apparently, Carey remains absolutely convinced that the Hockey Stick inquiry, carried by the same people for the same institution, some of whom have now been fired for gross miss-conduct, remains an unquestionable vindication of Mann.
        ________________

        I believe McIntyre’s attempt to draw a parallel between Sandusky and Mann is obvious and shameless.

        I believe you have been misinformed if you think the staff and faculty members who investigated Mann carried out a similar investigation of Sandusky. The names of those who carried out the Mann inquiry are given in the linked reports. Please identify those you have in mind and present evidence that they investigated Sandusky.

        RA-10 Inquiry Report: Case of Dr. Michael E. Mann

        http://www.research.psu.edu/orp/Findings_Mann_Inquiry.pdf

        RA-10 Final Investigation Report Involving Dr. Michael E. Mann

        http://live.psu.edu/pdf/Final_Investigation_Report.pdf

      • RA-10 Inquiry Report, Page 5 Paragraph 1
        On January 26, 2010, Dr. Foley convened the inquiry committee along with University counsel, Mr. Wendell Courtney, Esq. in case issues of procedure arose.

        McIntyre Blog Article, Wendell Courtney’s Last Day
        http://climateaudit.org/2011/11/12/wendell-courtneys-last-day/

        Presumably, Courtney, at the time Penn State’s legal Counsel, gave the advice that the Mann inquiry panel could determine that Mann had no case to answer without ever taking evidence from any of Mann’s critics – an action that would seem, from information now in the public domain, to be standard operating procedure when allegations of wrong doing were laid at Penn State employees, e.g. Paul McLaughlin’s allegations against Professor John T. Neisworth

        McLaughlin tried to report sex abuse by Penn State professor
        http://www.therepublic.com/view/story/FBC-PENNSTATE-PROFESSOR_6604813/FBC-PENNSTATE-PROFESSOR_6604813/

        Wendell Courtney was legal Counsel to Penn State during the 1998 ‘investigation’ of Sandusky abuse allegations, he was also Counsel during the non investigation into the McLaughlin allegations and he is named in the Mann inquiry report.

        To me, the deficiencies in the internal investigations into Sandusky in 1998, Neisworth in 2001/2 and Mann in 2009/10 are suggestive of a systematic approach to covering up wrong doing by Penn State employees by the University.

        I doubt that the lawyers in the Mann FOI case will have much difficulty in persuading the Judge to come to a similar conclusion.

      • “you just equated Mann with Murdoch.”

        I would say they are roughly equal as scientists.


      • I think that trying to adopt a timeframe of 2K, rather than the usual 1K, addresses a good earlier point that Peck made w/ regard to the memo, that it would be nice to try to “contain” the putative “MWP”, even if we don’t yet have a hemispheric mean reconstruction available that far back

      • Girma, I am always skeptical of quotes plucked out of context. Before jumping to the conclusion Michael Mann meant what you would like to think he meant, read the context and the discussion.

        http://unlocked-wordhoard.blogspot.com/2009/11/medieval-warm-period-and-cru-e-mails.html

        But even if you don’t want to examine the context of your quote, I am puzzled if you prefer a 1K timeframe rather than the 2K timeframe Mann is proposing. It suggests you want to hide the first 1K. Why would you want to hide it?

      • Words mean what they say:

        it would be nice to try to “contain” the putative “MWP”

        WHY?

      • Is that a rhetorical “why”?

      • Girma, the the word “putative” means assumed.

        If you start with a 1K (AD 1000 -2000) timeframe temperature graph, the left part (AD 1000) has warmer temperatures(MWP) than the middle part(LIA), so some might assume the temperatures were even warmer before AD1000. I think Mann wanted to contain this assumption by showing a 2K timeframe ( AD 1 – 2000). Obviously, this would not hide the MWP.

      • Would it be nice to have a reason to call it global?

  33. What ho, Virginia.
    What’s been seen in the shower?
    Water closet mail.
    ==========

  34. We must be living in a parallel universe where people continue to tell a story that is contrary to observation.

    The IPCC projected, for the next couple of decades, a warming of 0.1 deg C per decade for the case if human emission of CO2 had been held constant at the 2000 level, and 0.2 deg C per decade for the business as usual case.

    What does the observation show?

    Observed temperatures are less than the case if emission of CO2 had been held constant at the 2000 level.

    http://bit.ly/qbt4g6

    What is the reality here?

    There is no uncertainty that the projections are found to be wrong.

    No policy response is required for AGW as it is an unconfirmed theory.

    • Girma

      Do I take it from your plot that you no longer subscribe to the hypothesis that there is a 60-year cycle determining the temperature?

      Well, that makes this graph superfluous:

      http://www.woodfortrees.org/plot/hadcrut3vgl/from:1880/to:1891.5/trend/plot/hadcrut3vgl/from:1993/trend/plot/hadcrut3vgl/from:2000/trend/plot/hadcrut3vgl/from:1933/to:1951.5/trend/plot/hadcrut3vgl/from:1940/to:1951.5/trend/plot/hadcrut3vgl/from:1873/to:1891.5/trend/plot/hadcrut3vgl/mean:157/mean:163

      (It compares the short trends around the 60-year points in HadCRUT, note that the actual trends in 1880 and 1940 really did fall on both 17-year and 10-year lines, while in 2000 both the 17-year and 10 year are rising.)

      The probability of the trends seen in that graph were there an actual fall in temperature (based on examining all falling 10 year trends in HadCRU and applying Bayes’) is extremely small, and fairly reliable as a predictor.

      Go ahead and check for yourself, if you like, and comment if you wish on the validity of the method.

      And.. the probability of the rise being so small as 0.06 deg C/decade, comparing priors with the given trends? Far smaller than the probability of the rise being so high as 0.2 dec C/decade, using the same method.

      Sadly, it’s a lot of data, and WfT doesn’t particularly store collections of thousands of trend lines and their slopes very conveniently.

      AGW is far better confirmed than any theory presented by Girma.

      • Bart R

        Thanks for a post without any name calling for a change.

        Bart, these are my honest interpretation of the data:

        http://bit.ly/sxEJpK

        http://bit.ly/szoJf8

        1) Why does a line pass through all the annual global mean temperature peaks?

        2) Why does another line pass through all the annual global mean temperature valleys?

        3) Why are these two lines parallel with a slope of 0.06 deg C per decade?

        Bart, you must address these questions!

      • Annual global mean temperature peaks=>[1880s, 1940s & 2000s]
        Annual global mean temperature valleys=>[1910s & 1970]

      • Girma

        1) Why does a line pass through the “compress (12) global mean peaks” every 60 years from 1880 to 2000?

        Well, no line does this.

        I know, on your graph it appears this may be so. However, let’s decompose your graph by parts, stepwise examining the many issues of graphical analysis we encounter on the way.

        A) The first thing we notice is the superfluous-seeming offset 14.

        What does this do to the graph, and what is its benefit?

        Offset 14 changes the scale of the graph, de-emphasizing vertical aspect, leading the eye to traverse the 160 years in a more elongated trail that seems to undulate down and up and down and up.

        Can we dispense of the offset 14 without losing any information and with better resolution?

        It turns out we can, easily, and the image becomes almost ten times clearer to the eye.

        http://www.woodfortrees.org/plot/hadcrut3vgl/compress:12/to:2002/plot/gistemp/compress:12/offset:-0.015/detrend:-0.02/to:2002/plot/hadcrut3vgl/trend/offset:-0.42/detrend:-0.23/plot/hadcrut3vgl/trend/offset:0.1/detrend:-0.23/plot/hadcrut3vgl/from:1880/to:2010/trend/plot/hadcrut3vgl/from:1880/to:1910/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1940/to:1970/trend/plot/hadcrut3vgl/from:1970/to:2000/trend

        We do have to remove the superfluous scale lines, but as they’re vestiges of the step used to construct the lines above and below the central trend and had disappeared under the linear trend lines, nothing is lost.

        Now what does the curve look like?

        For one thing, it appears to be rising much more dramatically.

        For another, we can clearly see how poorly GIS and HadCRUT fit together, the peaks at 1880 and 1940 being supplied only and entirely by the offset, detrended GIS, and HadCRUT not peaking in 1880 but rather falling through that ‘peak line’ from a higher trend back to the start of the curve in 1850. Indeed, almost to the bottom of the HadCRUT valley, the GIS continues to pass through the ‘peak line’ even as late as 1900.

        B) But we know a better way to look at curves than compress(12). We can use a prime two pass annual filter to remove seasonal effects and smooth the relations in the monthly lines, and while we’re at it, let’s include the full run of the temperature curves (but not touch the linear trend lines yet):

        http://www.woodfortrees.org/plot/hadcrut3vgl/mean:11/mean:13/plot/gistemp/mean:11/mean:13/detrend:-0.02/offset:-0.015/plot/hadcrut3vgl/trend/offset:-0.42/detrend:-0.23/plot/hadcrut3vgl/trend/offset:0.1/detrend:-0.23/plot/hadcrut3vgl/from:1880/to:2010/trend/plot/hadcrut3vgl/from:1880/to:1910/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1940/to:1970/trend/plot/hadcrut3vgl/from:1970/to:2000/trend

        C) Well, that makes us think, “How strange.” We _could_ make the two curves more coincident, by using the Normalise function. Is there a point to this? Let’s try and see.

        http://www.woodfortrees.org/plot/hadcrut3vgl/mean:11/mean:13/plot/gistemp/mean:11/mean:13/normalise/plot/hadcrut3vgl/trend/offset:-0.42/detrend:-0.23/plot/hadcrut3vgl/trend/offset:0.1/detrend:-0.23/plot/hadcrut3vgl/from:1880/to:2010/trend/plot/hadcrut3vgl/from:1880/to:1910/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1940/to:1970/trend/plot/hadcrut3vgl/from:1970/to:2000/trend

        Well, now the illusion of the peak line coinciding with the tops of the curve is very degenerated. Where’d that appearance go? It didn’t go anywhere. It was never really there.

        D) And what’s this? Why the central trend line describing the mean rise of HadCRUT starts in 1880? And ends in 2010? Why end in 2010? The rest of the lines end in 2000 or 2002, don’t they? And HadCRUT starts in 1850, no? And both the peak and valley lines start in 1850, too!

        Let’s see what the graph looks like with the same start and end points:

        http://www.woodfortrees.org/plot/hadcrut3vgl/mean:11/mean:13/plot/gistemp/mean:11/mean:13/normalise/plot/hadcrut3vgl/trend/offset:-0.42/detrend:-0.23/plot/hadcrut3vgl/trend/offset:0.1/detrend:-0.23/plot/hadcrut3vgl/trend/plot/hadcrut3vgl/from:1880/to:1910/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1940/to:1970/trend/plot/hadcrut3vgl/from:1970/to:2000/trend

        Wow. Mindblowing. The trendline is wildly and radically different looking, isn’t it, when you base it on the whole record, or even if you just make the final endpoint 2000, where the 60-year cycle is supposed to end.

        http://www.woodfortrees.org/plot/hadcrut3vgl/mean:11/mean:13/plot/gistemp/mean:11/mean:13/normalise/plot/hadcrut3vgl/trend/offset:-0.42/detrend:-0.23/plot/hadcrut3vgl/trend/offset:0.1/detrend:-0.23/plot/hadcrut3vgl/from:1880/to:2000/trend/plot/hadcrut3vgl/from:1880/to:1910/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1910/to:1940/trend/plot/hadcrut3vgl/from:1940/to:1970/trend/plot/hadcrut3vgl/from:1970/to:2000/trend

        2) Well, the valleys, that’s a straight line passing arbitrarily between two points roughly 60 years apart. Well eight points, really, that are somewhat close to the extreme lows but not actually equal to them, and not exactly fitting either GIS or HadCRUT but looking more like it might fit the ensemble of both with normalisation and detrending and offsetting manipulations.

        3) These two parallel lines with a slop of 0.06 deg C per decade that appear to coincide with nothing but a few random datapoints on your combination of two graphs are because you wanted them. They have no apparent objective meaning. They’re advertising.

        I’ve illustrated FOUR illegitimate practices of graphical presentation A-D and removed them from your graph to reveal that it is nothing like what it purports; I’ve outlined 3 false claims about the graph revealed when it is presented with its cosmetics removed.

        The graphical analysis you manufactured is simply wrong seven different ways.

        Actually, it’s EIGHT ways wrong, and counting: there is no validity to combining GIS and HadCRUT in the way you’ve done for the purposes you purport. If you must compare the two, then present them offset (or one at a time) to separate them so the eye can distinguish them.

        Why? We know GIS and HadCRUT use some stations in common, and have some manipulations on the station data, and so we may produce invalid weighting of station data by recombining these two sibling sets. (This, btw, is a problem with the WfT ensemble set, too, so you’re not the only one to make this mistake.)

        Likewise, we _can_ look at shorter trends than the 160 year trendline with good confidence.

        On HadCRUT or GIS (separately), 30-year trendlines are certainly valid, and even 17-year trendlines overcome signal:noise 95% of the time, 19 times in 20.

        For all the temperature datasets on WfT, the final 30 and 17 year trends are far above the 0.06 deg C level. This tells us that the recent temperature rate of increase is above the average 0.06 deg C.. and, for those datasets exceeding 60 years, we can also dismiss that this rise is somehow related to a cyclic 60-year trend. Or 70 year trend.

        Indeed, we can explain the appearance of a rising 60-year trend by looking at interference patterns of the solar cycle, the AMO and the PDO.. at least until 1980. After 1980, the signal of the solar cycle is long since lost (I say destroyed by GHG perturbation), and the rising trend defies drops in AMO and PDO and pressure from albedo and aerosols and solar changes, upheld — it appears — by GHE.

        Girma, you could address these significant challenges, but I’d be satisfied so long as you just stop using those invalid techniques.

        You’re teaching bad habits to observers who in their naive innocence of the requirements of graphical analysis have no means to defend themselves from these very grave errors you defend.

      • Bart R

        Thank you for your time.

        I will respond to the above long post of yours.

      • Bart R and Girma

        Have been following your exchange with interest.

        It appears that no matter how Bart R adjusts the data we are left with the fact that temperature has shown a cyclical trend of warming and slight cooling overlaid on a long term trend line of 0.6C warming per century.

        At the same time atmospheric CO2 has shown no such cycles, but has increased slowly at first and most recently exponentially at a CAGR of around 0.45% per year.

        The physically observed CO2/temperature correlation is not robust and where there is no robust correlation, the case for causation is weak.

        Any projections made based on the last 30-year warming “blip” are likely to be erroneous, as the latest decade has shown.

        This is pretty much the summary in words.

        Since no one knows what the future will bring we can say that Girma’s projection (based on the past repeating itself) is as sound as that of IPCC (based on model simulations with all sorts of assumed inputs).

        We’ll just have to wait and see who is right.

        Being a rational skeptic who places more weight on actual physical observations than on model simulations based primarily on theoretical deliberations, I would tend to favor Girma’s forecast over that of IPCC.

        Max

      • Joachim Seifert

        Max,
        I am on a 10-15 page paper in English, to be ready coming spring.
        Then the “wait and see” period will then be over and the climate prediction is resolved for all……. no more temp increase, from the 21 Cty top plateau on it can only go downwards…..
        JS

      • Bart R


        Why does a line pass through the “compress (12) global mean peaks” every 60 years from 1880 to 2000?

        Well, now the illusion of the peak line coinciding with the tops of the curve is very degenerated. Where’d that appearance go? It didn’t go anywhere. It was never really there.

        I have used your recommendations, and the following is the graph I get:

        http://bit.ly/nDIJzJ

        Don’t you see a straight line passes through all the peaks and another straight line passes through all the valleys, and these lines are parallel?

      • Girma

        Don’t you see a straight line passes through all the peaks and another straight line passes through all the valleys, and these lines are parallel?

        I’ve added the other two straight lines that pass through at least two peaks. As you will note, no straight line passes through all three peaks.

        And note, the valley trend line has not the expected two, but three points of contact with the temperature curve, and within error bars the whole period between the mid-50’s and the late 70’s contains three distinct points of contact, instead of one.

        I’ve also added the two-pass filter temperature trend that suggests the (approximately two partial) rising cycles in the cropped (because the cycles don’t appear beyond the cropped points) 1880-2000 graph.

        http://www.woodfortrees.org/plot/hadcrut3vgl/from:1880/to:2000/compress:12/plot/hadcrut3vgl/from:1880/to:2000/trend/plot/hadcrut3vgl/from:1880/to:2000/trend/offset:0.25/plot/hadcrut3vgl/from:1880/to:2000/trend/offset:-0.25/plot/hadcrut3vgl/from:1880/to:2000/trend/offset:0.25/detrend:-0.155/plot/hadcrut3vgl/from:1880/to:2000/trend/offset:0.065/detrend:-0.35/plot/hadcrut3vgl/mean:199/mean:211/from:1880

        Please note several elements that strongly disqualifiy contention of a dominant cycle of 60 years:

        1. The volume between the mean trend and the curve is not the same in the top and bottom of the curve. This powerfully suggests non-cyclic nature.

        2. The volume of no two homogenous timeslices is the same; not symmetrical with rise and fall in the curve, not identical one cycle after the other, not forming a discernable pattern that might be explained by modulation.

        3. There are only two cycles. No graphical analysis of any competence can argue a trigonometric fit on less than three full cycles.. and we cropped because we know that both before and after this graph the apparent fit degenerates even more than what we observe within the 1880-2000 span.

        4. The period appears to shift slightly, from about 61 years in the first half to about 57 years in the next.

        This just isn’t a sine curve if it changes period without frequency modulation.. and if you do have frequency modulation, you need not three cycles but hundreds of cycles to confirm the hypothesis, and also must not have amplitude modulation.. which we clearly see even in only two ‘periods’.

        In total, these four observations form a rather absolute disqualification of the claim of a periodic curve dominating temperature. Any single one of these indications about a curve in general might not be enough to deprecate its periodic nature, however with all four observed, there is not only no mathematical reason to consider the sine curve hypothesis, but no way to consider it at all.

        What we _could_ consider is the influence of the following:

        http://www.woodfortrees.org/plot/esrl-amo/mean:179/mean:181/from:1880/plot/jisao-pdo/mean:137/mean:139/normalise/plot/hadcrut3vgl/mean:89/mean:97/from:1880/plot/esrl-co2/mean:7/mean:11/normalise

        AMO is highly periodic at 60 year intervals, lagged about 5 years from the global mean appearance of peaks and valleys. The record is pretty good and goes back fairly far with this pattern.

        PDO is somewhat periodic (though subject wild amplitude modulation) since figures are available at about 46 year intervals.

        The coincidence of lows and highs in AMO and PDO starting around 1910 (perhaps even before) and lasting for about 70 years until 1980 gave the simulation of a rising global temperature trend with an approximate 60 year period, especially with the inverse relationship with the solar cycle (not shown) thrown in.

        After 1980, if these cycles were all there were to consider, then the global temperature ought have flattened or even dropped. By this time, the solar cycle has completely ceased to show correlation of any sort with temperature. The best, strongest correlation to explain why temperature globally continued to rise while PDO plummets is CO2.

        Indeed, there are at least five major factors that ought be pushing global temperatures toward century lows at this time, and instead we have the warmest decade on record and a warming trend on all global datasets for anything longer than a decade, as expected given CO2 rise.

        When those four sources of natural variability, plus aerosols, shift together to their peaks, we ought expect rapid and record warming, though thankfully we don’t soon expect a peak triplepoint (say around 2070) if patterns hold.

      • Bart R

        Tip:

        go to

        https://bitly.com/

        to shorten your URLs.

      • manacker

        I can see some reason for your concerns.

        If my only choice were Girma or IPCC, I’d be in somewhat of a quandary too. The virtual horns of a dilemma. A rock and a hard place.

        IPCC’s errors are not quite as manifold as Girma’s, but there’s no reason to put confidence in them, either.

        There is a warming trend, that’s undeniable. However, the IOPCC doesn’t own the trend, and there’s no reason it will follow IPCC orders.

        Making a prediction based on an ensemble of projections in the way some have done it is silly, and has been shown meaningless in other fields long before the IPCC started doing it.

        I’m more of the, “there has been anthropogenic warming (including but not limited to UHI and land use); the best explanation includes CO2E correlations; as CO2E increases, this contribution to temperature rise compared to natural variability will increase if all other things remain equal, no new unknown negative feedbacks emerge, and present trends continue to hold,” school of thought.

        (Girma, thanks for the tip, however I like to see where I’m going on the internet and find WfT links highly readable; if I really needed to shorten my links, I could use WordPress’ tools for aliasing addresses.)

      • Bart R

        If my only choice were Girma or IPCC, I’d be in somewhat of a quandary too. The virtual horns of a dilemma. A rock and a hard place.

        IPCC’s errors are not quite as manifold as Girma’s, but there’s no reason to put confidence in them, either.

        Bart you cannot compare an amateur with the professionals at the IPCC.

        At least, thank you for your, If my only choice were Girma or IPCC, I’d be in somewhat of a quandary too.>/I>

        We must believe what we see:

        http://bit.ly/rxmhWh

        This result shows that the global mean temperature never exceeds the upper boundary line for long since record begun 160 years ago

      • Girma

        If we must believe what our eyes see, then we must believe everything our eyes see.

        http://www.woodfortrees.org/plot/hadcrut3vgl/mean:203/mean:207/plot/hadcrut3vgl/from:1900/to:1980/trend/plot/hadcrut3vgl/from:1980/trend/plot/hadcrut3vgl/to:1900/trend/plot/esrl-co2/mean:7/mean:11/normalise

        Step 1: Smooth the temperature trend to remove short term trends that are not related to the effects we are trying to find. (Two pass relative prime filter at about the 17 year level.)

        Step 2: Construct trend lines to remove the triplepoint peaks and valleys where we know PDO and AMO and solar cycles combined in the 1910-1915 and 1940-1945 spans, leaving only those points where ocean oscillations largely cancel out.

        Step 3: Demonstrate the extraordinary correlation of CO2 level with temperature.

        These things are hidden by the cosmetic effects in your graph, until like drawing apart a veil we apply proven techniques to reveal what the evidence is trying to show us.

      • Bart R

        Look what your smoothing has done to the ANNUAL global mean temperature. The smoothed curve ignores the 1880s, 1940s peaks and the 1970s valleys. Your smoothed curve is off by about 0.3 deg C! (half of the global warming per century)
        http://bit.ly/uoEXKj

        A much better fit between model and observation for the peak and valley values is the following graph of mine.

        http://bit.ly/cO94in

        Bart, sorry to say, but your smoothed curve is fortunately wrong.

        It is wrong because:

        It ignores the 1940s peak.

        It ignores the 1880s peak.

        Thanks for the climategate emails, we know that is what happened:

        “Indeed, in the verification period, the biggest “miss” was an apparently very warm year in the late 19th century that we did not get right at all.”

        “It would be good to remove at least part of the 1940s blip,”

        SMOOTHING HIDES DATA. Please leave the annual global mean temperature values. Don’t touch them.

      • Girma

        I hadn’t realized until your latest reply in this thread just how much you don’t get about graphical analysis.

        No two dimensional image can reveal everything about a three dimensional world.

        Have a look at http://xkcd.com/977/ to see a tiny sampling of how many perfectly valid alternate projections of the globe, for example, have been built to the many purposes map-users may have.

        “Look what your smoothing has done to the ANNUAL global mean temperature. The smoothed curve ignores the 1880s, 1940s peaks and the 1970s valleys. Your smoothed curve is off by about 0.3 deg C! (half of the global warming per century).”

        When I posted my graph, did you forget what you knew of the 1880’s, 1940’s and 1970’s?

        They’re still there, on other less smoothed graphs.

        The information hasn’t gone away entirely just because it was filtered from the graph in question.

        Its removal, like removing the rind of an orange to reveal the flesh of the fruit within, like removing the case of a Swiss watch to reveal its works, was intentional to reveal the core of the strongest underlying mechanisms of climate.

        A much better fit between model and observation for the peak and valley values is the following graph of mine.

        http://bit.ly/cO94in

        Let’s take a closer look at your graph.

        It includes projections past the date it was created. (How long ago was that, Girma? I see a sizeable discrepancy between the temperatures you show after 2008 and what actually happened.)

        These projections of yours, like the error above caused by including the “offset 14” term draw the eye to suggest a pattern that simply does not exist. Why would you include guesses up to 2100 in such a graph, if not intentionally to confuse the observer?

        Also, and I’ve pointed this out before about the same graph, it cuts off instrumental readings prior to 1880 which are disparate with the pattern you claim. This cropping is entirely illegitimate, if your purpose is to demonstrate goodness of fit.

        You do note excursions from your pattern in the 1890-1900’s and 1950’s (overperformance), however you ignore excursions that based on volume are more sizeable in the 1920-1940’s and 1990’s (underperformance).

        All told, your graph is a poor fit for half of the actual data. (Though it’s impossible to discuss its fit for 2012-2100 data, as there is no data.)

        Your ‘periodic’ graph includes three peaks (two curtailed) and two valleys.

        It is thus only half as long as a graph would need to be to be useful for establishing a periodic relationship, and that only if there were no indications of double modulation (ie both amplitude and frequency), which it also fails on.

        Further, your graph proposes no mechanism to explain the periodicity you claim.

        I can counter these weak claims with the very well-established AMO and PDO and solar cycles, have illustrated how they interract constructively at exactly the points you claim are part of your trend (which they will do roughly every seven decades alternatingly for peaks and valleys, and which in the first half of the 20th century they coincidentally did for a valley and a consequent peak and then again a consequent valley), and see nothing in your work that explains the divergences which my proposal does.

        It ignores the 1940s peak.

        And I explained why. Also, you may wish to note, this ‘ignoring’ of the 1940’s peak is a mathematical outcome of the weakness of the peak compared to the overall trend. That the peak was so substantially diminished was due to a real physical effect, as is the rest of the smoothed curve’s direction and shape.

        It ignores the 1880s peak.

        And I explained why this would be, likewise.

        Your curve, on the other hand, cropped the entire first 30 years of the data, implied its last 92 years solely from imagined fit, and de-emphasized half of the excursions from the claims in the small timeframe you did use, based on the one dataset that best matched your expectations.

        The hemispherical datasets don’t show your claimed trends. The two other global datasets from station data don’t show your claimed trends. BEST doesn’t show your claimed trends. AMO and PDO show what causes the appearance of your trends.

        SMOOTHING HIDES DATA. Please leave the annual global mean temperature values. Don’t touch them.

        Of course smoothing hides data. That’s unavoidable, as all presentations hide or change something. We must caution when we do something that hides data (as I did in my narrative, and as you _never_ do) so the observer is reminded to consider more than just the image presented. We must also be careful to avoid the many pitfalls of invalid graphical technique, which your graphs simply repeat so often and with such flagrance as if taken from a manual on how to fool the eye.

        If you think my graph was due criticism, let me show another one that makes clear how terrible that graph is:

        http://www.woodfortrees.org/plot/best-upper/mean:97/mean:89/from:1825/plot/best/mean:97/mean:89/from:1825/plot/best-lower/mean:97/mean:89/from:1825/plot/best-upper/from:1950/trend/plot/best-lower/from:1825/to:1950/trend/plot/best-upper/from:1825/to:1950/trend/plot/best/from:1950/trend/plot/best-lower/from:1950/trend/plot/best/from:1825/to:1950/trend/plot/hadsst2gl/mean:11/mean:13

        Be warned, smoothing has hidden excursions and extreme values — as has the long timespan presented.

        Note also the graph is very busy, which will distract the eye, but it does show the vastness of our Uncertainty on the instrumental record, too.

        We might still be below the 1825 global temperature, based on all instruments can tell us, if the 95% Confidence upper BEST curve is accurate. We might alternatively have been warming since 1825 at as steep a rate (or steeper) as in the past 60 years, if the 95% lower BEST temperatures are right (and the 1940’s 95% upper BEST accurate).

        (And recall, instrumental data is at least an order of magnitude more precise than all the proxy data — all the things that tell us there even may have been an LIA or MWP or earlier global temperature event.)

        And look at that oceanic trend line! The HadSST looks for its first half century as if it were calibrated exactly on the global record including land, and then like it were sandwiched between the lower and average trends until 1980, and now appears to be dropping away from the land trends rapidly.

        I ought have included that trend too to discuss the global-ocean relationship.. if I knew what the relationship were. So much to speculate on, so little information.

        But whatever else, we can be extremely certain, it’s not a 60 year wavelength temperature trend starting in 1880 and moving up 0.06 deg C/decade.

      • Bart R

        I enjoyed the civil interaction with you in this thread. Thank you. I think we now genuinely understand each others positions.

        Let the observation in the coming decade resolve our argument.

        In the following graph, in the next decade, if the annual global mean temperature data lie in the red region, your and IPCC’s position regarding AGW is right. If it lies under the red region, we sceptics are right.

        http://bit.ly/oI8dws

        Bart, do you agree with that?

      • Girma | November 19, 2011 at 7:28 pm |

        (A small primer for those who wish a background on the addition of two waves: http://www.youtube.com/watch?v=WHjZf1PrFcs, given we have a 60-year AMO and 46-year PDO.)

        Again, we must disagree; however, we’re making progress, as now the reason for dispute is primarily Bayesian, and not from strict Graphical Methods.

        Let the observation in the coming decade resolve our argument.

        In the following graph, in the next decade, if the annual global mean temperature data lie in the red region, your and IPCC’s position regarding AGW is right. If it lies under the red region, we sceptics are right.

        http://bit.ly/oI8dws

        Bart, do you agree with that?

        Any one decade, as we’ve seen in countless cases, might deliver a surprise, an unexpected or ambiguous or marginal outcome, or even an expected outcome but for demonstrably other reasons.

        Suppose a major volcanic event tomorrow, or a string of five significant volcano events spread over the next decade.

        Suppose China or another major economy has been mistaken about its aerosol emissions, and this remains undiscovered for whatever reason for more than a decade.

        Or clathrates begin to bubble up in the thawing permafrost of the Arctic and from the seabeds.

        Or suppose some Chaos effect causes the solar and ocean cycles to synchronize acausally, as sometimes happens in complex systems?

        No, wagering on outcomes will not give us meaningful answers. Further, regardless of the outcome (though with some asymmetry), the evidence presented by one further decade will give only a marginally different probability for both proposed priors.

        The next decade can not much further prove you wrong, nor can it much further prove me right.

        We go — strictly on the global temperature record and ignoring all else — from 1000:3 odds in favor of AGW (though not IPCC’s prediction from ensembles) over Girma to about 1001:3 odds, even if the next decade is an unprecedented vertical line up; and likewise from 1000:3 to about 999:3 odds even if the next decade is likewise vertical down.

        What it takes to shift statistical understandings signicantly is to come to better mechanical understanding of the system, to dramatically increase data quality (as BEST has done in many ways), and to seek meaningful consilience from other kinds of datasets – such as metrics of extreme events.

        While paleo/geophysical proxies are generally poorer in a number of ways, the sheer number of potential proxies and size of their datasets also may help.

        All of this, however, predicates on proposed mechanisms.

        Which your hypothesis does not establish very clearly.

        So, no. Waiting another decade and looking at thermometers only.. is a wasted effort, at best misleading, and at worst teaching bad logic.

      • Bart

        The next decade cannot much further prove you wrong, nor can it much further prove me right.

        I will change my position from a sceptic to accepting AGW if in the next decade the annual global mean temperatures lie in the region shaded red in the following graph.

        http://bit.ly/oI8dws

        This is my position because in the last 160 years the annual global mean temperatures have not exceeded the upper boundary line for long as shown in the following graph.

        http://bit.ly/rxmhWh

        Thanks again Bart

        Cheers

      • Girma

        I think we already have enough people who believe in good science for bad reasons that I encourage you to rethink your position.

        I’ve more than once heard people opine that Einstein was right because he had such soulful eyes.

        Will you still change your mind to support AGW if there is a sudden drop in cloud albedo for the rest of the decade, caused by some heretofore unknown mechanism unrelated to GHGs?

        If overnight several substantial sources of aerosol emissions shut down for the rest of the decade?

        If suddenly volcanic activity dramatically shifts?

        If there is a dramatic and unprecedented change in solar activity?

        And.. is this the only thing that will change your mind?

        Waiting 10 years to see whether by coincidence a poorly-worded prediction from someone who ought know better, speaking against the advice of his IPCC colleagues, turns out to be momentarilly right?

        Wouldn’t logic, mathematics, collection of much better evidence and clearer, cleaner presentation, design of some experiment with the undeniability of Rutherford’s or Millikan’s work of a century ago, or the like, be more satisfying to you than a wager?

      • Bart R

        I believe there is a signal in the annual global mean temperature data. I believe we have no clue why it is on a trajectory of 0.06 deg C per decade warming as shown below.

        http://bit.ly/vpkwvv
        (To emphasize the trend I have changed the vertical scale)

        All the aerosols, volcanoes and greenhouse emissions have not changed this trajectory for the last 160 years. As a result, the effect of aerosols, volcanoes and greenhouse emission on the climate is nil.

        I believe this trend will continue until the climate changes like it did at the end of the little ice age.

  35. Stern and his collaborators have always emphasized the uncertainties of the consequences of the climate change and approached those uncertainties in the spirit of the precautionary principle, but formulating that using methods of economics of risk and time (risk aversion and discounting). This article follows the same logic.

    The paper emphasizes the need of understanding scientific uncertainties, but appears to do it only for a part of the problem. They tell that no model can extend reliably far to the future – and conclude that the uncertainty should only add to the significance of the climate issue.

    The papers of Stern and his collaborators on environmental economics fail in my judgment seriously on the other side of the issue. How they calculate the present value of the risks of climate change is highly questionable, how they calculate the value of mitigation measures, is ,however, worse than that. If the climate models are very unreliable in estimating the risk of catastrophic consequences, the understanding of the consequences of the policy options seems to be totally lacking. The future influence of the policies is calculated simplistically, in some sense in a linear and additive fashion, although the real consequences are influenced very essentially by political feedbacks, i.e. by the future decisions of future decision makers. Any attempt to include these feedbacks tells that the calculations are really meaningless.

    It’s not possible to compare long term risks and benefits for the human well-being in the way environmental economists do, when they search for quantitative results. Stern’s results are in one extreme end in using the lack of valid theory to produce results that support one political view. Many others (like William Nordhaus) are more balanced, but the lack of valid theory makes their quantitative results highly questionable as well.

    • Pekka,

      And yet these people have the political power through years of manipulation with IPCC reports and many other bogus articles. Pushing the panic button of uncontrollable warming through mans activities tying them all to the rise of CO2.
      It takes years of data to get a pattern to warming and cooling of temperatures. In an insignificant amount of planetary time.
      An Ice Age is precipitation based so why ignore precipitation strictly on temperatures?
      This year there has been a drop of 6mm of sea levels. That is 3 times to what it was rising. Looks like we are in for some extremely heavy precipitation.

  36. For the attention of Steve Mosher:
    Getting concerned with uncertainties would not get me very far, but got this one (only after emailing the data) through by the ‘St. Svalgaard, the WUWT’s heliospheric gate keeper’
    http://www.vukcevic.talktalk.net/HMF-T.htm
    Notice up to 1.5 degree C swing in the temperatures (not global). Details will be on line soon, but it is based on the findings in this article
    http://www.vukcevic.talktalk.net/theAMO.htm (pages 11+)
    “It’s the Arctic” – the faint lone voice in wilderness.

  37. The paper advocates “scientific speculation on policy-relevant aspects of plausible, high-impact, scenarios even where we can neither model them realistically nor provide a precise estimate of their probability”

    How does this differ from the current common practice of dreaming up and endlessly dramatizing catastrophic scenarios (Greenland to melt, Africa to dry up, etc) without bothering to mention how wildly unlikely they are?

    And “designing model experiments to meet the needs of policy-making”. Is it possible to be more explicit that the politics are to be prior to the science?

    I’m not at all sure I see a new leaf being turned over here, more like same old same old

  38. Joachim Seifert

    Dear Judith,
    The paper calles for:
    “…….identifying weaknesses in the science that are likely to reduce the robustness of policy options;”
    “……. clarifying where adaptation to current climate is currently lacking; — identifying observations which, in a few decades, we will wish we had taken today………”

    But,….. if you do exactly THIS by writing and informing the 40 institutes, on whose calculations the AR3 and SRES scenarios are based……
    then…… after their apprehension that you come from the sceptics camp, they all practize absolute ignorance , like a child, behaving obstinant….

    I wrote to the major German Physics organizations……. with the tiny
    request, that they just acknowledge having received my humble information…. no reply whatsoever….. some English institutes have responded (the gentlemens culture),…… but
    the German institutes are entrenched to the last bone, which reminds me
    of the time before 1945……very comparable…they do not want to deal
    with skeptics…. its them or nobody…..
    JS

  39. Judith Curry

    Thanks for posting the entire Smith and Stern article, which was behind paywall.

    The authors start of with:

    Policy-making is usually about risk management. Thus, the handling of uncertainty in science is central to its support of sound policy-making.

    This is discussed in the context of

    whether a target of ‘50 per cent chance of remaining under +2°C’ is either ‘right’ or ‘safe’

    .

    In this context the authors bring up a salient point:

    Science often focuses on what is known and what is almost known; dwelling on what one is unlikely to know even at the end of one’s career may not aid the scientist’s career, yet exactly this information can aid the policy-maker. Scientific speculation, which is often deprecated within science, can be of value to the policy-maker as long as it is clearly labelled as speculation. Given that we cannot deduce a clear scientific view of what a 5°C warmer world would look like, for example, speculation on what such a world might look like is of value if only because the policy-maker may erroneously conclude that adapting to the impacts of 5°C would be straightforward. Science can be certain that the impacts would be huge even when it cannot quantify those impacts.

    Before one starts making serious risk analyses about what a “5°C warmer world would look like” one should attempt to ascertain whether or not such a world could even be physically caused by human GHG emissions.

    The authors accept “a priori” the IPCC position that such a world is not only a physical possibility, but even that it is something that might actually occur. This is IMO a “fatal flaw” in their analysis.

    Let us forget all the climate model simulations, upon which such a postulation is based, and look simply at empirical evidence.

    Humans have been emitting CO2 into the atmosphere as a result of industrial activity since the beginning of the Industrial Revolution.

    Fortunately, we have a modern record of globally and annually averaged land and sea surface temperature anomalies, which goes back to 1850, at a very early stage of the Industrial Revolution. This record shows that it has warmed by 0.7°C since 1850. There have been some challenges to the validity or accuracy of this record, but as far as I have seen there have been no serious challenges that it represents an understatement of the actual warming seen, so let’s say the record is sound.

    Through data from the Vostok ice core, we have an estimated value of atmospheric CO2 concentration in 1850 (290 ppmv), and from Mauna Loa we have measurements of the current level (390 ppmv).

    What we do not know, however, is how much of the past warming was caused by human activities as opposed to natural factors, nor how much of the past human impact was caused by CO2.

    Here we must rely on IPCC estimates, rather than observed empirical data.

    IPCC (AR4 WG1) estimates that 7% of the past warming was from natural forcing factors (solar) and that all other anthropogenic factors other than CO2 essentially cancelled one another out. IPCC also tells us that the CO2/temperature relation is logarithmic.

    From these data points we can calculate the observed temperature response to the observed increase in atmospheric CO2.

    So far this is fairly straightforward.

    Now we have to do some “crystal ball gazing” to attempt to ascertain what the future will bring.

    This is, at best, a very nebulous undertaking (see Nassim Taleb’s The Black Swan).

    First we must try to guess how much CO2 humans are likely to emit, let’s say to the year 2100. [Predictions that go beyond a few years are already very suspect; those that go further than a decade or two are usually totally meaningless, but let’s have a go, anyway.]

    Today humans emit around 30 GtCO2 annually and atmospheric CO2 concentration is at 390 ppmv and increasing by about 2 ppmv per year.

    Over the past several years the atmospheric CO2 level has increased by an exponential rate of 0.45% per year.

    If we assume that the future increase will be exponential, we have:
    1.0045^90 * 390 = 585 ppmv by 2100

    [This turns out to be the same as IPCC “SRES scenario and storyline B1”, a “business as usual” scenario with no climate measures, moderate economic growth and global population leveling off at end of century.]

    Extrapolating the observed CO2/temperature response to the estimated CO2 level by 2100, we arrive at an additional warming above today of:

    dT(390-585 ppmv) =
    dT(290-390 ppmv) * ln ( 585 / 390) / ln (390 / 290) =
    0.93 * 0.7 * (0.4055 / 0.2963) = 0.9°C

    This is not anywhere even close to the suggested “5°C warmer world”.

    But let’s look if such a world is even physically possible (from human CO2 emission).

    The WEC issued a 2010 report listing the “proven fossil fuel reserves” as well as the much higher “total inferred possible fossil fuel resources” of our planet.
    http://www.worldenergy.org/documents/ser_2010_report_1.pdf

    From these estimates one can calculate that if ALL the optimistically inferred fossil fuels on our planet were consumed, the atmospheric CO2 concentration would increase by 675 ppmv, to a total “maximum possible ever” level of around 1065 ppmv.

    At this level the physically observed CO2/temperature response would tell us that we would have the following warming:

    dT(390-1065 ppmv) =
    dT(290-390 ppmv) * ln ( 1065 / 390) / ln (390 / 290) =
    0.93 * 0.7 * (1.0046 / 0.2963) = 2.2°C

    That’s it! Even here we are far from reaching the suggested “5°C warmer world”.

    Later on the authors write:

    Avoiding the question of what the probability of a given climate outcome is, and asking instead if that outcome has a probability of, say, less than half a percent, would ease many of the difficulties that distinguishing imprecision and ambiguity pose for climate scientists, while potentially retaining much of the information of value to policy-making.

    We do not need to “worry” about “what a 5°C warmer world would look like” (from human CO2 emissions, at least), even at the level of “less than a half a percent” probability because it is not even a physical possibility, based on the empirical evidence we have at hand.
    .
    So let’s get on with worrying about real things instead.

    Max

    • Max,

      The fatal flaw is strictly following temperatures.
      Many other circulation drivers are not temperature based.
      But don’t tell that to the climate scientists as it will only be seen as interesting with no bearing on the data manipulation.

    • Max,

      A small technical point: Estimates of pre-industrial CO2 concentration usually derive from Law Dome ice core data, not Vostok.

      Regarding your calculations: one important thing you should keep in mind is that the factors you’re addressing have not been increasing linearly since 1850. About 70% of the CO2 increase from pre-industrial levels has occurred since 1960, about 50% since 1980. Most of the ~0.7ºC warming has occurred in the last 30-40 years. What we’ve observed has effectively been a transient response to perturbation.

      The standard defintion of transient response is the temperature change after 70 years from a 1% per annum increase (or decrease) in CO2 concentration (1% pa gives a doubling (or halving) in 70 years). Your figures for such a response (dT(290-585 ppmv) = 1.6ºC) agree quite well with figures derived from the IPCC’s GCM ensemble. However, to project out to future temperature changes you also need to consider the equilibrium response to an already present energy imbalance.

      You can get a feel for equilibrium response looking at the orange line on this chart. It shows a forecast of what will happen if all forcing factors are frozen at year 2000 values. There is a further 0.3-0.4ºC warming through the century. Obviously the size of the equilibrium response will be proportional to the size of the imbalance so you would expect more warming from stabilisation after a doubling compared to stabilisation after a 35% increase.

      Also implicit in your calculations is an assumption that negative forcing factors (mainly aerosols) would keep pace with positive forcing factors (mainly long-lived GHGs) through the century. Unfortunately this wouldn’t be the case. The atmospheric lifetimes of well-mixed GHG concentration changes are given in decades to centuries, whereas lifetimes of aerosols are given in days to weeks. This being the case well-mixed GHG concentration increases act like compound interest whereas aerosols have to replenish their balance constantly: There will be an increasing divergence between aerosol negative forcing and GHG positive forcing. Also, as fossil fuel use decreases warming will continue or even temporarily accelerate since the first effect will be a drop in aerosol number.

      I think if you take these factors into consideration you’ll get figures quite close to the IPCC’s. Still 5ºC is perhaps unlikely by 2100, but the higher end emissions scenarios should see such an increase by ~2200.

      • Paul S

        Thanks for your response.

        Let’s go through it.

        OK on ice core source.

        Of course, neither CO2 nor temperature have been “increasing linearly since 1850” (I agree with you here).

        CO2 increased gradually at first (based on ice core estimates cited by IPCC). Since Mauna Loa measurements were installed it has accelerated. Most recently it has been increasing exponentially at a compounded annual growth rate of around 0.45% per year.

        Temperature has increased by 0.04 to 0.05°C per decade on average, but this has occurred in ~30-year warming cycles with ~30-year cycles of slight cooling in between, bearing no resemblance whatsoever with the CO2 record (see Girma analysis)

        Your mention of the theoretical “transient response” (and its “standard definition”) is of less interest to me than the actual physically observed CO2/temperature response.

        You apparently agree with my estimation of the warming to 2100:

        Your figures for such a response (dT(290-585 ppmv) = 1.6ºC) agree quite well with figures derived from the IPCC’s GCM ensemble.

        93% of 0.7ºC from 1850 to today from CO2 plus another 0.9ºC from today to 585 ppmv in 2100 ~1.6°C, so we agree.

        You then switch from observed CO2/temperature response to model-based theory with:

        However, to project out to future temperature changes you also need to consider the equilibrium response to an already present energy imbalance.

        I may be trampling on a sacred cow of climatology here, but I believe that one should start with empirical evidence, based on physical observations or reproducible experimentation, following the scientific method, rather than simply cranking hypotheses into computer models and seeing what comes out the other end.

        The notion of a different “equilibrium response” from that which we have physically observed over the 160+ year period from 1850 to today has not been validated by actual physical observations, but is a model construct based on theory.

        Your second point: the notion that “aerosol forcings will not keep pace with CO2 forcings” is not based on any physical observations, but on model simulations with very dicey inputs. We see how climatologists are running around trying to blame the recent lack of warming on Chinese aerosol emissions.

        Your third point on the lifetime of CO2 in the atmosphere is a red herring, Paul. I have simply assumed future CO2 will stay in the atmosphere just as long as past CO2 has done.

        You then write:

        Still 5ºC is perhaps unlikely by 2100, but the higher end emissions scenarios should see such an increase by ~2200.

        You may be right in writing that IPCC’s higher end emissions scenarios should see such an increase by ~2200, but, as I have shown you based on WEC data on the amount of fossil fuels remaining on our planet, the postulated CO2 levels of these higher end emission scenarios are physically impossible to ever reach, so these scenarios are silly.

        Sorry, Paul.

        NO SALE.

        Max

      • Your mention of the theoretical “transient response” (and its “standard definition”) is of less interest to me than the actual physically observed CO2/temperature response.

        Transient response isn’t a theory. It’s a concept, a way of understanding forced temperature changes and it is precisely related to the physically observed CO2/temperature response. The point is that your figures were arrived at by calculating the transient warming response across two periods. However, forcing changes not only produce transient surface warming but also an energy imbalance at the top of the atmosphere which will cause further equilibrium warming even if forcings are stabilised.

        You then switch from observed CO2/temperature response to model-based theory

        This is not model-based theory. It’s something shown in models, certainly, but the theory is derived from quite basic physics and is backed up by empirical observations. Perhaps the simplest way to see it is by looking at the land-ocean warming contrast. Over the past 30 years land surface temperatures have increased by ~0.8ºC whereas sea surface temperatures have increased by ~0.35ºC. This is expected because both land and ocean will receive (roughly) the same energy increase but uniquely in the oceans a large portion of that will be distributed vertically several hundred metres down (this is what we see in the ocean heat content data), rather than simply being used to warm the surface. However, at some point the sea surfaces must warm up by about the same amount as the land so they are radiating out enough energy to close the imbalance.

        So, looking at the past 30 years we can posit a transient response to the total net forcing changes over this period of ~0.5ºC (that being the observed land-ocean surface change), and tentatively infer an equilibrium sensitivity of ~0.8ºC (Note that this is NOT equilibrium sensitivity to a CO2 doubling, just to forcing changes since 1980).

        Your second point: the notion that “aerosol forcings will not keep pace with CO2 forcings” is not based on any physical observations, but on model simulations with very dicey inputs. We see how climatologists are running around trying to blame the recent lack of warming on Chinese aerosol emissions.

        No, this has nothing to do with model simulations. Perhaps I’ll put this in a simpler way: The primary anthropogenic source of aerosol emissions is fossil fuel burning so as we continue burning we are increasing CO2 concentration and replenishing aerosol stocks. However, aerosols drop out of the sky very quickly so think about what happens if burning suddenly stopped. CO2 will start to gradually decline over the years but the aerosols, which have mitigated some of the warming, will be gone within months, and the things they previously mitigated against will still be hanging around.

      • Nice try, Paul, but you are not very convincing when you write:

        .However, forcing changes not only produce transient surface warming but also an energy imbalance at the top of the atmosphere which will cause further equilibrium warming even if forcings are stabilised.

        And follow this up with:

        So, looking at the past 30 years we can posit a transient response to the total net forcing changes over this period of ~0.5ºC (that being the observed land-ocean surface change), and tentatively infer an equilibrium sensitivity of ~0.8ºC

        The “past 30 years” have been another warming cycle, statistically indistinguishable from the 30 year warming cycle starting around 1910, before there was much human CO2, and not much different from an even earlier one in the late 19th century, when there was essentially no human CO2.

        Don’t look at a single 30-year “blip” to “posit” something, Paul – look at the 160+ year record instead. It will tell you a lot more, as I pointed out earlier.

        Bring some empirical evidence to support your “concept” that this “further equilibrium warming” has not been seen over a 160+ year period (and I am not referring to model simulations or dicey, subjectively interpreted paleo-climate data from carefully selected geological periods, but real-life data based on physical observations or reproducible experimentation).

        The “aerosol” rationalization has worn thin, Paul. First it was used to try to rationalize the roughly 30-year period of slight cooling starting around 1946, which occurred despite the fact that CO2 emissions had begun to increase exponentially in the post-WWII boom years.

        Now that it has again stopped warming 30 years later despite CO2 reaching record levels, it is being dusted off and brought back as a rationalization (this time it’s the Chinese).

        Fool me once, shame on you. Fool me twice, shame on me.

        Paul, the facts of the matter are that CO2 and temperature have shown very little correlation, when one really looks at the past 160+ years.

        And we both know that lack of robust correlation makes the case for causation very weak.

        Even so, using the actually observed CO2 and temperature record plus the IPCC assumptions on human versus natural forcing, we see that CO2 has not been a major driver of climate and that postulations of 5C warming by 2100 (or 2200 for that matter) are ludicrous, as I pointed out, because there just isn’t enough carbon out there to get there, even if it were all 100% consumed.

        The S+S deliberations on “risk management” are based on a silly assumption of a “5 degree C warmer world” (caused by AGW, of course).and can therefore be ignored.

        This was my point, which you have been unable to refute.

        Max

        Max

      • You began with these premises, to which I have been responding by pointing out that you’re missing some of the picture:

        IPCC (AR4 WG1) estimates that 7% of the past warming was from natural forcing factors (solar) and that all other anthropogenic factors other than CO2 essentially cancelled one another out.

        dT(390-585 ppmv) =
        dT(290-390 ppmv) * ln ( 585 / 390) / ln (390 / 290) =
        0.93 * 0.7 * (0.4055 / 0.2963) = 0.9°C

        You’re now introducing a completely different ‘cycles’ concept and dismissing the role of aerosols which are the main basis for anthropogenic effects ‘cancelling one another out’, which makes this whole premise redundant. Is there any point in continuing this discussion if you’re simply going to abandon the premise and talk about something completely different when things aren’t going so well?

      • Paul S

        You are waffling.

        We are doing a “reality check” on the S+S paper, which discusses the “risk” associated with a “5°C warmer world” (from AGW, of course).

        I simply pointed out that this assumption was absurd, and why this is so.

        Now you state:

        You’re now introducing a completely different ‘cycles’ concept and dismissing the role of aerosols

        Let’s go through both parts of this claim.

        The “cycles” concept is not something “completely different”, which I have introduced at all. It is an inherent part of the physically observed temperature record, like it or not (see Girma)

        You are the one who brought up the latest warming cycle (stating erroneously that “most of the warming” observed to date had occurred during this period), and I simply pointed out that this was one of three similar cycles and that these short “blips” should not be used to draw conclusions about CO2 forcing, but rather that one should look at the entire 160+ year record instead.

        And when one looks at the entire record, one sees a) a poor correlation between CO2 and temperature and b) a CO2/temperature response of less than 0.7°C for a CO2 increase from 290 to 390 ppmv.

        You then bring up aerosols as the “wild card”.

        IPCC has told us that all anthropogenic forcing components other than CO2 (aerosols, other GHGs, etc.) have cancelled one another out in the past. I accepted that at face value.

        There are no physical observations that would support the notion that aerosols have played a major part in the mid-century cooling (or in the current “lack of warming” for that matter). This is all just rationalization.

        Finally, we have the IPCC estimate of natural forcing, which I accepted at face value, as well.

        IPCC has given us its estimate that natural forcing (solar) only represented 7% of the total forcing since industrialization, with the rest attributed to human-induced forcing.

        At the same time, IPCC has conceded that its “level of scientific understanding” of “solar forcing” was “low”.

        There have been many solar studies, which would attribute around half of the observed warming to the unusually high level of solar activity of the 20th century (rather than just 7%).

        In spite of these studies and the IPCC admission that its LOSU was low, I accepted the IPCC claim for the purposes of critiquing the S+S paper on risk.

        Finally to “equilibrium” versus “transient” climate response: The “equilibrium” concept is based on the theoretical deliberation that net feedbacks are strongly positive, so that 2xCO2 sensitivity is estimated to be around 3°C. Since the actual observations do not support such a high sensitivity, there needs to be an equilibrium delay to explain the discrepancy.

        I have seen no empirical data supporting the equilibrium concept as you have described it.

        Finally – and that is the crucial point – there is the embarrassing problem that there just isn’t enough carbon in all the optimistically estimated fossil fuels left on our planet to reach the CO2 levels required to even come close to a “5°C warmer world”, as S+S have conjured up.

        So, back to the main topic here, the S+S deliberation about a “5°C warmer world” is total rubbish, as I demonstrated to you and you have been unable to refute.

        Max

      • Most recently it has been increasing exponentially at a compounded annual growth rate of around 0.45% per year.

        How could that be? The preindustrial level was around 280 ppmv, as part of the natural carbon cycle. According to you, humans are multiplying the natural level by 1.0045 each year. How could they be multiplying it by anything? What mechanism would accomplish that?

        By emitting CO2, humans are adding to the natural level each year, not multiplying it by something. Redo your math on the assumption of addition of an exponentially growing anthropogenic component to the natural level of 280 ppmv and you’ll see that your 0.45% CAGR is low by a factor of five.

        Also your fit is bad. If you multiply instead of adding anthropogenic CO2 you get a much worse r2, one for which the unexplained variance is five times higher than what you get if you view fossil fuel CO2 as being added to the atmosphere. Your multiplicative model is neither a good fit to the data nor physically meaningful: it makes no sense to multiply annual fossil fuel emissions by the natural CO2 level.

      • Vaughan Pratt

        It appears that you have trouble with observed data.

        In 1958 Mauna Loa measurements started.

        For the first 20 years, atmospheric CO2 concentrations increased at a compounded annual growth rate (CAGR) of around 0.35%.

        This exponential rate then accelerated around 1980, and since then the CAGR has been around 0.45% per year.

        Over the same period, human CO2 emissions have grown by a CAGR of around 5% per year.

        Over the entire Mauna Loa record the exponential increase in atmospheric concentration has been 0.41% per year.

        At the current CAGR, the level in 2100 would be 585 ppmv. This is the same as estimated for IPCC “SRES scenario and storyline B1”, business as usual (no climate initiatives) with moderate economic growth and global population leveling off after mid-century (per UN projections).

        Over the past 10 years the atmospheric CO2 level has grown by an average of 2 ppmv per year. If one assumes that the future growth will be linear rather than exponential, we would see an increase of 89 * 2 = 178 ppmv, or a level of 568 ppmv by 2100.

        No matter how you slice it, Vaughan, the “B1 scenario” looks reasonable to me, provided there are no “climate initiatives” and the UN’s population growth estimates are about right.

        Of course, if fossil fuels keep getting more difficult to develop and extract, and hence more expensive, there will be more economic push for added energy conservation measures, and if cost-competitive non-fossil fuel energy sources are developed over the next several decades, this will cause a further slow-down in the increase of human CO2 emissions as well as atmospheric CO2 concentrations.

        And nuclear power will undoubtedly play an increasing role in those regions where most of the future growth will occur (China, India, etc.)., as their coal deposits start to dwindle.

        So “B1” may be sort of an upper limit, in actual fact.

        Max

      • He is among these poor dear folks who think their financial calculators can grow CO2 in the same manner as a Savings Bond.

        I’ll lend you so much atmospheric CO2 if you’ll pay me pack that CO2 plus some CO2 interest.

        It’s absurd, and it is wrong.

        When comparing CO2 to money, CO2 is like a piggybank. The money inside a piggybank does not grow itself. The only means to get more money into the piggybank is human behavior. Grandma Natural gave you 280 cents. Want more, get a paper route.

        Each year the number starts at ZERO. Then humans get to work at doing stuff. They work hard all year. If they add 2.42 ppm, then they’ve worked especially hard. Great job in 2010 human race. Way to go. Thumbs up. You make those humans from 1958 look inferior and lazy and unimaginative, and you did it with government on your back. be proud. Here’s to an even greater performance in 2011! Max says you are limited. I know you’re not. You can produce more fossil fuels than ever. Mash that pedal. Chill out by turning down the AC. Toast away that winter chill. Burn it, baby.

      • JCH

        You and Vaughan Pratt seem to ignore that there is a finite amount of fossil fuel out there.

        People who understand more about this than you apparently do (the WEC) have figured out roughly how much might optimistically be on our planet, including all inferred possible sources.

        This is roughly four times as much as the current “proven reserves”. (Other estimates, such as “peak oil”, result in a much lower figure.)

        And when it’s ALL gone some day in the far distant future, atmospheric CO2 will have increased to around 1065 ppmv, if the WEC estimates are right.

        That’s it JCH, no matter how hard we press on the accelerator. The tank will be empty.

        Will that happen by 2150? By 2200? Or never? Who knows?.

        Max

      • So much nonsense, so few paragraphs. One must admire Max’s chutzpah, ignoring the point that humans can’t multiply nature’s 280 ppmv by anything. Talking past people instead of with them seems to be his modus operandi. With that m.o. he can’t lose, no matter how illogical he gets.

        Over the same period, human CO2 emissions have grown by a CAGR of around 5% per year.

        This would be true for 1868-1873 and 1960-1972. Otherwise the CAGR of human CO2 emissions fluctuates between 18% and −17% as can be seen at this graph. Since 1980 it has hovered around 2-3%, in excellent agreement with the CAGR of 2.2% for the accumulating CO2 being measured at Mauna Loa.

  40. A truly balanced selection of quotes can influence the discussion that follows. From the conclusion of the same paper:

    “Large scientific uncertainty is never an argument for acting as if the risks are small”

    “Within a risk-management framework, a lack of confidence in the best available probabilities of the day is no argument for inaction”

    “Policy-relevant science and economics can communicate the costs of delay as clearly as it does the cost of actions”

    These quotes, along with all that is presented and discussed in the rest of the paper, reflect the approach and reasoning that provides strong support for substantial global emissions reductions as well as planning for adaptation, basically in parallel with the IPCC.

    • Martha

      You point out that S+S opine:

      Large scientific uncertainty is never an argument for acting as if the risks are small”

      But when it is physically impossible to ever reach a 5°C warmer world from human CO2 emissions, the risk of doing so becomes zero.

      This then becomes something that no sane person would worry about.

      That was my point.

      Max

      • I see. Scientists do not work with an agreed-upon value but you have one. That’s fascinating. Maybe you should email the world’s thousands of scientists immediately with your input.

        In the meantime, what the science says and what we can already observe is that even a modest rise in temperature is associated with a continuum of effects, including rising sea level, shifting seasons, increased storm damage, serious loss of soil and water sources, changes to ecology and marine biodiversity, and all that goes with these changes.

        As many people already understand, the effects will not be evenly distributed around the globe and how dangerously temperature climbs over the next decades and whether or not temperatures stabilize is at least partly related to future social and economic scenarios, as well as how the earth responds to AGW-induced conditions, as well as natural influences.

        If you read the most current science i.e. from the past 5 years, the IPCC estimates may be too low.

        Please familiarize yourself with the core science and related issues. Your level of ignorance is inexcusable. :-(

      • Martha,
        You keep on about slr, but it is not happening as you seem to beleive.
        You keep on about a continuum of bad things that are not actually ocurring in reality.
        Perhaps you need to get out more?

      • Martha – If we restricted ourselves to the end of the century, Max would be correct in asserting the improbability of a 5 C warming, but when he says it wouldn’t be possible to “ever” reach it, he is clearly wrong. For example, if we assume that CO2 rises to 1000 ppm from 390 and that there is already about 0.8 C warming in climate commitment from current forcing, then a mid-range equilibrium climate sensitivity value yields a warming of 4.98 C (essentially 5 C), and if the upper boundary of the generally estimated range is used – 4.5 C per CO2 doubling – then a warming of 6.91 C would result. This neglects the possibility of a substantially greater warming that might results if a tipping point such as massive methane release were triggered. It also neglects the possibility of greater fossil fuel deposits than are currently estimated as potentially recoverable.

      • Just to add a small point – the currently estimated equilibrium sensitivity range for CO2 doubling is 2 to 4.5 C. Nothing in recent data has done much to change this for a persistent CO2 forcing. It’s worth noting that sensitivity values derived for short term climate responses such as due to ENSO are inapplicable, because the dynamics are very different. This relates, for example, to studies by Lindzen/Choi, Spencer/Braswell, or Dessler, none of which addressed CO2 or other long term forcings.

      • Fred, have you bothered to calculate what kind of economic scenario it would take to get to 1000 ppm? I’d be more worried about a Klingon invasion.

      • P.E. – 1000 ppm is inevitable if we don’t permanently curtail carbon emissions. However, if you are claiming it might not occur by the end of the century, I think you may be right but that’s not a certainty, given the rate at which global energy demands are increasing and the rate at which China and other populous nations are industrializing.

      • “then a mid-range equilibrium climate sensitivity value yields a warming of 4.98 C (essentially 5 C), and if the upper boundary of the generally estimated range is used – 4.5 C per CO2 doubling – then a warming of 6.91 C would result. This neglects the possibility of a substantially greater warming that might results if a tipping point such as massive methane release were triggered. It also neglects the possibility of greater fossil fuel deposits than are currently estimated as potentially recoverable.”

        I am interested in where you see this 4.98 C or 6.91 C increase in temperature, roughly.
        Would be mostly even as 1/2 of temperature increasing daytime highs and the other half increasing nite time temperatures.
        Or would even in sense mostly reflected increasing night time temperature and since most tropical temperature do not have much “room” to increase in the nighttime, this nighttime increase would mostly in temperate zones.
        Or would most of the increase in average temperature be in night time polar regions and temperate zone.

        Would in either of the 4.98 C or 6.91 C increase see Antarctic average temperature of -50 C increase to say -10 C and see similar increases in arctic regions.

        It seems that in 50 to 100 year it quite possible considering the possible tightness in supply of fossil fuels, that methane Hydrates could be mined
        thereby greatly reduce the amount of methane hydrate which could released by warming temperature. And in terms of ocean deposit, these deposits must be in deep water- below 200 meters.
        It seems temperature at 200+ meter under ocean would require centuries of warming in order to get modest increases in temperature- therefore quite possible a number such 90% of deposit would mined before warmer condition affected them.
        As far as methane hydrate in permafrost, most permafrost isn’t very old and if lacking in age, one a lacks the time for accumulation of deposits.
        In addition such deposits would previously been released during this interglacial period and during more recent warming in last hundred years.

      • gbaikie – I can’t answer your question about the future distribution of elevated temperatures except to suggest that it might follow past patterns of warming due to increased greenhouse effects. In general, these have involved greater warming at higher than lower latitudes (particularly in the Arctic, where sea ice melting reduces albedo), greater Northern than Southern hemisphere warming, greater night time than daytime warming, and greater winter than summer warming. Some of the evidence is stronger than others, but the above seem to be the general patterns.

        The prospect of mining methane deposits for energy is intriguing and I think should be pursued.

      • Let me keep pursuing this 1000 ppm thing. We hear a number of claims being made:

        1) Business as usual will lead to 100 ppm eventually.
        2) Fossil fuels are expensive, and green tech is cheaper.
        3) Peak oil will make the question moot.
        4) There’s a conspiracy of fossil fuel companies to prevent green technologies.

        And probably a few others that escape me for the time being.

        These can’t all be true, can they?

        It’s possible that looking beyond the 2100 horizon that fossil fuels will still be the bread and butter of energy, but perhaps a more important question than what is the odds of a 5C rise is what are the odds that fossil fuels will still be the primary source of energy in 2100?

        Some of this, or course, has to do with politics. There is a rather straightforward way to get to carbon-free electricity in the short term, and it’s called nuclear fission. But conventional uranium fission is in the political dog house. Other technologies just over the horizon include salt-core thorium-based fission. And a little further over the horizon is possibly fusion.

        So we get back to the question, what will the mix look like in 2100? Some of it is based on an unforeseeable breakthrough in fusion technology (here’s the uncertainty monster in another cloak), and some of it has to do with some grand political bargain (will these Gen IV fission technologies be politically acceptable?).

        Keep one more thing in mind: by 2040 or 2050 the climate sensitivity should be settled beyond any reasonable doubt. The earth will either warm unambiguously or it won’t. So policy makers then, if the warming appears, will feel a sense of urgency that exceeds what they feel now (they talk a good game, but nobody in the policy area is acting like they’re really concerned) might force some serious choices to be made wrt Gen IV technology.

        Against that backdrop, the 1000 ppm scenario seems kind of silly, doesn’t it?

      • P.E. – I agree with much what you say, although I think we have fairly adequate data already on climate sensitivity to CO2, and much as we would like to narrow the range of estimates, that may not happen quickly.. As to whether the 1000 ppm scenario is “silly” – well, I would like to think so, but I’m doubtful. In general, though, I tend to agree with your implied message of trying to develop alternative energy sources including the nuclear ones.

      • “P.E. – 1000 ppm is inevitable if we don’t permanently curtail carbon emissions. ”

        It seems that when talking about century into the future that anything is inevitable. Though we might get the Jetsons yet.
        No doubt people will eventually die, but in a century or two they could live twice as long. And though taxes seem eternal they could be significantly reduced sometime in the future.
        It is also possible in a century or two that people live on average half the current life expectancy, that CO2 is permanently curtail, and that we could exceed 1000 ppm at some point in distant future.

        It seems extremely unlikely that before 2100 that CO2 will reach 1000 ppm. And it seems possible that by year 2200 people consider there a desperate of CO2 in the atmosphere.
        And it seem it’s far smarter to think about next 20 years then 50 year or longer. OR imagine people 50 years in the future will not have a far greater ability to deal with any future problems.

      • “It’s possible that looking beyond the 2100 horizon that fossil fuels will still be the bread and butter of energy, but perhaps a more important question than what is the odds of a 5C rise is what are the odds that fossil fuels will still be the primary source of energy in 2100? ”

        I guess that by 2100 and if global have stayed around same level or have risen by 1-2 C that fossil fuel will not be primary source of energy.
        There are many factors which could converge which bring this about..
        And it is doubtful that political forces will have little to do with it.
        But one start with political forces or aspects.
        First by 2100 will there be global superpower?
        And if so which country- or will some new country be formed?
        Will the UN exist? Or will that be seen in history as an entity formed
        to deal with the conflict of US and USSR?
        Could the superpower status be about the amount population and their purchasing power [GDP]. Or will it be about some other characteristic.
        Perhaps others could give a list of what these could be.
        Instead politics, let’s focus of technology- which has been the dominate factor in altering the future in last 500 years.
        Let’s list potential technology which could influence energy use:
        Nanotechnology
        Computer technology
        Nuclear technology
        Space technology
        Material technology.
        And all the above are interrelated. Nanotechnology perhaps least developed and could the most transformational. And probably has most dollars being invested in it if include microbiology as aspect of nanotechnology [microbiology is “natural” nanotechnology] .
        Nuclear technology could include fusion. Or could include traditional
        fission in which one could be pebble type reactors that are widely available.
        I would assume that by 2100 everyone uses LED lighting or something better. That computer technology eliminates the need for human travel- for work or recreation. But travel involving recreation may be more sport than a need to get somewhere- one may mostly travel because one wants the traveling aspect. An easy guess is home heating will be “geothermal”- use heat pumps.
        Space technology should continue to evolve- even if less exciting things
        do not happen [I find very unlikely] one still what it is doing at the moment plus some improvement. Right now everyone uses it everyday and 100+ billion yearly market. So mere continuation may make a + trillion dollar market by 2100.

        So what is most fossil fuel used for currently- and we focus on fossil use that makes most CO2.
        The first that should be pointing out [because people hopelessly missed informed] is that passenger car use does not make much CO2- it never has. If one mixes up all transportation elements then it’s become slightly more significant. But not as significant as coal power generation which dominate source of human CO2 emission- and why China and it’s large use of coal leads the world in CO2 emission.
        So mainly the CO2 emission is from coal for power generation and most of that power isn’t used for residential uses- instead mostly commercial and industrial use.
        So anyways the simple solution would replacing coal with nuclear power plants- if CO2 were considered a problem. Since no new coal plants are being built in US, nor are many going to be built, as far as
        US is concerned, it’s only problem is lack of new nuclear powerplants.
        In regards to China it’s likely they can’t maintain their 10% growth for another 50 years. And without such need of growth, there will be less need of coal powerplants [cheap and fast to make] plus they probably have significant coal shortage within a decade or 3.

      • Fred Moolten

        Check the WEC 2010 report I cited for“inferred possible total fossil fuel resources” of our planet.

        You will see that there are just enough optimistically estimated fossil fuels left to reach 1,065 ppmv atmospheric concentration when they have ALL been 100% used up.

        That’s it, Fred.

        The “absolute maximum ever possible” CO2 level from human fossil fuel combustion.

        Ain’t no mo’.

        Your temperature calculation is flawed, as well.

        It’s all about empirical data, Fred, not model simulations.

        Look at the actual physically observed CO2/temperature relationship over the past 160+ years rather than using model-derived “climate commitment from current forcing” and an exaggerated “mid-range equilibrium climate sensitivity value”.

        Max

      • gbaikie

        Your assessment makes good sense.

        China’s coal reserves are large, but they will become increasingly difficult and expensive to exploit, so they will (already have) start to depend on more nuclear power.

        Same goes for India, which has a fast breeder reactor prototype running today.

        The future will not be anything like the past. As the song went: “the past is dead and the future is blind”.

        There were serious studies in the mid-19th century warning that cities like New York, London (and even Manchester) would be inundated by two meters of horse manure by 1930 from the rapidly growing number of horse carriages.

        So much for IPCC computer-generated SRES scenarios and storylines for the future.

        Bring out the shovels…

        Max

      • I am always amused by Fred’s pronouncements regarding climate sensitivity and how so many lines of evidence yield the same result and how we can ignore papers claiming a lower sensitivity, bacause they are based on confused signals generated by ENSO. History tells us that we should distrust these estimates. Hansen in 1988 had 4.2K for sensitivity. The latest GISS has 2.6K which is a huge decrease from the point of view of policy. I’m sure that Fred circa 1988 would have defended Hansen’s dramatic overestimate because it was stated in a peer reviewed paper and the vast majority of the literature agreed. Fred, the question in my mind is what evidence would convince you that these estimates are highly questionable? You are not allowed to resort to argument from authority. I have heard via “internet buzz” that there is a new paper that is giving a lower estimate, approximately 2.3K.

    • Martha,
      Quoting from the IPCC, when they are not following reasonable policies or procedures for disclosure, analysis, attribution, review, and have no guidelines for conflict of interest is not really a useful tactic on your part.

      • “Quoting from the IPCC”

        hunter, What I quoted is from the paper that is the topic of this thread. Clearly, if you think I am quoting from the IPCC, you have not read what we are discussing. Have you, now? ;-)

  41. Martha writes “These quotes, along with all that is presented and discussed in the rest of the paper, reflect the approach and reasoning that provides strong support for substantial global emissions reductions as well as planning for adaptation, basically in parallel with the IPCC”

    Once again, shades of Tom Lehrer’s Vatican Rag. “Genuflect, genuflect, genuflect, genuflect”.

  42. “Climate change is a result of the greatest market failure the world has seen. The evidence on the seriousness of the risks from inaction or delayed action is now overwhelming..The problem of climate change involves a fundamental failure of markets: those who damage others by emitting greenhouse gases generally do not pay” From the Stern review.
    Where’s the uncertainty in this statement?
    However one should give Stern the benefit of the doubt after all the British economy is in such excellant shape with high growth low unemployment all which can be directly attributed his time from 200 when he became second permanent secretary at H.M. Treasury, with responsibility for public finances, and head of the Government Economic Service.
    No doubting the talents of this man?

    • The temperature of Earth is well inside the range we have enjoyed for the past ten thousand years. The overwhelming evidence is that we will stay inside this same temperature range and this will happen if we take no action or extreme action that is bad for us.

  43. Sorry. it should say 2003 and the source was Wikipaedia

  44. –> “… and assessing the ethical, logical, philosophical and economic underpinnings…”

    … and the sociological underpinnings. The Education Industrial Machine has failed America and there is nothing to slow the inexorable slide of the society into ignominy because the secular, socialist bureaucracy need only continue to fool succeeding generations of increasingly ignorant graduates of the state-funded dropout factories to maintain its hegemony over the productive..

  45. How is it that Stern has any credibility after authoring his report described in Wikipedia as:

    “The Stern Review on the Economics of Climate Change is a 700-page report released for the British government on October 30, 2006 by economist Nicholas Stern, chair of the Grantham Research Institute on Climate Change and the Environment at the London School of Economics and also chair of the Centre for Climate Change Economics and Policy (CCCEP) at Leeds University and LSE. The report discusses the effect of global warming on the world economy. Although not the first economic report on climate change, it is significant as the largest and most widely known and discussed report of its kind.[1]”

    The Stern report is total rubbish and has been largely discredited, notwithstanding the report was featured throughout the IPCC AR 4.

  46. All these people who think a more honest presentation of uncertainty will make anti-CO2 policy more likely are just fooling themselves. As L. Ron Hubbart is credited with saying, if you want to control people you have to lie to them.

    The fundamental problem is anti-CO2 policies are just plain bad ideas. A carbon market is a giant tax on energy consumption combined with the largest subsidy to financiers in human history. If people are going to accept such a thing they have to believe 1) that the tax will not effect reduction in emmissions by capping economic activity, it will not function as a knob which controls economic output, and 2) that it is absolutely necessary to avoid a real catastrophe down the road.

    Notice both those things are utter lies. If you tell voters the truth they’ll never support a carbon market.

    The same is true about other policies. Wind and solar are economic debacles by themselves. For voters to support throwing taxpayer money down those holes they have to believe it’s needed to avoid a real catastrophe and that the investment will actually net create jobs. If one is honest and says “the catastrophe is just a possibility, and we can’t really say how likely it is,” then voters will not be afraid of it.

  47. Dear Dr. Curry,

    Here’s an essay I feel is rather interesting, and gets at the heart of a lot of the issues going around about how science is being done, and the sort of critical questions we should ask others and ourselves (whatever spectrum of the debate one is on).

    http://theconversation.edu.au/critically-important-the-need-for-self-criticism-in-science-4160

    I thought maybe it would be interesting to you as well. I fear in all these debates the basics on what makes good science–how it’s done, and how we apply it–are getting lost and that makes for a lot of the problems and unnecessary uncertainties (and/or assertions) we’ve been discussing lately, as well as muddles the conversation. A firm foundation from which to measure and ground the conversation is utterly important, in my belief; and this essay is a good message, reminder, to all us doing science whatever the field.

  48. This all presupposes that the decision making model is the best one. There’s a whole other planning method where the risks are just noted, but contingencies aren’t developed.

    By the time all the uncertainties and unknowns are chained through, acceptance of the results are most likely to be based on faith.

  49. Libertarian Ethics would solve the problem quite easily:

    Require insurance to publish, and hold the publisher, the insured, and the insurer liable for their actions in the same way we hold drug companies liable for their actions.

    Under that model, the quality of all scientific research will go up, and work will not be published that includes speculative claims. It is currently in no one’s interest to act ethically, since the entire system of academic research is populated with perverse incentives. The data pretty clearly suggests that the peer review process does not work – that it enshrines existing paradigms. The data suggets that publishing articles rather than books allows shoddy work that would be prevented by the more ready criticism and higher expense of larger works. The data suggests the few people who use citations have read the original papers. The data suggests that the top people who are capable of reviews do not perform them.

    The way the market solves this problem, in general, is to enact penalties and require third party insurance of the results.

    • I would love to see your so-called data, as would the rest of the “science of science policy” community, but I doubt it exists. Where is it? If you mean impressions rather than data then yours are largely mistaken, albeit popular.

      As for your insurance scheme, if reading papers could directly injure people the way taking drugs does it might be rational, even for fiction, but it is not. Nor is the notion of “Libertarian Ethics” unless that is your book title.

    • Why restrict the requirement to science? You could require everyone to have insurance before they publish anything. You can’t have too much Libertarian Ethics.

    • “Under that model, the quality of all scientific research will go up, and work will not be published that includes speculative claims. It is currently in no one’s interest to act ethically, since the entire system of academic research is populated with perverse incentives. The data pretty clearly suggests that the peer review process does not work – that it enshrines existing paradigms. The data suggets that publishing articles rather than books allows shoddy work that would be prevented by the more ready criticism and higher expense of larger works.”

      If you reduce it by say 50% you would reduce the crap by close to 50%,
      but you still have a lot worthless papers. A court case will not prove anything- other than perhaps one following some stupid laws [which ironically are desired in the name of “Libertarian Ethics”.
      So we keep all the existing crap and slow down the yearly addition of more of it. And new stuff having the stamp of court approved.

      The other point suggests the books are somehow better. I think Nature does better in general as compared to books.

  50. Didn’t there used to be links to Science of Doom and Skeptical Science on the blog roll?

    http://scienceofdoom.com/
    http://www.skepticalscience.com/

    • Science of Doom is one of the best sites on the Web for offering accurate explanations of basic climate science and geophysics principles at a level that is detailed without being excessively technical.

    • There have been some blog roll dynamics lately I noticed as well.
      Jo Nova is mia, and Jennifer Marohasy would be a good addition- both educated women with a lot to say about clilmate and enviro issues from the Australian pov.
      Solar Cycle 24 is still the best snapshot of solar activity.
      SoD, is a classic site that gets the details right but misses the picture.
      But the only constant in life is change.

      • But Bishop Hill and No Frakking Consensus?

        Just looks a bit too political and not scientific to me. I’m interested in the science and I think we should keep politics out of that.

      • Louise,
        AGW is largely a political movement using a veneer of science.
        Donna and the Bishop both document the confusion and worse that occurs as the AGW meovement continues to distort the scienctific process to gain social capital.
        And they do a very good job of it.

  51. Louise

    Like you, “I’m interested in the science and I think we should keep politics out of that”.

    How would you rate IPCC in that regard?

    Max

    • Max and Louise, if you look back I think you will see that we are discussing the entire climate debate, not just the scientific portion. The science is only important in the context of the policy issues.

      • David Wojick

        You are right, of course.

        Sure, there was Arrhenius, etc. but he politics came first as far as the current CAGW hysteria is concerned with the establishment of IPCC, charged with the expressed agenda of finding alarming human-induced climate change and coming up with proposals to mitigate against it, namely direct or indirect global taxes on carbon..

        So, from the start, we had agenda-driven science and a corrupted “consensus” process led by IPCC and fueled by the prospect of trillions of dollars of tax revenues.

        Climategate etc. and the scramble to “act now!” made that abundantly clear.

        But IMO what is important now is to continue to expose the many holes in this agenda-driven science so that the general public can see that it is being bamboozled.

        Max

  52. The history of the creation of and the complete lack of any scientific backing for the setting of the 2℃ limit to the rise in global temperatures that is so often quoted as some sort of barrier which any global temperature increase must not cross without the probability of inducing some catastrophic consequences can be found in;
    “Spiegel on line International”
    “The Invention of the Two Degree Target” Jan 2010.

    http://www.spiegel.de/international/world/0,1518,686697-8,00.html

  53. Confusing science with woo-doo climatology is not prudent. Physics, arithmetic, chemistry are sciences – they instantly admit when they make a mistake. Going from Nuclear winter for year 2000, before we even defrosted – they turned 180 degrees into GLOBAL warming – without admitting that they got it wrong… Then they tray to make you forget about the phony GLOBAL warming, by jumping into climate change…? Climate never stopped changing in 4 billion years, and never will.

    Climate was changing 300-500y ago, when was less cars on the road, even less electricity was used – was more trees… Proving that climate is changing is not a science; but woo-doo tactic for fleecing the Urban Sheep. Q: would the climate stopped changing, if it wasn’t any industrial revolution? Looting the ignorant shouldn’t be called science!

  54. As long as S+S were deliberating

    whether a target of ‘50 per cent chance of remaining under +2◦C’ is either ‘right’ or ‘safe’

    the essay had some semblance of credibility..

    But the minute.they started bloviating about the risks associated with a “6◦C degree warmer world” (from AGW, of course), the paper lost all credibility

    Sound policy-making is then hindered by the lack of sound scientific speculation on high-impact events, which we cannot currently model but may plausibly experience. Failing to engage with the question ‘What might a 6◦C warmer world look like, if it were to occur?’ leaves only naive answers on the table for policy-makers to work with.

    This is pure fear mongering for which there is no credible scientific base.

    There is no way we could “plausibly experience a 6◦C degree warmer world” from AGW. It is pure fantasy.

    Max

  55. Judith Curry

    Looks like we have pretty much exhausted the dialog on Smith + Stern.

    Any chance that you can get access to the full papers of Stewart or Spiegelhalter/Riesch?

    The abstracts sound interesting.

    Thanks.

    Max

  56. Joachim Seifert

    Hi, Fred Moolten, you seem to be the greatest CO2 expert of all of us.
    Lets suppose, from the bottom of the Earth (as happened in Africa in conjunction with volcanic activities) would arise a tremendous cloud of CO2
    and would cover the planet up to 500 yards high…… (since it is heavier than air, it would stay on the bottom below the air (forget mixing with wind)
    Question to the great CO2-expert: What would the temperatures of Earth be with a 500 yard cover of CO2?
    Awaiting your appreciated judgement….
    JS

  57. ‘fallacy of misplaced concreteness’

    This is at the heart of the problem with modelling the future. The assumptions that the future is a concrete “place” that we are traveling to, that can be predicated through cause and effect models.

    No matter how sophisticated your models, Victorian age physics cannot accurately model the future. It quickly goes off the rails as is seen through weather forecasting. The underlying assumption about the nature of the future is wrong.

    Victorian age physics sees the future as deterministic. If you know all the inputs (physical laws, initial state, etc.) you should be able to calculate the outputs (future state). We know that at a fundamental level that view of the future is wrong.

    The future exists as a probability only. It is not set in concrete. The future is not written. There are an infinite number of futures ahead of us, and we will end up in one of them. Which one is a matter pr probabilities.

    When you throw 2 dice, the most likely future is 7. That does not preclude any future between 2 and 12. To assume that the future will be 7 because that is the most likely is the fallacy of misplaced concreteness.

    To bet your future on the dice turning up 7 is a misplaced strategy. Humans in the past have succeeded largely because they have the ability to adapt to any throw of the dice, and prepare accordingly.

    • ps: the reason that betting on 7 (the most likely future) is misplaced, is because while it is the most likely as compared to any other result, it is not the most likely when compared to all possible results. Of the 36 possible futures, only 6 of them are 7. The other 30 futures are a different result.

      Therefore there is only a 1 in 6 chance of an arriving at the most likely future. So, if you put all your money on 7, and neglect the other possibilities, there is a 5 in 6 chance you will end up bankrupt.

    • Joachim Seifert

      Hi, Ferd, sometimes you really get things completely, but completely wrong…. : “The Victorians (I am a Victorian) see the future deterministic…”
      This is exactly the case when you have ALL the VARIABLES in your model.
      Once, you leave one out (a major one is missing, as is the case today), then NUMBERS wont add up and they go into PROBABILITY OF THE REMAINING VARIABLES…… all this “likely, very likely” of the IPPC, the Probability of Remaining Variables (PRV) is nonsense…only computer gaming, no scientific value whatsoever…. Dont let you talk into this rubbish… The models have to be with all variables included or they value
      nothing….
      JS

      • In an infinite universe you can never know all the variables.

      • Joachim Seifert

        We can leave out the little ones, the small shorttimers,
        but important is, you have the elefants in the game….
        which is not the case with the IPCC …… and the CO2 is not more
        than a mosquito in size, about which all the bloggers reminiscend… and,
        this way, they are, regrettably, completely off and only guessworking…
        JS

  58. “Lets suppose, from the bottom of the Earth (as happened in Africa in conjunction with volcanic activities) would arise a tremendous cloud of CO2
    and would cover the planet up to 500 yards high…… (since it is heavier than air, it would stay on the bottom below the air (forget mixing with wind)
    Question to the great CO2-expert: What would the temperatures of Earth be with a 500 yard cover of CO2?”

    If CO2 exceeds 100,000 ppm animals would die quickly, at 50,000 ppm [5%] one would begin to lose consciousness.
    “Atmospheric pressure drops by approximately 50 percent at an altitude of about 5 km. (In other words, about 50 percent of the total atmospheric mass is within the lowest 5 km).”
    http://www.newworldencyclopedia.org/entry/Earth%27s_atmosphere
    500 yards is roughly .5 km. the density of air lessen as you go higher, so even if mixed one would have 10% or more CO2- and so everyone dies.

    As for temperature. Earth’s atmosphere has a mass equal to about 10 meter of water. Or if you were to liquidity all of earth gases they would cover earth at a depth of about 10 meters.
    The composition of earth atmosphere is:
    “78.09% nitrogen, 20.95% oxygen, 0.93% argon, 0.039% carbon dioxide”
    http://en.wikipedia.org/wiki/Atmosphere_of_Earth
    Or 7.809 meters, 2.095 meters, .093 meter of argon and .0039 meters
    of CO2 [ 3.9 mm]. With water vapor varying from few mm to .4 meters [400 mm]. Roughly.
    Your addition would be roughly 1 meter of CO2 or increasing CO2 by about 250 times [btw require it about 750 trillion tonnes].
    If you assume that one gets 1 C per doubling: 390 to 780 to 1560 to 3120 to 6240 ppm [etc]. One would have a lot of warming- about 8 C.
    Venus has about 92 of earth’s atmospheres so this would be about 1/900th of Venus’ CO2.

    • Joachim Seifert

      Earth temps and CO2:
      You say: “IF YOU ASSUME” – again and again, the assumers have
      spoken…. if you assume, the holy spirit has encircled my house…
      …… are we in science or in storytelling…..?

      Well, you can make good: The CO2-content has various times doubled
      in paleohistory….. did warmings follow because of it?
      Just refer to the historic dates, I will have a look…..
      JS

      • “Well, you can make good: The CO2-content has various times doubled
        in paleohistory….. did warmings follow because of it?”
        No.
        Warming preceded CO2 rise.

        “Just refer to the historic dates, I will have a look…..”

        http://en.wikipedia.org/wiki/File:Vostok_Petit_data.svg
        http://en.wikipedia.org/wiki/Ice_core
        btw, it looks like dust levels sometimes precede temperature rise.

        I know of no example in paleohistory in which CO2 can be shown to have directly caused warming. Though CO2 does seem to rise whenever temperatures rises and drops whenever temperature decrease. It’s commonly “assumed” rising CO2 levels help maintain global temperatures.
        It was once an hypothesis that CO2 levels was major element which caused ice age and interglacial cycles but that is mostly regarded as false.

        The best evidence that rising CO2 causes warming is in the last 100 years, and seems to me, that at the most, one gets 1 C rise in temperature per doubling of CO2.
        I don’t regard this as having a large degree of certainty.

  59. “David Young | November 18, 2011 at 1:40 am | Reply
    Hansen in 1988 had 4.2K for sensitivity. The latest GISS has 2.6K which is a huge decrease from the point of view of policy.”

    Extrapolating this result we end up with a CO2 sensitivity of 0.0K in 2048.

    This result is equally as valid as any climate science result for the future. It is based on observation of trends, using a computer model that follows the laws of science as laid out by al-Khwārizmī in 820 AD.

  60. Interesting Paper. I think it would be useful for policy makers if they could assess how young a science is in establishing uncertainty. How long does it take for a new discovery to be overturned ? How many new discoveries occur per unit time ? What impact in terms of the climate sensitivities and temperature do the new discoveries have. As an example considerer the role played by aerosols in either reflecting sunlight, acting as cloud nucleation sites, absorbing infra red radiation. Think of how dynamic this research area is, how many new discoveries are anounced and try to quantify how established is that science. Quantifying the rate of change in uncertainty would be very useful for policy makers.

  61. The folks responsible for “The National Climate Ethics Campaign (2011)” don’t feel that it is ethically right to use uncertainty-
    “Because the risks to humanity and the planet associated with inaction are extremely high, it is also morally wrong to use scientific uncertainty as an excuse for to delay or prevent emission reductions or preparation for the consequences of climate change.”……….. (pg 4)
    http://climateethicscampaign.org/storage/CEC%20Handbook%20final%2011-9.pdf

  62. Repletion with bureaucratese and code-words (A to lay readers, -A to insiders) is not a virtue. And horse-pucky like, “climate policy, where the scale of the risk is great even if we cannot provide precise probabilities of specific events, and where many plausible changes are effectively irreversible should they occur” is a dead giveaway.

    Self-serving speculation is not a basis for policy — or, rather, should not be.