Uncertainty: lost in translation

by Judith Curry

We have looked at what uncertainty means and doesn’t mean in science, how it is measured, when it can’t be measured and how that might change through research into the big questions. Above all we asked how other people can grapple constructively with advances in knowledge and changes in thinking, instead of despairing at ‘those uncertain scientists’. – Tracey Brown and Tabitha Innocent

Sense About Science has a superb document entitled Making sense of uncertainty: Why uncertainty is part of science.   From the Introduction:

Uncertainty is normal currency in scientific research. Research goes on because we don’t know everything. Researchers then have to estimate how much of the picture is known and how confident we can all be that their findings tell us what’s happening or what’s going to happen. This is uncertainty.

But in public discussion scientific uncertainty is presented as a deficiency of research. We want (even expect) certainty – safety, effective public policies, useful public expenditure.

Uncertainty is seen as worrying, and even a reason to be cynical about scientific research – particularly on subjects such as climate science, the threat of disease or the prediction of natural disasters. In some discussions, uncertainty is taken by commentators to mean that anything could be true, including things that are highly unlikely or discredited, or that nothing is known.

Some clearer ideas about what researchers mean by scientific uncertainty – and where uncertainty can be measured and where it can’t – would help everyone with how to respond to the uncertainty in evidence.

Contributors to the document include Michael Hanlon, Paul Hardaker, Ed Hawkins, Ken Mylne, Tim Palmer, Lenny Smith, David Spiegelhalter, David Stainforth, and Ian Stewart.

The document has 6 sections:

  • Those uncertain scientists
  • So what is uncertainty for scientists?
  • Predictions and models?
  • Do we even need more certainty?
  • Playing on uncertainty
  • Delving deeper

Much of the substance presented here is very familiar to those that have followed Climate Etc.’s uncertainty series.    A few tidbits that I thought were particularly well stated:

[I]t is misleading to quantify uncertainty that cannot be quantified – in these cases there is an even greater need to talk equally clearly about what researchers do not know as what they do. ‘Unknown unknowns’ cannot be identified, much less quantified, and the best approach is to recognise this.

On emotive, economically important and political subjects, uncertainty in research findings has been played up and played down according to whether people favour or object to the implications. 

‘Consensus’ suggests that scientists aim to agree. This incorrectly implies that scientists try to minimise uncertainty for the sake of finding consensus. When researchers in a field assess the ‘weight of evidence’, they don’t simply mean the number of studies on a particular question, or patients per study, but how compelling the evidence is and how thoroughly alternative explanations have been looked at – a result of the scientific process and peer review of new research.

Until we understand scientific uncertainty, we risk being seduced or confused by misrepresentation and misunderstanding. Without an understanding of uncertainty amongst the public and policymakers alike, scientists will struggle to talk about uncertainty in their research and we will all find it hard to separate evidence from opinion.

Lost in translation

The Carbon Brief comments on this document in context of climate change, in a post entitled Lost in translation: Scientific uncertainty and belief in climate change.   Excerpts:

‘Making Sense of Uncertainty’ calls this the “misuse of uncertainty”: the politicisation of scientific uncertainty itself to gloss over or ignore evidence. “Smoke and mirrors”, they call it. I call it bad translation, but it all boils down to the same thing.

A simple way to remedy bad translation is to adapt the communication to fit the audience. A number of excellent publications, like this article from Somerville and Hassol – which identifies areas of confusion and offers alternatives for scientists – make plausible suggestions.

But what’s of interest to us here is the mechanics of bad translation. So, as it pops up again and again, let’s take a closer look at uncertainty and likelihood.

When scientists talk about uncertainty – and other terms like probability – they refer to how likely it is that something will happen. What the audience may hear is, “we don’t know if this is right”.

For example,  the IPPC says:

“[M]ost of the observed increase in global average temperatures since the mid-20thcentury isvery likely due to the observed increase in anthropogenic greenhouse gas concentrations”.

‘Very likely’ in this case means a probability of more than 90 per cent. But the way that uncertainty is presented in the media strongly indicates that the public ‘translates’ this term as a probable cause, but with significant doubt. This is an interpretation which is significantly less certain than the original intention.

Of course, misunderstanding is possible or likely, but not guaranteed. The inexactness of the transmission means that there is a reasonably large margin of possible interpretations for the audience. They will not necessarily misunderstand, but it is likely that a significant proportion of a non-scientific audience will understand uncertainty to mean doubt, as it does in the dictionary, and likelihood to indicate possibility, not probability.

Bad translation does not just mean that the content of the message is misunderstood. It also foments a secondary kind of uncertainty in the audience’s mind. This secondary uncertainty is psychological rather than scientific, and is characterised by doubt. To use the metaphor of ‘sowing the seeds of doubt’, if the badly-translated scientific uncertainty is the seed, this secondary uncertainty is the plant that grows from it.

As a final point, we will look briefly at the politicisation of uncertainty mentioned in Sense about Science’s report.  In his book, ‘Understanding Uncertainty’, the statistician, Dennis Lindley, states: “uncertainty [for most people] is a personal matter: it’s not the uncertainty but your uncertainty”.  This is often overlooked in communicating scientific uncertainty, and is a key element of its politicisation and “misuse”.

Much of the strength of arguments which try to undermine the areas of agreement in climate science is in their clarity, repetition of memorable phrases and their certainty about both their own rightness and their opposition’s wrongness. Confidence is persuasive.

However, green politicians are starting to play the same game.  UK climate secretary Ed Davey’s  speech at the Met Office last month is another example. It was loaded with emphatic statements like “The facts don’t lie, the physics is proven. Climate change is real and it is happening now”.

Davey’s speech reached a much wider audience than the Met Office, largely because of his diatribe against “absolutely wrong and really quite dangerous” misinformation. Certainly, it riled The Telegraph and the Mail enough to respond. In the process, the newspapers accidentally promoted his speech. There may well be a question over how comfortable scientists are with this approach, but this is certainty a strategy.

JC comments: In the ten’s of thousands of words that I have written here on uncertainty, this material raises a few points that I haven’t covered adequately or bear re-emphasizing.

With regards to how the ‘public’ interprets uncertainty, I am reminded of this conversation that Peter Webster had with a Bangladeshi farmer having little formal education (which Webster has used in numerous presentations):

PJW: We hope to provide you with seasonal forecasts to help you plan your agricultural activities

FARMER: That would be good.

PJW: But we will not always be correct: Perhaps 7 times out of 10.

FARMER: (after some thought): That is fine! Only God knows 100% what will happen, and he is not telling  and you are not God! Right now, we guess each year and that means we are right as often as wrong. 70% means I am ahead! 

The point is this.  Nearly everyone has some informal understanding of the notion of probability.  Public officials and particularly those in decision making positions at government agencies have at least an operational understanding of probability, likelihood and risk, and in some agencies the relevant people have a highly sophisticated understanding of probability and risk.  Journalists dealing with political, policy and technical issues also arguably have an operational understanding of uncertainty and risk.

Overegging the pudding with emphatic and overconfident statements about the science (effectively minimizing uncertainty)  causes a number of problems. It motivates the other side to make contradictory statements even more emphatically and confidently.  And then if your understanding turns out to be incomplete and predictions are not realized, there is a public loss of confidence in your position, which can spill over to science in general.

The most worrisome problem to me is one that I have not hitherto seen discussed is the impact of overly emphatic and overconfident statements on the science itself.  Such overconfident assertions take away the motivation for scientists to challenge the consensus on detection and attribution, particularly when they can expect to be called a ‘denier’ for their efforts. The consensus among scientists about attribution of climate change extends far beyond the small community of climate scientists that actively work on detection and attribution and either publish or extensively read the primary literature.  This extended group may work on climate change impacts or related problems, with individuals deriving their confidence in the consensus in a second-hand way from the emphatic nature in which the consensus is portrayed.

So there is a positive feedback loop whereby overconfidence in statements about the consensus has become a self-fulfulling prophecy, at least in the short term.  This is one reason why the ‘pause’ is so interesting, with recent observations not matching the expectations from climate model projections and some people projecting that pause could continue for several decades.  How scientists react to this, and how partisans in the debate play this, will be interesting to watch.  I suspect that there will be backlash against the overconfidence if the pause does continue.

201 responses to “Uncertainty: lost in translation

  1. The positive feedback is very much based on “broken telephones” all around.

    1. The scientist in the scientific paper details the uncertainties

    2. The IPCC chapter mentions the uncertainties

    3. The IPCC SPM (summary for policymakers) contains some indication of the uncertainties inside one item in the bibliography

    4. The Press Release doesn’t have space for the uncertainties apart from a side remark in the middle of the text

    5. The interviewed scientist is not asked about the uncertainties

    6. The journalistic article isn’t interested in the uncertainties

    7. The policymaker either doesn’t know the uncertainties exist, or pivots all his/her career about some of the uncertainties as reported to him/her third- or fourth-hand.

    And then

    8. The scientist gets funding from the policymaker with the (mis)understanding that the uncertainties don’t really exist

    9. Back to point 1

    • I like the broken telephone analogy

      • thank you Judith. Some time ago I have actually detailed a very good example of “broken telephones” in AR4.

        And yes, the IPCC will never be shown “wrong”. Misunderstood, even too prone at misunderstanding itself, but the core of the science will still be there. At the other end of the broken telephone.

        —-

        I remember when alignments of Moon craters where used to demonstrate their volcanic origin. Nowadays, the same alignments of the same craters is used to demonstrate their asteroidal origin.

      • curryja | July 29, 2013 at 4:56 pm |

        Except if your policymaker is such a dim bulb that they JUST RELY ON NEWSPAPERS FOR SCIENCE and don’t know to research the heck out of the issue prior to the evidence-based policy decision process, then there’s something seriously wrong here.

        The policymaker might be predisposed to one bias or another prior to beginning the evidence-based policy decision process, including about uncertainty, but if the evidence-based policy decision process is so flawed as to not consider the evidence thoroughly and competently, then maybe we ought get people to stop re-electing Jim Inhofe.

        Erm, I mean, incompetent policymakers.

        Hey, anyone read http://www.nytimes.com/2013/07/28/us/politics/obama-says-hell-evaluate-pipeline-project-depending-on-pollution.html?_r=0

        I know TransCanada Pipelines’ spin department has read it; there’ve been dozens of articles responding all over the oil-producing world with criticisms and nitpicking and outright proof one should not JUST RELY ON NEWSPAPERS for unbiased journalism.

        There’s a difference between a broken telephone and a guy making hissing noises at the other end pretending there’s a bad connection.

      • BartR your suggestion that politicians should be more interested in Science than newspaper reports of Science, presupposes that they are more interested in scientific facts, than public opinions about scientific facts. It is public opinion that is their life, and consequently those who form those opinions whom they most respect. Or why Inhofe’s state newspaper is the “Daily Oklahoman” and Bloomberg’s is the “New York Times”.

      • Is the unstated goal of the projectors to maximize confidence in and reliance on their pronouncements? Consider the etiology of the “90% = highly likely” assertions. It is consensus of a clique of self-selected and designated experts, retro-translated into a number which is trotted out to inspire reliance on the consensus creators! Self-serving blather.

    • David L. Hagen

      Quantifying & Reducing Uncertainties
      Nigel Fox of UK’s National Physical Lab provides an excellent perspective on how large the uncertainties actually are, how they could be reduced by an order of magnitude, and that large impact that would have on differentiating between models. See TRUTHS at NPL.

      See: Nigel Fox’s lectures:
      “Resolving Uncertainty in Climate Change Data” (30 min)
      ‘Seeking the TRUTHS about climate change’. (57 min)

      Technical:
      Accurate radiometry from space: an essential tool for climate studies doi: 10.1098/rsta.2011.0246 Phil. Trans. R. Soc. A 28 October 2011 vol. 369 no. 1953 4028-4063

      Further links to Nigen Fox TRUTHS.

  2. My involvement in climate science, after a career in elementary particles physics, was triggered by an email offer at the college I was teaching, that I could have as many copies of “An Inconvenient Truth” as I wanted to convince my students that dangerous global warming was occurring. I wrote back asking instead for a number of copies of Feynman’s Lectures in Physics for my students and how it seemed to me the that the video showed that increased temperatures caused most of the increase in CO2, not the other way around. I was immediately challenged to a debate, by another physicist, because a real climate scientist doesn’t debate, they pontificate.
    In any case, I knew immediately listening to their responses that they were wrong and didn’t know what they didn’t know, which has been Al Gore’s problem forever. I didn’t have to know any details to know that they were uncertain enough that they couldn’t sustain a debate.
    Of course, uncertainty calculations are determined by what we know is undetermined, but can never estimate what we don’t know. My hobby horse is the effect of cosmic rays on energy flow which as far as I know is nowhere taken into account but which is demonstrably important, just as basically as that CO2 is a greenhouse gas.

    • David Springer

      Hi Charles,’

      FYI the hyperlink to your blog doesn’t work.

      Very well said. There appears to be strong correlation between solar magnetic field strength (which has the effect of throttling high energy extra-solar particles reaching TOA) using sunspot numbers going back 400 years as a proxy for field strength. The latter half of the twentieth century is a solar grand maximum and almost like a light switch near the turn of the century it ended and appears to be going scary quiet i.e. quiet enough to make Dalton and Maunder look frisky.

      The difference of course is there’s a half-baked greenhouse theory that works for a pure radiative and conductive system but is woefully incomplete on the latent energy transport. Unfortunately latent energy transport is the big kahuna in the vast majority of the troposphere. I suspect it sucks pretty bad at predicting horizontanl and vertical mechanical transport by ocean currents too.

    • Charles Jordan,

      Hear, hear!

      More Feynman, less “climatology”.

      Live well and prosper,

      Mike Flynn.

      • > More Feynman, less “climatology”.

        Here’s one from Michael Scott:

        There were 183 of us freshmen, and a bowling ball hanging from the three-story ceiling to just above the floor. Feynman walked in and, without a word, grabbed the ball and backed against the wall with the ball touching his nose. He let go, and the ball swung slowly 60 feet across the room and back – stopping naturally just short of crushing his face. Then he took the ball again, stepped forward, and said: “I wanted to show you that I believe in what I’m going to teach you over the next two years.”

        Another one, from another person:

        When I took the same freshman physics class in 1974-75, in the same lecture hall (“201 Bridge — it snows here even on the hottest days!”) the same demo was performed. (Actually, the pendulum bob was not a bowling ball by this time, but a shiny brass sphere about 18 inches in diameter.) The first quarter of the course was taught by a hot young field theorist and Feynman-sycophant.

        As she released the massive pendulum, she gave it a slight push, then calmly awaited its return. Another professor saw the push, leaped from his seat in the front row, and tackled the first professor out of the way, a few moments before the pendulum returned and lightly kissed the wall.

        http://www.langston.com/Fun_People/1994/1994ABX.html

      • David Springer

        @Willard

        re; the professor tackling the other to save her from her own miscalculation

        I can sympathize with the tackler. He knew what would happen from teh moment the other released the pendulum. He had several seconds to think about whether he should let her learn a lesson or not.

        One day in college I had a similar experience with my roommate. We’d had a party the night before. Beer bottles were all scattered about. On my desk was a full bottle with no cap on it. Heineken in green bottles. I even remember the brand. It’s like it was yesterday. My roommate was a frugal sort and when he saw the bottle he said “Damn it Dave I hate when you open up a beer and then don’t drink it.” He walked over, picked it up, and was about to take a swig from it so it didn’t go to waste. Only it wasn’t beer. The bathroom was occupied the night before and I had to piss so I refilled an empty beer bottle. It was beer piss and probably had no flavor, odor, or color. But it was still piss. But my roommate had just chastised me. I had a split second to make a decision. At the last possible instant I yelled “BILL DON’T!!!!” and snatched the bottle out of his hand just inches from his lips.

        He never bitched at me again for opening a beer and not finishing the whole thing. We remained best friends for decades. We even married two girls who themselves were best friends. We’d been in the Marine Corps together for a few years and by then had both gotten out and were attending the same college. I think he was a little pissed (so to speak) about how close I let him get to taking a swig. I admitted it was a tough decision.

    • Charles Jordan | July 29, 2013 at 5:05 pm
      ” how it seemed to me the that the video showed that increased temperatures caused most of the increase in CO2″

      Not today, that’s for sure. Going from 280 PPM to 400 PPM in atmospheric levels of CO2 is not caused by an increase in temperature.

      • David Springer

        Really. Exactly how much of the CO2 rise is attributable to CO2.

        Please show your work. ;-)

      • David Springer

        Ummm… I mean how much of the CO2 rise is attributable to temperature rise.

        If you know it isn’t the entire 120ppm from 280 to 400 then surely you know how much it actually is, right? Otherwise you couldn’t say what you said. Unless you were just a babbling ass that tends to say the first thing that comes into his pointy little head of course. Which you aren’t, right?

      • Rob Starkey

        Webby- You wrote- “Going from 280 PPM to 400 PPM in atmospheric levels of CO2 is not caused by an increase in temperature.” – There is a compound effect. A rise in temperature causes an increase in the release in CO2 from soil. It is not possible to state how much of the increase in CO2 from 280 to 400 is due directly to humans. We know we release a lot of CO2. We do not know what percentage of the CO2 in the atmosphere is from humans

      • Don’t act so clueless SpringyBoy. I have modeled that in a book chapter and several times in blog posts. A small fraction of the CO2 rise is due to long-term temperature increase in the surface promoting outgassing. Since this temperature increase is only on the order of 1 C, the outgassing is small relative to 40%.

        http://theoilconundrum.blogspot.com/2011/10/temperature-induced-co2-release-adds-to.html

        This post is cleverly titled Temperature Induced CO2 Release Adds to the Problem

      • David Springer

        Your article suggests 1ppm/year increase from outgassing per degree of sustained temperature anomaly. A number of sanity checks arrive inside the ballpark. Excellent. I’ll accept that for now.

        The last cold episode of the little ice age ended in 1850. That’s 160 years where the ocean has been ostensibly warming up with some positive anomaly. According to this:

        the anomaly in 1850 was -0.6C and by 1900 was -0.2C and by 2000 was 0.2C. That seems adequate per your 1ppm/year/degree to outgas about two thirds of the 120ppm increase since 1850.

        Please explain how that’s not the case and your arrival at only 20ppm.

        I see one flaw in that you start your impulse function in the year 1900 which ignores the -0.4C sustained anomally built up between 1850 and 1990 and continued for the next 110 years. That works out to about half a degree C anomaly sustained over 120 years or 60ppm outgassing by the 1ppm/year/degree rule.

        Now lets do a sanity check. You calculate 20ppm outgassing for temp anomaly starting with 1900 baseline. I get an additional 60ppm for the sustained 0.4C anomaly created between 1850 and 1900. That’s 80ppm total or two thirds the total observed increase. Checkarooni.

        Thanks for playing. Nice try but global warming began in 1850 with the end of the LIA not 1900 which appears to be an arbitrary baseline you pulled out of your ass to get the result you wanted.

      • David Springer

        typo correcto

        -0.4C sustained anomally built up between 1850 and 1990 1900

      • no good, you can’t do math that loosely springyboy.

      • David Springer

        No good. You can’t criticize that broadly.

  3. The first sentence reads “We have looked at what uncertainty means and doesn’t mean in science, how it is measured, when it can’t be measured and how that might change through research into the big questions.”

    I was intrigued, and read the rest. The point I have tried to make over and over again, is that when a quantity is measured in physics, we always know how accurately this measurement has been made. We always get a +/- that goes with the measured value. The amount of the uncertainty is, then, certain.

    When it comes to CAGW, the absolutely vital quantity is the climate sensitivity for a doubling of CO2, however defined. Because we cannot do controlled experiments on the atmosphere, with current technology we cannot measure climate sensitiivty, and this is where all the uncertainty in CAGW lies, and why our hostess, and many others, have written as much as they have on the “uncertainly monster”.

    I am sorry, but if some quantity in physics cannot be measured, then there is a fundamental problem. We can do all the theorizing we like, but in the end we cannot know whether this theorizing makes any sense, because we cannot measure the vital quantity; in the case of CAGW, climate sensitivity. So we will always be unable to use physics to tell us what will happen when we add more CO2 to the atmosphere from current levels. Physics cannot tell us whether CAGW exists or not. No-one can prove that CAGW either exists or does not exist. It will remain a permanent hypothesis. That is what our hostess ought to conclude from her musings on the uncertainty monster. And that is what the IPCC should have stated many decades ago.

    • Jim Cripwell

      +100

      Max

    • …and you’re quite certain of your statements, are you James?

    • Jim:
      Your point about direct immeasurability is well made. One can, however, draw sound scientific conclusions about what sort of measurements and analyses CONTRADICT the CO2-driven climate-change hypothesis. I have in mind here the demonstrable lack of cross-spectral coherence between CO2 and temperature variations on multi-decadal time-scales in the currently measured time-histories.

      • John, you write “draw sound scientific conclusions”

        I agree. I have noted several times in the past that the point you make, gives a strong indication that the climate sensitivity for a doubling of CO2 is indistinguishabke from zero. But being strictly pedantic, is is not proof that the hypothesis of CAGW is wrong. So CAGW remains a viable hypothesis.

      • Jim:
        Being more practical than pedantic, I would characterize the CAGW hypothesis as one that lacks convincing empirical evidence. Although I never claimed climate sensitivity “indistinguishable from zero,” CO2 is almost certainly not the principal driver of surface temperature changes observed in the 20th century. There’s much left undiscovered in the infant science of planetary climate systems, whose workings cannot be explained by radiative processes alone.

      • John, You and I agree. I think that what you have written is absolutely correct.

  4. A long time ago in climate blog time-two or three days, At the end of the consensus 11 blog I asked a question, to which I was hoping to get an answer or two. It may have appeared to be rhetorical. it was not meant to be. I seriously would appreciated an answer from anyone!
    —-and, I know many who will participate on this thread, did so there.

  5. David in Cal

    Assigning probabilities to unknown unknowns is merely a communication device IMHO. It’s not a measurable number, the way roulette probabilities are. It cannot be tested or falsified. It doesn’t come out of a scientific formula. So a “90% probability” is merely a way for a researcher to say that s/he’s highly confident. I think the media give too much weight to such personal probabilities, not too little.

  6. people might be more inclined to revisit if you provide a loink. Going back thru this stuff can be a pain…just a thought, darrylb.

    • hmmm, a ‘link’ would be better than a ‘loink’, but I see Dr C has so noted. regards

  7. David in Cal

    Leonard Savage’s classic “Foundations of Statistics” defines personal probabilities in terms of the odds at which one would make a bet. Under his definition, for someone to say that his personal probability exceeds 90% that (most of the observed increase in global average temperatures since the mid-20th century is due to the observed increase in anthropogenic greenhouse gas concentrations) would mean that he’s willing to give odds of greater than 9 to1 that this proposition is correct. I wonder how many people at the IPCC would give those odds if they were betting their own money.

  8. Chief Hydrologist

    The most incredible thing happened in climate in 1998/2001. Climate spontaneously reorganized into a new – and cooler – configuration. This was predicted by no one and understanding the fundamental significance of this climate shift is still emerging.

    Climate had previously shifted in 1976/77 – to a warmer mode characterized by a warm PDO and more frequent and intense El Nino. We were aware of this – it was widely known as the Great Pacific Climate Shift. It was speculated that the 1976/77 shift was caused by global warming and that this was a new and permanent state of the Pacific Ocean. The shift back to cooler conditions reveals underlying climate dynamics that transforms notions in a new paradigm of how climate works.

    The old paradigm applies inappropriate methodologies of reductionism to a dynamically complex system. The new paradigm allows progress to a state where the source of uncertainty is known and the new methods of complexity science can be applied.

    • well put, Chief

      • The Chef said:


        The most incredible thing happened in climate in 1998/2001. Climate spontaneously reorganized into a new – and cooler – configuration. ”

        And that is said with complete certainty. The irony of it all.

        Look at the words as well: “most incredible” and “spontaneously”. How so scientific !

        That is why The Chef is such a valuable commenter, as he keeps the level of FUD so high.

      • Chief Hydrologist

        Uncertainty is about the future – we are entitled to some certainty about the past. The 1998/2001 climate shift is as obvious as the 1976/77 ‘Great Pacific Climate Shift’.

        It is both the most significant climate event this century yet and happened abruptly as a spontaneously emergent behavior of the climate system. Prominently a shift to a cool PDO and an increase in the frequency and intensity of La Nina.

    • I agree that these climate shifts took place, and indeed I pointed this out in my several books in detail. But isn’t this scenario superimposed upon a background of a steadily upward trend (albeit less steep than alarmists claim) probably due to rising GHG concentrations?

      • Actually, i think the external forcing projects onto these shifts making the steady upward trend not a very good characterization

      • Chief Hydrologist

        CO2 should lead to increased temperatures – all other things being equal. We are pretty much guaranteed that all other things will not be equal in unpredictable ways – and this extends to the planetary energy budget.

        Of course there is always the chance that CO2 will push the system past a threshold.

      • The “steadily upward trend” is not to be found in the century-long records from the least-altered station sites. Such records show multidecadal variations, rather than secular trends, dominating the lowest spectral frequencies. Multidecadal components are also very prominent in the most reliable proxy series, such as GISP2 del18O.

      • JohnS, “The “steadily upward trend” is not to be found in the century-long records from the least-altered station sites. Such records show multidecadal variations, rather than secular trends, dominating the lowest spectral frequencies.”
        That is not true. There is evidence of a steady long term trend if you look at regions that have less impact from the shorter term variability.

        Since land has a stronger response to radiant forcing than ocean, the difference gives a reasonable estimate of the impact of CO2e forcing. About 0.8C per doubling.

        Using tropical paleo you can get a longer term estimate with uncertainty of course, but a fair indication.

        If you were looking for natural variability you would find it.

      • Capt:
        I do look for natural variablity–in century-long time series that are products of actual measurement at hundreds of fixed locations, rather than of ad hoc statistical syntheses of inadequate data fragments from ever-changing locations, such as you show. Eating such data “sausages” leads to scientific indigestion, if not something worse.

      • JohnS, “I do look for natural variablity–in century-long time series that are products of actual measurement at hundreds of fixed locations, rather than of ad hoc statistical syntheses of inadequate data fragments from ever-changing locations, such as you show.”

        If you take enough fixed locations which have varying seasonal cycles and average them out you will get pretty much a straight line except for a dip in the 1600 to 1880 time frame. Since the polar regions can be in phase or out of phase with the mid latitude and tropics, you can get more or less wiggles depending on where you look. When your ad hoc data fragment represents over 60% of the energy in the system you will likely get a more reasonable estimate.

        If you are not into paleo, you are pretty much stuck with half a globe of information up to 1955 then a less than stellar collection of data the other half which really shows the ~60 cyclic pattern.

        Of course the southern data is high suspect, but you could compare that with SOI

        Northern extratropics has the best coverage for both land and oceans so you might find that more interesting.

        If you are worried about the splice of instrumental to the Indo-Pacific Warm Pool reconstruction, you could make a more involve slice comparison.

        Or you could Marcott the whole mess and not find much of anything.

      • Captn:
        My original remarks clearly concerned “century-long records from least-altered station sites” as a requisite basis for bona fide scientific analyses. Such records from properly sheltered thermometers are available world-wide only for the 20-th century. In 1750, there were only a dozen towns in Northern Europe, a couple in the US, and Rio de Janeiro recording temps daily. No such recordings were made then in all of Asia, Africa and Australia. And outside of well-traveled sea lanes, SST observations by ships of opportunity, left vast expanses of the oceans uncovered. That is hardly a scientifically credible foundation for any “global average” time-series. My point is that by sticking to empirical evidence required for hard science, truly secular trends are not to be found in the aggregate, except in locations whose environment strongly changed during the past century: urban centers and lands newly cleared to sustain an exploding global population.
        By bringing up paleo reconstructions–which seldom agree amongst themselves–to contradict my point, you’re venturing into the field of tea-leaf readings. I now regret mentioning GISP2, because that’s not where I would go for PRIMARY scientific indication.

      • To clarify a murky sentence, please read: …SST observations by ships of opportunity left vast expanses of the oceans largely uncovered until the satellite era.

      • Johns, “By bringing up paleo reconstructions–which seldom agree amongst themselves–to contradict my point, you’re venturing into the field of tea-leaf readings.”

        I wasn’t bring up paleo to contradict your point, I just didn’t think you had a very good point. The data is what it is. You can’t throw any of it away without a valid reason. The tropical SST and land I used is for well traveled regions. The trade winds as trade routes kinda force that. The North Atlantic is well covered as well. If you like you can do a correlation of those regions with GAT to get a margin of error. Since the Tropics agree well with the base signal in GISS or any other surface record, globally, hemispherically and regionally, that is a pretty good indication that there is a longer term secular trend. When that secular trend is dead nuts on the Indo-Pacific Warm Pool reconstruction, there just might be a probability that things are making sense.

      • Captn:
        There is a big difference between temperature data gathered under the various circumstances I described, which is seldom appreciated by those who lack professional experience in making in situ measurements. And there is a categorical difference between actual measurements and proxy reconstructions. Reliable data is NOT simply “what it is,” as you would have it! BTW, by what analytic criterion do you conclude that a run of rising values constitutes a SECULAR trend, rahter than a feature of seeing only a fraction of a truly long-term cycle ?

      • JohnS, “BTW, by what analytic criterion do you conclude that a run of rising values constitutes a SECULAR trend, rahter than a feature of seeing only a fraction of a truly long-term cycle ?”

        The secular trend I am referring to is the rise from approximately 1600 AD in paleo and from 1900 AD in the instrumental. There is also a cyclic/pseudo cyclic trend of ~60 years in the instrumental that appears to vary from 60 to 150 years in the paleo. There is even a atmospheric forcing trend mainly in the NH that is likely due to CO2e and amplification of the other trends.

        I prefer the SST because even though it is sparser, it is an actual temperature not an average of Tmax and Tmin and I have compared the SST with the regional Tmins available from BEST and it is reasonably in agreement other than the pre-1955 near Antarctic region. For example Oceania Tmin shows the trend. The Galapagos Island Tmin shows ENSO with the secular trend. You can pick any number of coastal or island surface stations that have a large thermal mass, ocean/sea/lake to filter out some of the atmospheric noise. There are a lot of ways to double check.

        That uses the equatorial temperature imbalance to help look for the secular trend.

        Just cutting out the 1976 climate shift you can see a fairly common longer term trend. Some of this is a bit like reading tea leaves though. You can use sequential linear regressions by region and standard deviation to help isolate change points, but most of the more novel methods are above my pay grade.

      • Captn:
        Since the power density spectra of the most reliable long-term temperature records, both instrumented and paleo, unmistakably show wide-band (highly irregular) cyclical behavior over multidecadal to quasi-millennial scales, the use of linear regression to establish “trend” over arbitrarily chosen time-intervals is analytically untenable. That you do that with synthesized indices by “removing the 1976…climate shift” make such results even more fanciful.
        Although I, too, would prefer to work with SST rather than land-station records, the stark fact is that the Surface Marine Observations are all-too-often inadequate to establish reliably even monthly climatological summaries, let alone long time series, in most Marsden squares. Well-traveled sea-lanes may cross the tropics, but they don’t run extensively inside them. Ships don’t report Max and Min temps, only the temps observed at standard six-hourly GMT times. Thus at least 1760 data points are required for a determination of a yearly average. Furthermore, the sampling methods for SST have been highly inconsistent. It requires a leap of faith to accept such synthesized indices as ERSST, which relies upon unacceptable spatio-temporal interpolations to construct long time series, for scientific inference.
        Similar objections apply to BEST’s imprudent use of “scalpeled” snippets of record. Their results are generally deviant in their spectral structure and are often highly corrupted by UHI (which affects Tmin in particular) at major cities, which usually have the longest records. This corruption is then spread more widely by “kriging” into regions of sparse station coverage.

      • JohnS, “That you do that with synthesized indices by “removing the 1976…climate shift” make such results even more fanciful.”

        I think you are missing what I did there. That is the difference between the NH and SH latitude bands or the hemisphere imbalance based on absolute temperature not anomaly. Since they both would have the ~60 year cycle, the differencing highlights trends not common to both. In fact, since the “regain” from an NH dominate little ice age would be primarily in the NH, it is more compelling.

        “Ships don’t report Max and Min temps, only the temps observed at standard six-hourly GMT times.” Right, so 71% of the surface is not reported as Tmax and Tmin so I should base “global” of 29% of the surface? I still say you have to use the data that is available. I can double check SST with Satellite data which is also not reported in Tmax and Tmin and I can compare SST to coast Tmin from surface stations.; There is an error margin that increases as you go back in time, but it is what you have for the MAJORITY of the surface.

      • Captn:
        The bottom line is, no matter what you do, if you don’t have reliable, unbiased data, then you don’t have a firm basis for empirical inferences. Such data, unfortunately, is available globally only since the satellite era. Thoroughly scrutinized and validated station records in barely sufficient dnesity tyo provide reasonable estimates for continents extend no further back than the 20th century. Enough said!

    • David Springer

      That’s a bunch of crap that it wasn’t predicted by anyone. It was predicted by climatology (as opposed to climate science). Climatology is pretty much actuarial so once you have data with identifiable spatio-temporal patterns you can predict based on those. Just like how hurricane tracks are predicted.

      You have to be nearly blind (or very stupid) to look at NH land temperature record, even old ones before satellites, and not see an approximate 60-year sine wave in it.

      Predicting that the sine wave will continue is pure climatology and works like a charm in this case. The goofy climate scientists with their CO2 control knob simply failed in the climatology department because they were all about theory which turns out, unsurprisingly, to be a huge gorpy hypothesis spanning just about every branch of science riddled with assumptions some of which are wrong. How wrong even David Springer doesn’t know.

      • Chief Hydrologist

        I was concerned with hydrological regime theory in Australia especially. These are related to Pacific conditions that produce drought and flood dominated regimes of approximately 30 year durations. These were traced back in the mid 1980’s in flood records to 1850. A mere moment in the scheme of things. So I had an inkling that things might change back again – but only time would confirm the hypothesis. I know of no one who was predicting a turn around at that stage – and it would have been a courageous not to say reckless prediction.

        Before the turn of the century proxies were limited and even more unreliable than they are now. There were alternative explanations for the dip in temperatures in the mid century in the instrument records. There were simple CO2 measurements and radiative physics that suggest still that CO2 changes climate – therefore how to disentangle natural from anthropogenic change was the question of the day. The instrumental records are of such limited duration as to be statistically useless for these multi decadal patterns. The PDO was not described until 1996 and much else has emerged since then.

        The critical insight into dynamical systems theory specifically applied to these ocean and atmospheric patterns did not emerge until the Tsonis et al paper in 2007. (Tsonis, A. A., K. Swanson, and S. Kravtsov (2007),
        A new dynamical mechanism for major climate shifts, Geophys.
        Res. Lett., 34, L13705, doi:10.1029/2007GL030288. – http://heartland.org/sites/all/modules/custom/heartland_migration/files/pdfs/21743.pdf )

        This is not about sine waves or cycles but about abrupt change in a non stationary and non Gaussian dynamically complex system. 20/20 hindsight is all very well – but the abrupt climate shift of 1998/2001 was certainly not predicted.

      • David Springer

        You misunderstand. Do you know the difference between climate science and climatology? The pause, give or take five years, was perfectly predictable with climatology. Right on time. Magnitude perfect. A repeat of the pause 60 years ago.

      • David Springer

        Ask Curry. She knows her climatology and when pressed for a prediction makes the same one I make – a multi-decadal pause just like what happened in the 1950’s. Texas is even having the same decadal drought as we had in the 1950’s. This is how climatology works. It isn’t climate science. Theory takes a back seat to historical data and repetition of weather patterns that occur under similar circumstances. So for example we take everything we know about every hurricane in the past and find the most similar past set of circumstances to make our prediction of how the current storm will evolve.

        In other words it doesn’t take an iota of climate science to predict 40+ inches of snowfall in Buffalo in January, February, and March of 2014. Nor does it take Copernicus or even Ptolemy to predict 2014 will have almost precisely 365 days and the sun will rise in the east and set in the west on each of those days.

        We still don’t have a theory of gravity but we can put a man on the moon and return him safely despite our ignorance. We have a law of gravity but no theory that explains the law. There’s a phuck of a lot of predicting you can do without any theory behind them.

      • Chief Hydrologist

        ‘climatology

        The scientific study of climates, including the causes and long-term effects of variation in regional and global climates. Climatology also studies how climate changes over time and is affected by human actions.’

        I think you have confused climatology with trendology.

        Moreover – I am still talking abut the ‘new dynamical mechanism for major climate shifts’ and not decadal trends. Much more significant .

      • Chief Hydrologist

        There is a difference between predicting shifts and realizing when the world is in one mode or some other. I have been talking multi decadal cooling for a long time. Although unless you have some idea that it changes the energy budget of the planet it makes little difference to the overall outcome if the planet continues to slowly accumulate energy.

        I have discussed with you the multi-decadal Texas drought and it’s origins in precisely these oceanic and atmospheric patterns. It is a little like the mirror image of Australian drought and flooding regimes.

        e.g. http://s1114.photobucket.com/user/Chief_Hydrologist/media/USdrought_zps2629bb8c.jpg.html?sort=3&o=40

        http://oceanworld.tamu.edu/resources/oceanography-book/oceananddrought.html

        But prediction of the timing and nature of the next climate shift is still quite impossible. This is where the new techniques of complexity science may be of some use.

        e.g. http://www.ucl.ac.uk/~ucess21/00%20Thompson2010%20off%20JS%20web.pdf

      • “David Springer | July 29, 2013 at 10:31 pm |

        Ask Curry. She knows her climatology and when pressed for a prediction makes the same one I make – a multi-decadal pause just like what happened in the 1950′s. ”

        Look at the record. She is a huge supporter of Tsonis, and that is why she routinely eggs the Chief of Water to continue to wave the Tsonis flag of chaotic uncertainty.

        Look over there. Chaotic squirrels!

        Nothing works better to promote uncertainty than to point at chaos as a culprit.

      • Willard, “Nothing works better to promote uncertainty than to point at chaos as a culprit.”

        Squirrel? Hadley cells, Ferrel cells, polar cells and polar vortexes are atmospheric circulations that are dependent on Coriolis effects, latitudinal temperature range and stable ranges or space. Changing the temperature range and space, by like creating a circumpolar current in one hemisphere instead of both would create an asymmetry, one hemisphere would be more stable than the other. The stable hemisphere would be less sensitive to change than the less stable. There would be two climate sensitivities.
        I don’t recall seeing that very basic part of atmospheric dynamics discussed much, pretty much only “global” sensitivity is discussed by the linear no threshold groupies.

        http://arxiv.org/ftp/physics/papers/0408/0408089.pdf

        Instead of a squirrel, it is an elephant.
        .

      • Chief Hydrologist

        ‘The climate system has jumped from one mode of operation to another in the past. We are trying to understand how the earth’s climate system is engineered, so we can understand what it takes to trigger mode switches. Until we do, we cannot make good predictions about future climate change… Over the last several hundred thousand years, climate change has come mainly in discrete jumps that appear to be related to changes in the mode of thermohaline circulation.’ Wally Broecker – http://www.earth.columbia.edu/articles/view/2246

        Notwithstanding webby’s stupidity – abrupt climate change is mainstream science.

      • > Willard […]

        You talking to me, Cap’n?

      • Willard, “You talking to me, Cap’n?”

        Yep. Chaos and turbulence are just a part of the problem and creates most of the uncertainty that gets lost in the translation. Kinda odd for you to call it a squirrel in this thread.

      • That’s Web who said that, Cap’n.

      • Chef references Broeker, who has scientifically determined that aCO2 will be damaging to the planet.

        Chef references Broeker’s studies of past abrupt climate change.

        Chef can not put 2 and 2 together.

        He tries and gets a negative number. Apparently Chef believes that the abrupt climate due to aCO2 will lead to cooling. Chef is a denier so that’s where he has to take Broeker’s statements.

        In fact, Broeker is pointing to a deterministic outcome caused by a forcing which will lead to a significant warning. That is not a chaotic squirrel. It is a homing pigeon.

        And then you have Cap’n attributing something that I said to Willard.

        Do these guys of the 3% persuasion have control over their faculties?

      • Chief Hydrologist

        Wally Broecker characterized carbon emissions as poking a stick at an angry beast.

        ‘No one doubts that abrupt climate change has happened – the evidence for it is overwhelming – but there are considerable controversies as to where, how, why or even when it happened. The uncertainties bear on the following points:

        • Spatial footprints: What is the geographical extent of a given climate event? Did it occur in both hemispheres? Did it happen at the same time everywhere, or can we track down its propagation on the Earth’s surface? For example, there is an argument as to whether the Younger Dryas (YD) cooling occurred in the Southern Hemisphere as well as the Northern Hemisphere. The main evidence in the Southern Hemisphere is a glacial advance in New Zealand, the date of which is unsure. It is a matter of pressing urgency to determine if the YD occurred in other parts of the Southern Hemisphere and in the Tropics.

        • Interpretation of climate records: accurate, global coverage of relevant climate variables is very recent, and in many respects still insufficient. Thus to gain knowledge of the past, paleoclimatologists must rely on “proxy variables” measured in various geological objects, such as ice cores, sediment cores (deep-sea and lakes), tree-rings, corals, etc. Since those are, by nature, indirect measurements, the exact amplitude of some isolated events is very difficult to determine. In the colorful prose of the NAS report: imagine that the climate system is “a device […] hidden in a box in a dark room.You have no knowledge of the hand that occasionally sets things in motion, and you are trying to figure out the system’s behavior on the basis of some old 78-rpm recordings of the muffled sounds made by the device. Plus, the recordings are badly scratched, so some of what was recorded is lost or garbled beyond recognition. If you can imagine this, you have some appreciation of the difficulties of paleoclimate research and of predicting the results of abrupt changes in the climate system.”

        • Causes of abrupt climate change : A viable scenario for abrupt climate change must fulfill 4 requirements (Broecker, 2003).

        1. the scenario must characterize the states among which the climate system has jumped.

        2. it must identify a mechanism by which the system can be triggered to jump from one of these states to another (see question 7).

        3. it must invoke a teleconnection mechanism by which the message can be rapidly transmitted across the planet.

        4. it must have a flywheel capable of holding the system in a given state for many centuries.’

        http://www.ldeo.columbia.edu/res/div/ocp/arch/

        ‘Modern climate records include abrupt changes that are smaller and briefer than in paleoclimate records but show that abrupt climate change is not restricted to the distant past.’ http://www.nap.edu/openbook.php?record_id=10136&page=19

        You are an utter idiot Webster.

      • No worries, Cap’n.

        Seems that the Financial Times are also tackling chaos:

        What makes the inaction more remarkable is that we have been hearing so much hysteria about the dire consequences of piling up a big burden of public debt on our children and grandchildren. But all that is being bequeathed is financial claims of some people on other people. If the worst comes to the worst, a default will occur. Some people will be unhappy. But life will go on. Bequeathing a planet in climatic chaos is a rather bigger concern. There is nowhere else for people to go and no way to reset the planet’s climate system. If we are to take a prudential view of public finances, we should surely take a prudential view of something irreversible and much costlier.

        http://www.ft.com/cms/s/0/c926f6e8-bbf9-11e2-a4b4-00144feab7de.html

      • Willard, “If we are to take a prudential view of public finances, we should surely take a prudential view of something irreversible and much costlier.”

        They kind of miss the point there. Chaotic would imply reversible just not predictably so.
        “Chaos: When the present determines the future, but the approximate present does not approximately determine the future.” Lorenz

      • I don’t think they had the same concept of chaos as you in mind, Cap’n.

        Here might be something more like yours:

        [D]eterministic chaos is a well-defined mathematical concept. It refers to a situation in which you can completely understand and be able to model (mathematically) some system, but cannot actually determine the initial conditions with sufficient precision to be sure that any simulation of the system’s evolution will actually match what is observed.

        http://wottsupwiththatblog.wordpress.com/2013/07/28/watt-about-deterministic-chaos/

        Have you seen the original discussion at Tony’s?

      • Chief Hydrologist

        Models have in fact several sources of instability. Scale, parametizations, coupling breadth and depth, data error bounds.

        Climate chaos involves small changes in control variables cascading through numerous sub-systems before settling into a new and emergent pattern.

      • Willard, “I don’t think they had the same concept of chaos as you in mind, Cap’n.”

        What kind of chaos you can expect would depend on the system you are dealing with. If you know the actual bounds of the system, you have a simple probability density. The problem is the bounds or space. Climate science neatly avoided the upper bounds and did not consider asymmetry. Now asymmetry is getting harder look and the system limits expanded to the stratosphere and deeper oceans. Just about all the previous “projections” where based on pitiful slab models with simplistic radiant kernals that did not consider 14 to 25% of the atmosphere. Flat earth models with two-dimensional physics don’t cut it.

      • Chef lists four criteria easily met by GHG theory

        1. STATE TRANSITION
        Transition from a low CO2 to high CO2 state. A significant fraction of CO2 that was geologically sequestered underground over an epochal time-scale is now residing above ground, and more is to come

        2. FORCING MECHANISM
        Mechanism of state change is CO2 forcing caused by fossil-fuel combustion

        3. CONNECTIVITY
        CO2 quickly diffuses to blanket the planet.

        4. PERSISTENCE
        The flywheel is the long-term persistence of CO2 in the atmosphere.

        Chef, you are a useful idiot to the people that run this blog. Even though they gush at your commentary, they don’t ask you to publish your findings or otherwise work with them. Ever wonder why that is?

      • Chief Hydrologist

        I copy from – http://www.ldeo.columbia.edu/res/div/ocp/arch/ – 4 criteria from Wally Broecker for recognizing abrupt climate changes.

        Webby gives 4 criteria for recognizing a twit.

    • R. Gates the Skeptical Warmist

      Chief,

      This is a good post, but you said:

      “The new paradigm allows progress to a state where the source of uncertainty is known and the new methods of complexity science can be applied.”

      ______
      I would only suggest this be amended to say:

      “The new paradigm allows progress to a state where one of the sources of uncertainty is known…”

      Certainly the shift to a cool PDO (which is really what you’re getting at) is only one of the sources of uncertainty. There are several others, with the AMO being one signficant one. The role of clouds and aerosols are another, and the sleepy Gleissburg sun yet one more. When the dust settlles– both literally and figuratively, those scientists who embraced these areas of uncertainty will prove to be the leaders of the next phase of our continually evolving understanding of the climate.

  9. By translating “manipulation” into “scientific data processing,” Somerville and Hassol reveal their astonishing spin. Without blatant manipulation of station data and reliance upon UHI-corrupted records, the 20th century time-history of surface temperature look remarkably different, with a deep low in 1976 and end-of-century levels only slightly higher than those experienced in the first few decades.

  10. I get the feeling that these authors don’t understand their own hypothesis. They are using uncertainty in climate science as an example of uncertainty acting against the science which is really quite well known – just not quite well known enough to make over-emphatic statements. But this, I think, is to miss their own main point: Uncertainty means that there is a possibility that all or part of the scientific theory being discussed is significantly wrong. One way forward – I hesitate to say the best way since that is too certain – is to test specific assumptions that the theory is wrong. Until each of those assumptions can be disproved or reduced, the relevant parts of the theory should be treated as being unusable.

    There will still be mistakes, but the end effect should be that the bounds of a theory become reasonably well known and hence the theory can be applied with a measure of informed confidence in some situations, and not be applied in others.

    In the case of climate science, some of the specific assumptions to be tested might be: We don’t understand and can’t predict ocean oscillations, solar cycles, cloud formation, the hydrological cycle, tropical storm cycles, etc, etc, or any of the interactions between them, My goodness, it’s a long list. The idea that we know anything major about climate with “90% certainty” is starting to look like hubris.

    • I would like to know the ratio of energy that is converted into latent and sensible heat when different wavelengths of light are absorbed by sea water. I know that in the case of blue light almost all of the energy absorbed is manifest as sensible heat, but I do not know if this is the case with red or infra-red radiation.
      As back-radiation is infra-red, it would be nice to know the albedo of the Earth, from 400-4000 nm, as well.
      Easily knowable, unknowns.

      • David Springer

        DocMartyn | July 29, 2013 at 6:38 pm | Reply

        “I would like to know the ratio of energy that is converted into latent and sensible heat when different wavelengths of light are absorbed by sea water.”

        Heh. You and me both. There are economical (for a university lab) continuous mid-range infrared lasers which can be brought to bear to isolate the variables of interest and test them. Climate scientists, near as I can tell, are like neo-Darwinists in that they’re not really interested in subjecting their dogma to experimental science. It can only end badly when you already have a 97% consensus. Speaking of which neo-Darwinists and climate warmists are the only scientific theories which rely on consensus instead of experimental evidence.

      • David Springer

        I figure you for a neo-Darwinist, doc.

        Predict for me, if you will, when P. falciparum will evolve a means of digesting sickle-cell hemoglobin? I mean here we have a eukaryote that’s probably the most studied organism in history, completely sequenced, and it predictably evolves (or not) resistance to anti-malarial drugs based solely on eukaryotic DNA copy errors of one SPM per 10^9 cell divisions multiplied by how many trillions of trillions of times it reproduces in a year or a decade. If resistance can be acquired in a single SPM there’s about a 50-50 chance it’ll find it in any single infected human. If it takes two SPMs where one alone does nothing it takes about 10 years. If if takes three co-dependant SPM’s it is unlikely to find a solution. Evidently it takes three SPMs (or more) to get over the human sickle-cell mutation.

        Now get this. In the past few decades alone P. falciparum has reproduced more times than all the reptiles and mammals that ever lived. I’m asked to accept the notion that while P. falciparum hardly changed at all that fish, with the same number of reproductive events, became mammals. I ain’t buying it. Sorry.

      • Sensible and latent fluxes don’t depend on incoming wavelengths. They occur equally under all lighting conditions, and only depend on the water and air temperatures and wind speed.

      • David, the ability of an organism to colonize a new niche depends of the cost/benefit of what the adaption causes it to have in its original niche.
        The human blood stream is only the only niche for P. falciparum. At the moment the largest pool of humans are wild-type and the maximum HbAS population is about 30%. P. falciparum survives and not thrives in HbAS individuals. If P. falciparum were to evolve to thrive in individuals with HbAS, but in doing so lose against the WT P. falciparum in WT humans, their population density would fall, as would their ability to infect.
        It is a swings and roundabouts situation. Evolving to fill a smaller niche, without an Island effect, is difficult.

      • JimD, have you got a reference for that? I would truly like to know if this experiment has been performed in pure water and in sea water.

        The thing is that the penetrance of photons into water is frequency dependent, so that IR will be absorbed at the very surface and blue light deep down.

      • DocMartyn,

        Do you mean that someone would have done an experiment on the dependence of evaporation and convective heat transfer on wave length of incoming radiation.

        I’m pretty sure nobody has done such an experiment with any kind of water. Some things are so obvious for every working physicist that they would certainly consider doing experiments to confirm those expectations total waste of effort.

        It’s good to maintain critical skepticism but there are also limits on that.

        What people have measured is the penetration depth of various wavelengths in water. The results tell that all wavelengths penetrate far enough (thousands of molecular distances or more on the average) in water to guarantee that they release essentially all their energy as heat. Thus the actual surface does not get any information about the wavelength of t radiation that has been heating the water, it just gets warmed from below and releases that energy either by releasing water molecules (that involves latent heat) or through collisions with other molecules of the atmosphere. These processes depend on local conditions at the skin, including the temperature and moisture gradient of the atmosphere very near to the surface, not on the surrounding radiation.

      • Pekka, your tone was uncalled for and your ‘everyone knows’ explanation shows you do not have any understanding of your trade.
        A packet of light of 400 nm will travel more than 9 orders of magnitude a greater distance, through water, than a packet at 3,000 nm. Half of a packet of light at 3,000 nm will be absorbed within the first micrometer of the water.
        the possibility that energy entering the system at 1 mictometer compared with 1 meter is quite reasonable.
        If you actually used your imagination, did experiments you would not be turning into the dumb-bunny of physics. .

      • I was not referring to everybody but to every practicing physicist. I seriously do believe that may statement is valid. For a physicist the situation is fully clear, and I did explain why.

        When the ratio of evaporation and sensible heat transfer from surface is controlled by the first few nanometers it does not matter what happens at the depth of 1µm, except that the surface temperature may change a little. It’s really irrelevant whether the radiation penetrates 1 µm, 1 m, or 1 km before being absorbed.

        There’s no direct link from absorbing radiation and evaporation (or sensible heat transfer to air). The only link goes through temperature.

      • So Pekka, you believe that illuminating the surface of the sea with an equal amount of energy, comprising of photons at 400 nm or 3,000 nm, will show no differences.
        You state this even though you know that the sea surface comprises of waves, foam, and airborne droplets ranging from 1 to 150 microns in diameter.
        You state this even though the physics of water evaporation of brine pools is complex, unexplained and that phenomenological models are still used in commercial salt pans, as physic’s based investigations have failed.
        You also know that water vapor has a very similar absorbance spectrum to liquid water, but not water droplets do*, and that at the sea/air interface there exist a multitude of brine droplets in the 1-150 micron diameter range**.
        You are aware that brine droplets will absorb more heat from an energy packet delivered at 3,000 nm than they will from similarly sized packet of 400 nm.
        I believe it possible that if you illuminate a model sea/air interface, with light of 400 nm and of 3,000 nm, one will have a difference in the rate at which the bulk brine heats and the rate water evaporates.
        You on the other-hand, not only don’t think it possible, you don’t believe it worth the time of a physicist to investigate the possibility that such a phenomena could exist.
        You display all the arrogance of a Aristotelian, and not a Popparian.

        *http://www.thermopedia.com/content/145/

        **http://onlinelibrary.wiley.com/doi/10.1111/j.2153-3490.1954.tb01085.x/pdf

      • Apparently no one save Doc Martyn has any clue about evaporation into the atmosphere, which is covered fairly well ion Brutsaert’s text by that name.

      • DocMartyn, you can look at text books in meteorology, like Judith;s, or engineering to see that standard evaporation formulas depend only on the wind speed, water and air temperatures, and air humidity. These formulas are derived from observations, and those are the dependent variables, not the amount of light around.

      • A truly scholarly introduction to evaporation–one of great historical value–is provided by: ishttp://authors.library.caltech.edu/2467/1/BOWpr26b.pdf

      • DocMartyn,

        The only important property of water that affects the evaporation and the transfer of sensible heat is the skin temperature. The properties of the atmosphere (temperature, moisture, winds) have their influence, and does the roughness of the surface (calm, waves, droplets, ..).

        What does not matter is the type of radiation that has resulted to that particular temperature. That does affect the skin temperature, but nothing more relevant to this discussion.

        The skin is cooled by evaporation and transfer of sensible heat, it’s warmed by heat flux from below. The net effect of IR is to cool layers slightly below the skin (first few µm), solar SW penetrates deeper and warms mainly the uppermost few meters, and to a lesser degree also deeper water. The heat from the heating by solar SW heats the skin from below. At locations and times without solar SW heating the skin is still colder than layers just below. In those cases the heat has been stored at another time or transported from other parts of the ocean.

      • John S.,

        It’s, indeed, typical that scientific publications that discuss this kind of basic issues are very old. Such old publication have often been significant steps forward, but seldom fully correct in view of present understanding.

        The paper of Bowen agrees certainly on what JimD and myself have trying to explain: The ratio of evaporation to conduction is determined by temperatures and properties of the atmosphere (pressure and humidity).

      • Pekka:
        Certainly there have been advances in understanding evaporation since Bowen’s seminal work and empirical adjustments in the calculation of the Bowen ratio. But what both you and Jim D clearly miss is that the temperature of the water is a function of the solar heating; thus the sensible and latent fluxes have a unmistakable diurnal cycle. Doc Martyn adresses the fundamental question of solar heating at various depths. Textbook physics has little to offer on that subject.

      • Just check what DocMartyn asked originally. The answer to that has been given:

        1. IR does not warm the water it cools.
        2. Everything related to absorption and emission goes through sensible heat also in case of IR.

        The albedo for IR has not been discussed. It’s significance is, however, influenced strongly by point 1. The albedo for IR is very close to zero for most surfaces including water and snow. For dry sand and some types of stone it’s a bit higher, but still small.

      • Pekka:
        Here’s Doc Martyn: “I believe it possible that if you illuminate a model sea/air interface, with light of 400 nm and of 3,000 nm, one will have a difference in the rate at which the bulk brine heats and the rate water evaporates.”
        Here’s JimD: “Sensible and latent fluxes don’t depend on incoming wavelengths. They occur equally under all lighting conditions, and only depend on the water and air temperatures and wind speed.”
        Here’s you (prior to reading Bowen): “There’s no direct link from absorbing radiation and evaporation (or sensible heat transfer to air). The only link goes through temperature.”
        In light of the fact that it requires high-entropy insolation to turn liquid water into vapor, which accounts for minimal evaporation at night, I find your terpischorean skills very impressive.

      • John S, it is not the light photons that cause evaporation. They are absorbed causing warming of the water, independently of evaporation occurring or not. Evaporation occurs if the air above is not saturated already. Evaporation can occur at night. This is why DocM’s original question made no sense. There is no connection between the photon wavelengths absorbed and sensible and latent heating that occurs. Photon absorption causes heating first, and for a given heating amount by any wavelength or other means, the evaporation would respond to that heating amount.

      • Jim D:
        Instead of examining what happens in situ, you keep reverting to well-known textbook stuff as if it explains evaporation from the oceans completely. Neither Clausius-Claperon nor the Bowen ratio in its various formulations can do that. The essence of Doc’s question, to my mind, concerns the RATE of heating at various depths and the consequent effect upon modes of heat transfer, not their theoretical possibility. This is precisely where the stark difference between the effect of SW insolation and IR backradiation becomes evident. The former is high-entropy, diurnally varying and is absorbed within tens of meters; it heats the uppermost layer. The latter is low-entropy, quite steady and is absorbed within micrometers; it has virtually no effect upon temperature beyond the evaporating surface skin. The wild card, of course, is convection and turbulent wind-mixing . Nevertheless, the rates of both sensible and latent heat transfer show strong diurnal variation, with daytime rates typically far exceeding those at night.

      • John,

        The first question of DocMartyn was very clear and didn’t leave any questions on interpretation. The later questions were not as clear taken by themselves but joining them with the first question, it’s clear that he was not discussing, how wavelength affects warming at variable depths but the possibility that the wavelength affects in some other way the ratio of evaporation to sensible heat transfer.

        On this question the answer is known reliably, not with absolute accuracy but with an accuracy that’s sufficient for all practical applications. That answer is: the wavelength if the incoming radiation within the range from UV to LWIR has no other significant effect than that trough warming.

        Trust in that answer is based on textbook physics, not direct answers from textbooks, but on the theory that the textbooks describe. All that physics is well verified by innumerable confirming experiments, and perhaps even more importantly by the innumerable successful application of that physics in engineering type applications.

        As I wrote in one of the above comments:

        It’s good to maintain critical skepticism but there are also limits on that.

        Errors are made when the best known parts of physics are applied, some details may remain neglected for long until someone comes up and brings that up. I cannot exclude with absolute certainty the possibility of such of error in what I write above, but I can tell that the likelihood is small enough to ignore.

        I repeat in a little more detail on of my above comments:

        The absorptivity and emissivity of radiation are equal for any single wavelength. Radiation passes trough water equally well in both directions. For these reason the depth distribution of the source of IR emission from water is the same as that of IR absorption. The skin of water and small droplets in air close to the surface have a temperature that’s close to that of the air in their immediate neighborhood because that air is essentially saturated by moisture. Under typical conditions the temperature and moisture of the air drop when we move a little higher. At some wavelengths IR penetrates only a few meters in the air but over most of the spectrum much further.

        Water emits with a very high emissivity over the whole IR spectrum, but the lowest and warmest atmosphere dominates the downwelling radiation only at a small part of it. For this reason the emission has the power of Stefan-Boltzmann law at the temperature of the skin while the downwelling radiation is always less intense.

        All the above tells that the net effect of IR on water is cooling at every depth, not only on total, but at every depth. IR does not heat small brine droplets or any other liquid present, it has a modest cooling influence in all cases. There are three cooling mechanism of comparable strength: net IR, convection, and evaporation. Their relative strengths depend mainly on temperature and the state of atmosphere near the surface. For heating of the skin we have only convective mixing of ocean water and conduction that’s important over very small distances near the skin. Solar SW contributes indirectly as the source for the heat brought to the skin by convective mixing and conduction.

      • Pekka:
        Heat transfer studies over oceans repeatedly show that, over climatic time-scales, the effect of evaporation exceeds all other transfer mechanisms combined. On a diurnal basis, high-entropy insolation is what drives that massive effect. Your understanding of what happens at the air-sea interface is not shared by oceanographers who specialize in such interactions. I simply cannot take the time to write a mini-essay or track down the references that may be available on the web. You might benefit, however, by acquainting yourself with the Knudsen layer.

      • John S., it doesn’t help to conflate absorption and evaporation. They are independent processes, the former governed by radiative transfer in the ocean, and the latter governed by thermodynamical phase equilibrium at the air-water interface. Experiments could easily be designed to show their independence, by heating water up in various ways to the same temperature and then seeing that the evaporation at that temperature is the same anyway.

      • Chief Hydrologist

        ‘Evaporation, or more precisely its energy equivalent, the latent heat flux, is the main process that compensates for this imbalance between surface and atmosphere, since the latent heat dominates the convective energy flux over sensible heating. The radiative energy surplus at the surface is thus mainly consumed by evaporation and moist convection and subsequently released in the atmosphere through condensation. This implies that any alterations in the available radiative energy will induce changes in the water fluxes. Our focus in this editorial is therefore on the surface radiation balance as the principal driver of the global hydrological cycle.’

        http://jdsweb.jinr.ru/record/51856/files/Understanding%2520Non-equilibrium%2520Thermodynamics.pdf

      • In the past here at Climate Etc Pekka has been involved in discussions of the Knudsen layer.

      • CH, exactly, John S. needs to understand that radiation causes the surface temperature to be what it is and that in turn causes evaporation. He has a missing link, which is the temperature.

      • Chief Hydrologist

        Heat transfer studies was what he said Jim.

      • CH, then we all agree except DocM who wants to conflate radiation into this. I think the subject gradually changed to remove the radiation part, which was the right direction. No one argues whether latent heat flux is important over the ocean.

  11. David Springer

    Research is one thing. What to do about it is another. Engineers eschew uncertainty before bending metal and avoid it like the plague for one-off or otherwise untested designs where there is substantial blood and treasure at stake.

    I suggest you consider the history of manned space flight for an example of how engineers treat uncertainty in the science when the stakes are large.

    • Andrew Russell

      Steve McIntyre has asked for an “engineering quality” analysis of the CAGW hypothesis from the IPCC and other warmists. He has never received any response. They know they cannot provide one.

    • This is how Boeing tested the wing strength of their 777.

      The wing out performed the computer model; Breakage occurred at 154% of maximum load and not at the 150% predicted.

      • David Springer

        What amount of overloading was in the design spec? Designing for 150% of max wing loading and getting 154% after bending metal is incredibly good. Of course it’s not exactly Boeing’s first rodeo is it? They were one of Dell’s largest commercial customers, by the way. along with Ford. :-)

      • David, Boeing have some experience in over-engineering

        124406 was rebuilt and returned to action by the 50th Service Squadron.

      • Steven Mosher

        Glad you brought that up doc.

        Yes when designing planes we handled uncertainty by overdesign.

        Watching a plane go through destructive testing was quite emotional.

        Springer doesnt know but there are huge uncertainties and unknowns when it comes to designing a plane.

      • Nice video, thanks for posting.
        *****
        Question for Steven Mosher:

        Please can you expand on your point about

        “huge uncertainties and unknowns when it comes to designing a plane”.

        What are (say) the top five uncertainties and unknowns, along with their approximate ranges? Thanks.

      • Huge uncertainties in aircraft design does not sit well with potential passengers or with test pilots either.

      • Rob Starkey

        Mosher writes- ” there are huge uncertainties and unknowns when it comes to designing a plane.”

        Steve

        You overstate the uncertainty when it comes to airplane design. We are uncertain of specifically where within a range something will fail but sufficient margin is designed into the design to ensure that if a failure does occur then it will not be catastrophic to those on the plane.
        The “ultimate testing” of wing strength by a mechanical test is an example of how the models used in wing design are validated. If we found that the wing had failed earlier than the model predicted would be possible then a vast amount of additional testing would be required to find the root cause of the discrepancy before the plane would ever fly. Was the model wrong, was the test set-up wrong, was the wing built wrong, or was the wing design wrong?
        All that would be done to ensure that the models and observed conditions were in sync. In climate science do we ensure that same process s followed? NO. Are the models in sync with observed conditions within predicted margins of error? NO.
        Do people continue to advocate the use of models that have demonstrated that they do not match observed conditions? Yes- You never have answered why you think that makes any sense at all. Please do!

    • In engineering terms, earth’s habitability in regions has a projected failure mode and we are pushing the climate towards that. What would an engineer suggest? Ignore projections due to uncertainty and keep going? Take precautions in case the projections are true?

      • Further to the analogy, there are managers who look at costs and take the cheaper route regardless of safety, ignoring the engineer’s recommendations.

      • Recommended movie on this subject: The China Syndrome.

      • That is one of the most stupid comments that have ever appeared on this website. Prior to the industrial revolution the Inuit live in the Polar region and the Australian aboriginals lived in the great Australian deserts.
        Earth has a capital BTW.

      • David Springer

        Unless you want to go to war with China, India, and the other billions who live in developing countries then I suggest making sure we (by we I mean the US) have the strongest possible economy in 50 years time so we can better adapt to the worst case scenario and come out whole.

        Until you figure out there’s not a godamn thing you can do to slow greenhouse gas emissions enough to matter then it’s no use even discussing with you what’s the best course of action. Any action must be weighed in relation to how it effects our economy in 50 years time. Making energy more expensive today reduces economic output in the present and the economic retardation is compounded as time marches on because that’s just the way future monetary discounts work.

        So the first bit of advice I have for western warmists is stop being such phucking naive dumbasses.

      • Jim D,

        Given that “climate” is the average of “weather”, how can you “push” the average of anything? You obviously can’t calculate the average of events yet to occur. You can guess or assume, but so can my 10 year old granddaughter.

        In answer to your question, an engineer will ask to see your experimental data, results of tests and so on, before giving an opinion. You would also need to pay a fairly hefty fee for such advice.

        As you have none of the things an engineer needs, and are not prepared to pay the engineer anything at all, it is highly likely that an engineer would suggest that you stop wasting his time.

        Feel free to waste as much of my time as you can. I find your comments quite amusing.

        Live well and prosper,

        Mike Flynn.

      • David Springer

        “Unless you want to go to war with China,…”

        Ha! I am beginning to see nuclear war with China or Russia to be a much greater risk than anything associated with climate change. The probability for such a war for any given year appear small but when you consider the multiple decades over which a climate change disaster needs to unfold and consider the really severe consequences of major nuclear war…well who gives a damn about the warming.

        Not intending pivot here–just an observation/thought.

        Regards

      • David Springer

        Mutual assured destruction. China and Russia are sane. Its pissant petty inbred dictators (oh hay Kim Dumb Dong the 35th or whoever is running N. Korea at the moment) with small arsenals and nothing to lose we need to worry about.

      • In this analogy to engineering, scientists are the engineers, being the ones who understand the natural system components in the way engineers understand their materials and components. The scientists are saying nature can’t withstand any more of this stress and maintain itself in any form like the present, so there are various breaking points being approached. That is the context for the question about action under uncertainty, which is similar to that faced by engineers all the time. The action could be something like shoring up the weaker parts against failure. Low-lying areas with large populations come to mind, but we also need to consider weaknesses in food, water, health and energy systems brought about by the large changes in nature.

      • Mike Flynn, your engineer knows what load his bridge was built for, and will warn if it is being approached.

      • Jim D,

        You were talking about the Earth’s habitability, and a projected failure mode for same.

        Nothing about bridges at all. Spare me the irrelevant “analogies” so beloved of the fact free Climate Cultists.

        Engineers like facts, and useful data, and concrete examples. They produce useful things (most of the time, anyway!)

        Climate Cultists have produced nothing of any use whatsoever. Not one thing. None of the “models” they so amateurishly concoct even agree with each other. A thousand monkeys banging away on a thousand keyboards are just as likely to come up with something useful. Facts, Jim D. Facts.

        Provide a few.

        Live well and prosper,

        Mike Flynn.

      • Mike Flynn, likewise scientists have a pretty good idea what it takes to melt icecaps and raise sea levels or how easy it is to raise global temperatures to levels last seen in deep paleoclimate, and they see that is what is being done to the earth system now. Sound alarms or keep quiet?

      • Jim D,

        Well, then, if “scientists have a pretty good idea . . “, we should immediately put the “scientists” in charge of the world’s population and all the world’s resources.

        I am a little unsure as to which “scientists” you are nominating for this task.

        Is part of their task to first ensure world peace, before imposing their will on the world’s population?

        Will I be allowed to refrain from participation in this grand scheme?

        Can you provide a list of the “scientists” in question?

        Live well and prosper,

        Mike Flynn.

      • OstrichBoy doesn’t understand that engineers only work on materials that they own or have accountability or authority over. In human terms, the earth’s climate is owned collectively by the nations of the world, and so engineering will only apply when considered at that scale.

        So JimD has a valid point that only scientists are currently involved in the study of climate and how it will evolve.

        If at some point geoengineering is required then I assume engineers will get involved.

        The same behavior occurs with petroleum engineering. I find it very odd, as it should be part of their lifeblood, but a petroleum engineer is the last guy that you would ask to predict world fossil fuel supplies. Textbooks on PE do not discuss this aspect at all. They are taught to only work on problems that they have accountability over, so that amounts to the ground under their employer’s “ownership”.

      • Mao’s people believe their engineers can jack the world by 2100.

        http://freebeacon.com/china-military-preparing-for-peoples-war-in-cyberspace-space/

        An Army of People have shown us the new way forward. It is not a gas.

      • Mike Flynn, if you have questions about nature you go to the scientists who have studied it. As I said, they know through many lines of reasoning what it takes to change the climate by degrees, mainly because their ideas work on past changes, and are consistent with the basic physics governing earth’s energy flow and fluid motions. Feel free to consult others, but check how much they understand about past climate first as a minimum bar.

    • Actually these thoughts are roughly in line with recent ‘studies’ and not mwg ‘boogey-man driven’ or desire for Armageddon. MAD is not a lock. It is just that small finite probability for a big-boys conflict and huge consequences thing over time. But it is all numbers like GCMs. For sure NK or some other pissant could be a trigger or even inflict metro-size damage without being a trigger. Leave you in peace now,,,

      Regards

  12. Judith Curry writes,
    “The most worrisome problem to me is one that I have not hitherto seen discussed is the impact of overly emphatic and overconfident statements on the science itself. This does not motivate scientists to challenge the consensus on detection and attribution, particularly when they can expect to be called a ‘denier’ for their efforts.”

    And an extreme result would be for some Scientists to leave the field. A less extreme result would be for them to work on ‘safer’ related issues that are not as likely to generate controversy. To distance themselves a bit from the controversy. If the Emphatics are gaining and your base of Scientists are being pushed a bit away, where does that lead? Emphatics may attract other Emphatics. So those with more of that kind of skill and less Scientific skill may join the Consensus.
    I’ve become familiar with a phrase as I have an interest in the health of a 1000 acre watershed I live in. The phrase is, ‘Best Practice’. “A best practice is a method or technique that has consistently shown results superior to those achieved with other means.” Best Practice Science?

  13. Charles Jordan writes:

    “…that I could have as many copies of “An Inconvenient Truth” as I wanted to convince my students that dangerous global warming was occurring. I wrote back asking instead for a number of copies of Feynman’s Lectures in Physics for my students…”

    Pretty good. Looking out for your students first.

  14. Right now, we guess each year and that means we are right as often as wrong. 70% means I am ahead!

    You must run 20 trials at 10 in 20 before you can exclude p=0.70 from the 90% binomial confidence interval.

    So it will be two decades before the farmer can know whether he is really ahead. I suspect, however, there can be some unintended confidences. Will the farmer deal with his own uncertainty the same way he will deal with advice from an authority (with p(correct)=0.7). Should I plant all my fields now? Should I spread the planting in time? Should I plant the flats? Should I plant the slopes?

    When the farmer and the authority agree, there is no difference in the action. What happens when the farmer “gut” disagrees with the authority?

  15. Judith Curry

    The cited articles are interesting, but one comes away with the feeling that one has just heard a very slick “sales pitch” for the so-called “consensus” position on “climate change”.

    The paragraph stating that models are good for making predictions made me think of Nassim Taleb’s The Black Swan, which argues just the opposite.

    The unwritten point is that there are just too many uncertainties in what makes our climate behave the way it does to make any meaningful statements on attribution of past climate change, let alone project what will happen in the future with any degree of confidence. There is no justification for the overconfidence expressed by IPCC.

    The current “pause” is a prime example. It was NOT “projected” by the models, which instead projected warming to continue at an accelerated rate as a result of unabated human GHG emissions.

    Your closing paragraph tells it all:

    So there is a positive feedback loop whereby overconfidence in statements about the consensus has become a self-fulfulling prophecy, at least in the short term. This is one reason why the ‘pause’ is so interesting, with recent observations not matching the expectations from climate model projections and some people projecting that pause could continue for several decades. How scientists react to this, and how partisans in the debate play this, will be interesting to watch. I suspect that there will be backlash against the overconfidence if the pause does continue.

    The facts on the ground appear to be unraveling any confidence that existed for the IPCC consensus position and its projections of CAGW.

    Sic transit gloria.

    Max

  16. Johnston’s snake bite cure was heavily promoted in the late 19th century but is now quoted as a joke. When the IPCC said that a rare gas in the atmosphere, carbon dioxide, less than 1%, could drastically alter the climate, the reaction of most people was disbelief. It could not be true. How could such a rare gas have so much power?

    That was the IPCC’s first mistake. They refused to address this paradox. And to this day the IPCC has refused to address exactly how a rare and otherwise innocent gas could have such a vociferous appetite for heat. The doctor was saying this stuff will very likely kill you unless you take Johnston’s snake bite cure,, or that is how the warning was interpreted.

    Why did it not ocur to the IPCC that such a claim was bound to be disputed? Worse still, exploited by politician Al Gore against his old rival George W?

    Why? Because the IPCC were themselves uncertain of their claim but decided like a poker player to bluff it out. This suited their corporate consensus culture.

  17. Berényi Péter

    For example, the IPPC says:

    “[M]ost of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations”.

    ‘Very likely’ in this case means a probability of more than 90 per cent.

    Well, if the IPCC means that (and it seems to be the case indeed), then it managed to say something devoid of meaning.

    One can’t talk about “probability” with no definite sample space, with a proper sigma-algebra on it. If that’s given, one can try to estimate the probability measure function by running experiments and counting outcomes.

    However, it is not easy to see what the “experiment” is for the proposition above and what are the “possible outcomes”.

    To keep it simple let’s suppose one of the possible outcomes is A=”most of the observed increase in global average temperatures since the mid-20th century is due to the observed increase in anthropogenic greenhouse gas concentrations”, the other one is its negation, B=~A. The sigma algebra is {{},{A},{B},{A,B}} and the probability measure is P({})=0, P({A})=p, P({B})=q, P({A,B})=1 with p+q=1.

    With this the IPCC’s proposition translates to p > 0.9

    To verify this proposition, one obviously needs a well defined experiment with 2 possible outcomes, A and B, which can be performed arbitrarily many times, along with a decision procedure which settles the outcome for each experimental run.

    In principle it is easy. One only needs 2xN identical copies of the mid-20th century Earth system, half with greenhouse gas concentrations increasing as observed, the other half with those concentrations kept constant at their initial level, with no other differences whatsoever. The experiment consists of running the two kind along with one another N times and measure rate of average temperature increase from 1950 to 2007 for each pair. If this quantity is at least twice as large for the former case than for the latter one, we have met event A, otherwise event B. We only have to count hits & misses, a straightforward task, even for beginners.

    Number of outcomes A divided by N is an estimate of p. As N increases, this number converges (stochastically) to the true probability. However, the rate of convergence is slow, the error term being proportional to the inverse of the square root of N. That means to decide if the proposition p > 0.9 was true or not, one needs a fairly large number of experiments, especially if the actual ratio measured is never substantially larger (or smaller) than ninety percent. If, for example, p happens to be 0.91, one needs some ten thousand experiments to be able to say p is larger than 0.9 with any degree of certainty. But the point is one can use standard statistical methods to decide if N is already large enough or more experiments are needed.

    So far so good. There is only a minor issue with this approach, namely the utter lack of Earth replicas.

    Fortunately climate scientists came up with a clever workaround to this outrageous obstacle, the modelling paradigm. They are constructing virtual Earth systems and they are running the experiment described above on those computational instances.

    Unfortunately with this very move they have left the realm of science behind as well and ventured into the marshland of pseudoscience.

    That is so, because running experiments on virtual instances is NOT the same as running them on actual ones. In this case one needs an additional step. One has to make sure the model behaves exactly like the real one. And that’s tough.

    Computational models can never be validated against a single run of a unique physical entity. They may be useful in designing cars, but only because we have wind tunnels to check the computational aerodynamic model as many times as we wish (and the budget allows). This option is missing in the case of climate.

    There is no reason whatsoever to believe the outcome of virtual experiments would be identical to those (impossible) experiments perfomed on the real one. What is more, there is no proper procedure to estimate the difference.

    In fact we do know for sure models are infeasible for this purpose, because estimating p using different models or even the same ones with different parameter settings and/or boundary conditions lends hugely different numerical results for p. If they are not identical in this respect, there can be no more than one which behaves as the Earth system does, possibly none. However, there is no decision procedure whatsoever, which would tell us with any degree of certainty which is the correct one among them if any.

    Using “ensembles”, as it is the case, is no remedy, because averaging over a universe of unknown (and unknowable) statistical properties is meaningless.

    • Berenyi: there is a way to validate such models. You have to isolate every single process in the climate model and validate it aginst an experiment set up for the purpose, or against the procass fortuitously provided by Nature. Even when you have done that, because of truly random variables in the climate model, all you will be able to sayafter many,many runs of the model is that there outputs belong to the same population as the real world’s. That is the final test..

    • Steven Mosher

      • Nice video. Thanks.

        Now how does one sell/engage Bayesian (or Dempster-Shafer, or …) approaches to decision-making as a viable way forward within the debate when so many strongly biased voices are around? Details can be very nuanced and some voices impatient. Perhaps some of the most ‘difficult’ people to deal with are ingrained empirical frequentists or classicists trained in the sciences. This is just a concern I have rooted in limited experience years ago. In some ways I wonder whether the task would have been easier in the golden age (1980’s). [I still think it would be a healthy exercise for serious participants to look at decision making in medicine.]

        One thing I would definitely do, would be getting some DA professional involved–not writing ‘white-papers’, but instead getting structure and data out of the scientists and politicos. We’re not on the Ted Mack Amateur Hour.

        Maybe I read a lot into your video morsel, but that’s what went thru my mind–can’t help that. Keep throwing it out. Some will stick.

        It’s late here…

      • Berényi Péter

        Not this “subjective probability” BS again, please.

        “What probabilities represent, are not features of the external world, but rather features of your personal subjective mental states.”

        With that you are proposing climate science does not belong to natural sciences, but is a behavioral science. For the only way you can assign numbers to mental states is by observing behavior.

        Now, Mosher, would you kindly tell us whose behavior is to be observed and in what specific experimental setup to verify the IPCC’s 90%?

      • Steven Mosher

        ‘With that you are proposing climate science does not belong to natural sciences, but is a behavioral science. For the only way you can assign numbers to mental states is by observing behavior.”

        haha. another false claim of certainty. There are many ways to assign numbers to “mental states”, assuming they exist. Think about it.

        The point is you have a notion about how probabilities are assigned and determined. That notion, that theory of probability, is not in fact certain, although you pretend it is.

      • Berényi Péter

        @Mosher, if you were actually telling something instead of prevaricating, that would be informative.

        You could, for example, enumerate several ways of identifying sigma-algebras of mental states and assigning a probability measure to them which do not refer to outward behavior. You should be able to do that, for you claim there are many ways to assign numbers to “mental states”.

        Once you have done that, supply a verification method for each, please.

    • I’ve always thought that the study linked below was very informative. IMO it is precisely the visual presentation of uncertainty among experts estimating key parameters (here for a HLW repository) having different backgrounds and preconceptions that makes this sort of approach valuable. Whether or not you like what come out, you can’t run away from the underlying problems such methodologies highlight, e.g,. figure 3:

      http://s1285.photobucket.com/user/mwgrant1/media/InteriorEffectivePorosity_zps82b7e168.png.html

      And subjective evaluations are very, very difficult to eliminate. (We can save hierarchies of ignorance for another day :O) )
      ————————————————–
      {Edited report abstract and link:}
      Probability encoding of hydrologic parameters for basalt. Elicitation of expert opinions from a panel of five consulting hydrologists
      Description/Abstract

      The Columbia River basalts underlying the Hanford Site in Washington State are being considered as a possible location for a geologic repository for high-level nuclear waste. To investigate the feasibility of a repository at this site, the hydrologic parameters of the site must be evaluated. Among hydrologic parameters of particular interest are …
      Site-specific data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. To obtain credible, auditable, and independently derived estimates of the specified hydrologic parameters, a panel of five nationally recognized hydrologists was assembled. Their expert judgments were quantified during two rounds of Delphi process by means of a probability encoding method developed to estimate the probability distributions of the selected hydrologic variables. The results indicate significant differences of expert opinion for cumulative probabilities of less than 10% and greater than 90%, but relatively close agreement in the middle ranges of values. The principal causes of the diversity of opinion are believed to be the lack of site-specific data and the absence of a single, widely accepted, conceptual or theoretical basis for analyzing these variables.

      http://www.osti.gov/energycitations/product.biblio.jsp?osti_id=5563618

    • Berényi Péter,

      A subjectivist friend once told me:

      > If anyone could suggest any sort of non-trivial objective interpretation of the question “what is the probability of rain tomorrow”, then I’d be all ears

      Do you have an objective interpretation for the probability that it will rain tomorrow?

      Many thanks!

      w

  18. Steven Mosher

    I find it ironic that in a thread about uncertainty some many people make claims of certitude, especially claims about having an understanding of uncertainty. I won’t go though and point them out. exercise left to the student

    • “I won’t go though and point them out. ”

      Yes, it seems somehow appropriate to leave have uncertainty in naming names.

      • leave/how … pffft uncertainty in neuron access

      • Steven Mosher

        hehe, glad you got that

      • Creating fake certainty at the same time as promoting uncertainty is a hallmark of the political framing technique of FUD (Fear, Uncertainty, and Doubt).

        Notice that Uncertainty is distinct from Doubt, as Doubt is the intended goal.

        I used to work at IBM TJ Watson, but didn’t realize until just now the story behind the term FUD:

        Wikipedia– ”
        FUD was first defined with its specific current meaning by Gene Amdahl the same year, 1975, after he left IBM to found his own company, Amdahl Corp.: “FUD is the fear, uncertainty, and doubt that IBM sales people instill in the minds of potential customers who might be considering Amdahl products.”

        So one generates fake certainty in suggesting that Tsonis is correct in his theories behind climate shifts (see Chief Hydro), while manufacturing doubt in other theories (see SpringyBoy).

        Where FUD doesn’t quite always work is the Fear part. Apart from those that equate enviros and greens as “Scary Commies”, the 3% denialists are more likely to exchange a What Me Worry framing to substitute for the fear component.

        No worries if you buy an IBM product. Be fearful if “business as usual” is disrupted. Going to alternative energies is loaded with uncertainties, and there is massive doubt over whether BAU can then continue.

        One thing for certain, fossil fuels are finite in supply and are non-renewable on the human time scale. Alternatives will have to take their place, regardless of the uncertainty in climate.

        How do you like them apples?

      • Web, from what you have been writing here we are all aware of your view that the world will soon be running out of apples. The ones for sale will all have some worm problems. The methane in the water used to grow them is what has turned them the strange green color. Tell us now what is a fair price that we should have to pay? An apple a day…

    • you can be certain of uncertainty.For example we can propose that the irreducibility of the uncertainty of climate sensitivity in the last 3 decades is evidence of the irreducibility of climate sensitivity . Hence climate sensitivity is random.

      • Steven Mosher

        I’ll add one to my list of people who are overconfident of what they think they know.

  19. > Nearly everyone has some informal understanding of the notion of probability.

    There is lots of research on this. For instance:

    Piaget and Inhelder (1951/1975) concluded that, by the age of 12, most children can reason probabilistically about a variety of random generating devices. […] Nisbett, Krantz, Jepson, and Kunda (1983) argued that people with little formal training in probability tend to analyze a situation probabilistically when (a) the sample space is easily recognizable (e.g., when the event is repeatable and outcomes are symmetric) and (b) the role of chance is salient (e.g., in coin flipping and urn drawing). On the other hand, even people who have had considerable training in the application of probabilistic models can be led to the unconscious application of natural assessments for situations that they know call for a probabilistic analysis (Tversky & Kahneman, 1971).

    http://www.srri.umass.edu/sites/srri/files/Konold1989.pdf

    Even if we grant that probabilistic reasoning is easy, decision under uncertainty might still be hard. I see no other reason why most of the chess players I know turned to Poker.

    • Willard,

      “The Scientific Reasoning Research Institute”?

      Please tell me I’m dreaming. Please.

      Live well and prosper,

      Mike Flynn.

    • While everyone has some intuitive understanding of probabilities, the intuition fails as soon as the events are so infrequent that the probability does not get properly reflected in personal experience.

      Understanding and building a correct intuitive view of very low probability high consequence risks is not natural for human thinking even assuming that the quantitative risk analyses are accepted as correct and trustworthy. People have the tendency of worrying much more about large catastrophes than their quantitative likelihoods would warrant, and correspondingly less about the smaller scale accidents that they are much more likely to meet with equally serious consequences for themselves.

      Numerous studies of risk perception have shown that people’s reactions contradict strongly quantitative risk estimates. One consequence of that has been that it seems impossible to determine the level of risk aversion that individuals or a wider population have. Reaching an agreement on the related utility function would allow for quantitative comparison of risks of different nature. Lacking the knowledge of the right utility functions scientists have opted to functions whose only real advantage is that they make the calculations easier, while the results may be non-sense at worst, as evidenced also in some cases where climatic risks have been analyzed. (The case of Weitzman’s Dismal theorem provides such an example according to some other scientists.)

      • Thanks, Pekka. May I ask which studies on risk aversion you have in mind?

      • Nice post Pekka.

      • My main source is the book Christian Gollier: The Economics of Risk and Time. It was the most important source for the lecture course on risk management in energy markets and energy investments that I gave several years ago. (Another important source of a different nature was Dixit and Pindyck: Investment under Uncertainty).

        At that time I read also other related material, some of that discussed empirical studies of risk perception, but I don’t have any list of those texts any more. Being interested in these questions I have seen the difficulties mentioned in different connections from attitudes towards nuclear energy to investment theory.

      • Pissant Progressive

        I don’t know where to post this on this thread but i haven’t seen it noted yet. Since at least one recent blog post was economics-themed it seems worthwhile to ask: What did the Bangladeshi farmer say when he was asked how much he was willing to pay for the forecast?

      • I noticed that a journalist who post here on the very subject of risk, has photographic evidence of a BMI closer to 35 than 25, which carries with it huge risks of premature death. Still, I will not cast the first stone.

    • willard, They were bored and knew in their hearts they wanted to raise again.

  20. I happened to pick up Cullen Murphy’s book God’s Jury (The Inquisition and the Making of the Modern World) at an airport this summer. There is much in it that might be relevant to the climate change debate. But on uncertainty in particular: “The presumption is now widespread, though rarely articulated in these terms, that a lack of certainty is unacceptable. It is the presumption that if we only knew enough, and paid enough attention, and applied sufficient resources, then ills of all kinds would disappear.” And: “Locke’s ideas about religious toleration turned on the very idea of uncertainty. Human beings can’t know for sure which truths are “true”, and in any case, attempting to compel belief only leads to trouble.”

  21. Latimer Alder

    ‘When researchers in a field assess the ‘weight of evidence’, they don’t simply mean the number of studies on a particular question, or patients per study, but how compelling the evidence is and how thoroughly alternative explanations have been looked at – a result of the scientific process and peer review of new research.’

    … and what their mates are saying and what the chances of publication are for various possibilities and where the next grant/tenure is coming from and what their Professors particular hot buttons are and all the human baggage that comes along with making a value judgement.

    The myth of the ‘objective’ climatologist seeking only ‘Truth’ was shattered on 17/11/09. The day that the Climategate e-mails were liberated. Behind the mask, we see that they are just like anybody else. Or in some cases, considerably worse.

    • The “Climategate” emails have resulted in precisely zero retractions and zero instances of professional misconduct proceedings: there was nothing in those emails that was in any way interesting or relevant to the substance of the science of climate change.
      People who couldn’t argue the science resorted to attacking the scientist, and they failed to achieve anything except publicise their disregard for the law and their disregard for the science.

      • Heh, it’s an autopsy on a patient not yet dead.
        ===========

      • Craig Thomas: You seem to be in agreement with Steve McIntyre on this.

      • Speaking as a skeptic/denier – or maybe as an “emotive contrarian” – I’d like to express my disapproval of the use of those emails. I’m happy to explain to fellow skeptics and others why I disapprove, but my reasons are not special. Purloined material, distorted context, disrespect of that professional privacy we should accord to the local dog-catcher…I could go on. Nothing was exposed for me, since I have never been in much doubt as to the incestuous, conformist nature of of any “consensus” when egos and jobs are at the stake and nobody knows all that much about the subject. (I believe a polite term for not knowing much is “uncertainty”.)

        Don’t need those emails. Don’t want them. Other people’s property.

      • Gesundheit

    • It’s all about the incentive structure of the field. If you count “success” as a) making discoveries and b) getting credit for them among other investigators or at least c) making arguments that others have to deal with, then you have pretty good alignment between individual incentives and the progress of science. Which empirical work should I believe? The stuff that I sincerely believe will help me make discoveries. How much data and method detail should I disclose? The amount that I believe optimally balances my credibility (to get others citimg me and building on my work) with my ability to use my asymmetric information to beat others to further discoveries. In pure-science situations where the audience and funding sources are both peer-discoverer communities, these incentives work very well to create an “invisible hand” guiding ambitious individuals to act in a way that maximizes scientific progress. Scientific consensus on a topic achieved under these circumstances is likely to be the best guess available, because any sincere deviant who thought he could better make discoveries by departing from the consensus would have motivation to do so. (This doesn’t mean that there won’t be politicking and the like in the game for peer influence, but it does mean that in the long run users of research results have strong incentives to build on things that they believe are most correct.)

      The problems come in when other incentives dominate, such as the impact that a particular finding might have on public policy. Once particular findings are rewarded or punished for extrinsic reasons, incentives may no longer point toward sincerely acting to maximize the rate of discovery but rather toward maximizing those extrinsic rewards. In these situations, observers of the field will find it much harder to extract useful findings to believe in. Perversely, policy makers who insist on consensus from a science may destroy the usefulness of that science for getting at the best possible understanding of the world, even if they weren’t biased in seeking a particular answer to justify their favored policy. Merely forcing “consensus” will cause whatever view happens to be popular at the moment with the most influential scientists to be overconfidently promoted to dogma. I think this process is what has Prof. Curry worried.

  22. Climate alarmism’s message is often slipped subliminally in rather than blurted. Check out Tracey and Tabitha’s article. After the airy babble, vapid abstractions, circularities and commonplaces, consensus stands, ever the responsible adult, though humbled and misunderstood. That “climate change contrarian” getting “emotive” looks a little silly and arrogant, like some carping Republican bad guy from The West Wing…though our two professional science communicators would never say as much.

    Tired of being a shallow, emotive contrarian? Already quite a deep person, but you want to “delve deeper”? There’s this book by Naomi Oreskes you can read…

  23. @- Judith Curry
    “The most worrisome problem to me is one that I have not hitherto seen discussed is the impact of overly emphatic and overconfident statements on the science itself. This does not motivate scientists to challenge the consensus on detection and attribution, particularly when they can expect to be called a ‘denier’ for their efforts.”

    We all worry about different things I suppose, but given the well over 90% preponderance of the published scientific research that validates AGW theory I think the intentional expression of overly emphatic and overconfident statements of uncertainty and doubt that certain fringe groups opposing the policy implications of AGW propagate are somewhat more damaging.

    As has been detailed by several writers the is a long history of special interest groups casting doubt as a intentional tactic on scientific findings that threatened their financial interests.
    Tobacco is the most well known, but the opposition to asbestos, lead, CFCs, OPs, SOx etc are all part of a pattern of anti-science lobbying that tries to use scientific uncertainty to block policy consideration of known threats to human safety.

  24. From one layman’s perspective:

    The climate system is made of 5 separate subsystems (atmosphere, cryosphere, hydrosphere, biosphere, and geosphere) , each with it’s own set of complexities and uncertainties, which, I don’t think any scientist claims to understand fully, much less the complexities around the interactions of these 5 separate subsystems. Add to that the external variables like the sun, gravity, polarity, cosmic rays and probably countless others that have or have not been identified. Thus the description of climate as a massively complex, chaotic, non-linear, coupled system. To me, it defies any kind of logic for anyone to claim that despite our incomplete understanding of each of the individual subsystems, their interactions, the effect of the external variables, the real possibility of a large number of unknown unknowns, that they can claim any certainty, much less 100% certainty that a trace gas is the dominant driver of climate, and that the increase of Co2 caused by man, which according to who you want to believe ranges from close to 0 to the full difference between pre-industrial times to now, or about 125 ppm, will lead to catastrophic global warming, climate change or whatever it is you want to call it today. That degree of certainty is hubris at it’s worst.
    Also having read that in the geologic history of the planet, Co2 levels have been far higher than today, why was it not catastrophic then? That there have been other times in pre-industrial history where abrupt and violent climate change occurred. Do we understand what caused those changes? Do we have an explanation for the RWP, MWP, LIA, etc?
    Add to that the overall behavior of many of the scientists supporting the catastrophic theory (Climategate, and later Gleick), and the behavior of the IPCC itself (read the Delinquent Teenager).
    The reality of the IPCC process is that the SPM is the portion that gets read most widely and is what the media and politicians reference when discussing the latest report. But, the SPM is not written, reviewed, or approved by scientists. It is written by political appointees of 150 countries who effectively “negotiate” the language that will best serve their purposes – see http://nofrakkingconsensus.com/2013/06/13/the-ipcc-politicizing-science-since-1988/ .
    Re: AR3, besides just the hockey stick, Lindzen stated that the IPCC clearly uses the Summary for Policymakers to misrepresent what is in the report and gave an example from the chapter he worked on, chapter 7, addressing physical processes.
    The 35-page chapter, said Lindzen, pointed out many problems with the way climate computer models treat specific physical processes, such as water vapor, clouds, ocean currents, and so on. Clouds and water vapor in clouds, for example, are badly misrepresented in the models. The physics are all wrong, he said. Those things the models do well are irrelevant to the all-important feedback effects.
    “The treatment of water vapor in clouds is crucial to models producing a lot of warming,” explained Lindzen. “Without them [positive feedbacks], no model would produce much warming.”
    The IPCC summarizes the 35-page chapter in one sentence: “Understanding of climate processes and their incorporation in climate models have improved, including water vapor, sea dynamics and ocean heat transport.”
    That, said Lindzen, does not summarize the chapter at all. “That is why a lot of us have said that the document itself is informative; the summary is not.”

    http://news.heartland.org/newspaper-article/2001/06/01/ipcc-report-criticized-one-its-lead-authors

    Likewise, in AR4, from http://www.amlibpub.com/essays/ipcc-global-warming-report.html the SPM was written several months prior to “the 1600 pages of scientific information underlying their summary” . Appendix A to the Principles Governing IPCC work, states “Changes (other than grammatical or minor editorial changes) made after acceptance by the working group or the Panel shall be those necessary to ensure consistency with the Summary for Policy Makers or the Overview chapter”. In other words, the “science” will be adjusted to agree with whatever politicians and bureaucrats want it to say.
    The first and second assessments had similar problems. So far, by my count, the IPCC is zero for four in producing an SPM that is a valid summary of the science. Moreover, much of the ”scientific” information provided in the detailed reports comes from questionable or politicized sources – again see “The Delinquent Teenager”. In short, the IPCC has been shown to be almost purely political and the SPM produced is worse than valueless. If governments act on the SPM, real economic harm will be the result, and the climate will continue to operate as it always does – that is, in ways we cannot possibly predict or control.

    As for those claiming that Climategate emails should be ignored since they were “stolen”, I completely disagree. The emails were part of work product generated thru public funding, and therefore, subject to public scrutiny. Any illegal action was taken by the authors of said emails by stonewalling FOI requests and other actions.

  25. Judith, the scientist, wrote: “Overegging the pudding with emphatic and overconfident statements about the science (effectively minimizing uncertainty) causes a number of problems. It motivates the other side to make contradictory statements even more emphatically and confidently. And then if your understanding turns out to be incomplete and predictions are not realized, there is a public loss of confidence in your position, which can spill over to science in general.”

    In the nonscientific world, the other side will always contradict with emphatic, confident statements – no matter how carefully the first side makes its statements. And if your predictions (careful or rash) are not born out, your supporters will find a way to obscure your mistakes and pretend they never existed. Finally, activist scientists on both sides don’t give a XXXX about the credibility of science in general; their greater goal is to save the planet or the economy.

    • Individual scientists certainly care about their individual credibility, so why is the IPCC not perceived by many to be scientifically credible. Some possibilities:

      What credible scientist would allow a group of government appointees under the direction of the UN to control every word in their Summary for Policymakers? (And change their peer-reviewed report to agree with the summary, if necessary.)

      Which scientists and economists will the IPCC select as authors to provide a consensus report on man’s role in climate change and what might be done about it?

      Who is responsible for the scientific credibility of the report? The smallest unit the report is roughly a one-hundred page chapter with roughly one thousand references jointly written by a dozen authors. Under these circumstances, no one’s personal credibility is at stake. This problem would change if the names of three scientists responsible for each section (usually one to two pages) were listed at the top of each section: One author who is responsible for drafting the consensus language in this section. One reviewer who has reviewed each point reference by reference. One “editor” who has made sure that comments from peer review have been fairly addressed.

  26. Lauri Heimonen

    Judith Curry,

    the uncertainty debated here can be avoided, as we understand that it is nonexistent.

    There are two biassed views which already make the anthropogenic warming, based on model calculations, be impossible to observe by measurements in reality:

    i) In the climate model calculations adopted by IPCC there is assumed that the increase of CO2 content in atmosphere during the industrialized era is totally caused by human CO2 emissions. However, according to natural laws, the CO2 content in atmosphere is controlled together by both all the CO2 emissions to atmosphere, and by all the CO2 absorptions from atmosphere to all the CO2 sinks. As I have earlier, many times already, stated, the recent CO2 emissions control only about 4 % of total CO2 increase in atmosphere. For instance the anthropogenic share is only about 0.08 ppm CO2 in the recent total increase of 2 ppm CO2 in atmospher, and in the present CO2 content of about 400 ppm in atmosphere the share caused by human CO2 emission is only about 16 ppm at the most.
    ii) IPCC scientists and many others believe that the recent warming has followed the recent increase of CO2 contentent in atmosphere. But they can not have stated any proper evidence in reality for the belief. Whereas there are increasing evidences in reality that the trends of increasing CO2 contents in atmosphere follows warming and not vice versa; and the warming is natural without any observable anthropogenic influence. During trends of decades the global mean content of CO2 in atmosphere is controlled by sea surface temperature of areas where CO2 sinks are. The sea surface temperature of CO2 sinks determines how much CO2 from total CO2 emissions to atmosphere remains in atmosphere to increase the CO2 content in atmosphere.

    Look e.g. at comments http://judithcurry.com/2011/08/04/carbon-cycle-questions/#comment-198992 ; and http://judithcurry.com/2013/01/16/hansen-on-the-standstill/#comment-287036 .

    • Laurie: Perhaps the following analogy will show why scientists believe humans are responsible for all of the increase in CO2 in the atmosphere.

      Suppose my family has $300,000 (from an inheritance) and we make and spend $100,000 per year. Nothing will change and we will still have $300,000. Suppose our child now gets a job and adds $4,000 (4%) to our family income, but our spending doesn’t increase. In 25 years, we will have $400,000; 50 years, $500,000. Most people would say that the increase in our net worth is due to the additional income provide by our child, even though that income is only a small fraction of our total income and outgo. After all, our net worth would certainly stop increasing if our child stopped working!

      The $100,000 we earned and spent every year is analogous to the 220? GT/year of carbon the earth naturally emitted and absorbed every year before the industrial revolution; our inheritance of $300,000 is analogous to the roughly 300 ppm of carbon dioxide (600? Gt) present before we began to burn large amounts of fossil fuel. The 9? GT/year of carbon we emit by burning fossil fuels is only 4% of the carbon that nature emits and absorbs; but, just like our son’s $4,000, the additional carbon will accumulate in the atmosphere as long as the natural emission and absorption doesn’t change. Therefore, most scientists attribute the accumulation of carbon dioxide in the atmosphere to the burning of fossil fuels, even though it amounts to only 4% of natural emissions.

      Suppose we wanted to show our child how important their earnings were to our family’s savings. We keep all of our money as $1 bills in a very large box and write our child’s name on every $1 bill they earned that went into the box. Unfortunately, the box is well-stirred every time money goes in or out and some of our bills with our child’s name usually go out every time we spent money. Over a very long period of time, the fraction of bills in the box with our child’s name on them will gradually approach 4%, even though our child’s income is responsible for all of the increase in our net worth. We would mistakenly conclude that our child’s income was responsible for only 4% of our accumulating fortune, but in reality nothing would be accumulating without our child’s income. This is analogous to the mistake you make when you compare the 30 GT/yr man emits by burning fossil fuels to the 750 GT/yr emitted and absorbed by nature.

      We could pursue the analogy a little further and say that our son’s earnings started at $500/year and increased $500/year, analogous to our growing use of fossil fuel. And instead of saving all of our child’s earnings, we spent half of it and saved (accumulated) half. The higher concentration of CO2 in the atmosphere has sped up the rate at which carbon dioxide is taken up natural “sinks” (compared with pre-industrial), leaving only about half of what we emit to accumulate in the atmosphere.

      • Wow Frank, nice to see someone who still cares about logical reasoning. I took the knee-jerk immediate dismissal approach below, as this is not the first time that Lauri has gone through the misguided argument.

      • Thanks Web. Rarely, one can bring shine a little light in dark corners. And I actually learned a lot about Laurie’s position by constructing the analogy. If my child’s $4,000/yr is responsible for our increasing wealth, why isn’t my $100,000/yr? I think the best answer to that question is to ask what would have happened if my child wasn’t earning anything or man wasn’t burning fossil fuels.

        Barnes: Your trace gas argument is bogus. If you condensed the atmosphere to a liquid that stayed in one place – so that it would more tangible and intuitive – you would have about 10 m of liquid air covering the surface of the earth. (Atmospheric pressure can raise an evacuated column of water about 10 m or 32 feet.) If the layers separated into their components, the CO2 layer would have thickened from 3 mm to 4 mm this century. (300 ppm of 10 m is 3 mm.) This is about the thickness of a pane of glass. If presume you are familiar with how hot your car can get in the sun because visible light, but not infrared, passes through glass. I suspect you also apply a much thinner layer of sunscreen to protect yourself from the sun’s UV rays. “Trace” amounts of materials with the right optical properties can have big effects; you have had personal experience with those effects.

        “Why would anyone assume that the climate’s ability to absorb Co2 stays at any constant rate?” I wouldn’t assume anything. I’d recognize that atmospheric CO2 has increased 40% and recognize that the rate of uptake by sinks is probably linear to a first approximation in some cases and varies with the change in CO2 in others. So I would expect the current trend to continue, with half of each year’s fossil fuel emissions disappearing into sinks. I don’t know what will happen when we get far from current experience. My family might continue to save have of my child’s earnings when our net worth is $300,000-$400,000, but change our behavior when we get near $1,000,000.

      • Frank – why would you assume that anything in a system as complex as our climate system is linear, sinks included, since sinks can and likely do change dynamically to changing conditions? The analogy to how my car heats up is also a bogus analogy – my car, as is a greenhouse, is a static, linear, non-coupled, closed system – everything our climate system is not. So, it is very predictable what will happen when you leave a car out in the sun – not so predictable when you addtiney amounts of Co2 to a massively complex, dynamic, non-linear, coupled system. As for converting the atmosphere into a liquid where Co2 magically becomes a solid layer a few mm thick – how does that begin to compare to gases disperesed and intermixed with other gases at continually variable volumes? What “optical” properties does does Co2 possess?

      • Barnes: Don’t twist my argument; I’m objecting only your implying that we can ignore rising CO2 because it’s “a fraction of an already trace gas”. If it were condensed to a liquid or solid, there is enough of the TRACE gas CO2 in the atmosphere to make a layer approximately 4 mm thick and that layer has thickened from 3 to 4 mm in the past century. Common phenomena associated with glass and sunscreen demonstrate that the TRACE amount of CO2 in the atmosphere is more than enough have dramatic effects on radiative energy transfer – if CO2 has the right optical properties to do so. Laboratory measurements made well before the current hysteria tell us that gaseous CO2 does have such properties. I am simply trying to make the 100 ppm rise in an invisible trace gas into something more tangible and relate it common phenomena with things (glass, sunscreen) that are familiar. IMO, the “trace gas” argument is bogus in this setting.

        The optical properties of CO2 are changed modestly by its environment state: liquid, pure gas, or diluted gas. The individual lines that can be seen at low pressure are smeared out by collisions, higher temperature and contact with neighboring molecules in condensed media. All of these phenomena have been studied carefully in the laboratory and incorporated into GCMs. You might read:

        scienceofdoom.com/2011/03/12/understanding-atmospheric-radiation-and-the-“greenhouse”-effect-–-part-nine/

      • Frank – you seem to think thst gases, liquids and solids have the same physical properties. They do not which is why your analogy falls flat. You also seem to think that controled lab experiments will manifest the same say way as in highly chaotic complex, coupled, non linear system such as climate- they do not. Finally, you say that it has all been captured in GCMs. Got news for you, the gcms have demonstrably failed, and failed spectacularly. You might want to find another source to bolster your arguments.

    • A classic 3% denier, can’t even admit that excess atmospheric CO2 is caused by the combustion of fossil fuel.

      • Big deal. Even if 100% of the excess in Co2 can be attributed to combustion of fossil fuels, it is still a fraction of an already trace gas. Frank’s financial analogy is spurious at best. Why would anyone assume that the climate’s ability to absorb Co2 stays at any constant rate? If increased Co2 results in increased plant growth, which would result in increased Co2 absorbtion, wouldn’t you have a change in the carbon cycle?

  27. Some recent comments by Michael Mann were perhaps informative. About Science being about credible theories and best explanations. Best Explanations led this novice the Abductive Reasoning Wikipedia page, where the Why is the lawn wet in the morning question is asked. Most likely it rained last night. And there are other possible less likely answers. Does Abductive Reasoning explain part of how we got here to the current state?

    Abductive Reasoning seems to follow one of my own guidelines, Qualify what one says. Prefer using most likely over unambiguous statements. I use qualify as accountants do. An unqualified opinion is without a list of all your reservations as you don’t have any to list. You’ve gone almost all in and trying to un-ring that bell in the future is going to be messy. A qualified opinion has a list of your reservations, and uncertainties. And one purpose of a qualified opinion to be honest is to protect the accountant, but also to what? Protect the Profession. It’s integrity. It’s reputation. Public trust at times, is the whole ballgame.

    An example of what opinions I am talking about is auditing Walmarts books and then, giving an assurance on how closely their books match with what accounting rules say. Are they a fair representation in numbers of what happened and what is? That opinion is then given to among others, their shareholders and the SEC. The company’s financial numbers have now been legitimized following the standards of the profession. We still don’t have certainty though. They could have hid something really well. And the opinion given allows for that. A high level of confidence but not an all seeing confidence, as a lot of sampling as used to manage the vast amounts of financial data.

    I think Mann made a useful point about Abductive Reasoning and Best Explanations. He taught me something today.

  28. if a climatologist is ”uncertain” – shouldn’t make wild / unsubstantiated statements: http://globalwarmingdenier.wordpress.com/

  29. Pingback: Weekly Climate and Energy News Roundup | Watts Up With That?