Taylor and Ravetz on the value of uncertainty

by Judith Curry

. . . this “crisp number” mode of thinking has promoted the use of over-simplistic models and masking of uncertainties that can in turn lead to incomplete understanding of problems and bad decisions. – Peter Taylor and Jerome Ravetz

It’s been a while since we’ve had an uncertainty post.  Previously we’ve covered a lot of territory on that topic, but I think this talk provides some fresh insights and some great quotes.

Last March, Peter Taylor and Jerome Ravetz gave a presentation at the University of Oxford Centre for Practical Ethics [link to ppt, audio].

Title: The Value of Uncertainty

Abstract:  The faith that truth lies in numbers goes back to the Pythagorean attempt to unify both practical and theoretical sciences.  Its current manifestation is the idolisation of pre-Einsteinian physics in the quantification of social, economic, and behavioural sciences.  The talk will explain how this “crisp number” mode of thinking has promoted the use of over-simplistic models and masking of uncertainties that can in turn lead to incomplete understanding of problems and bad decisions.  The quality of a model in terms of its fitness for purpose can be ignored when convenience, especially computerised convenience, offers more easily calculated crisp numbers.  Yet these inadequacies matter when computerised models generate pseudo-realities of their own through structures such as financial derivatives and processes such as algorithmic trading.  Like Frankenstein’s monster, we have already seen financial market pseudo-reality take on an uncontrolled, unstable and dangerous life of its own, all the more beguiling when it generated income for all parties in the merry-go-round.  Despite its manifest failings, it is still going on.

We believe the urgent task is to integrate uncertainty and quality into the quantitative sciences of complex systems, and we will offer some practical techniques that illustrate how this could be accomplished.

Below are some excerpts from the text of the .ppt:

The Value of Uncertainty

Perceived need to eliminate uncertainty

  • –Confusing science with removing uncertainty
  • –Delusional certitude
  • –Wilful blindness

Recognition of uncertainty can have value

  • –Appreciation of possibilities
  • –Adaptation to circumstances
  • –Better decisions

The Ethics of Uncertainty

Uncertainty mostly seen as undesirable, yet

  • –False certainties of “Useless Arithmetic”
  • –Consequences of authority metrics

Is recognising uncertainty good or bad?

  • –Prevents or delays crucial action? (overuse of “precautionary principle” or on the other hand tobacco “manufacturing doubt”)
  • –Confuses the public or causes loss of trust?
  • –Makes us feel uncomfortable?
  • –Surely someone must know the “truth”?

Aristotle on appropriate precision: 

“It is the mark of an educated man to look for precision in each class of things just so far as the nature of the subject admits”

Delusional certitude

“It is hard to overstate the damage done in the recent past by people who thought they knew more about the world than they really did.”  – John Kay in “Obliquity” 2010

“Understanding the models, particularly their limitations and sensitivity to assumptions, is the new task we face. “Many of the banking and financial institution problems and failures of the past decade can be directly tied to model failure or overly optimistic judgements in the setting of assumptions or the parameterization of a model.”  – Tad Montross, 2010, Chairman and CEO of GenRe in “Model Mania”

Seek transparency and ease of interrogation of any model, with clear expression of the provenance of assumptions. Communicate the estimates with humility, communicate the uncertainty with confidence. Fully acknowledge the role of judgement.”  – D. J. Spiegelhalter and H. Riesch

Tools For Judgement

  • Blobograms
  • Decision Portraits
  • Nomograms
  • The combination of these tools will enable us to reason rigorously about uncertain quantities
  • In particular, the use of blobs with nomograms would enable the identification of models that are strictly nonsensical: GIGO
  • That is, where uncertainties in inputs must be suppressed lest outputs become indeterminate

The Logic of Failure in Human Decision Making

Losers:

  • Acted without prior analysis
  • Didn’t test against evidence
  • Assumed absence of negative meant correct decisions made
  • Blind to emerging circumstances
  • Focused on the local not the global
  • Avoided uncertainty

‘good participants differed from the bad ones … in how often they tested their hypotheses. The bad participants failed to do this. For them, to propose a hypothesis was to understand reality; testing that hypothesis was unnecessary. Instead of generating hypotheses, they generated “truths” ’

“ … we do not feel it is generally appropriate to respond to limitations in formal analysis by increasing the complexity of the modelling. Instead, we feel a need to have an external perspective on the adequacy and robustness of the whole modelling endeavour, rather than relying on within-model calculations of uncertainties, which are inevitably contingent on yet more assumptions that may turn out to be misguided. ”  – D. J. Spiegelhalter and H. Riesch

Whilst uncertainty is not to be glorified

  • We should not disguise our ignorance with delusionally certain models
  • We can take advantage of the greater scope uncertainty offers
  • Tools are needed to support judgement

276 responses to “Taylor and Ravetz on the value of uncertainty

  1. Judith,

    When you start to assimilate the vast areas of climate into a form of understanding, it is astronomically complex.

    • Is reducing all climate change to a single variable, the increase in atmospheric CO2 measured in ppm, The ONE — to the exclusion even of the Sun — exemplary of, this “crisp number” mode of thinking.

    • It is a number that leaves little room for skepticism. For an amount burned everyone can have quite good agreement on the ppm number mainly because both the burn rate and ppm have been so measurable for half a century.

    • thisisnotgoodtogo

      “It is a number that leaves little room for skepticism. For an amount burned everyone can have quite good agreement on the ppm number mainly because both the burn rate and ppm have been so measurable for half a century.”

      You forgot again, Jim D.
      “…except for The Pause, which we attribute to other things, things like this’n’that, ‘n’sun, ‘n’ ocean – stuff which never behaved like that until The Pause hit”.

      Uh huh! Very agreeable.

    • thisnot, that is just the point. The CO2 burn rate and rise rate are independent of those things and are quantifiable. If you want crisp numbers this is it. Do we want 700 ppm or 500 ppm? This choice governs a burn rate.

    • Jim D: Your assumption that the ppm rise is precisely determined by the burn rate is a fine example of crisp number thinking. This is actually quite uncertain.

    • Astronomically complex and thus beyond all human comprehension. Which leads to a psychological defense mechanism, with a just a delicious name, delusional certitude. Ain’t it the truth.

    • It is to believe in the certainty of something when it’s costing someone else. People do have a remarkable ability to abandon a strongly held belief when it becomes obvious it is costing themselves and not just others. That will be the real test of faith. ~Walter Starck

    • thisisnotgoodtogo

      Jim D | December 13, 2013 at 12:35 am |

      thisnot, that is just the point. The CO2 burn rate and rise rate are independent of those things and are quantifiable. If you want crisp numbers this is it. Do we want 700 ppm or 500 ppm? This choice governs a burn rate.”

      Jim D,
      You forgot again.
      CO2 does not just go into the atmosphere in one shot and stay there. at 700.
      Try reality. A refreshing change.

    • I am surprised (not) that the “skeptics” don’t believe that a burn rate is linearly connected to a rise rate despite the 50-year history of CO2 measurement that shows this connection with about the highest correlation you could imagine. It is very clear what it takes to reach 500 ppm and stop there versus 750 ppm and continuing to rise. If you want “crisp numbers” the ppm levels are as crisp as they get for different possible burn rate scenarios.

  2. I like the thrust of this. However, when you have to include a bullet point such as “Tools are needed to support judgement,” I’m not sure how much hope it is possible to hold for a successful righting of the debate.

  3. I understand there have been very sudden climate changes in the past – e.g. from ice age temperatures to near current temperatures in 7 years and 9 years at 12,500 and 11,500 years ago. However, both these were very beneficial for flora and fauna (Figures 15:21 and 15.22, pp391,392 http://eprints.nuim.ie/1983/1/McCarron.pdf ).

    Given this, it seems that the climate change projections over the next century are not of much use. What we really need to know is the probability that GHG emissions will increase or decrease the likelihood of a sudden climate change and increase or decrease the time until such sudden climate change may occur. Furthermore, will our GHG emissions reduce ir increase the net costs/benefits of a sudden climate change. That is, what is the probability that it will reduce the severity and net damages of a cooling event, and what is the probability it will increase the severity of a warming event. This is what I want to know.

  4. If the warmistas embrace uncertainty, perhaps they will become more tolerant ;)

    • thisisnotgoodtogo

      Michael said:
      ““crisp number” mode of thinking.”

      Yeah, boo hisss to crisp numbers.

      Thankgod the IPCC gives us a range for metrics like CS, and uses phrases like ‘best estimate’.

      You’re forgetting again, Michael, that In Otto et al, so many of the IPCC sensitivity study authors just concluded differently than IPCC reported.

  5. ““crisp number” mode of thinking.”

    Yeah, boo hisss to crisp numbers.

    Thankgod the IPCC gives us a range for metrics like CS, and uses phrases like ‘best estimate’.

  6. Having read the slides, it seems to me that Ravetz has done a complete about face. The last time that he was prominent on this site or at WUWT he was pushing a post-modern science that enabled policy decisions to play a role in the science. Now he has gone so far in the other direction that he has produced the following slide:

    “ … we do not feel it is generally appropriate to respond to limitations in formal analysis by increasing the complexity of the modelling. Instead, we feel a need to have an external perspective on the adequacy and robustness of the whole modelling endeavour, rather than relying on within-model calculations of uncertainties, which are inevitably contingent on yet more assumptions that may turn out to be misguided. ” – D. J. Spiegelhalter and H. Riesch

    Seems to me that many of us were telling him this all along. Maybe he learned. Also, there is a great emphasis on the value of testing. I guess Kuhn and the Frankfurt School have fallen in his favor.

    • Sometimes
      we fallible
      humans develop
      tools that
      transcend
      our
      fall-ib-ill-ity,
      tools of
      technology,
      water wheels ‘n such,
      so our machines do
      the work of meny,
      includin’ serfs.
      allowin’ us leisure
      ter think
      ‘n problem solve,
      develop tools
      of logic ‘n criticism
      Oh my god!
      So we transcend
      our human
      sub-jectivity
      ‘provisionally,’
      develop tests
      that tentatively,
      allow us
      ter escape
      the human
      con-dish-un.

    • Ravetz was misunderstood at WUWT all along. I read his stuff 20 years ago and I knew he didn’t mean what people were attributing to him. He’s partly to blame for the misunderstanding. “Post-normal science” is misnomer. It’s not science as such, and was never meant to be. It’s intended as a way of using science in decision making.

    • Dagfinn | December 13, 2013 at 5:32 am

      Agree. I think some of this is due merely to the way it is written. Expressing the concept as ‘post normal-science’ would help. This implies something happens (decision making) after ‘normal science’. The more usual depiction of ‘post-normal science’ flips many folks into thinking this is something about science that is ‘after normal’, i.e. ‘abnormal’ in some way. In practice however, a lot of what is happening in climate science *is* kind of abnormal in that various forms of bias have caused the true scientific method to be left behind. Hence, ironically, the misunderstood label from Ravetz has become a curiously appropriate one.

      From the sceptical Cli-Fi novelette ‘Truth’: “Yet the hissing of that social worm drowned out reason, and its spit of overstated urgency corroded post normal-science into negatively post-normal science. Reason would be the beast’s death, which maudlin hacks helped avoid by portraying CO2 immorality with la muerte de los nietos, a profitable product-line for seers of old ironically inconvenienced by El Nino and La Nina.”
      https://www.smashwords.com/books/view/273983

  7. ” idolisation of pre-Einsteinian physics in the quantification of social, economic, and behavioural sciences.”

    Isn’t climate science a physical science? I hate to break it to you but human behavior is far more complex than a climate system. And behavior depends on variables that are often difficult to measure or to even define. I can see why they would say that models in these areas should be taken with a grain of salt..

    • ” idolisation of pre-Einsteinian physics in the quantification of social, economic, and behavioural sciences.”

      Isn’t climate science a physical science?

      Climate science is the science of the behavior of the ocean/atmosphere/ice cover system.

    • “science of the behavior”

      It’s a matter of physical processes that can be quantified and measured In the Social Sciences you are dealing with individuals or groups whose behavior is determined by multiple factors many of which we can’t quantify and have difficulty measuring.

    • This really cracks me up. I mean, here’s physics, whose objects of study DO NOT come with little numbers on them… those have to be made up by the phsycists. On the other hand, here’s economics, whose objects of study talk in terms of quantities themselves–how many for how much? So I am supposed to buy this utter nonsense that quantification in all the social sciences is somehow MORE of a loose metaphor than it is in (say) physics? Oh please. Have an ounce of sophistication please.

    • AK-
      I have to amend your statement a little AK. Based on the relative infancy of climate science perhaps this is more accurate.
      “Climate science is the science of the behavior of the climate scientists” :)

    • @dennis adams…

      Or should we call that “Climate Scientology”?

    • So you really believe the study of Economics is on the same ground as physics? They make a distinction between soft sciences (e.g Social Sciences) and hard sciences for good reason. And it has to do with uncertainty because of the reasons I stated before.

    • Joseph, I meant that this quote from the maroons, “idolisation of pre-Einsteinian physics in the quantification of social, economic, and behavioural sciences,” really cracks me up, not your commentary on it.

      As far as this goes, “So you really believe the study of Economics is on the same ground as physics,” no I don’t believe that. What I believe is that the above insinuation that “physics envy” is the REASON for quantification in economics is just plain stupid, and that the things to be explained in economics are in numerical form even if there is no discipline of economics. Trade is as old as art, and humans have been asking each other “how much for how many” much, much longer than they’ve been applying the (very fruitful) metaphor that a falling rock is like the square of a real number in some sense.

      The reasons why economics is hard, I submit, have almost nothing to do with the appropriateness of quantification in a discipline that studies human trade. To repeat: I don’t object at all to what you said. I object to the garbage written by Taylor and Ravetz.

  8. Judith Curry

    Until the “uncertainty” regarding the magnitude of natural as well as anthropogenic climate changes can be narrowed down, we are unable to make any meaningful suggestions for policy initiatives to alter our future climate accordingly.

    The 2011 study by Richard Tol basically accepts the IPCC view on certainty, but takes it with a grain of salt (uncertainties in the “science” are much greater than those in the economic study itself).

    The study concludes that the warming experienced to date has had a positive impact for humanity.
    http://www.copenhagenconsensus.com/sites/default/files/climate_change.pdf

    Climate change increased welfare by the equivalent of a 0.5% increase in income for the first half of the 20th century. After 1950, impacts became more positive, edging up to 1.4% of GDP by 2000. However, impacts roughly stabilize after that, reaching their maximum at 1.5% of GDP in 2025 and then precipitously fall to reach – 1.2% of GDP in 2100.

    “1.4% of GDP” represents around $1 trillion benefit from climate change to date and this resulted from warming of 0.75°C, up to 2000.

    Tol also estimates that, on balance, warming of up to 2.2°C above today’s temperature would be beneficial for humans.

    Many individual components were estimated to be beneficial beyond 2.2°C. The largest negative component came from an imputed higher energy costs, rather than from environmental impacts. IOW if unit energy costs could be kept low this impact would be reduced and the net overall impact would be positive at temperatures even higher than 2.2°C.

    This level is not likely to be reached until quite late this century, with the “breakeven point” occurring around 2080 under the IPCC worst case scenarios. It could well never be reached at all by 2100, if these scenarios are exaggerated.

    So the risk of negative impacts for humanity from global warming within this century is low.

    What about global cooling?

    The unusually high level of 20thC solar activity has stopped. SC23 was weaker than normal and current SC24 is very inactive. Several solar studies suggest that we may be heading for a prolonged period of lower than average solar activity, possibly even a period like the Maunder minimum.

    From Tol’s study, we can conclude that the first 0.75°C cooling would cost the world the $1 trillion it gained from the equivalent warming to date.

    It is most likely that an added 0.75°C cooling would be even more costly for humanity.

    Could continued unabated human CO2 emissions avert a prolonged period of cooling?

    Nobody knows the answer to that question, although the observed slight cooling of the past decade despite unabated CO2 emissions and concentrations reaching record levels would suggest that we are unable to stop cooling with CO2 emissions.

    But it seems clear to me that jumping into costly actions today to reduce CO2 emissions in order to stop warming that is likely to be beneficial though most of this century and thereby risking cooling that could be harmful right away makes no sense.

    Correct my logic if I’m wrong here.

    Max

    • P.S. Just to add Tol’s figures:

      Figure 2 shows the economic impact of climate change.

      (Year:Temperature change:%GDP economic impact)

      1900: 0ºC: 0%GDP
      1950: +0.3ºC: +0.5%GDP
      2000: +0.75ºC: +1.4%GDP
      Projections:
      2030: +1.2ºC: +1.2%GDP
      2050: +1.7ºC: +1.0%GDP
      2080: +2.7ºC: 0%GDP
      2100: +3.7ºC: -1.2%GDP

      So the 20thC warming of 0.7ºC has added 1.4%GDP (~$1 trillion).

      And if we now had an extended period of cooling, which would reverse the 0.75ºC warming experienced over the 20thC, we would have a negative impact of -1.4%GDP (IOW we would “give back” the net $1 trillion gain we saw). If the cooling were twice this amount (1.5 ºC), the net loss would be at least $2 trillion.

      “Breakeven” occurs in 2080 (under IPCC “worst case scenario”).

      By 2100, IPCC “worst case scenario” results in a net loss of 1.2% GDP (~$0.8 trillion).

      If his numbers are right this looks like a no brainer to me: do nothing today to reduce CO2 and hope it doesn’t get colder.

    • Manacker,

      Your logic looks pretty sound to me.

    • Now suppose we went into a solar minimum. I would think the question then would be how much of a decrease in temperature (if any) would there be and would there be a subsequent decrease in CO2 due to cold and enhanced carbon sink? If there is no decrease in CO2 what would the ultimate effect be?

      A prolonged cold spell would probably totally discredit the CAGW crowd. If there is merit to AGW and CO2 kept increasing during the minimum then after the cold spell whooa HEAT WAVE!! I guess humanity better hope we use up all the carbs before that time. Or maybe they can just shoot sulfur particles into the air or detonate some nukes.

    • A Maunder Minimum is about 10-20% the forcing effect of doubling CO2, so there is some perspective to help you see what it does.

    • Max,
      But the ice in the northern hemisphere is still melting, the oceans are still warming and you are neglecting the large uncertainty in your observation that there has been slight cooling in the last 10 years.

      But I’ll propose that any Solar scientist would agree that the sun will in the long term continue to grow brighter and put out more energy.

      And the CO2 at 400 ppm will maintain temperatures for the rest of this century, your cooling trend will eventually dissipate.

    • Well, I certainly hope to get warmer again before the next glaciation.
      ============

    • Jim D writes “A Maunder Minimum is about 10-20% the forcing effect of doubling CO2, so there is some perspective to help you see what it does.”

      The farce continues. Warmists trot out these numbers as if they were written on tablets of stone. First, no-one has the slightest idea what the climate sensitivity for a doubling of CO2 is. All we have are estimates, and no measurements, so any value ascribed to the CS of CO2 is just a guess. What little empirical data as we have, that there is no measured CO2 signal in any modern temperature/time graph, gives a strong indication that the climate sensitivity for a doubling of CO2, however defined, is 0.0 C, to one place of decimals, or two significant figures. Now 20% of 0 is 0, and we know that during the Maunder minimum there was a significant cooling even.

      Sorry, Jim. No sale.

    • “A Maunder Minimum is about 10-20% the forcing effect of doubling CO2, so there is some perspective to help you see what it does.”

      Have to admire this stuff. Make it up as you go along while making sure not to blush. You have no idea, nor does anyone else, what the “forcing effect”
      is of a Maunder type minimum. The effect of a doubling of Co2 is also not known with anything like the precision you want to claim.

    • Walt Allensworth

      “If his numbers are right this looks like a no brainer to me: do nothing today to reduce CO2 and hope it doesn’t get colder.”

      Indeed.

      And yet, our POTUS is hell-bent on having the world pay extraordinary sums of money in an attempt to do just the opposite. The best face I can put on this behavior is “misguided,” though many other more nefarious terms come to mind.

    • The TSI during a Maunder type solar minimum is 1360.1 watts/square meter, and maximum is 1361.9 (Wang, Lean and Sheeley Astrophysical Journal 2005)

      Only gives you about 0.1 C cooling, not enough for a little ice age, let alone a glaciation. Doggerland remains in Davey Jone’s locker.

    • @ Jim D: A theoretical question:

      If, by decree, we halted ACO2–completelly–by 1 Jan 2015, what does the physics say the TOE would be in 2050 and how would that TOE be better than the TOE in 2050 if we ignored ‘climate change’, established NO policies to address it, and produced the energy to run our civilization by whatever means made economic sense?

    • Bob Ludwick, the difference between business as usual and stopping cold now by 2050 is more than 1000 Gt CO2, which amounts to about 0.5 C in eventual warming. Of course, what we do past 2050 matters just as much.

    • I sense that the “skeptics” don’t believe the size of the solar variations that led to the Maunder Minimum. Is this just motivated rejection of facts, or is it based on any scientific publication? Have they also not noticed that even the sunspot cycles are seen statistically in the temperature record? It doesn’t take much forcing to be noticed. The forcing change of the sunspot cycle is about 20 times less than doubling CO2. By the way, forcings are also “crisp numbers” that can be compared with each other. An example is that doubling CO2 is 3.7 W/m2. Increasing the sun’s strength by 1% is 3.4 W/m2. Crisp numbers for you.

    • @Jim D

      My question actually had two parts.

      Your answer to the first one, if we completely stopped ACO2, how much warming would be averted by 2050, was 0.5 C.

      So I take 0.5 C as the upper bound of the efficacy of any possible CO2 control measures that we could POSSIBLY enact.

      The second , unanswered, part was, in what way would preventing a half degree rise, amortized over r36 years, be better than if we just ignore climate and produce energy as needed, by the most efficient and economic means available.

      Accepting your 0.5 C ACO2 warming/36 years as gospel and if given a choice, which seems increasingly unlikely, I’ll take the 0.5 C warming, abundant plant food, and cheap, plentiful energy in a heartbeat.

  9. What can be computed without much uncertainty is the future CO2 level for given fossil fuel use scenarios. For a strong mitigation strategy limiting CO2 to 1500 Gt in the next century we will have 500 ppm. For a business as usual growth with 5000 Gt CO2 in the next century we go past 700 ppm. With our emission rate we are choosing between 500 and 700 ppm. Paleoclimate tells us this is not a small difference for the climate. Choose a target ppm and that tells you the strategy. Some have promoted this way of thinking in terms of target levels.

    • Jim D

      Agree that a BaU scenario could get us to around 650 ppmv CO2 by 2100.

      Implementing various “no regrets” initiatives, could lower this by as much as 100 ppmv (mostly coming from switching from coal to nuke or natural gas where nuke is not feasible or gas is plentiful).

      So our guesstimates are in the same ballpark.

      Max

    • Jim D,

      Paleoclimate tells us this is not a small difference for the climate.

      What are you trying to say?

      Are you saying it is good or bad?

      How good or how bad?

      How do you know?

      Choose a target ppm and that tells you the strategy.

      How do we choose a target ppm rationally?

      How does choosing a target ppm tell us the strategy?

    • manacker, 650 ppm in 100 years from now is not likely under business as usual because it allows for hardly any growth in emissions to keep up with population and development. In real terms it is a reduction in CO2 per capita, which sounds like a mitigation scenario. It also leaves fossil fuels in the ground if you want to stabilize at 650 ppm rather than continue to emit past that.

    • What can be computed without much uncertainty is the future CO2 level for given fossil fuel use scenarios.

      A very good example of “Delusional certitude”.

      The “model(s)” that statement is based on is/are full of unproven, and IMO unwarranted, assumptions. Anybody who has actually studied the ecology of the the various carbon sinks/sources can see this, unless they are engaged in “Wilful blindness”.

      Oh, there may well be a direct connection between emission levels and increasing pCO2, but there’s a variety of other mechanisms that might be responsible, or partly so. And we mustn’t forget Salby’s suggestion that CO2 levels have been bouncing around a lot more in the last few centuries than the current “consensus” position claims.

      In Kuhnian terms, any time somebody insists that the current paradigm is “certainly correct in all particulars”, they’re guilty of “Delusional certitude”.

    • Peter Lang, well, paleoclimate indicates that the Antarctic ice sheet would become unsupported by its local climate somewhere less than 700 ppm, and that would be the last permanent ice on earth, so I would say it is not good, at least for future sea levels, to reach that level. A target nearer 500 ppm would be safer from that perspective, but still a bit on the edge of that cliff, and Greenland would be melted anyway for 10 meters sea level. These are the choices; 10 meters or 70 meters.

    • AK, I think that there are some people, believe it or not, who know that Man has added nearly 2000 Gt CO2 to the atmosphere, and when the level in the atmosphere rises by 1000 Gt in the exact same period with a matching accelerating rate with ocean acidification taking care of the rest, it is completely unrelated. I don’t understand those people. Do you? Where is the CO2 coming from if both the ocean and atmosphere are increasing, and where is Man’s contribution going instead? Senseless denialism.

    • Jim D

      paleoclimate indicates that the Antarctic ice sheet would become unsupported by its local climate somewhere less than 700 ppm, and that would be the last permanent ice on earth, so I would say it is not good, at least for future sea levels, to reach that level.

      You seem to be ignoring the time dimension. It’s not possible t have a rational discussion if you try to imply that scenarios that would take thousands of years to happen, are likely to happen during the next 100 years. So, please try to be sensible and give up on the scaremongering. I asked you serious questions and you haven’t even attempted to answer them sensibly.

      If you can’t or don’t attempt to answer the questions sensibly I am left with an even stronger impression that the CAGWers arguments are little more than unfounded beliefs and scaremongering. I accept you are concerned and believe your concerns are well founded, but dodging the questions reinforces the impression your concerns do not have a sound basis.

    • @Jim D…

      I don’t understand those people. Do you? […] Senseless denialism.

      They’re just like you, only on the other side of the question. Both they and you (and those like you) demand certainty rather than the more tentative theories real science always deals with.

      The rate of increasing pCO2, as measured by recent direct measurements and proxies that may be correct, can be correlated very roughly with the exponential advance of the Industrial Revolution, as measured by estimated emissions. Thus, we can fairly say that there’s a good probability of cause and effect.

      But those same estimated emissions would probably be a good proxy for the general advance of the Industrial Revolution, and widespread mechanization of processes that were manual before, to the extent that they even existed.

      A careful look at the details of the Earth’s carbon budget will show that the anthropogenic component is a tiny fraction (~3%?) of the total amount emitted and absorbed on an annual basis. Changes to the ecology of (some of) the major sinks and sources could easily swamp the effect of human emissions from fossil sources.

      Any such change that was driven by mechanization could plausibly play (have played) a part, even perhaps a major part, in the supposed increase in atmospheric pCO2. A very incomplete list would include whaling, deforestation, bog drainage (especially peat bogs), mechanized farming including phosphate pollution (I’d expect the latter to work in the opposite direction, but there’s no certainty), and aerosol pollution from internal combustion engines. Others could easily be conceived.

      In the face of these possible and plausible alternative “causes”, especially given the possibility that the proxies used to estimate pCO2 prior to 1958 (which wasn’t that long ago) are incorrect, continued insistence on the “certainty” of “future CO2 level for given fossil fuel use scenarios” would seem to be just as much “Senseless denialism” as insisting that “it is completely unrelated.”

    • Peter Lang, the best target ppm is one that is not in unknown territory for life on earth. You may not think that a continuously rising sea level is an important inconvenience, but others do and would prefer not to have it, given a choice. You may think that global average temperatures can’t increase by 4 C at 700 ppm, and do it quickly, but that is more likely than not to happen, and continental areas would continue to warm to those levels faster because they are already at a rate to do that.

    • AK, those people don’t have a balanced carbon budget. The ocean can’t outgas and acidify at the same time, but they don’t notice this discrepancy. The Keeling curve looks as manmade as can be. What else is needed to convince them? It is just Denial, plain and simple. They ignore the easy explanation with a balanced carbon budget, and wander off looking for something irrational instead.

    • @Jim D…

      I’ve argued with people who blame outgassing. It’s as pointless as arguing with you. I’m talking about flow of carbon into/out of real sinks and sources, which include two pumps for surface carbon into the depths, each of which is dependent, in a probably very non-linear manner, on pCO2. Also smaller storage areas, including peat bogs, lake and river bottom deposits, forest wood, root mass, etc. All of which are also dependent on pCO2, among a host of other things again in a very non-linear manner.

    • AK, and so you assert that the way the Keeling curve matches emission rates over the last 50 years is just one big unexplainable coincidence, and you seem quite sure of that too.

    • Jim D,

      You didn’t answer any of my questions. Instead you restated your beliefs and and emotions. So you have confirmed that your beliefs are not based on rational analysis, but just on the emotional beliefs of others. For example you said:

      You may not think that a continuously rising sea level is an important inconvenience, but others do and would prefer not to have it, given a choice.

      But that is irrational. We know the damage cost a 0.5 m sea level rise is trivial, about $0.2 trillion in $35,000 trillion total GDP to 2100.

      People would like all sorts of things given a choice and without considering the consequences of their actions. The consequences of the mitigation policies advocated to date would be costly, reduce the rate of improvement in human wellbeing, and for no measurable benefits. You seem to not recognise these things, so I cannot give your views and opinions much credence.

    • Peter Lang, if you can rationalize how a 4 C global average rise in temperature in the space of a century is good instead of bad, you will be getting somewhere, but so far no one has managed to do that, and we only see lots of negative estimates. You should try to counter them with facts, because otherwise it just looks like you are talking hopes and wishes.

    • Jim D

      No, Jim, you are wrong

      650 ppmv by 2100 (87 years from today) takes into account the projected growth in population (UN estimates) plus an increase in overall average per capita CO2 generation of 30% resulting in a total cumulative CO2 generation from today to 2100 of around 4000 GtCO2.

      Run the figures yourself.

      Sure, fossil fuels will still be “left in the ground” by 2100.

      If we believe WEC 2010 estimates, there are enough in the ground to get us to a maximum of a bit less than 1000 ppmv when they are all gone, but that will not happen by 2100 (if ever).

      Max

    • Jim D

      Looks to me like you are getting carried away in hysteria in your exchange with Peter Lang on future GH warming.

      So let’s see if I can get you back on the ground.

      IPCC has a “worst case” business as usual scenario RCP8.5, which projects around 1000 ppmv CO2 by 2100 and an increase in temperature of 3.7C above current level.

      This scenario ASS-U-MEs that human CO2 emissions will exceed 100 GtCO2 per year by 2100, three times today’s level, with a population that has increased by 1.5 times, so a doubling of the per capita CO2 generation of today.

      This appears overly exaggerated, since it would mean that the entire world population averages the same per capita CO2 generation as the industrialized nations (N. America, W. Europe, Japan, AUS/NZ) have today.

      The per capita emissions of the industrialized nations is decreasing steadily today, plus it is highly unlikely that the poorest nations will reach this same level by 2100, so this does not make sense

      A more reasonable estimate is that the world per capita average increases by 30% and total emissions will reach twice today’s level, or around 66 GtCO2 per year. And that would get us to around 650 ppmv by 2100 and using the same IPCC 2xCO2 TCR used for the “worst case”, a warming of 2C over today’s temperature

      And if we do some “no regrets” initiatives (primarily new nuclear instead of coal), we could get this down by 60 to 80 ppmv, with a reduction of warming of up to 0.6C.

      All of this ASS-U-MEs that the IPCC 2xCO2 CS is not exaggerated. If it is (and this appears likely in view of the many recent observation-based studies out there), then the projected warming would be lower.

      So rejoice, Jim!

      Max

      .

    • Jim D

      Keeling curve matches temperature?

      Not really, Jim.

      The Keeling curve is based on ice core data prior to 1959, so should be taken with a grain of salt.

      But it shows a steady slow increase, with a slight acceleration toward the late 1950s, when Mauna Loa measurements start.

      Over this period we have multi-decadal cycles of warming and slight cooling, lasting about 30 years each, with no apparent correlation at all with the steadily increasing CO2 curve. Temperature goes up by 0.5C from 1910 to 1940, with very little increase in CO2, and then when CO2 increase starts to accelerate after WWII, temperature starts to cool slightly. The late 20thC warming cycle fits accelerated CO2 increase pretty well with another 0.5C warming, but since around 2001 it has again started to cool slightly, despite unabated CO2 emissions and levels reaching record heights.

      So the overall CO2 temperature correlation is statistically not very robust. And where there is no robust statistical correlation, the case for causation is weak, Jim.

      Max

    • There was good correlation between temperature rise and CO2 rise only in the last quarter of the last century, and not before and not since.
      =====================

    • @Jim D | December 13, 2013 at 12:08 am |

      AK, and so you assert that the way the Keeling curve matches emission rates over the last 50 years is just one big unexplainable coincidence, and you seem quite sure of that too.

      The “way the Keeling curve matches emission rates over the last 50 years” is actually a very rough correlation, on a narrow range of time-scale, and what I’m asserting is that there is some unquantifiable probability that human emissions aren’t the primary cause. The general correlation is with the roughly exponential growth of the mechanization with the Industrial Revolution, and there is a variety of other mechanisms that might be involved.

      Of course, the smart money would bet on emissions, but it would be wise, IMO, to hedge those bets.

    • The general picture of the link between emissions and increase in atmospheric CO2 concentration is certainly correct. We know empirically enough on the changes of the amount of CO2 in atmosphere, oceans, vegetation and soil to be sure of that. Arguments like those of Salby lack totally validity, and are so badly in contradiction with this knowledge that they can be dismissed.

      Having said that, the quantitative knowledge as well as the accuracy and reliability of the models of carbon cycle is not very good. That’s manifested most clearly by the recent multi-model analysis of Joos et al that’s used as the main source of IPCC AR5. The figures of that paper tell as an example that the 5%-95% confidence limits for the amount left in atmosphere from a sudden pulse after 100 years are 28%-52% (as read from the Fig. 1). This is a really wide range for a quantity supposed to be known very well.

      The uncertainty builds largely up in 20 years from the release. Thus estimating the increase in the concentration is inaccurate already over periods this short.

    • Matthew R Marler

      Jim D:
      You may think that global average temperatures can’t increase by 4 C at 700 ppm, and do it quickly, but that is more likely than not to happen, and continental areas would continue to warm to those levels faster because they are already at a rate to do that.

      What is the rate of warming as a function of CO2 concentration? At the surface, land vs water, Equator vs poles? In the deep sea? If 2C is the TCS to doubling, and 3C is the ECS, how much time passes between doubling and a 1.9C increase? How much time passes between a 1.9C increase and a 2.9C increase?

      Rates are what tell us how much time we have to mitigate and adapt, if the bad-case scenarios are accurate.

    • thisisnotgoodtogo

      Jim D | December 12, 2013 at 11:34 pm |

      AK, those people don’t have a balanced carbon budget. The ocean can’t outgas and acidify at the same time, but they don’t notice this discrepancy. The Keeling curve looks as manmade as can be. ”

      Jim D, I reminded you right at the start about what you were forgetting.
      It was only a matter of time before you forgot what you forgot.

      .
      You forgot again, Jim D.
      “…except for The Pause, which we attribute to other things, things like this’n’that, ‘n’sun, ‘n’ ocean – stuff which never behaved like that until The Pause hit”.

      Uh huh! Very agreeable.
      Jim D | December 13, 2013 at 12:35 am |

      thisnot, that is just the point. The CO2 burn rate and rise rate are independent of those things and are quantifiable. If you want crisp numbers this is it. Do we want 700 ppm or 500 ppm? This choice governs a burn rate.

  10. I can give a perfect example of the value of uncertainty with regards to this notion: “Yet these inadequacies matter when computerised models generate pseudo-realities of their own through structures such as financial derivatives and processes such as algorithmic trading. Like Frankenstein’s monster, we have already seen financial market pseudo-reality take on an uncontrolled, unstable and dangerous life of its own, all the more beguiling when it generated income for all parties in the merry-go-round. Despite its manifest failings, it is still going on.”

    That examples name is Myron Scholes. He is the co-creator of the algorithmic trading formula that made futures and options trading possible whereas today we have a $1.2 Quadrillion (plus) Derivatives Market Dwarfs World GDP. http://www.dailyfinance.com/2010/06/09/risk-quadrillion-derivatives-market-gdp/

    Scholes went on to form Long Term Capital Management using algorithm computer models to invest. The result: “The fund, which started operations with $1 billion of investor capital, was extremely successful in the first years, with annualized returns of over 40%. However, following the 1997 Asian financial crisis and the 1998 Russian financial crisis the highly leveraged fund in 1998 lost $4.6 billion in less than four months and failed, becoming one of the most prominent examples of risk potential in the investment industry.” Allen greenspan ended up bailing them out saying they were too large to fail.

    Today the banking industry is largely involved in selling Put and Call options and futures on margin that created this humongous market made possible with Scholes. Black and Mertons Nobel prize winning equation.
    http://en.wikipedia.org/wiki/Myron_Scholes

    • Here’s an article written shortly after the Long Term Capital Management fiasco. Rereading it, after all this time, brings up memories.

      • AK
        From your article “Privatizing The Regulatory Environment” It sounds like that person had a pretty good idea! As it is powerful bank lobbies seem to prevent or pervert any regulation public of private. I don’t know what will become of it and it does do away with normal economic inflation by sucking up every drop of monetary enhancement but it is definitely not your grandfather’s economy anymore.

        Thanks for the read.

      • Interesting you had that on your blog over a year ago :-)

    • @ordvic…

      Glad you enjoyed it.

      As it is powerful bank lobbies seem to prevent or pervert any regulation public of private.

      Perhaps the answer is Counterparty Surveillance

  11. Oops, it’s suddenly become ungraceful to brag on my innumeracy.
    ==========

  12. As an engineer, I like my numbers crisp, the crisper the better.

    I wonder how many would let their children fly on an airliner designed by this man…

    I suppose this is what is referred to as “Post-Normal Science” AKA the “Revenge of the Bolloxologists”, something to do with their physics envy (I nearly typed something else there!).

  13. ” crisp numbers”

    Lovely. I like the kineticist’s term “very pretty rubbish”.

  14. A fan of *MORE* discourse

    On-line and free-as-in-freedom:

    Computer Science Theory
    for the Information Age

    by John Hopcroft and Ravi Kannan

    We have written this book to cover the theory likely to be useful in the next 40 years […] one of the major changes is the switch from discrete mathematics to more of an emphasis on probability and statistics.

    Some topics that have become important are high-dimensional data, large random graphs, and singular-value decomposition, along with other topics covered in this book.

    Aye Climate Etc lassies and laddies, *here* are the Keys to the Kingdom … the kingdom of 21st century science, economics, and engineering.

    Confidence-from-complexity? These folks teach the nuts-and-bolts.

    Written by two masters-of-the-art. For free. Have fun!

    \scriptstyle\rule[2.25ex]{0.01pt}{0.01pt}\,\boldsymbol{\overset{\scriptstyle\circ\wedge\circ}{\smile}\,\heartsuit\,{\displaystyle\text{\bfseries!!!}}\,\heartsuit\,\overset{\scriptstyle\circ\wedge\circ}{\smile}}\ \rule[-0.25ex]{0.01pt}{0.01pt}

    • That isn’t science, that is photographing your stamp collection.

      When you going to write a program to design a chemotherapeutic?
      It would be a lot cheaper to reopen asylums, fill them with computers, and just let the reality challenged wonder in and stay.

  15. Heh, why doesn’t he just say the models are crispy critters? i did like ‘all the more beguiling when it generated income for all parties in the merry-go-round’ speaking of the financial markets but with a sideways look at climate science and energy policy, blinking at the splendor of climate and energy finance.
    ============

  16. –Confusing science with removing uncertainty

    Well, that is interesting – because it seems to describe the fallacious thinking of some “skeptics” who seem to think that the analysis of climate change isn’t science because of the integral uncertainty.

    • Some maybe but the general approach of lumping everything into one uncertainty range is pretty odd. The actual “forcing” of CO2 can be more accurately estimated than aerosols, clouds etc. etc. but “sensitivity” has everything included creating a huge mess. There is no valid reason to assume solar forcing should have the same weight as CO2 or aerosols the same weight as albedo. So they took and already non-linear problem and made it basically unsolvable in an attempt to “simplify”.

    • Perhaps the handling of uncertainty isn’t comprehensive. I don’t understand the science well-enough to say – although I wouldn’t doubt it (lot of people have trouble fully engaging with uncertainty). But despite what “skeptics” say, climate scientists do acknowledge uncertainty.

      Some “skeptics,” on the other hand, seek to exploit uncertainty to hold climate science to an unscientific standard, so they can confirm their biases and ironically argue that climate science isn’t a science because it doesn’t meet that unscientific standard.

    • Joshua, “Some “skeptics,” on the other hand, seek to exploit uncertainty to hold climate science to an unscientific standard.”

      “Warmists” exploit uncertainty as well and are actually better at it. Heck they even have the anti-uncertainty defense shield, The Merchant of Doubt.

    • > Heck they even have the anti-uncertainty defense shield, The Merchant of Doubt.

      Are you suggesting that uncertainty is the 2.0 version of the doubt product, Cap’n?

    • Cap’n –

      Let me be more specific. Consider the diatribes here, or at someplace like WUWT, against the use of conditionals when climate scientists talk about projections. Or, consider the way that some “skeptics” habitually ignore or misrepresent stated conditionals when they characterize what climate scientists say. My favorites include the ubiquitous references to that statement by that guy about the future snowfall in England, the ubiquitous and righteous denouncements of all those climate scientists who say that “the science is settled,” the way that Mojib Latif’s statements about projections were deliberately distorted into “climate scientist says that the globe is cooling,” etc.

      Fundamentally, those examples are a rejection, on the part of those “skeptics,” of the importance of uncertainty.

    • Joshua, “My favorites include the ubiquitous references to that statement by that guy about the future snowfall in England.”

      That is a good example. They person making the “never know snow” statement was a scientist with the MET who was wrong. He never should have made that statement with his “science” hat on. He screwed up, the MET is responsible since politically crap initially flows up hill. The MET guy “used” his position to inspire concern. His statement is now “bulletin board” material. One of the reasons NASA and the Bush administration wanted Hansen to chill out is perfectly illustrated by the MET situation. Professional organization should have policies in place to limit announcements but that is considered conspiratorial by warm and fuzzy idealists that want their freedom of speech but don’t mind if yours is limited.

      The MET dealt the hand and just has to fold on that one. You can continue to whine about that or try to actually play the distant observer.

    • Joshua, I’m wondering what you’re referring to by “that statement by that guy about the future snowfall in England”. Which statement, which guy, and which stated conditionals? There is an Independent article that refers mainly to David Viner, who appears to have no doubts. “Children just aren’t going to know what snow is”, etc. So are you referring to some other statement? Or was Viner misquoted, perhaps?

    • > That is a good example.

      If only because it allows Cap’n to escape for his Uncertainty as Doubt 2.0 blunder.

    • Dagfinn-

      Actually, I meant to reference how “skeptics” have dealt with that guy who said that stuff about an “ice-free arctic.”

      But let’s roll with this for a while anyway…

      Google David Viner and statement on snowfall in the UK, and you will return a veritable who’s who of climate “skepticism.” Google page after Google page of references.

      What time frame was Viner referencing? Do you know?

      Was that an off-the-cuff statement to someone in the press, or was it consistent with how he represented certainty about short- and long-term trends w/r/t weather events?

      Was it more a statement of long-term trends w/r/t climate?

      Does continued snowfall in the UK disprove a warming effect to ACO2 emissions? Is this, perhaps, “skeptics” using short-term weather phenomena as a facile rhetoric to address the more complicated issue of long-term climate change?

      How characteristic was that one statement by one scientist in the full context of what the “consensus” view is w/r/t uncertainty in short- and long-term climate trends, or snow in the UK? How characteristic was that statement in the full context, even, of Viner’s full body of work?

      So instead of comprehensively assessing and then representing the full context, “skeptic” after “skeptic” lines up to make the argument that one error – even if it can truly be described as an error (when the full context is assessed) – is a justification for dismissing the full body of work of climate scientists who work hard to quantify uncertainty (which is true regardless of whether you agree with the details of their quantification).

      This is a mindset that holds opposing viewpoints to a standard of perfection, equates “science” with removing uncertainty, and then discounts climate science because not all statements by all climate scientists are perfect.

      And now, (only if you’re ready), we could move on to statements from climate scientists about an ice-free Arctic, and how they have been dealt with by some “skeptics.”

    • Joshua, I agree with you that such statements are useless as a guide to what the scientific consensus might be, if there is one. However, as an example of how climate scientists speak to the media, it’s quite interesting although perhaps extreme. It’s active disinformation on Viner’s part, unless he did what he could to rectify it soon afterwards.

    • Meh, Julia Slingo @ the Met Office is still pitifully trying to defend the mindset that led to Viner’s hilarious words. Y’all just add to the hilarity and the pity.
      =========

    • “How characteristic was that one statement by one scientist in the full context of what the “consensus” view is w/r/t uncertainty in short- and long-term climate trends, or snow in the UK? ”

      If you’re going to argue about context, at least read the link that he provided for you. In summary, it was an article written in the year 2000 in which climate scientists affirmed – based on the consensus of their work – that global warming was responsible for the trend of snow-less winters and confidently predicted that the trend would continue and get worse.

      But yes, in general, the complete utter failure of confident predictions based on a short-term warming trend that has all but disappeared has, indeed, resulted in skepticism (without the scare quotes.) It takes a fake “realist” to claim this is all too complicated and nuanced to say for sure.

    • Alla children of Egypt holla ‘Alla Dr. Viner, alla the time.’
      ====================

    • More “context”: it was the fact that we were bombarded with Viner-esque propaganda for more than a decade that makes it so much fun to point out that it’s snowing in Cairo for the first time in 100 years. http://www.latimes.com/world/worldnow/la-fg-wn-snow-israel-egypt-20131213,0,1691393.story#axzz2nMzV6vMp
      It’s fun to point out because of all the little fake “realists” who will rush about saying that a weather event means nothing in a climate context* (unless it’s a hurricane, typhoon, drought, heatwave or tornado or some other politically exploitable situation that doesn’t too obviously contradict everything said to date.)

    • Dagfinn –

      So there we go. You’re willing to take the step to discuss his statement with an appropriate standard. This is where skepticism (minus the quotation marks) resides.

      But I do take issue with “active disinformation” without further context – as such a phrasing suggest to me a knowledge of intent that I assume you don’t have.

      What time frame was he referencing? Was he intending to to speak to long term changes?

      Did he fail to represent uncertainty w/r/t long-term trends in snowfall concurrent with continued accumulation of ACO2? Perhaps. I imagine that more careful analysis of that among climate scientists reflects uncertainty.

      If someone says something which is willfully exploited by those who use uncertainty as a rhetorical weapon then responsible for correcting the record? Seeing the reaction to his prediction, should he have rectified his statement soon afterward?

      Sure, I’ll go with that. I’d call it an error in judgement, and poor scientific practice.

      Does it become “active disinformation” if is a case where he didn’t rectify a previous statement because his intent was willfully exploited?

      I don’t think I can go with that. How about you?

  17. I’m curious. When did we switch to using the word ‘uncertainty’ instead of using the word ‘accuracy’? The word ‘uncertainty’ produces a negative emotional response in folks not used to statistics terminology. The two words may be used interchangeably in most physical contexts. Why did we switch away from using a word with a positive emotional ring to it: ‘accuracy’?

    • Gary

      Accuracy has a specific definition in science, as the difference between the measured value and the true value of a data point. In most cases this is actually unknowable, although some reassurance on the accuracy of a method can be obtained by undertaking measurements against some certified ‘known’ or at least by calibrating apparatus in such a manner that it can be traced back to comparison with such a known value.

      Uncertainty is a considerably more vague term, including some elements of accuracy, precision (i.e. the spread of results obtained for replicate measurements) and measurement resolution (an issue in climate change studies, where the putative changes in for example sea temperatures are an order of magnitude lower than the reliability of any individual measurement). As such, uncertainty is the correct (or at least a much better) term to be using in this context

    • Gary, you write “I’m curious. When did we switch to using the word ‘uncertainty’ instead of using the word ‘accuracy’?”

      Your question goes to the very heart of the shell game the warmists are playing; you have to be careful to spot where the pea is. Simplistically, according to the scientific method, developed by Galileo and Newton, first you do hypothetical estimations to arrive at the numeric value of parameter, and then you go out an measure it. When you measure the value of the parameter, you automatically get the accuracy with which the measurement was made. You have the accuracy, and you know how certain you can be. Until you have made the measurement, it is wrong and unscientific to claim that you know the value of the parameter.

      With CAGW, it is impractical to actually MEASURE the key number, namely climate sensitivity; we cannot do controlled experiments on the earth’s atmosphere. So the warmists have pulled a fast one; they claim that there is no need to actually measure climate sensitivity, as the scientific method demands; we can be 95% certain about things, just by asking people’s opinions.

      Now since we do not know the accuracy of the numeric value of climate sensitivity, then the warmists have to fudge the issue, and talk about uncertainty. That is why we have changed from talking about accuracy, and talk about uncertainty.

      When you concentrate on the lack or empirical data in the study of CAGW you get to understand the answer to your question.

    • Ian B.
      Sorry, you don’t get off that easy. Once upper and lower limits on your your ‘uncertainty’ are stated, you have made an accuracy claim. When you say something like ‘with an uncertainty of plus or minus 0.1 degree Celsius”, you have claimed that you expect the value to be no more than 0.1 degree above or below the value stated. How would that not be an accuracy claim?
      Or are you saying that claiming an “uncertainty of +/- 0.1 degree Celsius” should be interpreted as “we don’t know how accurate our number is but we are guessing it is +/- 0.1 degree Celsius”? If so, do you really expect people outside the climate science field to realize that is what you really mean?

    • “At the turn of the 17th century, Galileo rolled a brass ball down a wooden board and concluded that the acceleration he observed confirmed his theory of the law of the motion of falling bodies. Several years later, Marin Mersenne attempted the same experiment and failed to achieve similar precision, causing him to suspect that Galileo fabricated his experiment.

      Early in the 19th century, after mixing oxygen with nitrogen, John Dalton concluded that the combinatorial ratio of the elements proved his theory of the law of multiple proportions. Over a century later, J. R. Parington tried to replicate the test and concluded that “…it is almost impossible to get these simple ratios in mixing nitric oxide and air over water.”

      At the beginning of the 20th century, Robert Millikan suspended drops of oil in an electric field, concluding that electrons have a single charge. Shortly afterwards, Felix Ehrenhaft attempted the same experiment and not only failed to arrive at an identical value, but also observed enough variability to support his own theory of fractional charges.”

    • Steven, you write “Shortly afterwards, Felix Ehrenhaft attempted the same experiment and not only failed to arrive at an identical value, ”

      I have no idea what you post means. It seems to be irrelevant to everything. However, Milliken used the wrong value
      for the viscosity of air, so his measurement of the charge on the electron was wrong.

    • Gary,
      Suppose you have a room with 100 people and you want to know their mean weight. You take a random sample of 10 of them, and the sample mean is 167.5 pounds. Is 167.5 an accurate estimate of the mean weight of those in the room? It is a vague question: it depends on what you mean by “accurate”. If accuracy means “within a pound, plus or minus” it is a bit more clear, but there is no way that you can give a simple yes or no answer to the question, because a lot depends on how well a random sample of size 10 can reflect the properties of a population of size 100. Under some assumptions you can construct a confidence interval for the desired quantity, but the resulting statement about the accuracy of your sample estimate will have uncertainty attached to it. This example may seem oversimplified in the context of climate model parameters that are estimated from data, but it isn’t.

    • Jim C of course you have no idea.

    • John Carpenter

      “first you do hypothetical estimations to arrive at the numeric value of parameter, and then you go out an measure it.”

      Hmmm, so…. theory informs… gives an idea…. points the direction of what value to expect when you finally measure?

      “When you measure the value of the parameter, you automatically get the accuracy with which the measurement was made. You have the accuracy, and you know how certain you can be. Until you have made the measurement, it is wrong and unscientific to claim that you know the value of the parameter.”

      Jim, How do you get this ‘automatic’ accuracy when you measure a value of a parameter? Are you saying accuracy is obtained by comparing the estimated value of the parameter vs the measured value?

    • John, you write “Jim, How do you get this ‘automatic’ accuracy when you measure a value of a parameter? Are you saying accuracy is obtained by comparing the estimated value of the parameter vs the measured value?”

      I suspect you did not do Physics 101. It is over 65 years since I did, and I cannot prove what I am writing, or even give a reference. But we did practical physics labs. The first lesson is that whenever you measure a quantity, you ALWAYS give a +/- value. No exceptions. If you present your results with no +/-, you automatically get 0 out of 100. If you quote a wrong +/- you could still earn up to 95.

      If you understand basic physics, you know that the +/- is always there. I defy you to provide ONE example from the whole history of physics, where a proper measurement has been made, and it does NOT include a +/-. Start with the recent measurement of the Higgs boson.

      Maybe there are other denizens of CE who can provide a better explanation that I can.

    • Bob K,
      I agree with you up to the point where someone provides upper/lower bounds on the uncertainty of your calculated value. Notice how may words you took to explain the reason why the value may not be accurate. In fact, you didn’t even attempt to provide an accuracy or ‘uncertainty’ range. You are perfectly correct to not do so. Unfortunately, I see time and again people using the word ‘uncertainty’ while supplying range bounds.
      The term accuracy is not limited to measurement instrument accuracy but includes all factors that can degrade accuracy. If you cannot calculate the numeric value of that degradation, then you must supply a reasonable upper limit and mention that it is an estimate. In real world measurements overall reading accuracy is always worse than measuring instrument accuracy. We often trust that accuracy degradation is relatively minor for informal purposes but that is not an option when using that data for safety related values. Either measurement data or the calculations based upon that data (or both) must be validated against the real world. That validation process is often more expensive and difficult that collecting the original data.
      So, if you don’t know how accurate your number is, don’t claim it doesn’t matter because your value was appended with the ‘uncertainty’ bounds.

    • Jim,

      In Physics 101 the experiments are straightforward and presenting the error estimates almost trivial. Your error is that you seem to imagine that all good science is as straightforward. In real perfectly valid science many things are done that go far beyond Physics 101. Your comments are written as if you would not know that.

    • Pekka, you write “In Physics 101 the experiments are straightforward and presenting the error estimates almost trivial.”

      I know that. It does not detract from what I am trying to say to Bob. We were being taught HOW to make measurements. And the how ALWAYS includes the +/-. Surely you can confirm that I am right. Are you saying that you can make a valid measurement in physics and NOT have a +/-?

    • Gary

      I’m not really sure what you were complaining about re my earlier comment. Your comment regarding the uncertainty range is much more closely related to issues of precision than accuracy. In most cases you can’t really estimate how close your central value is to being correct (i.e. accurate in the scientific sense), and your uncertainty range is based on the precision of your measurements. If you suspect there is a bias in your results, or that you have insufficient measurements to allow confidence in your statistical processing of the uncertainty range, it would be feasible to increase the range you report based on expertise / expectation, but then we’re getting in to a whole other discussion, and one where the word Bayesian will probably rear its head (and go straight over mine).

      To use a hypothetical climate science example – A GCM is constructed that, when run multiple times with what are considered reasonably plausible forcings, gives a range of sensitivities of 3 +/- 1 deg C for a doubling of CO2. Clearly, the model is giving results with uncertainty of +/-1 degree, but does that tell us anything about the accuracy of the model output (in the sense of how accurately it is reflecting the real world, not just model space)? Only ‘ground truthing’, by waiting to see what actually happens, is going to allow an assessment of that. (I’d take the under, based on recent observational evidence).

    • Chief Hydrologist

      In both engineering and environmental science the most common issue is in finding ways to proceed in the absence of complete information. In many cases it is the application of experience and rules of thumb that provide a solution. Lean from experience – especially of others – or perish.

    • John Carpenter

      “I suspect you did not do Physics 101”

      I did, and then a lot more after that. But that is beside the point. You continue to say you automatically get a +/- attached to a measured value. I don’t contend and I agree all measured values should be expressed with an error or estimated error. The question I am trying to get you to answer is how are those pesky errors derived? You know they are not ‘automatically’ issued with the answer… don’t you? They have to be figured out also. Error in measurement is not always easy to determine, especially on complex measurements that have never been made before. You said, “first you do hypothetical estimations to arrive at the numeric value of parameter, and then you go out an measure it.” This may not always be the order it is done, but when it is, doesn’t theory help inform what answer to expect in experiment? Doesn’t it help to give confidence the measurement made could be accurate? So in the case of something like CS, why are you so stuck on the notion that theoretically derived estimates may as well be 0 since no measurement has been made? (even though they have by paleo/proxy techniques). Again, you said “first you do hypothetical estimations to arrive at the numeric value of parameter, and then you go out an measure it.” Is this not what is being done? We have an estimated range for likely CS to CO2 theoretically derived by calculations. Some investigators have used measured data (from ice cores for instance) to also determine CS to CO2. The two values can then be compared. Has it resulted in an absolute final value that everyone agrees with? No. Are the errors in measurement large? Yes. Does it mean we know nothing or have no idea what the likely CS range is? No… Also, just because a measurement has been made, it doesn’t mean it is correct either… go back and re read Moshers comment you didn’t understand.

    • John, you write “Is this not what is being done?”

      No, it is not what is being done. The paleo data is useless since it is impossible to determine the time with sufficient accuracy. The paleo data, in fact, indicates that the rise is temperature PRECEDES not follows the rise in CO2.

      One cannot do controlled experiments of the earth’s atmosphere, and that is why it is impractical to actually measure climate sensitivity. What I object to is the warmists, particularly Steven Mosher, insisting that estimates are as good as measurements. They are not. Estimates are little more than guesses until they are confirmed by measurement.

      My objection is not so much about the value of climate sensitivity. My objection is the IPCC being 95% certain that things about CAWG are correct, when there has been no measurement of the CS of CO2. How can you possibly support such downright scientific nonsense? And then to claim that this figure is justified because the majority of climate scientists think it is true. The mind boggles.

      Do you really support the 95% certainty the IPCC quotes in the AR5?

    • Ian B,
      You said “Your comment regarding the uncertainty range is much more closely related to issues of precision than accuracy. In most cases you can’t really estimate how close your central value is to being correct (i.e. accurate in the scientific sense), and your uncertainty range is based on the precision of your measurements.”
      Now that is a tack I had not expected. Uncertainty range is based on the precision of your measurements? So you are saying you believe uncertainty bounds have nothing to do with the accuracy of a stated value. That is certainly not what non-GCM folks understand uncertainty limits to mean. When a statement such as “temperatures are going to rise 2.2 degrees Celsius with an uncertainty range of +3 to +1 degrees Celsius by the end of the century” is made, folks naturally will take that to mean you expect the temperature will rise by 1 to 3 degrees Celsius. They do not take it to mean “our computer output is precise to +/- 1 degrees Celsius but we can’t tell you how accurate our calculated value of 2.2 degrees Celsius is.”

      Precision is how fine a measurement division you can obtain from your instrument, regardless of its accuracy (or in-accuracy). Accuracy is how close the measured value is to its true real world value. Presenting a precision value in a situation where folks will naturally expect an accuracy value is misleading. Justifying that with the use of the term “uncertainty” is not acceptable. I hope nobody else uses “uncertainty” in this way.

  18. I quote:

    “We believe the urgent task is to integrate uncertainty and quality into the quantitative sciences of complex systems,…”

    Social science Bull…. gives me worms.

  19. “…these inadequacies matter when computerised models generate pseudo-realities of their own through structures such as financial derivatives and processes such as algorithmic trading. Like Frankenstein’s monster, we have already seen financial market pseudo-reality take on an uncontrolled, unstable and dangerous life of its own…”

    What a stupid, narrow-minded, ahistorical and tendy-crap theory of financial instability. There’s been centuries of bubbles, panics and crashes, long before there was algorithmic trading or any seriously well-developed derivatives. If the only empirical example in the abstract–the Abstract-Fer Cryin OutLoud–is neither necessary nor sufficient for the phenomena that interest the authors, include me out of this wretched amateur-hour pseudo social science. And they sneer at them! What a pair of maroons.

    • NW,
      I commented on the same: http://judithcurry.com/2013/12/12/taylor-and-ravetz-on-the-value-of-uncertainty/#comment-424569

      I do think that their take on it is a basic misunderstanding of the instruments but I have, to a degree, shared that view although it in no way resembles a growth type bubble to me. It rather more looks to me like a bingo machine that could continue to call out higher and higher stakes.

      I don’t pretend to know the ramifications of the derivatives markets. In fact I always thought it seemed like the perfect kind of markets since unlike the stock market it is not predicated on growth but just has a daily win/lose component. I do wonder, however, if more exposure similar to Long Term Capital management by banks running margins isn’t a risk of some catastrophic loss that puts the whole system at risk?

    • Coupla remarks.

      (1) This is the most recent review of the centuries of financial mayhem:
      http://www.amazon.com/This-Time-Different-Centuries-Financial/dp/0691152640
      I think anyone who is really interested in the subject will get a lot out of this book whether they agree with all the conclusions or not, because it is packed with history, facts and interesting looks at systematic data.

      (2) We can now create asset market bubbles at will in experimental asset markets. Several of my colleagues were the innovators in this research and they keep at it today. Here is the Ur paper:

      http://teaching.ust.hk/~bee/papers/040918/1988-Smith_etal-bubbles.pdf

      You get bubbles quite reliably without any program trading, without any derivatives, and with an asset whose random dividend process is stationary (that is, it does not grow) and when all the subjects know this at the outset. It follows that neither growth nor program trading nor derivatives are necessary (much less sufficient) conditions for the development of asset market bubbles, at least in a laboratory asset market.

      (3) My reading of the experimental results on asset market bubbles is: As long as we have periods when new groups of traders are coming together in asset markets for the first time, or there is a heavy influx of novice traders into some well-established market, bubbles will bloom and crash. Fundamentally, the issue revolves around beliefs, and beliefs about beliefs, and so on down the common knowledge rabbit hole.

    • ???

      Seems to me that the reference is rather specific – to the significant role in the recent financial crisis, of computer modeling that systematically underestimated uncertainty (risk) w/r/t complex financial securities – not saying that financial instability would only arise from such modeling (i.e., not a generic “theory of financial instability”).

      Perhaps you should put down the shoehorn?

    • Whenever the work of Rogoff and Reinhart is mentioned a warning should be included

      http://www.newyorker.com/online/blogs/johncassidy/2013/04/the-rogoff-and-reinhart-controversy-a-summing-up.html

      I have their book, and it certainly contains a lot of relevant information, but the error that they made in their analysis tells, how difficult it is to draw reliable conclusions on issues this complex.

    • Ah, I see, You were objecting to them insinuating that the black box trading mechanism is in itself, as they describe, ‘a Frankenstein monster of pseudo reality taking on a dangerous life of it’s own’. What your saying is bubbles happen and it should not be attributed to this means of trading as a culprit necessarily. I agree with that.

      The one thing that sticks out to me about derivatives trading is that the market itself cannot (by it’s very nature) become overvalued as it does not have a valuation like a stock where it’s value is determined by it’s holdings. Derivatives are a straight bet between one party and another and has no value outside of a timed out pricing that favors one or the other parties in the bet. When the transaction is over there is a winner and a loser but the trade itself has no value outside of the players.

      I do think it has a huge economic impact due to the size of the market. If the banks were not cycling money through the bingo machine where would that money be? I have always supposed it would be in a monetary inflation. So I see that market as an inflation market able to gobble up any governmental money creation.

    • Chief Hydrologist

      To sum up, there may well be a threshold at which high levels of public debt tend to be associated with very bad growth outcomes and financial crises, but it isn’t ninety per cent of G.D.P., or even a hundred per cent. Maybe it’s a hundred and twenty per cent, although that figure isn’t a firm one, either.

      So far, I haven’t mentioned the issue of causation. Do high debts cause low growth, or vice versa? Theoretically, it could go either way. High levels of public debt, through their impact on interest rates and business confidence, can crowd out private-sector spending and reduce growth. But low growth depresses tax revenues and forces the government to spend more on things like unemployment insurance and food stamps. That increases the budget deficit, which necessitates the issuance of more debt.

      Reinhart and Rogoff acknowledge this ambiguity. “Our view has always been that causality runs in both directions,” they said in their Times Op-Ed, “and that there is no rule that applies across all times and places … Nowhere did we assert that 90 percent was a magic threshold that transforms outcomes, as conservative politicians have suggested.”

      From Pekka’s link. The argument seems to be about what levels of debt start to have unfortunate consequences. Economic theory suggests optimum growth with a government sector of about 25% of GDP. High debt tends to inflate the government sector. Debt of course needs to be paid back sometime – creating the austerity at some stage.

      In Australia we have pursued both balanced budgets and management of interest rates to contain inflation within a target range for decades with some success. Ultimately that is all that I – and indeed most of the Australian public – expect from government in economic management.

      Printing money on a scale never seen before is another story entirely.

    • Pekka, Whenever the New Yorker’s criticism of Rogoff and Reinhart is mentioned a warning should be included

      http://www.bloomberg.com/news/2013-04-28/refereeing-the-reinhart-rogoff-debate.html

      It’s important, when asking yourself what you think of Reinhart and Rogoff, to keep in mind that:

      (1) They put their data on the web and helped their potential critics find their errors by sending their own spreadsheet calculations.

      (2) That even when all critical suggestions for alternative analytical procedures are followed, the bottom line relationship between debt and growth holds up; and

      (3) That what the “right” analytical procedure is, depends on the question one wants to answer.

      Betsy and Justin are very clear about all this in the link above.

    • NW,

      Economics is always controversial enough to make economists spread jokes about themselves, and to support fierce fighting between opposing schools.

      It’s sure that most comments on the net, like the two we have seen here, are biased towards either side on the controversy.

      One essential question that’s very difficult to answer is whether high debt levels are more a cause to slow growth than a consequence of the other problems of the economies that have also resulted in high debt levels.

      Economics is a field of research where two persons of opposite views may share the Nobel price, and where some of the best known names continue to have strongly contradictory views on important issues.

    • Pekka,

      My reason for offering up “This time is different” to the denizens was not to promote their results on growth and debt. If this isn’t crystal clear from the context of the offering, I don’t know what could be. The discussion I want to have is about what we know about the causes of financial crises. I am arguing that there is almost nothing generalizable about the claim that the world-wide housing market bubble was caused by program trading or derivatives trading. Everyone thinks this sounds smart and hip. It doesn’t: It sounds irrelevant to anyone with a passing acquaintance with the history of bubbles and crashes.

      It also sounds irrelevant to anyone familiar with how easy it is to create asset market bubbles in laboratory settings, where there is no program trading and (usually) there is no derivatives market.

      The generalized economics snark is tiresome and irrelevant. What, exactly, do you (or anyone else) not buy about the (very large now) experimental evidence on asset market bubbles? If you don’t know, that’s fine, but spare me vague references to some irrelevant controversy between macroeconomists, who face all the same epistemic problems faced by climate scientists. Vast precincts of economics are not macroeconomics.

    • Chief Hydrologist

      The direction of causality is irrelevant and we have moved beyond a debt discussion. It can hardly be doubted that mountains of debt is a problem for many national economies. Missing the main point and relying instead on a claim of of complexity and uncertainty hardly constitutes a rational argument at all.

      We are beyond debt – higher debt is unsupportable in much of the western world – notwithstanding claims that just a little more will keep the economy growing in the current emergency. In the US and elsewhere the debt is a generational problem. The party is definitively over. They have nowhere to move on interest rates either – having kept them low enough for long enough to facilitate the housing bubble that led to the 2008 crash.

      We are in the era of ‘quantitative easing’ creating an overhang of inflation and instability. Forecast? It is likely to end badly.

    • Pekka, “One essential question that’s very difficult to answer is whether high debt levels are more a cause to slow growth than a consequence of the other problems of the economies that have also resulted in high debt levels.”

      Not being a Nobel Prize winning economist this may be silly but the quality of the debt is the question. Frivolous debt and unfunded “entitlements” are the dangerous debt which should be limited. I consider a lot of the “green” debt, frivolous.

    • NW

      From what I can tell, bubbles are caused by greedy lambs thinking they can get fat for free and end up getting slaughtered.

    • NW,

      I wrote my comment, because the work of Reinhart and Rogoff is even in economics an unusually clear example of controversy. Those of opposing views have certainly tried to take advantage of the errors in their research, as those who liked their conclusions tried to take advantage from the conclusions they presented.

      I have also followed many aspects of the housing bubble and debt crisis. Thus I believe that I understand the technical issues related to mortgage based derivatives including those related to subprime loans. Similarly I do believe that I have an idea on how the rating agencies contributed to misplaced trust in highly rated tranches of credit derivatives (CDOs etc.). (I don’t buy the idea that program trading would have been an important factor in the dominating developments.)

      I have usually not entered discussion of non climate-related economics on this site, as I don’t see this as the right place for that. Perhaps I should stop here as well after this comment.

    • @Pekka Pirilä…

      The one thing that sticks out to me about derivatives trading is that the market itself cannot (by it’s very nature) become overvalued as it does not have a valuation like a stock where it’s value is determined by it’s holdings.

      That totally doesn’t square with my understanding. The derivatives involved in the ’07-’08 crash were very definitely overvalued: They were carried at an unrealistically high book values due to underestimation of their risk, in off-balance-sheet instruments that banks then carried as reserves. When foreclosures started popping up, and the risk evaluation was modified to reflect that, their values dropped, forcing banks to find other assets to use as reserves.

      I suppose my understanding could be incorrect, but as far as I can tell, people like Steve Denning at Forbes are saying much the same thing.

    • AK,

      You quote Ordvic, not me.

      The risks of structured derivatives were underestimated, and that led to overpricing those derivatives.

    • AK derivatives trading. What is is called when you keep doing the same thing and expect different results?

      Since financial derivatives are pretty much intangibles, perhaps the US should short AGW to hedge their green investments? We can call them carbon default swaps.

    • I can’t help but be reminded how certain people believe that a correlation between x and y observed over the past 150 years or so demonstrates that x causes y, while others believe that since x didn’t exist until 150 years ago, but y kept happening over and over long before 150 years, that the recent correlation between x and y might not mean what the first group of people think it does.

      Now change past 150 years to past 10 years.

    • > The risks of structured derivatives were underestimated[.]

      They were ignored a bit too:

      http://www.pbs.org/wgbh/pages/frontline/warning/view/

      ***

      NW,

      Would you side for Eugene Fama or Robert Shiller regarding bubbles?

      http://www.npr.org/blogs/money/2013/11/01/242351065/episode-493-whats-a-bubble-nobel-edition

    • AK,
      The Derivatives Market I’m talking about and the Housing bubble bank instruments are two different things. I’m talking about the true derivatives market ie: the Chicago Board of Trade (the quadrillion plus market) involves pricing based on things like S&P 500, Oil, Gold, Pork Bellies etc for trading Put and Call Options and futures. The bank and institutional derivatives based on housing is an entirely different kettle of fish.

    • Everyone thinks this sounds smart and hip. It doesn’t: It sounds irrelevant to anyone with a passing acquaintance with the history of bubbles and crashes.

      I wonder whether NW is a “skeptic,” and if so, whether he complains about the arrogance of climate scientists.

    • I guess it should be spelled out when talking about derivatives. Property derivatives do have an underlying value and are an entirely different from financial derivatives. I don’t know very much about property derivatives but here it is explained:
      http://en.wikipedia.org/wiki/Property_derivatives

      Financial derivatives are pretty much as follows:
      http://en.wikipedia.org/wiki/Derivative_%28finance%29
      Some of the common variants of derivative contracts are as follows:

      Forwards: A tailored contract between two parties, where payment takes place at a specific time in the future at today’s pre-determined price.
      Futures: are contracts to buy or sell an asset on or before a future date at a price specified today. A futures contract differs from a forward contract in that the futures contract is a standardized contract written by a clearing house that operates an exchange where the contract can be bought and sold; the forward contract is a non-standardized contract written by the parties themselves.
      Options are contracts that give the owner the right, but not the obligation, to buy (in the case of a call option) or sell (in the case of a put option) an asset. The price at which the sale takes place is known as the strike price, and is specified at the time the parties enter into the option. The option contract also specifies a maturity date. In the case of a European option, the owner has the right to require the sale to take place on (but not before) the maturity date; in the case of an American option, the owner can require the sale to take place at any time up to the maturity date. If the owner of the contract exercises this right, the counter-party has the obligation to carry out the transaction. Options are of two types: call option and put option. The buyer of a Call option has a right to buy a certain quantity of the underlying asset, at a specified price on or before a given date in the future, he however has no obligation whatsoever to carry out this right. Similarly, the buyer of a Put option has the right to sell a certain quantity of an underlying asset, at a specified price on or before a given date in the future, he however has no obligation whatsoever to carry out this right.
      Binary options are contracts that provide the owner with an all-or-nothing profit profile.
      Warrants: Apart from the commonly used short-dated options which have a maximum maturity period of 1 year, there exists certain long-dated options as well, known as Warrant (finance). These are generally traded over-the-counter.
      Swaps are contracts to exchange cash (flows) on or before a specified future date based on the underlying value of currencies exchange rates, bonds/interest rates, commodities exchange, stocks or other assets. Another term which is commonly associated to Swap is Swaption which is basically an option on the forward Swap. Similar to a Call and Put option, a Swaption is of two kinds: a receiver Swaption and a payer Swaption. While on one hand, in case of a receiver Swaption there is an option wherein you can receive fixed and pay floating, a payer swaption on the other hand is an option to pay fixed and receive floating.

      This is an interesting take on Financial Derivatives:
      http://www.dailyfinance.com/2010/06/09/risk-quadrillion-derivatives-market-gdp/

    • NW is on the correct page here. For a readable five-year-old popular article on the bubbles literature I recommend (with only minor bias)

      http://www.theatlantic.com/magazine/archive/2008/12/pop-psychology/307135/

      The cheap shots at Reinhart and Rogoff, who played along with the new rules of ultra-open science, corrected their mistakes, and found almost the same answers as their critics to the same questions, are examples of no good deed going unpunished.

    • Willard asked “Would you side for Eugene Fama or Robert Shiller regarding bubbles?”

      I’m the wrong person to ask about The Evil Fama, but the old bastard does have a point that his winged monkeys have already typed out everything Shiller ever wrote. Sorry, this is a Chicago thing that is impossible to explain to outsiders. Shiller talks a lot of sense. It would be nice if someone would build an expert system Shiller. Then we could have The Oracle forever. But The Oracle needs to write down somewhere, in plain language, how we can do it too.

      Finance is the last redoubt of incredibly stubborn resistance to the Experimental Borg Collective. They will not comply. They do not wish to be assimilated. What else can I say?

  20. I could be persuaded to go along with this whole “internal” “uncertainty” thing for a price but I wouldn’t do it for free.

  21. JC says: “It’s been a while since we’ve had an uncertainty post.”
    _______
    But not long enough. Ha Ha !

    In science uncertainty is a matter of degree. Here are the only two things I need to know about uncertainty:

    1. 1% uncertainty doesn’t mean everything is know.

    2. 99% uncertainty does not mean nothing is known.

    Judith Curry is obsessed about uncertainty. It’s like an OCD. Oh well, there are worse obsessions.

    BTW, I ain’t perfect.

    • David Springer

      Max_OK | December 13, 2013 at 12:34 am | Reply

      “BTW, I ain’t perfect.”

      [sarc]

      No, really?

    • Chief Hydrologist

      Surely this counts as snark with a thin veneer of trivial and uninformed drivel. That someone like Max imagines that he has the intellectual chops to begin to understand Judith’s nuanced position is utterly laughable.

      What Max doesn’t know he doesn’t know is legion. Here’s another quote from someone in their 80’s.

      The climate system has jumped from one mode of operation to another in the past. We are trying to understand how the earth’s climate system is engineered, so we can understand what it takes to trigger mode switches. Until we do, we cannot make good predictions about future climate change… Over the last several hundred thousand years, climate change has come mainly in discrete jumps that appear to be related to changes in the mode of thermohaline circulation. Wally Broecker

      Broecker is of course the ‘father’ of global warming.

      If man-made dust is unimportant as a major cause of climatic change, then a strong case can be made that the present cooling trend will, within a decade or so, give
      strong case can be made that the present cooling trend will, within a decade or so, give
      way to a pronounced warming induced by carbon dioxide. By analogy with similar events way to a pronounced warming induced by carbon dioxide. By analogy with similar events in the past, the natural climatic cooling which, since 1940, has more than compensated in the past, the natural climatic cooling which, since 1940, has more than compensated for the carbon dioxide effect, will soon bottom out. Once this happens, the exponential for the carbon dioxide effect, will soon bottom out. Once this happens, the exponential rise in the atmospheric carbon dioxide content will tend to become a significant factor rise in the atmospheric carbon dioxide content will tend to become a significant factor and by early in the next century will have driven the mean planetary temperature beyond and by early in the next century will have driven the mean planetary temperature beyond the limits experienced during the last 1000 years.

      Nice call – although both natural cooling and natural warming play a role – but reality has been distorted to fit an agenda ever since.

      The only certainty is that we are in a decadal cooling mode and that these modes last for 20 to 40 years. Beyond that climate – by the nature of Broecker’s beast – is unpredictable. The sooner people like Mann and Hansen catch up with this – the sooner their acolytes will get a clue.

      What are the chances?

  22. In every other human endeavor, uncertainty in the consequences of what we are doing dictates slowing down, or stopping, until we are more certain. Continuing in the face of uncertainty is regarded as reckless.

    • Yes, Jim D, and doing nothing in the face of uncertainty can be reckless too. I know from experience that inaction can turn out bad.

    • Jim D If we sped it up we could be more certain we are less uncertain. More cars on more roads to nowhere would make a lot of people happy now but I guess you don’t care about people here now, only in some future.

    • There are solutions where early action is much less painful than late action or inaction.

    • Jim D

      In every other human endeavor, uncertainty in the consequences of what we are doing dictates slowing down, or stopping, until we are more certain. Continuing in the face of uncertainty is regarded as reckless.

      OK. Let’s check out your premise, Jim. First of all, we would never have been able to send a man to the moon with that approach, but let’s look specifically at our options today with regard to the “uncertainties” surrounding climate change.

      We have economic studies (Tol), which tell us that the small amount of warming we have seen since 1900 has been beneficial for humanity to date on balance, adding an estimated $1 trillion to global wealth (1.4% of World GDP). Most of this resulted since 1950. It is “uncertain” how much of this past warming was caused by human activities, but IPCC cites model studies, which estimate that most of the warming since 1950 was caused by AGW.

      We also have projections of significant future AGW, which are based on “uncertain” IPCC “worst case” scenarios, leading to warming over the rest of the 21stC. This warming is likely to be beneficial at first, peaking at 1.5% of GDP around 2030, reaching “breakeven” on balance around 2080, and then becoming detrimental on balance, with the net overall negative impact of $0.8 trillion by 2100 slightly less than the net overall positive impact of the warming experienced over the 20thC.

      We are “uncertain” whether or not the slowdown in solar activity, which we have witnessed since weak SC23 and even weaker SC24 have replaced the unusually high levels during the 20thC, will lead to a prolonged cooling trend, as some solar scientists predict.

      It is reasonable to conclude that a prolonged cooling trend equivalent to the warming we have seen over the 20thC would mean we would have to essentially “give back” the net benefit we realized to date from this past warming.

      Although some “no regrets” actions could be taken today, which could have a small impact on our climate, we know that it will take drastic reductions in CO2 generation on a global basis in order to have any perceptible impact on our future climate; we also know that these are very likely to be costly and painful.

      So what should we do?

      – Start immediately on a costly global program to reduce CO2 emissions, in order to avert the possible warming trend, taking the risk that this could work against us if the low solar activity continues, resulting in a long-term cooling trend?

      – Do nothing with regard to CO2 for now other than “no regrets” actions, but continue with efforts to reduce “uncertainties” of future climate change and prepare to adapt to any potential challenges resulting from any climate changes Nature throws at us, if and when it becomes apparent that these challenges might become imminent?

      I think you can figure out what the logical course of action is, Jim.

      Max

    • Max_CH in part of his 2:08 AM post says:

      “So what should we do?

      – Start immediately on a costly global program to reduce CO2 emissions, in order to avert the possible warming trend, taking the risk that this could work against us if the low solar activity continues, resulting in a long-term cooling trend?”
      _______

      Max_CH finally admits he believes humans can cause climate change. He also believes it might be a good idea for man to burn up all of earth’s fossil fuels to prevent more ice ages. Apparently, it hasn’t occurred to him that after all the fossil fuel has been used up humans will have no way to combat additional ice ages and will freeze unless evolution covers people with lots of think hair and everyone looks like furry apes.

    • Jim D wrote:
      | December 13, 2013 at 12:42 am |

      “In every other human endeavor, uncertainty in the consequences of what we are doing dictates slowing down, or stopping, until we are more certain. Continuing in the face of uncertainty is regarded as reckless.”

      I view the whole “internal” “uncertainty” narrative as a terribly ill-advised gamble that the general public will remain too ignorant to sensibly interpret hard-constrained (indeed law-constrained) evidence that the narrative is strictly inadmissible (in a black & white logical sense, as in 1+1=2).

      Why would anyone gamble like that?

      Artists have the answer:

      Metallica articulates the attitude concisely:

      “Those people who tell you not to take chances
      They’re all missing what life’s about
      You only live once so take hold of the chance
      […]
      Don’t stop for nothing it’s full speed or nothing
      I’m taking down whatever’s in my way”

      — Metallica – “Motorbreath”

      Artists are masters of provocative soundbite.

      “Jump in the Fire” accurately describes dark antics of creepy stealth political agents corrupting the climate discussion:

      “Down in the depths of my fiery home
      […]
      Tempting you and all the earth
      to join our sinful kind
      […]
      With h*ll in my eyes and with death in my veins
      […]
      Jump by your will or be taken by force
      I’ll get you either way
      Trying to keep the hellfire lit
      I am stalking you as prey”

      — Metallica – “Jump in the Fire”

      A few gamblers at the table are a manageable nuisance, but decisively throw the stalkers (like the small californian army of devilish climate discussion thought police operating at wuwt) under the bus with no remorse.

      An alliance with gamblers to outlaw stalkers is practical.

      Sincerely
      _______

      Gamblers ready to get serious about improving odds:
      Start with section 8.7:

      Nikolay Sidorenkov (2009) — The Interaction Between Earth’s Rotation and Geophysical Processes

    • Proud Narrative tumbles over Nemesis Nature. Haven’t I seen this movie before?
      ========

    • slowing down is relative.

      we slow down C02 emissions by speeding up renewables.

      There is no simple analogy that will govern what needs to be done

      The fact that one has to use analogies should indicate one thing:

      we are in uncharted waters.

    • thisisnotgoodtogo

      Mosher the fake scientist says:

      “slowing down is relative.

      we slow down C02 emissions by speeding up renewables.”

      No doubt “renewables” is one of Mosher’s private definitions.

      It would not be the one used which includes wood burned in open fires.

    • “thisisnotgoodtogo”

      of course it would. where do you come up with these silly notions?

    • manacker, regarding never having got to the Moon by slowing down in the face of uncertainty, you should realize that astronauts and test pilots are paid well for taking risk, but subjecting mankind to uncertain risk without them volunteering or being paid extra is just immoral. It is also the difference between being a race car driver and a bus driver. There is voluntary risk for yourself, or subjecting everyone else to your risk-taking.

  23. As a [poor*] gambler with a blind faith in models I find it hard to rationalize a statement such as “We can take advantage of the greater scope uncertainty offers”
    The author of this statement either implies that he can pick winners out of the uncertainty, or like a merchant at a goldfield he sits on the fence making his money from selling the miner’s their equipment and food.
    The first position is obviously foolish, the second isn’t a position at all.

    * in both senses

  24. “Many of the banking and financial institution problems and failures of the past decade can be directly tied to model failure or overly optimistic judgements in the setting of assumptions or the parameterization of a model.” –Tad Montross

    Is this supposed to be unique to the past decade? And did this Tad Montross just discover it, the way your teenage children just discovered sex? How very wise. I suppose Tulip Mania was due to “overly optimistic judgments in the… parameterization of a model” or something very like that, metaphorically speaking. Well you know, the 1630s Dutch had these models in their heads, you know…

  25. Berényi Péter

    Uncertainty &. ignorance are to separate issues. In case of climate prediction we are dealing with the latter one.

    • +1000

    • looks like you are ignorant about uncertainty.

    • Am I apathetic or just ignorant? I don’t know and I don’t care.

    • Berényi Péter

      Steven Mosher | December 13, 2013 at 12:59 pm |
      looks like you are ignorant about uncertainty.

      Nope, it is you, who needs education.

      In the olden days, when Ptolemaic astronomy still prevailed, there was a certain uncertainty in the calculated celestial position of planets compared to their observed behavior. It was huge, could be observed with naked eye using inaccurate clocks of that age. Astronomers were well aware of this fact, even if the model was sufficiently accurate for practical purposes, that is, to construct horoscopes. Some even tried to improve it by introducing additional epicycles.

      The Copernican system with its circular orbits around the Sun was even worse in this respect, part of the reason of its rejection by contemporary astronomers (the other one being the unobservability of stellar parallaxes). Those guys were not complete fools.

      Kepler was needed to do better than Ptolemy, but it was Newton’s dynamical theory which could account for interplanetary perturbations, to the extent it led to the discovery of Neptune. This model still had some minor uncertainty, observable only in the orbit of the innermost planet Mercury, instrumental in confirmation of Einstein’s geometrodynamics.

      It is a mathematical fact Newton’s theory lends inherently chaotic solutions for the general multibody problem, so we still can’t calculate the detailed long term behavior of the solar system beyond a time horizon of several million years. That’s what I would call genuine uncertainty, the rest was simple ignorance.

      Similarly, climate science struggles with the lack of general theory concerning irreproducible quasi stationary nonequilibrium thermodynamic systems, to which class terrestrial climate belongs to.

      A system is irreproducible if microstates belonging to the same macrostate can evolve to different macrostates with time.

      We do have a theory for the reproducible case, but that offers no help in calculating future climate states, even if there is tantalizing empirical evidence for furter restrictions, utterly ignored by current computational models, see

      Journal of Climate, Volume 26, Issue 2 (January 2013)
      doi: 10.1175/JCLI-D-12-00132.1
      The Observed Hemispheric Symmetry in Reflected Shortwave Irradiance
      Aiko Voigt, Bjorn Stevens, Jürgen Bader and Thorsten Mauritsen

      Therefore it is not uncertainity, but ignorance which ails us in this field.

  26. Antonio (AKA "Un físico")

    Let’s be clear. Uncertainty in climate, finance, industry, etc., could be accurately modelized by stochastic variables.
    Models have a range of validity. But if you intentionally oversimplify your assumptions, then your models are not going to be valid:
    – It is very easy to understand why many financial models are wrong: for example, in those derived from Black-Scholes equation, some of their previous assumptions are critically unreal.
    – And it is very easy to understand why IPCC’s models are wrong:
    https://docs.google.com/file/d/0B4r_7eooq1u2VHpYemRBV3FQRjA
    But for: industrial chain improvements, traffic jam preventions, etc., their stochastic models could work quite well.

  27. “…masking of uncertainties that can in turn lead to incomplete understanding of problems and bad decisions.”

    It’s tragic that one has to state the screaming obvious in needlessly cluttered and indirect language.

    However, it is a relief to hear the screaming obvious stated by the educated when so many of the most educated have become the most obtuse.

    • Thought fer Today:
      ‘Inspect every piece of pseudo science and you will find
      a security blanket, a thumb to suck, a skirt to hold.’

      H/t Isaac Asimov.

  28. A fan of *MORE* discourse

    BREAKING NEWS
    !!! Jerome Ravetz’ Post Normal Times defends Mann !!!
    !!! Hocky Stick Real! Mann’s ethics upheld !!!
    !!! James Hansen/colleagues lead PLOS ONE takeover !!!

    More reasons to read
    The Hockey Stick and the Climate Wars

    from The Post Normal Times
    Jerry Ravetz, Advisory Board

    Even if you have read it, you might want to read the new forward, by Bill Nye the Science Guy, and the update which shows continued vindication of the Hockey Stick, as it continues to be confirmed by additional studies.

    Those studies, and the now nine investigations in which no improprieties were found, surely make Mann the Most Vindicated Professor (MVP) ever – as he was referred to in a tweet by Peter Dykstra.

    ———-
    James Hansen and Colleagues
    Offer Evidence for
    a Disruptive Call to Action

    A PLOS ONE editorial

    The article PLOS ONE publishes today from James Hansen and colleagues, Assessing Dangerous Climate Change: Required Reduction of Carbon Emissions to Protect Young People, Future Generations and Nature” is extraordinary in many ways. From its diverse list of authors to the breadth of the analysis and the conclusions that emerge, the paper goes beyond the scope of a traditional research article by dismantling boundaries between disciplines and adding a moral dimension to the collective dialogue.

    Call for Papers  We are drawing on the findings of Hansen et al. to announce a call for papers in a new PLOS ONE Collection entitled, ‘Responding to Climate Change.’ The Collection will incorporate the broad range of areas covered in Hansen et al.’s paper, with a particular focus on work aimed at reducing fossil fuel emissions and returning the Earth and its ecosystems to a state of energy balance.

    Conclusion  Judith Curry asks: What is “Post-Normal Science”, and how does it deal with complexity, uncertainty, and society?

    There’s no need to wonder, Climate Etc readers! Michael Mann and James Hansen, working with hundreds of colleagues, have simplified the complexity, reduced the uncertainty, and are charting a moral and economic path for society.

    Aye Lassies and Laddies, the The Post Normal Times has got it right … the foresighted science of Michael Mann and James Hansen, acting step-by-step, is relentlessly forcing back denialism’s darkness!

    \scriptstyle\rule[2.25ex]{0.01pt}{0.01pt}\,\boldsymbol{\overset{\scriptstyle\circ\wedge\circ}{\smile}\,\heartsuit\,{\displaystyle\text{\bfseries!!!}}\,\heartsuit\,\overset{\scriptstyle\circ\wedge\circ}{\smile}}\ \rule[-0.25ex]{0.01pt}{0.01pt}

    • A fan of *MORE* discourse

      MORE BREAKING NEWS
      !!! Reinsurance Association of America
         allies with Hansen-style Climate-Science!!!
      !!! Extreme Climate Events on the Rise !!!

      Our Industry Is Science-Based
      by Frank Nutter (RAA President)

      Insurers see climate primarily through the prism of extreme natural events. […] With changes in weather patterns, intensity, and number of events, the result of course is an inevitable rise in insured and uninsured damages, globally and in the UW.

      Conclusion  The folks at RAA are risk-and-uncertainty professionals&hbsp;… that’s why they’re solid advocates of Hansen-style/Mann-style climate change science.

      Aye Lassies and Laddies, the RAA folks have got it right … the foresighted science of Michael Mann and James Hansen, acting step-by-step, is relentlessly forcing back denialism’s darkness!

      \scriptstyle\rule[2.25ex]{0.01pt}{0.01pt}\,\boldsymbol{\overset{\scriptstyle\circ\wedge\circ}{\smile}\,\heartsuit\,{\displaystyle\text{\bfseries!!!}}\,\heartsuit\,\overset{\scriptstyle\circ\wedge\circ}{\smile}}\ \rule[-0.25ex]{0.01pt}{0.01pt}

      morediscourse@tradermail.info
      A fan of *MORE* discourse

    • Fan says “the foresighted science of Michael Mann and James Hansen, acting step-by-step, is relentlessly forcing back denialism’s darkness!”
      ______

      AMEN to that !

      Attrition also is taking it’s toll on denialism. The naysayers are older than average.

    • David Springer

      Max_OK | December 13, 2013 at 7:46 am |

      “The naysayers are older than average.”

      With age comes wisdom.

      Write the down.

    • Fan says: “Even if you have read it, you might want to read the new forward, by Bill Nye the Science Guy, and the update which shows continued vindication of the Hockey Stick, as it continues to be confirmed by additional studies.
      Does Bill Nye present the ‘Best evidence available’? Or is he a useful idiot?:
      http://wattsupwiththat.com/climate-fail-files/gore-and-bill-nye-fail-at-doing-a-simple-co2-experiment/
      Perhaps his moniker should read ‘The anti-science guy” (he seems to be a pet peeve of Watts:
      http://wattsupwiththat.com/?s=bill+nye

      I don’t know why you and the other Mann Munchkins keep insisting on turning a boomerang into a hockey stick. The idea of climate having been a straight line before 1975 is absurd on it’s face.

    • R. Gates aka Skeptical Warmist

      David Springer quipped:

      “With age comes wisdom.”

      ——
      This is of course a simpleton phrase trying to defend the notion that older people are wiser. Older people are often just …older, and of course more prone to dementia.

    • I suspect proximity to the Death Panel improves attention, or should.
      ==============

    • A fan of *MORE* discourse

      An Old Person Is … anyone fifteen years older than you are.

      \scriptstyle\rule[2.25ex]{0.01pt}{0.01pt}\,\boldsymbol{\overset{\scriptstyle\circ\wedge\circ}{\smile}\,\heartsuit\,{\displaystyle\text{\bfseries!!!}}\,\heartsuit\,\overset{\scriptstyle\circ\wedge\circ}{\smile}}\ \rule[-0.25ex]{0.01pt}{0.01pt}

    • Wisdom is far more likely to come from age than from youth

    • Chief Hydrologist

      Of course Hansen is over 70 and Mann in his late 40’s. But they must honourary teenagers. The 40’s and 50’s are the peak career years. Experience and a lifetime of study count for much – except in he case of Hansen and Mann it seems.

      Working into the 60’s and 70’s is quite common these days. Elinor Ostrom – one of the most brilliant people ever – worked up to her death at age 79 last year.

      This triviality about age is a distraction from not having rational, scientifically founded knowledge on climate.

    • David Springer

      R. Gates aka Skeptical Warmist | December 13, 2013 at 12:14 pm |

      David Springer quipped:

      “With age comes wisdom.”

      ——

      R. Gates, with as yet undetermined appendage, writes:

      This is of course a simpleton phrase trying to defend the notion that older people are wiser. Older people are often just …older, and of course more prone to dementia.

      ———————————————————-

      I was hoping since I used a simpleton phrase even you would be able to understand it. I’ll try to use even smaller words for you in the future.

    • FOMD, you are clearly dilusional. I will read your posts when you are rational enough to declare Hansen for what he really is, and Mann a fake, phony, and fraud.

  29. @
    Berényi Péter | December 13, 2013 at 4:11 am said:
    “Uncertainty &. ignorance are to separate issues. In case of climate prediction we are dealing with the latter one.”

    You have hit the nail on the head. When it comes to forecasting oil production or the climate, ignorance is the operative word. Statistical trick studies and Hail Mary models are ploys born in Hubris. There is a lot of hubris in the warming camp. When will they realize it, is the question. You can’t model that, either.

    • jim2, can’t no forecast be a forecast ?

    • No.

    • Think about it, Jim2. A policy of do nothing is based on a forecast that nothing will change so nothing needs to be done.

    • Not logical, Max_OK.

    • Jim2, it’s logical to me. My decisions to act or not act are based on what I predict will happen at a future time. If I decide not to act , it will be based on my prediction that nothing will happen that will make action necessary.

      But maybe you are different and your decisions (to act or not act) are based on coin flips or nothing at all.

    • Max, you threw in policy when we were discussing forecasts based in ignorance. When it comes to policy, the government doesn’t have to have one!! I realize this possibility hasn’t occurred to you, but if we are ignorant about the true nature of climate, just do nothing – other than continue to gather DATA and try to use it to figure out how the climate really works. Spending blood and gold on fantasies about what might happen is idiotic because the list of fantasy problems is almost infinite.

    • Yes, jim2, I am talking about forecasts, both explicit and implicit. If you believe the policy should be to do nothing about man-made global warming, implicit in that policy is a forecast that doing nothing will be better than doing something.

    • Max_OK, ” If you believe the policy should be to do nothing about man-made global warming, implicit in that policy is a forecast that doing nothing will be better than doing something.”

      Not really, doing nothing would be better than doing something wrong. No regret policies are more geared to not screwing up one thing in order to fix another. The US national electrical grids are a good example. Intermittent solar and wind power alternatives are lovely as long as the help not hurt. Since the grids were not design for very much intermittent supply, we now have issues that could have been avoid.

      Mandates are another example. If you aren’t 100% sure don’t mandate. there is nothing wrong with trying something, just don’t get stuck with a not so brilliant idea.

    • captdallas 0.8 or less said on December 13, 2013 at 3:07 pm |
      Max_OK, ” If you believe the policy should be to do nothing about man-made global warming, implicit in that policy is a forecast that doing nothing will be better than doing something.”

      Not really, doing nothing would be better than doing something wrong….
      ——–

      JEEZZZ, that was implicit in what I said. The ” do nothings” are forecasters, like it or not.

      Do something, and something will go wrong. Do nothing, and nothing will go wrong. HA HA ! I know better than that from experience.

    • Max_OK, “JEEZZZ, that was implicit in what I said. The ” do nothings” are forecasters, like it or not. ”

      No the “do nothings” are pragmatists. If a pragmatist hears someone profess they know THE solution to a complex problem, the pragmatist knows he is listening to an idiot. For example; when Greg Laden made his four block model to convince the world that risking everything to combat global warming was worth the risk, the sane members of society pretty much knew he was an idiot. When the AGU invited Laden to their fall meeting and he had his predictable meltdown, the sane of the world pretty much knew that at least some members of AGU had let the cheese slip off their crackers.

    • “Not really, doing nothing would be better than doing something wrong” – Capn.

      Jus remember that next time you ar tempted to do the HM on a drowning victim.

    • captdallas 0.8 or less said on December 13, 2013 at 4:44 pm |
      Max_OK, “JEEZZZ, that was implicit in what I said. The ” do nothings” are forecasters, like it or not. ”

      No the “do nothings” are pragmatists.
      ——–

      Nah, “do nothings” are afraid of change. I think a lot of them are just old fuddy-duddies. But I am sympathetic because I can understand how getting old would make a person afraid of change. The kinds of changes old people experience usually aren’t good. To the elderly change too often means loss … loss of health, loss of loved ones, loss of friends, loss of standing, etc. Unfortunately, this fear impedes rational thinking and results in resistance to change in general. That’s why I don’t put much stock in notion older means wiser.

      I should add that I feel indebted to people older than me, both the living and those who have passed on. I have benefitted from their sacrifices.

    • Micheal, “Jus remember that next time you ar tempted to do the HM on a drowning victim.”

      So far I am three for three with no HM or CPR. Doing nothing or little can be very effective.

    • Max_Ok, “Unfortunately, this fear impedes rational thinking and results in resistance to change in general. That’s why I don’t put much stock in notion older means wiser. ”

      It seems that fear can drive either direction but the most fearful are typically the least rational.

    • ­> So far I am three for three with no HM or CPR.

      FWIW, they don’t teach CPR anymore around here. Try to pump your heart for 3 minutes. Then, try 10. Then go on for 25 more minutes:

      http://newsroom.heart.org/news/cpr-for-38-minutes-or-longer-improves-chance-to-survive-cardiac-arrest

      The stats for CPR are pretty awful:

      http://en.wikipedia.org/wiki/Cardiopulmonary_resuscitation#Effectiveness

      First-aid formations focus nowadays on how to use a defibrilator.

    • Willard, “The stats for CPR are pretty awful:”

      Right and if you get there late none of it is very effective or pretty. That is why I am captain buttwipe when it comes to swimmers staying together and staying close.

      I figure I may be good for ten minutes tops for CPR so it is definitely a no pulse last ditch effort. In-water about anything you do will likely cause puking so at least with modified Heimlich their head is down so there is a better chance the puke flows out instead of in. The best first step appears to be just tilting their head back to open the airway after making sure there is nothing in their mouth if possible. that is not easy in the water all the time. If they still have a pulse, they generally start breathing, at least so far for me, knock on wood.

      btw, modified Heimlich is just top of the abdomen or low chest. Sometimes called modified chest thrusts.

  30. “Uncertainty is normal in scientific research but to policy makers, journalists and wider society it sounds ‘unreliable.’ Despite life telling us otherwise, the assumption in many debates is that we should expect certainty.”

    http://www.lse.ac.uk/CATS/Media/SAS012-MakingSenseofUncertainty.pdf

    The linked Making Sense of Uncertainty is worth reading.

    • Uncertainty plays a huge role in this issue. It’s not that we expect disaster, it’s that the uncertainty is said to offer the possibility of disaster: implausible, but high consequence. Somewhere it has to be like the possible asteroid impact: Live with it. ~Richard Lindzen

    • Lindzen thought we should live with second hand smoke.
      COUGH ! GAG !

      He likes to go against the mainstream. It get’s him attention.

    • Then there’s second hand dung smoke. Hey, Americans once built houses of the plentiful resource in Buffalo Country.
      ============

    • kim, I think the pioneers on the Great Plains built sod houses rather than dung houses. Perhaps you are thinking of outhouses, which weren’t built of dung but were built to house dung.

    • “The outlook for the climate over the 21st Century is highly uncertain. There is a word in the English language to express high uncertainty. That word is ‘ignorance.’ And ignorance is not a basis for responsible government action.”

      (Appendix 2, Uncertainty, the Precautionary Principle, and Climate Change, by Kesten C. Green & J. Scott Armstrong, August 9, 2008)

       

    • Sooner nor you were there, Maxie come lately.
      ============

    • Chief Hydrologist

      Cow or buffalo dung is optionally used in adobe. It is used to line walls for thermal insulation. It was also used by the US settlers and Indians alike as fuel. Of course the more popular usage is to bury any rational discussion under a pile of it. In this Max_OK excels.

    • Sooners were cheaters. Boomers weren’t much better. My ancestors were neither. They bought their land from Native Americans.

    • Waggy quotes J. Scott Armstrong, a forecasting expert who believes you can’t forecast. Well, time is passing 76-year old J. Scott by, so I don’t blame for trying to do something to get attention.

    • David Springer

      Max_OK | December 13, 2013 at 11:34 am |

      “He likes to go against the mainstream. It get’s him attention.”

      With age also comes knowledge of where to use an apostrophe. Unfortunately that age is supposed to be pre-teen which doesn’t speak well of your education, Max. You are an adult, right?

  31.  
    Climate alarmism is a modern age digital version of crop circles.
     

  32. Feynmans unicorns.

    ‘The 60-year-old method for calculating scattering amplitudes — a major innovation at the time — was pioneered by the Nobel Prize-winning physicist Richard Feynman. He sketched line drawings of all the ways a scattering process could occur and then summed the likelihoods of the different drawings. The simplest Feynman diagrams look like trees: The particles involved in a collision come together like roots, and the particles that result shoot out like branches. More complicated diagrams have loops, where colliding particles turn into unobservable “virtual particles” that interact with each other before branching out as real final products. There are diagrams with one loop, two loops, three loops and so on — increasingly baroque iterations of the scattering process that contribute progressively less to its total amplitude. Virtual particles are never observed in nature, but they were considered mathematically necessary for unitarity — the requirement that probabilities sum to one.

    “The number of Feynman diagrams is so explosively large that even computations of really simple processes weren’t done until the age of computers,” Bourjaily said. A seemingly simple event, such as two subatomic particles called gluons colliding to produce four less energetic gluons (which happens billions of times a second during collisions at the Large Hadron Collider), involves 220 diagrams, which collectively contribute thousands of terms to the calculation of the scattering amplitude.

    In 1986, it became apparent that Feynman’s apparatus was a Rube Goldberg machine.

    To prepare for the construction of the Superconducting Super Collider in Texas (a project that was later canceled), theorists wanted to calculate the scattering amplitudes of known particle interactions to establish a background against which interesting or exotic signals would stand out. But even 2-gluon to 4-gluon processes were so complex, a group of physicists had written two years earlier, “that they may not be evaluated in the foreseeable future.”

    Stephen Parke and Tommy Taylor, theorists at Fermi National Accelerator Laboratory in Illinois, took that statement as a challenge. Using a few mathematical tricks, they managed to simplify the 2-gluon to 4-gluon amplitude calculation from several billion terms to a 9-page-long formula, which a 1980s supercomputer could handle. Then, based on a pattern they observed in the scattering amplitudes of other gluon interactions, Parke and Taylor guessed a simple one-term expression for the amplitude. It was, the computer verified, equivalent to the 9-page formula. In other words, the traditional machinery of quantum field theory, involving hundreds of Feynman diagrams worth thousands of mathematical terms, was obfuscating something much simpler. As Bourjaily put it: “Why are you summing up millions of things when the answer is just one function?”

    “We knew at the time that we had an important result,” Parke said. “We knew it instantly. But what to do with it?”

  33. Idiot climatists really should stop comparing humanity’s release of CO2 into the atmosphere to government involvement in the anti-smoking agenda of the liberal-Left. I’ve never met a smoker that didn’t know smoking was bad. And, that was before cigarettes were raised to $12 a pack by government fiat. If the Left and liberals really cared about anything but getting their hands on someone else’s money they wouldn’t promote lottery tickets to those who can least afford to throw away their family’s money and simply gamble away their government transfer payments in the hope the public school will feed their kid in the morning.

  34. Chief Hydrologist | December 13, 2013 at 12:21 pm |
    “Cow or buffalo dung is optionally used in adobe.”
    —-
    Sod houses don’t need dung for mortar.

    You don’t know dung about sod, so stick to what you know, which should be easy because it ain’t much.

    • Chief Hydrologist

      I see your reading comprehension skills are not improved Max. Adobe and not sod.

      http://www.backwoodshome.com/articles2/hooker110.html

      If I were you I would consider suing your teachers.

      The fact remains that Max_OK is incapable of anything but the most superficial analysis of anything. Like Michael and a number of other – his specially is short and unfunny snark – which nonetheless seems to amuse him greatly. It is an utter waste of time and space.

      I am still inclined to think he is 16 years old now – with a zit problem and a tube of lubricant.

    • When I was a boy there were still sod houses on the prairie. They had wood siding added and they looked like regular houses, only shorter. There is no mortar in a sod house, and there is no dung. There would be no need for it. Prairie sod has extensive roots in it that knits the “bricks” into a stable dirt block. We made sod bricks when I was in 4-H.

      They used buffalo dung for fuel.

      The Adobe houses in the southwest used dung, but the buffalo roamed not much out there: Texas and eastern New mexico. Maybe they used human BS. Around here, it’s plentiful, and probably was out there as well.

    • Chief Hydrologist

      Not sure I want to labour an obviously trivial and irrelevant point that emerged from a throwaway line. However, the point was clearly adobe and not sods. However plentiful sods are around here.

      All sorts of dung are used in adobe.

      As for the home of the Buffalo?

      ‘Their historical range roughly comprised a triangle between the Great Bear Lake in Canada’s far northwest, south to the Mexican states of Durango and Nuevo León, and east to the Atlantic Seaboard of the United States (nearly to the Atlantic tidewater in some areas) from New York to Georgia and per some sources down to Florida.[2][3]

    • Lol. You’re an analytical hydrologist. Why would a buffalo be unlikely to venture into adobe country?

    • Chief Hydrologist

      Sigh – perhaps the warminista problem is just functional illiteracy compounded by a utter lack of humour.

      I wonder which part of Mexico JCH didn’t understand.

    • Max said Great Plains. Kim said Americans. In America, adobe is pretty much Arizona and NW New Mexico. The buffalo did not roam there. LMAO.

    • Chief Hydrologist

      The range of the bison extended to Mexico. I think you are pretty much showing yourself to be an obnoxious fool now with a line in tendentious and irrelevant triviality.

      No plains adobe housing ever used buffalo pats? Seriously why do you bother?

      http://plainshumanities.unl.edu/encyclopedia/doc/egp.arc.029

    • David Springer

      No American Bison (buffalo) in Arizona or California. Ever. They were exterminated in New Mexico by 1840.

      http://upload.wikimedia.org/wikipedia/commons/1/17/Extermination_of_bison_to_1889.png

    • Heh, waiting for a historian to weigh in. Some places, dung was the only structural material. Sure they were temporary.
      ============

    • It was once widespread practice to pack dung around the umbilical cord stump. The only main pathogen was tetanus, and it struck randomly enough not to clue in the people, ignorant of germ theory and infectious pathology.
      =============

    • The difference between you morons and me is, we had a buffalo herd. They don’t like mountains. They don’t like rocks. They’re big and heavy. It splits their hooves. They like to drink a lot of water. When wild, they stayed close to rivers. River valleys have grass and dirt. Dirt is friendly to hooves.

      When the Spaniards introduced cattle to Mexico, they started killing off the buffalo, and their range quickly shrank to the Great Plains.

      Adobe in America is Arizona and NW New Mexico. They also built similar adobe in Northern Mexico, but not in the range of the buffalo. It was south of Arizona and west to the Pacific. They were masters at it, and it is highly unlikely they used dung as they do not appear to have had domesticated livestock, and they did have readily available fiber right next to their houses. The called it their cropland. If you want to believe they wandered all over the E NM grasslands looking for widely scattered buffalo pies, have fun with that.

      When the Spaniards started building European-style adobe in Texas, there were very few buffalo, if any. I do no know if they used cattle dung or not, and do not care. That is not American adobe. Places like Presidio, which has an excellent example of Spanish adobe is an example. Looking at terrain around Presidio, I doubt that buffalo in any number ever actually ranged there. Buffalo are apparently not as stupid as the hydrologists who claim expertise on them.

    • Well, I’ve finally located my copy of ‘Buffalo Jones’ under a pile of ‘stuff’, a reproduction of an edition published in 1899. It’s not indexed and may not be my source after all, but it kinda looks like fun to reread, anyway.
      ====================

    • Chief Hydrologist

      Cow or buffalo dung is optionally used in adobe.”
      —-
      Sod houses don’t need dung for mortar.

      You don’t know dung about sod, so stick to what you know, which should be easy because it ain’t much. Max_OK

      Was this where it started? JCH would need to show that American bison dung was never used as a fibre in adobe. A big ask – absurd on the face of it. Another epic fail from JCH?

    • My father started with two buffalo. He bought them on a lark at a Dakota sale barn. An old Sioux indian who was there came up to Dad and told him to never sell just one; that he could not have just one buffalo as it would die of loneliness.

      A veterinarian, Dad did not place much stock in what sounded like mumbo jumbo to him. One day he decided to sell one of the buffalo, the bull, to a friend who had a herd. A week or so later his remaining buffalo mysteriously died. After that he would wisely assure people they could not have just one buffalo.

      Imo, you can have one buffalo.

    • > Imo, you can have one buffalo.

      Minnesota has one:

      A day after coach Mike Yeo explained all the reasons he didn’t want to move Jason Pominville off the Wild’s top line to anchor the second line, he did just that.

      http://www.twincities.com/sports/ci_24721053/wild-jason-pominville-joins-second-line-young-players

      They’re having a lull right now, but Jason will survive.

    • Obama will handle it.

    • Maybe I’ll stay tonight ’til 2AM or so just to see what happens…

    • Time to short sell reality:

      The budget deal, in short, puts House Republicans at a crossroads: They can pass a compromise that locks in incremental gains on spending cuts, declare a partial victory and show evidence that they are interested in governing. Or they can reject the deal as insufficiently conservative and cement the party’s growing reputation as unwilling to come to the table, despite the political reality that Democrats control the Senate and the White House.

      “We are either a party that is serious about governing when control of Washington is split, or we are an unserious party that doesn’t care about realistic incremental gains, only caring about unrealistically getting the whole ball of wax, which will never happen as long as there is a Democratic president and a Democratic majority leader,” said Glen Bolger, a leading Republican pollster. “We have to stop being the dysfunctional equivalent of the Washington Redskins.”

      Backers of the budget deal can be forgiven if they are feeling a queasy sense of deja vu. Even before the final details of the budget deal emerged, the outside groups that have so frustrated establishment Republicans declared it insufficient.

      Tim Phillips, who heads the conservative group Americans for Prosperity, said conservatives who voted for the plan would be “go[ing] back on their word to rein in government over-spending.” Heritage Action for America said Tuesday it would oppose a deal that included “woefully inadequate long-term [spending] reductions.” Club for Growth President Chris Chocola criticized Republicans “who don’t have the stomach for even relatively small spending reductions.”

      “If Republicans work with Democrats to pass this deal, it should surprise no one when Republican voters seek alternatives who actually believe in less spending when they go to the ballot box,” Chocola said in a statement. Already, Republicans challenging Sens. Mitch McConnell (R-Ky.) and Thad Cochran (R-Miss.) have said they oppose the deal. So have Sens. [The Son] Paul (R-Ky.) and Marco Rubio (R-Fla.).

      http://www.politico.com/story/2013/12/warnings-federal-reserve-ron-rand-paul-100857_Page2.html

      You, quarterbacks! Stop being the dysfunctional equivalent of the Washington Redskins!

    • Phase change in the Higgs field occurring nearby? Not too likely. On the other hand, I seem to have misplaced my vial of Ice Nine…

    • This just in: Scientists now as annoying as apocalyptic Christians

  35. Chief Hydrologist

    The faith that truth lies in numbers goes back to the Pythagorean attempt to unify both practical and theoretical sciences. Its current manifestation is the idolisation of pre-Einsteinian physics in the quantification of social, economic, and behavioural sciences. The talk will explain how this “crisp number” mode of thinking has promoted the use of over-simplistic models and masking of uncertainties that can in turn lead to incomplete understanding of problems and bad decisions.

    In reality climate models have multiple divergent solutions. Change a parameter within it’s feasible range and there is no guarantee and not getting a radically different solution. This has been known for certain over the life of computer climate models. It is the basis of Lorenzian chaos theory.

    e.g. http://www.ipcc.ch/ipccreports/tar/wg1/505.htm

    and http://rsta.royalsocietypublishing.org/content/369/1956/4751/F8.expansion.html

    It leads to a phenomenon called irreducible imprecision.

    In each of these model–ensemble comparison studies, there are important but difficult questions: How well selected are the models for their plausibility? How much of the ensemble spread is reducible by further model improvements? How well can the spread can be explained by analysis of model differences? How much is irreducible imprecision in an AOS?

    Simplistically, despite the opportunistic assemblage of the various AOS model ensembles, we can view the spreads in their results as upper bounds on their irreducible imprecision. Optimistically, we might think this upper bound is a substantial overestimate because AOS models are evolving and improving. Pessimistically, we can worry that the ensembles contain insufficient samples of possible plausible models, so the spreads may underestimate the true level of irreducible imprecision (cf., ref. 23). Realistically, we do not yet know how to make this assessment with confidence.

    Economic models don’t tend to be based on non-linear equations which is a clear point of difference.

    • So, you’re saying things have really changed a lot from long, long ago (a few years ago) when you could feed in white noise and always get a ‘hockey stick’?

    • Chief Hydrologist

      Of course, models can be formulated that eschew these practices. They are mathematically safer to use, but they are less plausibly similar to nature, with suppressed intrinsic variability, important missing effects, and excessive mixing and dissipation rates.

      AOS models are therefore to be judged by their degree of plausibility, not whether they are correct or best. This perspective extends to the component discrete algorithms, parameterizations, and coupling breadth: There are better or worse choices (some seemingly satisfactory for their purpose or others needing repair) but not correct or best ones. The bases for judging are a priori formulation, representing the relevant natural processes and choosing the discrete algorithms, and a posteriori solution behavior. http://www.pnas.org/content/104/21/8709.long

      Make anything you like. You just need to run the model until it comes up with a solution you like.

    • Chief, I sense that you might get a kick out of this oldish argument of poverty, which still makes me giggle. Save it for a rainy Sunday.

      http://www.international.ucla.edu/media/files/Leamer_article.pdf

    • Chief Hydrologist

      Thanks NW.

  36. A poster on another blog had this to say, accurately:

    “pdcorcoran says:
    December 13, 2013 at 1:56 am
    I remember the global cooling scare also. I also remember that the best way to counteract the cooling was to increase regulation of corporations which caused the cooling, more taxes, and more government control. Then when the scare switched to global warming, the proposed solutions were exactly the same.”

    And I might add, were proposed by the same individuals/organizations.

    A previous thread addressed ‘Pathological Altruism’. A more accurate term for the phenomenon would be ‘False Flag Altruism’.

    There was an old country song by Tex Williams (and probably others): ‘The Big Print Giveth while the Little Print Taketh Away. The False Flag Altruist has the highly publicized solution for the ‘problem’ (the big print). The ‘little print’ is the inevitable concentration of power and money in the hands of the progressives, and the concomitant loss of freedom, personal autonomy, and lifestyle choices that are forced on the rest of us unconnected unfortunates when the ‘solution’ is implemented.

    The ‘False Flag Altruism’ is exemplified by the altruist (almost always progressive or the euphemism du jour for progressivism) identifying a critical problem, which requires IMMEDIATE action to stave off disaster, followed by the ‘solution’ which is always the same, as pointed out by ‘pdcorcoran’ above: more regulation of the evil corporations, more taxes, and more government control. See global cooling, global warming, health care crisis, banking crisis, housing crisis ad infinitum. No matter the problem, the solution is the same. And ALWAYS advances the goals of progressives, by whatever name.

  37. @
    Max_OK | December 13, 2013 at 2:53 pm | said:
    “Yes, jim2, I am talking about forecasts, both explicit and implicit. If you believe the policy should be to do nothing about man-made global warming, implicit in that policy is a forecast that doing nothing will be better than doing something.”

    Max_OK. There is no forecast in what I said.

    Here is how it works.

    Ready?

    OK:

    I don’t know if climate will continue to warm or not. If it does continue to warm, I don’t know if it will be catastrophic. So, what I’ll do is continue to look at data and analysis and listen to arguments about it. I will encourage government and scientists to gather yet more data and try to figure out how climate works for real, not just throw together some over-parametrized model of it and call it good.

    See, there is no forecast whatsoever there.

    • jim2, if you say you don’t know whether global warming will be a problem or not, and you don’t know whether something should be done about it or not, then you are not forecasting. But if you say nothing should be done, you are forecasting if global warming occurs it will not be bad.

  38. The ‘uncertainty’ in the area of climate change is a creation of researchers not the data–e.g.,

    “The papers of Mann et al. in themselves are written in a confusing manner, making it difficult for the reader to discern the actual methodology and what uncertainty is actually associated with these reconstructions. Vague terms such as “moderate certainty” (Mann et al. 1999) give no guidance to the reader as to how such conclusions should be weighed. While the works do have supplementary websites, they rely heavily on the reader’s ability to piece together the work and methodology from raw data. This is especially unsettling when the findings of these works are said to have global impact, yet only a small population could truly understand them. Thus, it is no surprise that Mann et al. claim a misunderstanding of their work by McIntyre and McKitrick.”

    ~Wegman EJ, et al., Ad Hoc Committee Report On The ‘Hockey Stick’ Global Climate Reconstruction

  39. Curious George

    There is a way to handle a behavior of uncertainties in computations explicitly – it is called Interval Arithmetic. It represents a value as an interval [Guaranteed Minimum, Guaranteed Maximum], e.g., for a 2-decimal digit computation [1,1]/[3,3] yields [0.33,0.34], or a temperature reading 25.67 would be represented as [25.66,25.68].

    To rewrite a model using interval arithmetic would be a major project, but we would not have to debate uncertainties any more.

  40. CAIRO — Snow coated domes and minarets Friday as a record Middle East storm compounded the suffering of Syrian refugees, sent the Israeli army scrambling to dig out stranded motorists and gave Egyptians a rare glimpse of snow in their capital.

    http://www.latimes.com/world/worldnow/la-fg-wn-snow-israel-egypt-20131213,0,1691393.story#ixzz2nP4qgPgR

  41. I have to smile when I see the references to major Uncertainties in financial systems or climate science and etc. The Uncertainties inherent in food production such as in agriculture are many times those of most other systems including Climate.
    Farmers and food producers at every level have to deal with a whole gamut of major Uncertainties in weather, in biological systems, in markets, in transportation systems and so it goes. Yet over the last 12,000 years humanity has slowly developed a whole range of fod production systems that are adaptable enough to compensate for the Uncertainties of food production, so much so that we can now adequately feed the over 7 billions of humanity now on this planet.

    In defence of “Uncertainty” and the role it plays in human development both societal and technological.
    “Uncertainty” is driver of new thinking and new developments as we constantly seek the nirvana of “Certainty”, the fixed, unchanging perfect solution and situation.
    But inherent in that very “Certainty” we are constantly seeking and trying to achieve in some aspect of our science / technology / existence / personal life / politics / society is the inherent “Uncertainty” introduced from some other quarter which inevitably destroys the very “Certainty” we believed we have achieved in some field of endeavour.

    Uncertainty in a human psyche is a little like a random walk similar to that of foraging ants who with no fixed pattern or plan just wander around until some sense tells them that the goal they are seeking such as food is close at hand.
    “Uncertainty” stimulates the desire to explore other avenues when trying to establish that sought after but never achievable for more than a brief moment in time, “Certainty” that most of humanity aspires to one way or another.
    As such Uncertainty is one of the leading drivers for people and organisations in trying to find solutions to all the problems that seem to constantly beset mankind.
    In trying to find those solutions new aspects and new fields and new thought patterns are opened up and knowledge, science and technology and society and civilisation take another step forward.
    Uncertainty also leads to a subject or situation being seen in many shades of grey which as so many participants may Uncertain of the realities or the correct solution if there is a permanent one, that they are prepared to look at and seriously consider other solutions or aspects as suggested by alternative thinking to a problem or situation.

    “Certainty” on the other hand leads to a very sharp, hard edged unforgiving attitude towards any who dare to hold other views, a situation that is so very obvious in climate science where Certainty, at least in their public pronouncements, is one of the most striking characteristics of the alarmist side of climate science.

    “Certainty” on the part of one party or all parties can and often does lead to severe conflict whereas Uncertainty by all the different parties allows those others to consider the possibility that they might just be wrong and the other party might be right.
    Therefore it might be a good idea to sit down with the other parties and discuss and to consider the alternatives and to take care what you say or accuse the other party of as they might just be right and you in your Uncertainty may be wrong.

    Aviation was quoted as an engineering example of certainty in action but the Being Dream Liner was aeronautical engineering Uncertainty at it’s most obvious. The Rolls Royce engine failures due to a weak point in an internal oil line was another example.
    This despite in the Dream Liner’s case a history of Fibre Reinforced Plastics used in aeronautical engineering going back into the 1940’s and the first fully FRP aircraft, the Phoenix Glider being built by the Darmstadt students in 1956.

    Just one aspect of aviation use of FRP’s was the lack of confirmed and tested data on both the fatigue life of FRP aircraft and the fatigue rate and failure modes in FRP aircraft right through until the unexpected and as it turned out the serendipitous failure of a glider wing which was on a fatigue testing rig failed when the infrared heating lights were inadvertently left on over the week end.
    The FRP aviation researchers and engineers were ecstatic as the data from this wing failure at last allowed them to calibrate a major fatigue Uncertainty on the use of FRP in aviation.

    Uncertainty has the effect of channelling knowledge, experience and technology towards trying to establish that Holy Grail of “Certainty” in one field but does not provide Certainty outside of that very narrow field which will itself eventually again become “Uncertain” due to the Uncertainties in other fields that impact on the established “Certainty” leading to further “Uncertainty”.
    [ Umm, thats convoluted to say the least! But so are the memes of Certainty and Uncertainty ]

    All in all “Uncertainty” is one of the great drivers of innovation and advancement in every field of human endeavour including technology, politics, finances and of societal and the advancement of civilisation .

    “Uncertainty” can lead to great advancement.
    It can also lead to vicious conflict and killing.

    Like all such memes “Uncertainty” in the whole range of human existence is neither black nor white but is of many, many shades of grey It is to our good and it is due to “Uncertainty” in the whole field of human endeavour that we as a species have got as far as we have up the ladder of civilisation.

  42. –e.g., The certainty provided by lifetime employment makes Western academics lie to get tenure and others to remain silent like the German citizenry during WWII who remained silent and purposefully oblivious to the plight the Jews?

  43. + many, ROM on, not a comment but a mini essay concernin’
    the holy grail of certainty and its oppsite,uncertainty the driver
    of innovation.
    bts.

  44. Climate Change: Let’s Just Keep Teasing that Bull. Who’re You Going to Listen to – Grownups?
    http://therealtruthproject.blogspot.com/2013/12/climate-change-lets-just-keep-teasing.html

  45. You actually make it seem really easy together with your presentation however I
    to find this topic to be actually one thing that I think I’d never understand.
    It kind of feels too complex and very wide for me.
    I’m taking a look ahead to your next post, I’ll attempt to
    get the dangle of it!

  46. Pingback: Weekly Climate and Energy News Roundup | Watts Up With That?

  47. Pingback: The value of uncertainty | Reflections

  48. Wow, this post is good, my sister is analyzing these things,
    therefore I am going to let know her.