Week in Review: April 2, 2011

by Judith Curry

It has been several months since we’ve had an open thread  like this, so it is high time for another.

The first issue of the new journal Nature Climate Change has been launched, the first month is freely downloadable.  My reaction to the first issue is that many of the articles have less meat/depth than many blog posts on the more serious blogs.  There is an interesting article entitled “Data on demand” that is definitely worth reading “Climate scientists are under pressure to make their data — and their methods — more openly available, both to fellow scientists and the public. Now, open-access climate science is becoming easier than ever.”

AGW Observer has assembled links to an interesting group of papers on geoengineering.

Climate Central has a post on climate change and agriculture.

In case you’ve never visited this site, Climate Law Blog has some interesting posts.

Dotearth has a reasonable post about the kerfuffle regarding the Berkeley Earth project and Muller’s testimony.  I don’t really get what all the angst was about.  Muller was asked to testify about the integrity of the climate data and the process by which these datasets were assembled.   His testimony addressed these issues, based on what he know from his preliminary analysis.  On this subject, Digging in the Clay has a post entitled “An indication of the growth of station bias?”

No Frakking Consensus has a post entitled “IPCC Insider’s Club”

Let me know if you’ve spotted anything else of interest.

420 responses to “Week in Review: April 2, 2011

  1. The angst about Muller is that he is making public statements on BEST’s behalf saying the skeptics are wrong. Seems simple enough.

    • Just to elaborate, Muller is acting like an academic scientist, which he is, who is oblivious to the political nature of the issue, which he should not be. There is standard language that he should be using — “I cannot comment on this project because our work is incomplete.” Instead he is releasing preliminary findings with no work to back them up, and much work left to be done. He made a strong promise of transparency and he has now broken that promise, so he is discredited. Transparency does not mean promising to publish the work at a later date, it means providing the work with the conclusions. It is painfully simple; he did something stupid, so now he has to pay the price.

    • Well, he IS a believer in AGW, therefore the sceptics MUST be wrong – from his POV.

      Expectations were too high – and that’s usually the prelude to disappointment.

    • See I don’t get this – the temperature record (all 5 of them now) agree substantially therefore cAGW is true and we are all going to die?

      Surely only total nutters were thinking that surface temperatures weren’t rising – to 1998 at least? For these people I was going to go with a Rudyard Kipling quote – ‘everyone is mad on at least one point.’ Having started a study of mad web sites (the Earth doesn’t rotate, Einstein is wrong because…, the Official Rolling Stones…, Kate and Williams Very Excellent Wedding, etc.) I decided to go with this good housekeeping tip instead.

      “For mad scientists who keep
      brains in jars, here’s a tip:
      Why not add a slice of lemon
      to each jar, for freshness.”

      No nutters here – right?

  2. These are the darkest days, of scientific hysteria, when every loyal academic can be expected to rally with his/her tribe of fellow academics. Roy Spencer has also just posted an article that warns he is focused upon fighting in the peer-review arena only, and will not be defending his work against blog attacks (no surprise there, coming from a “lukewarm skeptic” of the consensus). And of course Judith Curry will continue to be politically correct, and as incapable of focusing upon the critical evidence as is the “consensus”. The present epidemic of academic incompetence, and academic intransigence, will be swept away in due course, but not without monumental travail, as is obvious from the worldwide political consequences of the failed science. It has been a long time coming, and it will not go quietly and gracefully, much less rationally (because dispassionate reason was the first to go).

    • Harry, I wish I could write ss well as you do. All I can add is a hearty “Here, Here”. However, I would like to emphasise one of you sentences . “And of course Judith Curry will continue to be politically correct, and as incapable of focusing upon the critical evidence as is the “consensus”. ” This, to me, has been one of the great disappointments of this blog, Climate Etc.

      • You guys are nuts. With this blog Dr. Curry has done more to promote debate and legitimize skepticism than anyone else on earth.

      • Agreed. It’s hard to tell where she’s going sometimes. That’s a good thing. One sure way to kill this blog is to have an obvious consensus.

      • I agree with ChE that we absolutely do NOT want to have a consensus on this blog.

        If we wanted that, we could simply tune in to RealClimate or ClimateProgress, where dissenting views are simply censored out if they become too convincing or embarrassing.

        I disagree that Judith is falling over to be “PC”.

        While I may personally not always agree with her view, I think she is delicately walking the tightrope between CAGW advocacy and skepticism, bringing interesting topics for discussion and, above all, letting the posters do the talking.


      • So we have a consensus that we don’t/shouldn’t have a consensus. Now that we’re all agreed on that…

      • ian (not the ash)

        Reminded of that wonderful scene in Monty Python’s ‘Life of Brian’:

        Brian to crowd: “You are all individuals”
        Crowd in concert: “We are all individuals”

      • you forgot the punchline – the one guy who says “I’m not!”

      • Agreed

        We disagree on everything.

      • So…. have we agreed to disagree?

        Or do we disagree? :-)

      • Jim

        I refer examination of that question to the mathematician and logician Alfred Tarski (http://en.wikipedia.org/wiki/Alfred_Tarski), whose life can teach us much on disagreement, and whose work can teach much about truth.

  3. Dr. Curry,
    If you have not been watching ClimateAudit, then you have missed out on a number of important posts this past week or two. Perhaps most interesting is the “hide the incline” that McIntyre found. Not only did Briffa delete data after 1960 in his reconstruction, he also deleted data prior to 1550 to hide the fact that data was a little untidy as well. Anthony Watts made this issue a top post for a few days. See http://climateaudit.org/2011/03/21/hide-the-decline-the-other-deletion/

    • Ron, I agree that McIntyre’s latest posts are very interesting. I am also wondering about historical record from 1850-1899, which isn’t included in the “trick” and which doesn’t seem to agree very well with the proxies.

      • Dr. Curry,

        I hope you are coming to appreciate, that almost ALL of the published proxy climate-reconstructions of the past millenium are deeply suspect, for arbitrary data manipulations, suspect statistical procedures, and grossly understated error-bands.

        Peter D. Tillman
        Consulting Geologist, Arizona and New Mexico (USA)

      • Peter, I definitely agree that that there are substantial problems with proxy records, and even bigger problems when you try to determine a global temperature anomaly from them.

      • The Loehle’s temperature reconstruction can be considered to be a reasonable one, since it has relatively good correlation with the changes (first differential) in the Arctic’s geomagnetic field. There is an odd ~110 year period (1380-1490) with sharp temperature drop and the correlation turning temporarily negative, but subsequently returning to the original tendency. http://www.vukcevic.talktalk.net/T&dB.htm

      • Paul Vaughan

        Good work vukcevic. It’s reassuring to see people cluing in to rates of change. The nature of the variability absolutely does NOT follow conventional mainstream assumptions. We’re going to have to start getting more assertive with those still pushing old-school bad-assumption-laden parametric inference.

      • Second to that. Interesting analysis, and I agree that Loehle’s reconstruction at least passes the giggle test — perhaps because both the claims and methodology are modest.

      • V–

        Do you have an introductory page to your stuff somewhere? Your work looks interesting, but I have no idea who you are or where to start.

        “It’s a sin to waste the reader’s time” — Larry Niven

      • David Bailey


        I think what people are pointing out here – perhaps a little harshly – is that when it is realised that key players in a scientific discipline have published fraudulent results, it is absolutely essential that this is recognised as such. Until the wound has been cleaned, the rest of the science is forever tainted. People want to know just how far this goes.

        I mean, obviously you don’t personally collect all the data that you use for your research, so don’t you now wonder if that data has been honestly processed?

        Every time I hear senior people refer to the inquiries that “cleared the CRU scientists of any wrong doing”, I am reminded that the moment of reckoning has yet to come, and that until then, every bit of climate science – your subject – is suspect!

      • David, re your statement “I mean, obviously you don’t personally collect all the data that you use for your research, so don’t you now wonder if that data has been honestly processed?” I have made several public statements to this effect, shortly following climategate, and also in Discover magazine interview that was published last May

      • David Bailey

        But Judith, imagine what impact you could have had if you had spelled out the problems when you gave your evidence to Congress – even perhaps restricting your comments to this question of honesty for added emphasis!

        I’m not a climatologist, so in a sense I am like a member of the general public. I know that the science is being manipulated by some of the major contributors, and that until that is stopped, nothing that you or any of your colleagues do has any value!

        I realise that you have stuck your neck out a fair way already, but clearly there are a lot of people – not least the current President of the Royal Society – who are willing to hide the truth, only a very bold and repeated assertion of what is the most fundamental requirement in science – honesty – can make any difference.

      • Actually, I was asked to testify on the subject of uncertainty.

      • David Bailey

        Please understand, I am not trying to get at you, but isn’t dishonesty in climate science, the ultimate example of uncertainty?

      • Yes, it’s the ultimate uncertainty!

        There is also some certainty that there is warming bias in all “processed” data, due to confirmation bias, which again is certainty in science.

      • There is uncertainty, and there is uncertainty, but, alas, like crying in baseball, there is no certainty in science.

        There is Uncertainty, the Subjective, as when investigators give their feelings about the goodness of their data or of the stupendousness of their models, and since AR4, providing numbers to morph the subjective into the objective. Making science out of whole cloth.

        Then there is Uncertainty, the Objective. This is the axiomatic probabilistic cloud around every fact, and around every prediction. Then when the new fact is fit to the prediction, a scientist can measure how close that new fact fits, and begin to score validation.

        I have a time-tested lecture on science and in it I include a set of axioms of science. The list begins with a semi-facetious Axiom Roman Numeral Zero: Science is first rational. This I introduced especially to answer the late Dr. P. K. Feyerabend, the Irrationalist, who formally denied the existence of the scientific method. But if you are going to admit dishonesty into the mix, my first Axiom needs to be modified: Science is first rational and honest.

  4. Thank you…..

    …for (Bob Heinmiller, the greatest of the great. …smile.), Hank Stommel,

    …Walter Munk. Peter Webster…..

    ….even, thank you for Carl Wunsch, Mel Briscoe, Bob Corell, Margaret Lienen…

    Once upon a time, science was science and the greatest were the most modest. That time will come, again. …smile. ………Lady in Red

  5. Hank Zentgraf

    Muller should have declined the invitation to testify until the BEST work was vetted and published. That is what is expected of scientists in other disciplines.What is it about Climate Scientists in particular that drive them breathlessly toward the political process to influence public policy?

    • Congressional hearings are not about presenting primary or original scientific results. This particular hearing was about process. I agree that there was no need to present any BEST results in the testimony. The point about process that Muller’s one graph makes is that their preliminary analysis is in broad agreement with the previous analysis, but that there are a number of outstanding issues in how these data should be interpreted in the context of creating a time series of global annual temperature anomalies.

      • The preliminary results had already been widely reported.

        Would it not have seemed odd if he were to provide testimony without addressing the findings that were already circulating like wildfire throughout the “climate debate?”

        He stated that the results were preliminary and subject to change as they continued in the process. Was he wrong in assuming that people understand what that means?

      • What he was wrong in is not releasing the data and methods that those results are based on. This is the integrity issue, which is arguably the sorest spot in the entire debate.

      • It seems reasonable to me to argue whether, in the end, a better result would be reached if data and a full description of the methodology were openly released even as the process was ongoing. It’s an interesting question about how science would best be carried out. I can see the validity of arguments on both sides of that debate – and I tend to lean towards the side of increased openness – but it doesn’t seem like a matter of “integrity”‘ to me. I think that such language obfuscates the real debate and substitutes moral judgement.

        However, given that people choose to approach that question differently, I don’t get the outrage about an announcement of preliminary findings along with appropriate caveats. They have a right to close their process and announce their results just as people have the right to question their results because they aren’t provided access to the process.

        I’ll also note that the openness of their process isn’t quite so black and white. Anthony Watts posted about his inside view into BEST’s methodology. They have promised to make all data and methodology available upon completion of their study. It does not seem to me like they are lacking integrity in their approach, even if it might be better if they were more open as the process is ongoing.

        I will also note that they are being attacked for using Watts’ data in ways that others don’t approve (read Willis’ attack posted at WUWT).

        If find this rather ironic, as the fear of misuse of data is one of the oft’ heard arguments from “consensus scientists” in defense of their more closed scientific process.

      • yes someone please get Watts to release the surfacestation project data

      • You’re not paying attention – Watts’ paper will presumably be published later this year.

      • steven mosher

        To anyone familiar with the method as described on their website or familiar with the data there is nothing whatsoever shocking in Muller’s findings.

        We know, for example, that Any randomly choosen sub sample of all available climate data will produce a curve that matches existing curves. Pick any 100 stations from GHCN, from GCOS, and you will match GISS and CRU.
        We also know that the answer given is does not vary substantially when you use different methods. Use a CAM method or RSM method or Least squares method and you will get the same answer.

        Muller’s team is building a least squares method. RomanM and JefffId have done the same. So has nick stokes. So has tamino. We know that there answers match CRU and GISS. The shocking answer would have been if BEST preliminary results did NOT match. That would have been evidence that they got something horribly wrong. Basically, all the methods, CAM, RSM and “least squares” models will all produce a correct result. The differences are minor but important to those of us interesting in certain technical issues.

        The data they used is a random sample. They are using a random sample to test code and algorithms. This is a good discipline to follow and the first time to my knowledge that it has been done.
        As it is part of the testing process it is really of minor interest.

        For open code development there are a couple of approaches.
        1. Open the dev tree to anyone who wants to watch the progress
        2. Open the dev tree when you go final.

        The first approach is usually taken when we want or need community help to build the basic system. The second is taken when we need to provide a working reliable infrastructure for others. The BEST effort belongs in the latter category.

      • I could have simply waited for you to respond.

      • Steven, I have trouble squaring your picture with sampling theory. The whole point of confidence intervals is that different random samples of the same postulation should give very different results.

        Also, how do you “match GISS and CRU” when they do not agree?

      • Sorry, “population” not postulation. It is your postulation I am questioning.

      • steven mosher

        Maybe this will clarify an ambiguous sentence I wrote.
        If you look at the distribution of trends of all stations you will find that it is normal; That is, create a distribution of all trends of the 7000 or so stations in GHCN or the 30K+ stations in GCOS. Its normal ( actually a bit peaky)
        so, lets say from 1900 to today the population of station trends is centered on .8C per century. Pick 2% of those at random.

        I’ll guess the trend of that sample is .8C.

        So, I’m not the least bit amazed that BEST matched GISS
        and HADCRU by picking 2% and applying a method that
        is known to produce the same results as CAM and RSM.

        Here’s another thing. Pick all the airport stations. What’s your guess? Mine is .8C. I’ll beat your guess every day of the week. that’s because segregating the data by airport/no airport doesnt substantially change the answer.

        Pick the Longest records in the database. Answer? .8C.

        Now if I said EVERY randomly choosen subsample, then you’d have a point, since obviously i could randomly choose 100 sites with the lowest trends or 100 with the highest trends. But I’m not surprised, nor should you be, that they randomly selected 2% and came out with essentially the same answer. maybe by saying ANY sample I was a bit unclear. if that’s the case, then perhaps what I wrote here clarifies your concern. the distribution of trends is normal. Select a sub sample of them, guess the mean of the trends

      • steven mosher


        Obviously, I’m using the term loosely. The match I am looking at is matching trends over the interesting periods.

        1900-1940. An interesting period because of the relative sparsity of the data and the magnitude of the trend, which is relavant to understanding natural variation (perhaps)
        1980 to present.

        What will be interesting is seeing if the new sources that they add will change anything in the periods from 1900-1980.
        From 1980 to present i’d be surprised if anything changes much. GISS and CRU are already close in trend to UHA and RSS.

        When the final algorithms are done and when the UHI study ( using MODIS) is done then you’ll have a interesting comparison. I have some concerns about Modis that I cannot confirm but in short order ( some bug fixes required in R packages not under my control) I’ll have more to say.

        The most interesting thing will be the uncertainty levels.

        Basically adding new sources will only smooth out things. thats because the spatial correlation of the temperature field is pretty high ( longitudally). but hey I could be wrong, all those places we are not sampling could be cooling..NOT!

      • Steven Mosher,

        “We know, for example, that Any randomly choosen sub sample of all available climate data will produce a curve that matches existing curves.”

        Maybe I am missing something (it wouldn’t be the first time). It seems to me that the whole purpose of reviewing the UHI and station siting issues it to determine whether increases in temperature at individual stations due to such issues (in addition to climate) caused an increase in the reported “existing curves,” and whether adjustments for such issues in establishing those curves accurately account for those effects.

        If the average annual temperature in the U.S. is reported to have increased .8 degrees since 1900 (forgetting the issue of such accuracy over such a large area and time period), and raw temperatures at urban or poorly sited stations increased at a significantly higher rate, then is seems that the whole issue is what adjustments were made and how, and whether the adjustments were sufficient. Otherwise the reported trends were artificially inflated.

        “Pick all the airport stations. What’s your guess? Mine is .8C. I’ll beat your guess every day of the week. that’s because segregating the data by airport/no airport doesnt substantially change the answer. ”

        That should be true if there is no UHI effect, or the data is properly adjusted to reflect the impact of UHI (and siting issues). But it seems to me that to say that a random sample of BEST data would necessarily match the existing reported trends, just assumes that those trends were accurately reported in the first place. Which seems to beg the whole question behind the surfacestations.org project.

      • steven mosher


        “Maybe I am missing something (it wouldn’t be the first time). It seems to me that the whole purpose of reviewing the UHI and station siting issues it to determine whether increases in temperature at individual stations due to such issues (in addition to climate) caused an increase in the reported “existing curves,” and whether adjustments for such issues in establishing those curves accurately account for those effects.”

        The BEST study has not even begun to study the UHI effect. As it current stands, GISS does an adjustment for UHI. CRU does not. CRU increases the error bars in one direction to envelope the effect. BEST proposes to use a very new data set from the Modis program. Modis urban extent. This product has a 15 arc second resolution (500 meters at the equator) that decides whether a grid square is “built” or “unbuilt”. there are other urban proxies as well, ISA for example. Further, the issue of SITING is Different from the issue of UHI. SITING refers to the 100meters surrounding a station and micro climate effects. UHI effects can be meso scale.
        Simply, a site that is RURAL on the meso scale (within 20km) can still have bad siting. it can be in the shade for example.

        “If the average annual temperature in the U.S. is reported to have increased .8 degrees since 1900 (forgetting the issue of such accuracy over such a large area and time period), and raw temperatures at urban or poorly sited stations increased at a significantly higher rate, then is seems that the whole issue is what adjustments were made and how, and whether the adjustments were sufficient. Otherwise the reported trends were artificially inflated.”

        The issue is more complicated than that. There is no such thing as raw data. The rawest it gets is a hand written form by the observer, which of course has errors. The issues are
        1. station histories which are incomplete and potentially innaccurate or missing.
        2. known changes in observing practice
        a. location change
        b. time of observation change
        c. instrument changes.
        3. location accuracies
        4. historical data on past land forms.
        5. proxies used to define “urbanity” or other human alterations to the land form which may have an effect on
        the observed temp, both positive and negative.

        ““Pick all the airport stations. What’s your guess? Mine is .8C. I’ll beat your guess every day of the week. that’s because segregating the data by airport/no airport doesnt substantially change the answer. ”

        That should be true if there is no UHI effect, or the data is properly adjusted to reflect the impact of UHI (and siting issues). ”

        No its true even WITHOUT adjustments for UHI. The reason is quite likely the following: Some airports show a warming trend that is partly due to the general warming of the climate and partly due to the changes made in the heat capacity of the land. Asphalt versus “natural” landscape. But that is not a universal phenomena. airports are typically constructed with long fetches. wind, wind over 7m/sec, effectively mitigates UHI due to changes in heat capacity. So, airport means no high buildings, long fetches, winds blowing, and good conditions for convection working to mitigate UHI.
        Also, some airports are actually cooler than there suroundings. go figure.

        “But it seems to me that to say that a random sample of BEST data would necessarily match the existing reported trends, just assumes that those trends were accurately reported in the first place. Which seems to beg the whole question behind the surfacestations.org project.”

        Surface stations delivers exactly what was missing: a documentation of the SITE characteristics. the climate question is How big is the effect of putting a site in a less than perfect place? that has never been established definitively. What is established is this.
        1. Scientists are CONCERNED about micro site effects
        2. To address those concerns they developed a siting guide
        3. Current stations don’t follow the siting guide very well.
        What a proper study will address is this:
        1. How important are those concerns
        2. How big is the ACTUAL effect of violating guidelines.

        The mere fact of a violation does not, necessarily ENTAIL that the recorded temp is corrupt. That is an emprical question. Theoretical concern, empirical answer.
        So, Its good that scientists were concerned. Its good that citizens collected the data to test the ACTUAL effect. Its good that the theory ( bad siting = bad data) will be tested.
        early indications is that the effect is small. More testing is in progress.

      • Steven Mosher,

        My question regarding “the whole purpose of reviewing the UHI and station siting issues” was not about the methods BEST is using, but with respect to Muller’s statements regarding the surfacestations.org project. The question was prompted by Muller’s testimony, after reviewing the surfacestations data, that the existing temperature trend analyses will not need to be adjusted.

        Muller’s testimony included this:

        “Many temperature stations in the U.S. are located near buildings, in parking lots, or close to heat sources. Anthony Watts and his team has shown that most of the current stations in the US Historical Climatology Network would be ranked “poor” by NOAA’s own standards, with error uncertainties up to 5 degrees C.

        Did such poor station quality exaggerate the estimates of global warming? We’ve studied this issue, and our preliminary answer is no.”

        Muller wasn’t just saying that “BEST matched GISS
        and HADCRU…,” but also that the surfacestations data will not change that conclusion.

        In another post on this blog, Dr. Curry quotes Anthony Watts as follows:

        “But here’s the thing: I have no certainty nor expectations in the results. Like them, I have no idea whether it will show more warming, about the same, no change, or cooling in the land surface temperature record they are analyzing.”
        “While NOAA and Dr. Muller have produced analyses using our preliminary data that suggest siting has no appreciable effect, our upcoming paper reaches a different conclusion.”

        Watts does not seem to be as sanguine as you are about the pronouncements being made by Muller.

        Muller expressly claims to answer the question of station siting (however preliminarily). You seem to agree: “early indications is that the effect [of poor siting] is small.” I guess I am wondering why you are agreeing with Muller’s comments on the surfacestations project before Watts has even published, particularly where Watts himself has already made clear his disagreement with Muller?

      • maksimovich

        Any randomly choosen sub sample of all available climate data will produce a curve that matches existing curves

        Here is the 30 yr record for Wellington NZ(1980-2009)

        The appearance of duck trajectories (canards and black swans ) suggest otherwise.

      • steven mosher

        Pick any 100 stations randomly.

        I could of course pick high latitude stations that show more warming. Given the distribution of trends ( .8C +-) its trivially easy to pick locations that cool while the rest of the planet warms. its also easy to pick stations that warm at 2X or 3X the average rate. I imagine that during the MWP its easy to find places that were cooler than today. Therefore.. what? the MWP didnt exist? Put it this way, finding areas that cool over a 30 year period while the vast majority of places warm, isnt a discovery. Its not inconsistent with theory. what would be interesting is seeing what geographical features of Wellington allowed it to fight the general trend. For example, if you look at stations that tend toward the cooler end of the distribution they often have unique geographical features ( mountain valleys, bays etc )
        is there anything unique about that location?

      • maksimovich

        Firstly it is not an isolated station,others in the official series also show the same topolgy over 15 degrees of latitude.

        Second, this is a reaudited time series (by ABOM) so it allows us to ask legitimate questions on the robustness of the so called global trends.

        Third these stations are intentionally excluded from the GISS series ie the updates finish in the 90’s see Steve Mac.

        In addition these mid latitude stations are important indicator stations due to
        a) the distance from industrial aerosol,
        b) the amount of solar energy that is available ( 7 % greater then an equivalent NH station)
        c) the geometrical postion in the Ferrel cell
        d) its response to transport atmospheric and ocean.
        e) the swan population is mostly black (as noted by Popper during his time there)

        The most important observation however are the duck trajectories ie Canards


      • steven mosher

        Second, this is a reaudited time series (by ABOM) so it allows us to ask legitimate questions on the robustness of the so called global trends.

        Third these stations are intentionally excluded from the GISS series ie the updates finish in the 90′s see Steve Mac.


        1. I fail to see how this ‘allows’ you to ask questions. What you have is a locale phenomena. Locale in space and locale in time. If, this phenomena were pervasive we would not even have the conversation. If the phenomena were pervasive you would see UHA and RSS diverge from GISS and CRU.
        At the surface (2m) one can very well imagine pockets of the field ( like eddies) where the trend runs counter to the average of the rest of the field. Whatever physical process that leads to this entrainment would be a fun thing to investiagate. Nevertheless, the heat at the surface rises and is sampled again in the tropopshere. measured there, we find the same average trend for the field.

        2. The stations are not “intentionally” dropped from GISS. For a current list of stations which are manually dropped for data corruption errors you need to look at the GISS code.
        For my analysis and zekes and nick stokes, no such drops are made. our answers match CRU and GISS. For grins I’ll suggest you see nick stokes work on picking 60 long stations to cover the globe. Answer? nothing changes. or have a look at AR4 and the composite of the 4 longest stations on record.

        The world is getting warmer. you do believe in an LIA?

    • Hank

      Let’s forget what Muller said or didn’t say before Congress.

      More important will be what his committee says when they issue their report.

      If this ends up simply being another whitewash (“everything’s jes’ fine, folks”), it will be another nail in the coffin for the credibility of the “consensus” among an already wary general public.

      If it exposes some of the suspected problems, raising doubt on the temperature record, it will also be a nail in the coffin of the “consensus”.

      As I see it, the “consensus insiders” will be the eventual losers from this study either way.

      My opinion, of course.


      • What I find interesting about that comment is that it seems that even while you say we should wait to judge the outcome, you make it clear in your list of the possible outcomes that in fact you have made a judgement.

        You omit even the possibility that the outcome will not be “everything’s jes’ fine, folks,” nor raisedoubt on the temperature record.

        You have already determined that it isn’t even within the realm of possibility that the results will show just as Muller suggested the preliminary findings show: that the problems with existing temperature record analyses do not amount to statistical significance.

      • You have already determined that it isn’t even within the realm of possibility that the results will show just as Muller suggested the preliminary findings show: that the problems with existing temperature record analyses do not amount to statistical significance.

        You jump too fast and think too little, Josh – max covered that possibility in his first scenario. There are known problems that need resolution even if the results are that the other records underrate the temp changes – although that’s not likely.

      • They may not be “resolvable.”

        Further, if they can be quantified and are determined to not be of statistical significance, it isn’t necessary to resolve them.

        Obviously, the process of quantification is the nut to be cracked – but Max has dismissed even the possibility of such.

        What I find interesting about that is that not more than a couple of weeks ago, the leading “skeptic/denier” w.r.t. those temperature records thought that the methodology BEST is using had the potential to quantify the problems.

        Max’s “first possibility” was that it would be a “whitewash.”

      • You’re talking nonsense, Josh – if those problems can be quantified then the “resolution” is, by definition, the significance, if any, that the numbers have.

        the process of quantification is the nut to be cracked – but Max has dismissed even the possibility of such.

        And just what words did he use to dismiss that possibility? He presented two possibilities – there are others. There are ALWAYS others. You’ve never done contingency planning for large systems, have you?

        a couple of weeks ago, the leading “skeptic/denier” w.r.t. those temperature records thought that the methodology BEST is using had the potential to quantify the problems.

        And what makes you assume that Anthony has changed his mind about the methodology?

  6. From the BEST Initial Findings page: “We are correcting our programs and methods while still “blind” to the results so that there is less chance of inadvertently introducing a bias.” (http://berkeleyearth.org/findings).

    Apparently, not so “blind”…

  7. This talk from Dr. Vincent Courtillot was posted on Bishop Hill and is well worth watching IMO… http://bishophill.squarespace.com/blog/2011/3/31/courtillot.html

    • GregP

      Thanks for link to Dr. Courtillot’s talk on Bishop Hill.

      Agree that this is a must watch. Courtillot makes his case very clearly, starting out with the statement that there has been a break in the balance between the three pillars of scientific progress, namely observation, theory and numerical modeling and that in his opinion:

      “There has been much too much emphasis on the numerical side and absolutely not enough on the observational side, and observation is the key thing that should be supported in the coming decade.”

      (He got a round of applause for that observation).

      He then expresses surprise that there are only three groups recording global temperatures (GISS, NCDC, Hadley). As a sideline he mentions that he once asked Phil Jones for data and was refused (later came out in Climategate).

      Courtillot then systematically shows that the concept of a global temperature is flawed and how regional temperature records over the 20th century (Europe/North America) show totally different patterns, which show no correlation with each other or with CO2. He points out that there have been multi-decadal swings in the global record and no warming over the past 12 years. He then points out that the 20th century warming is neither without precedent over the past 2000 years nor rising faster than earlier periods and that these periods (MWP, LIA Roman Optimum, etc.) can be shown to correlate well with changes in solar activity.

      He switches to the hypothesis of a multi-decadal solar effect on our climate, showing an “M curve” of solar activity correlating closely with temperatures of stations in the Netherlands over the 20th century. A similar correlation is seen between pulsations in the Madden Julian oscillation in the N. Pacific ocean and solar activity. He mentions a correlation between cosmic rays and the Earth’s rotation, related to the impact on winds. The cosmic ray/cloud mechanism (Svensmark et al.) is mentioned, as is the CLOUD experiment in progress at CERN and observed cyclical changes in the vertical electrical field as another possible mechanism. He makes the point that a 10% change in cloud cover would represent over twice the change in forcing as a doubling of CO2, and that there was an observed 2% change in cloud cover from 1980 to today, which correlates well with observed changes in cosmic rays.

      Courtillot then discusses problems with the climate models. One of the most serious is that they are not falsifiable but, in order to be scientific they must be so. He points out that when one points out that there is something wrong with the model results, they simply “twitter a parameter” and it’s OK without addressing the key problem. He points out that there is an enormous underestimation of uncertainty and lack of acknowledgement of what we do not know in climate science today, that he is against frightening people and that a temperature increase of 2C would be no problem, although he does not believe that this will occur.

      Finally, he concludes that there most likely has been some 20th century global warming with regional irregularity, but that we do not know the cause for this today. We need another 5-10 years of data (instead of just modeling) to see whether the IPCC models can be discarded or not. More observations are required.

      I may have missed something, but that seems to be the essence of his presentation. He mentions a preceding presentation by Nir Shaviv, particularly with regard to the solar influence on climate (but I have not been able to find a link to that presentation).


      • Max –
        I just watched the video – very impressive and highly recommended. He ties a lot of things together – but with uncertainty. As a true scientist should do. He makes no claim wrt actual causation, but show correlations that most of the present climate establishment has been sweeping under the rug for the last ten or more years. And that should be high on the list of items needing investigation/research – BEFORE political action to destroy lives and economies is taken.

        BTW – the Nir Shaviv presentation is also on Youtube athttp://www.youtube.com/results?search_query=Dr+Nir+Shaviv&aq=f

      • Very good synopsis Max.

        It was very interesting and highlighted the issues with verifying causation. The complexity brings me back to the need for better non-linear math approaches like Tomas has brought up. Wither it is UV variation, cosmic rays or some other factor, there is something or combination of things that need to be better understood in natural variation.

      • Max –
        In case you missed it –


        And to correct my previous error, the Shaviv segment was at –


      • Carter’s presentation is terrific – all politicians should see it.

    • Craig Loehle

      I looked into the publications of Dr. Vincent Courtillot re his video: he has a stellar publication record in geophysics and geomagnetism. Papers look impressive.

    • I have a few criticisms.

      1) He claims surprise that there are only three groups recording global surface temperatures, and thinks there should be more like 20 such groups. But the reason there are only 3 groups is that there is no need for more. As Mosher points out (April 2, 2011 at 5:23 pm) there is little change to results when the analysis is done using various different methods and samples of the input. Anyone who realizes this is not surprised there are only 3 global temperature records. It’s a limit that has been correctly constrained by need. If we needed 20 records, there would be 20. In fact I think we are lucky to have even as many as 3 as there is redundancy there.

      2) He dismisses the utility of global average temperature too much. First of all it does have meaning, or rather the trend in it does. If global average temperature is rising then that’s an indication the Earth is “warming” or gaining energy. As long as we make statements such as “12,000 years ago the Earth was cooler” then we are making a statement that implies global average temperature is a meaningful statistic. Additionally if we calculate it was 6 degrees C cooler 12,000 years ago than today, then we can contrast that change with changes in temperature in recent decades.

      • Carter addresses these issues well in his video flagged above. I suggest you watch it.

  8. I think Muller is being unfairly criticized in this case. I can’t imagine him getting out of the hearing without stating his preliminary results. Just imagine the follow-up questions if he declined to state them. I also don’t see where he could win in this situation. State the preliminary results and people complain they are preliminary. Don’t state them and those that don’t like what they finally show make accusations that he needed time to massage the data. I think he probably made the best choice. Of course the accusations will come from one side or the other regardless of the final results. C’est la vie.

    • Indeed. He stated what he had, properly caveated that the results were based on a tiny random sample. I agree that there was little likelihood of his being able to refuse to comment. It’s interesting that no one is mentioning the comments where he attacked those who try to attribute current events with the “small warming we’ve seen thus far” or his statement that those who say that that warming has harmed the earth are unscientific. The fact that he and BEST are getting heat from both extremes is a positive in my book.

      • I suppose it bears repeating. From the BEST web site, “We are correcting our programs and methods while still “blind” to the results…” He has put BEST on thin ice now with his testimony IMO.

      • What do you take that phrase, “while still blind to the results”, to mean?

    • You are missing the point. This is not just about the testimony. He started reporting these results two weeks ago. Saying they are preliminary does not get him off the hook, because he has not released the preliminary work to back them up. The meaning, weight and validity of the conclusion are all based on the actual work that produced it and that has not been released. For that matter the hearing was on process and he could easily have discussed that without reporting results.

    • I think that when the complete results come in the raw data will show the many small (couple tenths of a degree) modifications through the TOB adjustments, Homogenization, UHI adjustments, and the continual down adjusting of the past data sets before 1980.

      However the total composite values will still end up being small from .1 degree to maybe more than the .7 degree CAGW rise in the GISS surface data base. By careful further auditing, where and by how much, and by whom, will be able to be seen. To me what is telling is the chatter among the usual suspects as to who is the most nervous, and who is just angry about the final results.

      The only way that the BEST data set will agree very closely with the other three, is they are all adjusted and homogenized the same way.

      Still to be considered long term is the changes in nebulized droplet size and albedo shifts in clouds due to static and ionic charges, increasing and decreasing with solar wind particle flux into and out of the inosphere, and changes in the EUV content of TSI.

      If research can tie the two together, the net effects of changes in the flux in the solar wind particles [not related to TSI] can be studied along with the The cosmic ray/cloud mechanism of (Svensmark et al.), and effects of CME as well as solar flares might show a very complicated interaction, that needs explaining.

  9. Michael Larkin


    Sure, Muller may have been invited rather than volunteered, but on the basis of a 2% survey, he has chosen to mention preliminary conclusions.

    This is not in line with the perception of what people would have hoped he would have said based on his past pronouncements. That would have been something like: “We’ve done some preliminary analysis of only 2% of the data, and it is too soon to draw any conclusions from that. In due course, we will complete the study, and publish our results complete with all data, methods, and algorithms used.”

    You have to be able to put yourself in the position of sceptics. They wanted to believe that Muller’s actions would be in full accord with his stated intentions. If they could feel sure of that, they would be prepared to accept his results however the dice might fall.

    But now, he has introduced doubt by speaking precipitously, and sceptics are wondering if he really is the way he initially came across. Some have already concluded he’s betrayed their hopes and he’s lost them. I myself, an agnostic, have noted that Pielke sr. has agreed with Anthony’s perceptions, and this has greatly saddened me, because I respect and trust Pielke, and now can’t place wholehearted trust in Muller. So we may be back in the same old situation, with the perception that the BEST project isn’t an honest brokerage.

    If that’s the case, then no end is in sight to the neverending disputes.

    I guess you don’t get it because you don’t perceive the need for honest brokerage in the same way sceptics/agnostics do. To you, Muller may seem to be onside with your own perceptions, and acting honourably. You’d have to walk a mile or two in my shoes to come to appreciate what a blow Muller’s testimony represents. You may not get it, but you have no option but to accept that’s the way it is.

    • And it’s not really that they’ve only looked at 2% of the data but that they haven’t applied all their methods to that 2% (in particular adjusting for poor quality data).

      The difficulty I’m having is not with releasing preliminary results but rather incomplete (therefore incorrect and possibly misleading) preliminary results. If this was a random 2% sample that had all the methods applied to it, that would be more understandable.

      Anytime someone makes public statements (especially to Congress), there’s a lot of pressure to defend those statements no matter how wrong they later prove to be rather than have to publicly retract and admit error (think Hockey Stick).

      It would be helpful to the success of the BEST project for them not to make statements until they have published their results. We should all be able to wait patiently until they’ve done so. Then let the fun begin…

      • Michael Larkin


        I don’t disagree with you. But whatever, the nub is loss of trust, and feelings of betrayal.

    • Actually, Muller seems less skeptical than I am about the “official” temperature record analyses. His work will have far more impact if he finds something of significant difference in his analysis. He understands the delicate line he is walking in terms of credibility on both sides of the debate. My advice on this has been to make the data publicly available ASAP (i.e. last Aug), and not to publish anything until they are ready. Being asked to give testimony at this juncture was awkward. Muller made his best judgment on how to handle this. There are people on both sides that are criticizing his judgment. For anything other than his conclusions about general adequacy of the U.S. data, he did not draw any conclusions from his analysis about “global warming.” So people will just have to wait a few more weeks until he posts the preliminary results of his analyses online and makes the 2% data available for other people to play with.

      • Judith

        I think this points out that BEST is a “lose-lose” proposition for the climate science orthodoxy, as I mentioned earlier:

        If Muller + Co. find no problems (especially if they exclude UHI from the investigation) , it will be written off as a whitewash in the eyes of the general public, thereby further eroding the level of confidence in the climate science orthodoxy.

        If they find serious flaws it will also undermine the credibility in the message of the oxthodoxy in everyone’s eyes.

        Am I wrong here, i.e. could you see an outcome that would end up improving the credibility of the orthodoxy in the public’s eye?


      • Brandon Shollenberger

        If it is done openly and well, it will be a “win,” no matter what the results are. Some people will probably view it is a “whitewash,” but they will be a small minority.

      • Max – by what means do estimate the reaction of the “general public?”

        There is a certain segment which will certainly write off as a “whitewash,” any analysis that doesn’t disprove AGW. Polls show that such folks are in a distinct minority.

        Despite some of the political controversy over climate science, three out of four Americans believe that the Earth has been gradually warming as the result of human activity and they want the government to institute regulations to stop it, according to a new survey by researchers at the Woods Institute for the Environment at Stanford University.

        The survey was conducted by Woods Institute Senior Fellow Jon Krosnick, a professor of communication and of political science at Stanford, with funding from the National Science Foundation. The results are based on telephone interviews conducted from June 1-7 with 1,000 randomly selected American adults.


        Several questions in the June survey addressed the so-called “climategate” controversy, which made headlines in late 2009 and early 2010.

        “Growing public skepticism has, in recent months, been attributed to news reports about e-mail messages hacked from the computer system at the University of East Anglia in Britain – characterized as showing climate scientists colluding to silence unconvinced colleagues – and by the discoveries of alleged flaws in reports by the Intergovernmental Panel on Climate Change (IPPC),” Krosnick said. “Our survey discredited this claim in multiple ways. “

        For example, only 9 percent of respondents said they knew about the East Anglia e-mail messages and believed they indicate that climate scientists should not be trusted, and only 13 percent said the same about the controversial IPPC reports.

        “Overall, we found no decline in Americans’ trust in environmental scientists,” Krosnick said. “Fully 71 percent of respondents said they trust scientists a moderate amount, a lot or completely.”


        The situation may have changed somewhat since that poll was conducted in June of 2010 – but I highly doubt that it has changed to an extent that would match anything close to your characterization.

        These articles also have polling data (some are more recent):




        Here’s a better article that looks at the questions asked in different polls and looks for biases:


        This article also looks at how different polls get widely disparate results:


        Finally, this article examines “expertise” level of scientists who are “doubters”:


      • Well the Berkeley group is not measuring its success in terms of whether it supports the orthodoxy. Their goal is to produce a more robust and transparent data sets, and to use improved statistical methods in the analysis. whatever answer they end up with, it will have a better foundation than the analyses we currently have. And most importantly, they are doing a serious uncertainty analysis, so that we can more objectively assign confidence levels to statements about the magnitude of the warming.

      • The Honest Broker
        For tangled climate chaos
        Is Mama Nature.

  10. Re the No Frakking Consensus post entitled “IPCC Insider’s Club” – see Judith’s main post for the link. The article suggests that IPCC Reports are written and controlled by a relatively small number of scientists.

    Does anyone know how authors/lead authors etc are actually chosen by the IPCC? Who decides which scientist get which roles for the construction of an IPCC Report? What criteria are used to choose people? What is the process?

    • The process is largely political and dominated by the mission of the IPCC which is support agw theory which supports a statist mitigation solution in line with U.N. and collectivists in general.

      • cwon1 – that does not really answer my question. It was a serious question for somebody who knows the answer. When we know the facts we can then make a judgement.

    • What a wonderful question! I don’t know but I can guess to some degree, namely the usual UN insider stuff. I am sure it begins with the sponsors who get to pick the people from their countries. The members of the IPCC are countries, not people. As far as I know only the developed countries are sponsors. This is partly in the public record. Then I imagine there are informal national quotas to be filled, within which it is a lot of who knows who, plus who pays who. It would make a great study. Mind you, other segments of the UN may already have been studied.

      • Then there’s the seating of the NGOs. Somebody needs to explain to me how having a .org domain and a million bucks gets some nobody without any official credentials a seat at these things. And it isn’t just the IPCC. The entire UN is a club where you can get in if you whisper the right words into the bouncer’s ear and slip him something discreetly.

      • I have no idea what you mean. The NGO’s have no “seat” at the IPCC, only countries have that. But NGO’s do have a major role to play in the UN because one of the UN’s primary purposes, perhaps its main purpose, is to raise money for developing countries. Much of this money flows through NGO’s, rather than through local governments. So I suspect they may have some involvement with the IPCC, but I have never seen much evidence of that. (From 1994 to 2004 I covered the IPCC for Electricity Daily, plus doing several policy studies on them.)

        Keep in mind that the big sponsors, the USA (especially the State Dept.), Britain, Germany, etc., are heavily pro-AGW. Most of the scientists who volunteer to write the reports are AGW activists. Plus the IPCC is owned by UNEP, whose charter is environmental activism. That is all it takes. There is really no mystery.

      • David – you describe my own journey to a “T”… two years ago I would not have questions AGW … today I cannot but question it.

        Commenting on the Muller controversy above, I find it incredible that his figures “match” the others, when the others keep getting adjusted — with the earlier years getting colder and colder. How could that be??

    • “Does anyone know how authors/lead authors etc are actually chosen by the IPCC? Who decides which scientist get which roles for the construction of an IPCC Report? What criteria are used to choose people? What is the process?”

      Interestingly, the InterAcademy Council (IAC) was charged with (inter alia) finding the answers to such questions. Their report (released on August 30, 2010 and published in October) notes the following in the Executive Summary (p. xvi) wrt “Transparency”:

      Given the high stakes in climate change decision making and IPCC’s role of providing policy-relevant information, the IPCC can expect that its reports will continue to be scrutinized closely. Therefore, it is essential that the processes and procedures used to produce assessment reports be as transparent as possible. From extensive oral and written input gathered by the Committee, it is clear that several stages of the assessment process are poorly understood, even to many scientists and government representatives who participate in the process. Most important are the absence of criteria for selecting key participants in the assessment process and the lack of documentation for selecting what scientific and technical information is assessed. The Committee recommends that the IPCC establish criteria for selecting participants for the scoping meeting, where preliminary decisions about the scope and outline of the assessment reports are made; for selecting the IPCC Chair, the Working Group Co-chairs, and other members of the Bureau; and for selecting the authors of the assessment reports. The Committee also recommends that Lead Authors document that they have considered the full range of thoughtful views, even if these views do not appear in the assessment report. [emphasis added -hro]

      But while on the subject of data disclosure and disappointment … of all the so-called enquiries pursuant to Climategate, the IAC’s review is probably the only one deserving of any credibility. Yet in the data disclosure department even the IAC disappoints.

      Notwithstanding the report’s claim (p. 11) that:

      More than 400 individuals, listed in Appendix C, provided input. The prevailing views of the questionnaire respondents about the various steps in the IPCC assessment process are summarized in this report and a compilation of all of the responses, with identifiers removed, is available from the IAC .

      to date, only 232 responses have been published. For details of this sorry saga, pls. see:


  11. As a “lay” skeptic (no science background), I have to rely on my own common sense and especially my God-given powers of discernment in trying to come to a judgment with respect to AGW. I came to the issue as a believer. Given my standard liberal politics including a strong bias, bordering on intellectual contempt against conservatives, how could it be otherwise? I bought, without thinking about it much, the premise that there really was a strong consensus among the vast majority of qualified scientists which was based on overwhelming evidence. That’s what the New York Times was telling me, and (so went my thinking) they ought to know. Why would they lie? Why would the scientists whose work they were reporting on lie? Or to put it more charitably as I do not believe even today these people (for the most part) are actually lying, how could they be mistaken? What possible vested interest could they have that would be strong enough to bias them sufficiently to get the science wrong? (Of course I didn’t realize at the time that the money and prestige and career advancement at stake were indeed sufficient.)

    Despite my faith in the AGW position. there was something about way the warmists dismissed the skeptics, with a certain defensive contempt that came close to rage, that got me to wondering. On the one hand, if the earth really was in danger and we really were reaching some sort of “tipping point,” rage might be appropriate. Still, I felt the need to investigate..

    Two years later, I’m firmly in the skeptics’ camp. There’s simply no doubt for anyone paying attention that the warmists are playing fast and loose with the facts, that they’re serial exaggerators, and that the science is in fact far from settled.

    Like many I suppose, I get frustrated because I think Judith Curry could go further. I recall her being questioned by congress a while back, and in answer to a question about whether the climate-gate emails, as bad they looked, might reflect poorly on the science, she answered a firm “no.”
    It’s been months, and my memory is a little shaky, so if I have that wrong
    I apologize. Perhaps the question related to the many mistakes in the IPCC report. But I clearly recall feeling deeply disappointed. Of course these things would reflect badly on the underlying science. The whole AGW case is built on the premise that the current warming, as shown by the hockey stick graph, is “unprecedented.” Absent that foundational premise, does not the whole theory fall apart?

    That said, I respect J.C. tremendously. She’s obviously a courageous woman and she’s done a great deal to open up the debate. In the years ahead, when AGW has finally been shown to be the greatest scientific blunder in modern history, I think she’ll be regarded as something of a hero.

    • For the record, I’ve stated a number of times that I think the climategate emails reflect poorly on the scientists and on the science. The emails do not demonstrate prima facie that any of the conclusions in the IPCC are incorrect.

      • No, but climategate was about, among other then MBH hockey stick, and AR3 made a big deal about the hockey stick. Climategate doesn’t refute any conclusions at all, but it certainly does leave a bad odor about IPCC, particularly AR3. AR4 backed off of the hockey stick for other reasons, which partially redeems the IPCC as an institution.

      • I agree Dr. Curry. What the emails showed was the politicization of the science, neither more nor less. It was quite enough.

      • John F. Pittman

        Your statement does not agree with the understood definition of prima facie http://www.google.com/search?q=prima+facie&rls=com.microsoft:en-us&ie=UTF-8&oe=UTF-8&startIndex=&startPage=1 nor AR4. In AR4, in the attribution,

        “”The evidence from surface temperature observations is
        strong: The observed warming is highly signifi cant relative to
        estimates of internal climate variability which, while obtained
        from models, are consistent with estimates obtained from
        both instrumental data and palaeoclimate reconstructions. It
        is extremely unlikely (<5%) that recent global warming is due
        to internal variability alone such as might arise from El Niño
        (Section 9.4.1). The widespread nature of the warming (Figures
        3.9 and 9.6) reduces the possibility that the warming could have
        resulted from internal variability. No known mode of internal
        variability leads to such widespread, near universal warming
        as has been observed in the past few decades. Although modes
        of internal variability such as El Niño can lead to global
        average warming for limited periods of time, such warming is
        regionally variable, with some areas of cooling (Figures 3.27
        and 3.28). In addition, palaeoclimatic evidence indicates that El
        Niño variability during the 20th century is not unusual relative
        to earlier periods (Section; Chapter 6). Palaeoclimatic
        evidence suggests that such a widespread warming has not been
        observed in the NH in at least the past 1.3 kyr (Osborn and
        Briffa, 2006), further strengthening the evidence that the recent
        warming is not due to natural internal variability. Moreover, the
        response to anthropogenic forcing is detectable on all continents
        individually except Antarctica, and in some sub-continental
        regions. Climate models only reproduce the observed 20thcentury
        global mean surface warming when both anthropogenic
        and natural forcings are included (Figure 9.5). No model that
        has used natural forcing only has reproduced the observed global mean warming trend or the continental mean warming
        trends in all individual continents (except Antarctica) over
        the second half of the 20th century. Detection and attribution
        of external infl uences on 20th-century and palaeoclimatic
        reconstructions, from both natural and anthropogenic sources
        (Figure 9.4 and Table 9.4), further strengthens the conclusion
        that the observed changes are very unusual relative to internal
        climate variability.""

        The emails show that the confidence intervals if nothing else should be much larger. The Trenberth quote about the inability of models if nothing else indicate problems with the models. If nothing else what the emails and Steve McI have shown is that this statement is suspect ""Climate models only reproduce the observed 20thcentury
        global mean surface warming when both anthropogenic
        and natural forcings are included (Figure 9.5)."" because these models depend on the palaeoclimatic record in this fashion of a subtitle: ""Analyses of palaeoclimate data have increased confidence in the role of external influences on climate"" by the following text in the AR4 after the subtitle ""Coupled climate models used to predict future climate have been used to understand past climatic conditions of the Last Glacial Maximum and the mid-Holocene. While many aspects
        of these past climates are still uncertain, key features have been
        reproduced by climate models using boundary conditions and
        radiative forcing for those periods. A substantial fraction of the
        reconstructed Northern Hemisphere inter-decadal temperature
        variability of the seven centuries prior to 1950 is very likely
        attributable to natural external forcing, and it is likely that
        anthropogenic forcing contributed to the early 20th-century
        warming evident in these records.""

        Perhaps a presentation of a more nuanced statement would make more sense.

      • I have criticized the AR4 detection and attribution statement at Climate Etc. in about 6 threads. The source of my criticism is published literature and reasoning, it has nothing to do with the CRU emails.

      • Wrong, I’m afraid – it is not demonstrably warmer than 1000 years ago

      • andrew adams

        They didn’t say it was “demonstrably warmer”. They said –

        “Palaeoclimatic evidence suggests that such a widespread warming has not been observed in the NH in at least the past 1.3 kyr (Osborn and Briffa, 2006).” (my italics)

      • asa –
        WHICH paleoclimatic evidence would that be? This perhaps?



  12. Rob B –
    The article suggests that IPCC Reports are written and controlled by a relatively small number of scientists.

    This has been common knowledge in some circles for a long time –




    • Jim – these are interesting links, but they don’t answer my questions. The AR editors seem to control the process. Who precisely is it that appoints the editors and according to what ctiteria?

      • Rob –
        Those links were only in reply to the relatively small number of scientists” thing. I could make a very good guess as to how their people are picked, but that wouldn’t answer your questions. But you might find the IPCC chapter of Christopher Horner’s book “Red Hot Lies” interesting. It would give you a clue about their operating processes.

  13. Dr. Curry writes: “I don’t really get what all the angst was about. Muller was asked to testify about the integrity of the climate data and the process by which these datasets were assembled.”

    From what I have read following the issue, the “angst” is likely based on the following:

    If Muller just testified about processes, there would have been no “angst.” But this is the second time he has announced conclusions, however preliminary, however based on incomplete analysis, using not only BEST’s data and processses, but Anthony Watts’ surfacestations.org data.

    Muller’s testimony included this: “Without the efforts of Anthony Watts and his team, we would have only a series of anecdotal images of poor temperature stations, and we would not be able to evaluate the integrity of the data.” Muller, using BEST as his platform, has essentially claimed to already have “evaluated the integrity” of not just the existing surface temperature records, but the existing analyses of that data, using the surfacestations.org data.

    As science, Muller’s pronouncements make no sense. As PR, they are quite clear and quite effective.

    I suspect that the angst, at its core, is based on mistrust of CAGW supporters based on past behavior. Just recently, James Delingpole was approached by the BBC to do a supposedly fair, objective interview about issues skeptics have with the climate “consensus scientists.” Delingpole wrote: “Nurse came to interview me at my home last summer, ostensibly – so his producer assured me – as a disinterested seeker-after-truth on a mission to discover why the public is losing its faith in scientists.”

    Delingpole proceeded to accept the assurances of good intention and the result was: “But as is clear from the Horizon documentary Nurse had already made up his mind. That’s why about the only section he used out of at least three hours’ worth of footage is the one where he tosses what he clearly imagines is the killer question: Suppose you were ill with cancer would you wish to be treated by “consensus” medicine or something from the quack fringe?”

    Delingpole, being taken aback by such a partisan sucker punch, fumbles his reply. And the CAGW blogosphere rejoices.

    Along comes Anthony Watts’ surfacestations.org endeavor. An undertaking to accumulate raw data, unprocessed by CAGW proponents, to achieve an unfiltered look at UHI and station siting issues. The CAGW proponents have been concerned about, and attempting to debunk, the surfacestations project almost from its inception. “Cherry picking confirmed” from Deltoid, September 2007.
    NOAA initially did an analysis based on all of 70 stations from the project, then The Journal of Geophysical Research – Atmospheres published a paper written in August 2009 that concluded that “In summary, we find no evidence that the CONUS average temperature trends are inflated due to poor station siting.”

    Watts has published an article with the Heartland Institute that found that “In other words, 9 of every 10 stations are likely reporting higher or rising temperatures because they are badly sited.” But as far as I know, he has not yet published an actual analysis of the data regarding “average temperature trends.”

    So Watts wants to create an openly available database of raw data, untouched by consensus hands. And following the mantra of the supremacy of peer review from the consensus, is waiting to publish the analysis of the project’s data in a peer reviewed journal. Shortly before he actually publishes, along comes a CAGW supporter, who has not heretofor been one of the more rabid agitators, Dr. Muller, and proposes a project to do an open, objective analysis of raw data. And BEST is born.

    Watts is himself not only allowed contact, but given an advance glimpse of some data and processes to be used. Seeing what he believes is someone with the same goal of an open, objective assessment of surface temperatures, he reciprocates. And very shortly thereafter come the pronouncements from Muller that UHI and station citing issues shown in the surfacestations.org data have no impact on the existing CAGW prepared global temperature analyses. Based on 2% of the data, before the full process has been completed. And one of the statements is before Congress no less.

    No one wants to be called a conspiracy theorist, so no one so far is really explaining what the “angst” is about. But the problem is that BEST, based on Muller’s statements to date, looks increasingly like just another CAGW public relations operation. There is at least then a chance that the whole purpose of creating BEST (a typical progressive acronym by the way) was to preempt whatever analysis of data is ultimately published by the surfacestations.org project.

    Anything published now can be characterized by the CAGWers as sour grapes. The CAGW blogosphere has already trumpeted Muller’s statements far and wide and will provide the leg work in using BEST to undermine whatever resulted Watts may publish, if they in fact differ from the consensus position on global average temperatures.

    Was Muller, like the BBC with Delingpole, intent on discrediting the surfacestations.org project from the start? Did he come around to the necessity to do so? Or has he just been sloppy and over eager in his pronouncements? The one explanation that won’t work is that Muller and BEST collected all the data, completed their analysis, then published their objective disagreements with Watts (if there are in fact any once the analyses are properly published). They have already made a preemptive release of their conclusions which makes that last alternative impossible.

    Personally, I hope Muller was just being sloppy and over eager to please his colleagues. But the question remains, was BEST just created as another iteration of the “climate rapid reaction team,” to do a preemptive “debunking” of the surfacestations.org project before the analysis is even published.

    • Muller is not spending all this time and effort with the idea that the end result being that the analyses done by CRU, NOAA, GISS are “perfect” and are providing the best possible answer for global surface temperature anomalies. He thinks he has some better strategies for doing this, and is providing a more complete data set so others can test their strategies for doing this. Most importantly, he has some good ideas for characterizing the uncertainty in the dataset.

      The surfacestations.org project is only about the U.S. stations, so it is not of major relevance to the global temperature or even the global land temperature. That said, Muller has gone out of his way to compliment Watts for his efforts on this. Once the Fall, Watts et al. paper is published on the surfacestations.org classification, I’m sure other groups will analyze the data in different ways and it will be a few years before we have clarity on how to interpret what all this means.

      • I don’t think anyone believes Muller has said that “the analyses done by CRU, NOAA, GISS are ‘perfect.’” He has, however, said they are essentially correct. And said so before completing anywhere near a full analysis of the data.

        The appearance that BEST may be more concerned with PR than science is a product of Muller’s broad, premature statements confirming one of the key planks of the CAGW consensus. No matter what other interesting analyses, and descriptions of uncertainty may come, the key issue in the policy debate it the CAGW consensus. That is what is being trumpeted by the activists, and it was Muller who gave them the ammunition.

        If anything BEST publishes contradicts even the smallest part of the consensus, then it will show it has earned the credibility it was initially given. If Muller had waited and published properly, even an analysis supporting the existing temperature analyses would have been met with greater acceptance, if not necessarily agreement. But CAGW supporters who announce conclusions before doing the science just aren’t going to get the benefit of the doubt from skeptics.

      • I do think the surfacestations.org project should have global relevance with respect to showing the amount of systemic bias due to improper siting, etc. If it shown that the bias is statistically significant in the US, I’d expect it to be as much or likely worse outside the US.

      • Lord Frijoles

        Others have been more detailed in their analysis of Muller’s words, so I’ll just attempt to paraphrase him:

        Muller to congress:”well, don’t take my words too seriously because I’d still be wrong, BUT, in case anybody is interested, our preliminary results are in accord with the CAGW consensus”.

        Professor Curry, I admire your work and efforts as much as the guy next door, but seriously, how can you defend Muller’s message that I just paraphrased? I mean, at the very least, his statements are indefensible from a scientific point of view. Moreover, that he didn’t pause to think about the importance his pronouncements would have (especially in a very highly politicised research area such as climate change) really makes us question his good judgement and/or the sincerity of his words when he said BEST would be all about science and not politics. Obviously, only he knows his true motivations, but that a man of his experience would be so careless and sloppy with his words, especially in a situation when he totally knew about all the publicity that would ensue ….. well ……

      • “The surfacestations.org project is only about the U.S. stations, so it is not of major relevance to the global temperature or even the global land temperature.”

        On the contrary, it is of supreme importance. Since the surface station survey is the only one of its kind, until a similar survey has been done on other surface stations around the world it becomes representative . If poor sitings can influence the record, then we need to be assured that it has not influenced the rest of the record outside of the US.

        It shows that for the most fundamentally important issue of our time, the data our conclusions are drawn from that inform policy are unreliable.

      • And what if poor siting hasn’t influenced the record? Because that’s exactly what the Watts-leaked abstract of Hall 2011 indicates: “the overall mean temperature trends are nearly identical across site classifications.”

        Which implies that for the most fundamentally important issue of our time, the data our conclusions are drawn from are, in fact, reliable.

  14. I suggest that those criticizing Muller ask themselves a question.

    If he had stated that the results are only preliminary, and should not be regarded as complete, but that they have found that their preliminary findings were in contrast to trends reported by other evaluators – how would they have reacted?

    • In law, a statement by a witness that is against his interests is given greater credibility. So a statement by Muller that BEST’s findings were in contrast to the previously trends would have been seen as premature, but would not have been as doubted by skeptics.

      Now Joshua, answer the reverse question. If Watts came out after an analysis or 2% of the surfacestations.org data, and claimed there was a contrast with the reported trends, what would be your reaction?

      • So a statement by Muller that BEST’s findings were in contrast to the previously trends would have been seen as premature, but would not have been as doubted by skeptics.

        Nor would have “skeptics/deniers” been positively “outraged” by his testimony, called him or implied that he is “agenda-driven,” “clueless,” a “con man,” “foolish,” “hiding behind Congress’ skirts,” making “dumb mistakes,” offering “colossally stupid” testimony, etc.

        As to my reaction had his testimony gone in the other direction: I would have viewed it as an indication of yet another problematic feature of the debate (just as I view the testimony he actually gave). Admittedly, I would have been concerned about the how a different testimony would feed the “tribalism” on the “denier/skeptic” side of the debate, but it is not at all lost on me that Muller has also been attacked from the “warmist/believer” side of the debate.

        Ultimately, in terms of the debate itself, it seems to me that BEST’s results – given the massive dimensions of all the questions involved – are interesting and important to some degree but certainly not decisive or anywhere close to that; but the amount of weight BEST has been given – as evidenced by the level of feuding between the tribes regarding their work – just goes to show how tribalism on both sides obscures the merits of the arguments being made (at least for this non-expert).

        The jumping back and forth on perspectives about BEST (on both sides) reminds me of how the Supreme Court (both sides) jumps back and forth on their view of Constitutionality depending on the political implications of the has they are hearing.

      • Obviously, that should be…. “depending on the case they are hearing.”

      • Joshua,

        You answered a question I did not ask. I did not ask how you would react if Muller his testimony “had gone the other direction.” I asked “If Watts came out after an analysis or 2% of the surfacestations.org data, and claimed there was a contrast with the reported trends, what would be your reaction?”

        A proponent announcing that his position is incorrect, based on a preliminary review of 2% of the data, might be at least interesting, but not very valuable. A proponent announcing that his original position was CORRECT based on that same flimsy analysis is worthless. No matter what side of the controversy you are on.

        It does suggest an over eagerness of the speaker to find confirmation in his data.

        So, how would you react to a similar self serving pronouncement from Watts on the same issue, based on similar skimpy work?

      • Sorry for answer the question I had wished you asked rather than the one you asked.

        I think that Muller’s caveats suffice to contextualize his findings – so honestly, if Watts had done something similar I would have considered it to be an announcement of preliminary findings, nothing more, nothing less.

        I see contrasting arguments from both sides about Muller being “proponent” for the other side. In fact, Muller suggested that his preliminary findings were in contrast to what he expected to find. I see no solid reason not to take him at his word.

      • Muller also made his statements while apparently knowing that Watts has a paper in progress that states different conclusions. So how would you feel/react if you were in Watts’ position?

      • Joshua,

        If true, that’s just a rhetorical device.

        “I conducted the research expecting to find my own original beliefs were wrong. I was shocked, shocked to find out how right I have been all along.”

      • Joshua, perhaps you should consider the merits of the arguments rather than just who is making them. Your focus on who is making the arguments is in fact a form of ad hominem fallacy. The skeptics are outraged because Muller has said they are wrong without offering any basis. What the skeptical reaction would be if he said something different is irrelevant to the merits of the skeptical outrage.

        Instead of commenting on the debate from outside why don’t you try joining in.

      • David –

        “The skeptics are outraged because Muller has said they are wrong without offering any basis. ”

        I guess I agree with the first part of that statement – although I would word it differently: he said that the preliminary results showed that the effects of the phenomena he was examining, on the data trends as previously reported, did not reach statistical significance.

        As to the second part I would disagree: the basis he offered was that it was preliminary results of an extensive analysis, utilizing a methodology he shared with notable “deniers/skeptics” to some degree (and which met their stated approval before any results were reported).

        My specific point of interest is how “who is making them” affects the logic and consistency of the arguments. As someone who lacks the technical basis for evaluating the complicated science, it is difficult for me to use any other means for evaluating the debates and how my own biases interfere with my reasoning. Even lacking technical expertise, I can at least gain some measure in the debate by analyzing the clarity, logical constructions, and consistency of the arguments being offered.

      • Brandon Shollenberger

        I think “without offering any basis” was meant in the sense there is nothing these people can look at to verify Muller’s remarks. All they have to go on is his word. Given the history involved, that is hard for a lot of people to accept. How many times have skeptics been told something in the past with only someone’s word to support it only to find out it was untrue? Sometimes it has been completely false, other times misleading, other times unacceptably inaccurate…

        Maybe Muller is right. Maybe his representation of things is perfectly accurate, and it is based on solid work. Maybe not. But for how many weeks are skeptics going to be told, “Your concerns are all wrong” without being provided a shred of reason to believe it?

        If you tell people they are wrong for weeks without providing them any way of verifying your results, it’s understandable for them to be annoyed.

      • I think I touched on your points in my response to David.

        I see some merit in what you are saying, but I don’t think it reaches the bar of undermining Muller’s, or BEST’s, “integrity.”

        As I see it, when the reaction outsizes the “infraction,” as it were, it suggests that bias is coming into play. That is my response to the level of “outrage,” and I think that examining who is outraged – i.e., whose ox is being gored – becomes relevant.

      • Brandon Shollenberger

        People overreact to things, so it doesn’t surprise me people might overreact to something like this. They have a valid reason to be annoyed, but they’re letting feelings from other things come out in response to Muller. That’s unfair to Muller, but it isn’t surprising or particularly horrible.

        As an aside, in the same way the response does not fit the crime, it is important not to go a bridge too far with claims of bias. There is obviously some bias, but it’s hard to say how much.

        I would like to add one criticism of Muller. Announcing general results “without basis” like Muller did doesn’t bother me much. However, I think announcing specific results is inappropriate. Saying BEST’s results generally agree with other temperature records makes some sense to me. Specifying a level of influence from something like UHI or microclimate issues to three significant digits does not. No amount of caveats will make up for the impression of precision given by doing something like that.

      • Brandon Shollenberger

        Oh, I forgot to mention one other thing. If you are going to announce results in advance to publishing something, you should always be sure to announce when further material will be available. Telling people what needs to happen before you release that material isn’t enough on its own. They need to be able to know what to expect from you.

        Here’s a question for you. If Muller keeps saying things like he has said for the next six months, and in that time nothing made available for people to examine, will you then criticize him?

      • “However, I think announcing specific results is inappropriate. “

        I agree with this point, and it is a fair criticism.

        If Muller keeps saying things like he has said for the next six months, and in that time nothing made available for people to examine, will you then criticize him?

        Clearly, there is some point at which continuously announcing preliminary analysis without publishing or announcing final conclusions would become indefensible. I don’t know if six months could be considered as a cut off point as a generic rule – but given that in this case they have indicated that they are finalizing publications and that they will make their methodology and data fully transparent, six months would seem too long to keep people waiting and still maintain credibility.

      • Brandon Shollenberger

        Personally, I think at the point you are testifying in front of Congress about something you ought to be able to give a clear indication of when the basis for your work will be available for verification. I assume Muller can’t do so, as otherwise he would have. That bothers me.

      • Also, David – I honestly think that I do try to evaluate the merits of the arguments in addition to examining their relationship to the tribal orientation of the people making them.

    • Do you believe people who predict the final score of a game after the first two minutes of play?

    • That’s a nonsense question because no credible scientist on either side of this debate or any other would make that statement based on 2% of the data and incomplete processing results.

      • There may be reasons to criticize Muller’s announcements, but I’m surprised that the 2% figure continues to be an object of criticism. If you want to draw conclusions about a population from a sample, you want the sample data to be accurate, the sample to be representative, and the sample size to be adequate in absolute terms. If those criteria are met, what difference does the percentage make, regardless whether it is 2%, 20%, or 0.02% of the population? Even an 0.02% percent figure could provide a highly accurate estimate of the entire population if the latter is very large in size.

      • One dollar “could” buy you a winning lottery ticket too. The word “could” is not a very helpful word if one purports to be discussing science, let alone making the kind of dispositive statements Muller made. Did Muller say the 2% was picked to be a “representative” and “adequate” sample? I didn’t see that in either of his pronouncements.

        I guess there is no use running the other 98% then. In a world of limited resources, running all the data seems a waste where 2% is adequate for final conclusions. In fact, why even collect the other 98%? Who knew we had too much data? We’re done here.

      • In your eagerness to indulge in ridicule, you missed the point, Gary, but I expect that others probably understood that I was pointing out that the 2% figure in its own right is not a basis for claiming that the results are unreliable, even if criticisms of other aspects of Muller’s claims might be warranted (or might not be). If you consult a standard statistics source, I believe you’ll find that what I stated was correct, and that the lottery ticket analogy was a false one.

        This is why citing the 2% figure as though that fact impugns the conclusions will be seen as signifying a misunderstanding of statistical sampling by those who cite it. Instead of resisting that principle, it might be a good idea to acknowledge it and move on to other aspects of Muller’s statements.

        If you are looking for a better analogy, it is the ability of sophisticated demographic models to predict an election outcome on election night when only a few percent of returns are in. Based on experience, those models have become quite accurate, even though they face unusual challenges not found with random sampling due to the severe inhomogeneity of precinct reporting that requires major adjustments to achieve a representative estimate (that, and not percentage values is what limits them). They still require all the votes to be counted, but that does not make them inaccurate.

      • I have to agree with Fred Moolten. The size of the sample ceases to be of much interest when the population is very large. Properly constructed random samples of the US and Australian electorates aim for 2000 respondents, a number that allows some internal analysis. Half that number will do if you all you are interested in is predicting the electoral outcome nationally, among men and among women, and in a few regions only. The American electorate is about 15 times larger than the Australian. For Australia, a sample of 2000 in an electorate of some 15 million represents a very tiny proportion indeed, and the US equivalent is much tinier still.

        Everything depends on the quality of sample, not (after a point) on its size.

      • Don,

        In polling, that kind of statistical sampling also depends on more than just the percentage. Those sample are controlled for region, income, party affiliation and many other variables. The smaller the sample, the less likelihood of even being able to control for those factors.

        “Everything depends on the quality of sample, not (after a point) on its size.”
        Agreed. So where did Muller explain the quality of his sample as representative of the entire U.S.? He may have done so, but I haven’t seen it. That was why I asked in my original comment.

        The original BEST release said “the preliminary analysis includes only a very small subset (2%) of randomly chosen data.” “Randomly chosen” does not seem to equate with representative.

      • Don – and Fred –
        The problem here is that you’ve both assumed a “representative sample” while, IIRC, the 2% is representative of only a small part of the world. I believe the word was – Japan. I could be wrong, but that’s what I remember reading. Steve Mosher may know.

        In any case, 2% is an inadequate sample. This is not a production run that we’re sampling where we’re looking for minor variations from a fixed configuration/performance and can reasonably apply statistical sampling. What’s needed in this case is not 100%, but 200% – or more. Not that we have that, but “weather” – and therefore “climate” are regional phenomena. And since we lack sufficient data for many areas of the planet, 200% of what we have would still be insufficient data for “certainty” on either a regional or global basis.

        Fred – your “voting” analogy is no more valid than Gary’s lottery ticket because you know where those precincts are – but we don’t know the temp record for much of the planet for the last 150 years. Even for the US, with all of our data records, much of that data has an accuracy (or lack thereof) that precludes the accuracy that’s been touted by the present data sets.

      • Don – I had considered using pre-election polling as an example, like you did. The reason I thought that election night predictions might be even more directly relevant is that pre-election polls predict future voting behavior based on current opinions, while election night predictions try to estimate total voting behavior based on the actual voting behavior of a much smaller sample. In either case, though, the principle holds that while many aspects of the sampling are critical, their percentage of the total population, when the total population is very large, is irrelevant. The numerical size of the sample make a difference, but not its percentage of the whole.

        It seems to me that Gary and Jim Owen would be wiser to focus on something other than that percentage figure if they want to raise questions about Muller’s conclusions. I hope it’s also clear from all my statements above that I was neither defending nor criticizing Muller, but I was criticizing the misuse of percentages as criteria for judging the use of samples to estimate the values found in a population from which the samples are drawn.

      • Fred –
        It seems to me that Gary and Jim Owen would be wiser to focus on something other than that percentage figure if they want to raise questions about Muller’s conclusions.

        In this case the percentage is meaningless except as a measure of the data coverage of the planet. And if Muller uses 100%, it will still be inadequate to produce the product everyone’s looking for to the accuracy that’s being expected. Just as all previous data sets fail, in reality, to be as accurate as has been claimed.

        Keep in mind that the data coverage is not uniform – nor is most of it continuous – nor is it sufficiently accurate to produce a final output to the desired accuracy. But it’s what we’ve got, so it’s what will be used.

        Nor is what I’ve said here criticism of Muller or BEST. It’s simple fact.

      • Don – and Fred –
        The problem here is that you’ve both assumed a “representative sample” while, IIRC, the 2% is representative of only a small part of the world. I believe the word was – Japan. I could be wrong, but that’s what I remember reading.

        Yes, you’re wrong:

        ERRATA: I made a mistake regarding the 2% figure, I misheard what was being presented during my visit with the BEST team at Berkeley. As many of you may know I’m about 80% hearing impaired and the presentation made to me was entirely verbal with some printed graphs. Based on the confidentiality I agreed to, I did not get to come back with any of those graphs, notes, or data so I had to rely on what I heard. I simply misheard and thought the 2% were the Japan station analysis graphs that they showed me.

        I was in touch with Dr. Richard Muller on 3/28/2011 who graciously pointed out my misinterpretation. I regret the error, and thus issue this correction about the 2% figure being truly a random sample, and not just stations in the Japan test presentation shown to me.


      • I got your point, it wasn’t exactly complicated. You wrote “If you want to draw conclusions about a population from a sample, you want the sample data to be accurate, the sample to be representative, and the sample size to be adequate in absolute terms. If those criteria are met….”

        So, were those criteria met in Muller’s statements? If he did, please point them out. If he didn’t, then you are just engaging in reflexive defense of all things CAGW.

        My comment was not a “misunderstanding of statistical sampling.” There is absolutely nothing in Muller’s statements that suggest the 2% was chosen to be a representative sample.

        I’ll repeat, “could” (and your 2 “ifs”) just doesn’t say much, and says nothing scientific in this context. The sample “could” be accurate, it could be representative, it could be of a sufficient size, all depending on many factors. It would be a question of statistical significance if Muller claimed the 2% WAS all of those things. But Muller didn’t say that, nor did you. So your defense of him was substance free. That was why I made fun of it.

      • I was simply objecting to the notion that a 2% sample is inadequate of itself because its size. Of course, as I said, a properly constructed sample has to take account of the nature and distribution of the population it is to be drawn from. But if that is the case, then size is, after a point, not a problem. I don’t know the nature of the Muller 2%, nor do I know what the population is like in the GISS or CRU homes. Can’t comment there.

      • Don,

        My 9:55 comment above yours was a response to Fred Moolten. Hard to tell sometimes with these threads.

        My comment to Fred was based on the statement from BEST that the 2% were randomly selected, not apparently selected based on their being representative, etc.

        My focus was not on the 2%, Fred’s was. His comment was that 2% “could” be a sufficiently large sample. And indeed it might, but there is no evidence I am aware of yet in this case that any attempt was made to choose a “representative” sample. And despite several requests, no evidence to the contrary has been forthcoming from the defenders of Muller’s statements.

        The surfacestations.org project as I understand it is looking at both poor siting, and UHI. What percentage of the 2% makes up each section? Are those 2% geographically representative? Are they representative of the various siting issues, altitude, proximity to heat sources etc.? Are they divided between rural and urban?

        In other words, a reasonable defense “could” be made of Muller’s pronouncements (if one had any evidence on those issues), but Fred did not make it.

      • Sample size isn’t really the problem. The issue is that BEST *hasn’t completed* their methodology on the sample including apparently not adjusting for poor station data quality. By making *any* statement regarding results without having applied their methodology fully raises the specter of prejudice. I don’t understand how this is excusable.

        According to Watts’ surfacestation.org project, 85% of the stations could be off by 2-deg C. Making pronouncements (even preliminary with caveats) before completing their methodology irregardless of sample size seems reckless and to me implies that he is not “blind” to the results…

        As one of the ‘unconvinced’, it seems to me that we’ve been down this road before. BEST is supposed to be taking the high road.

      • Rattus Norvegicus

        As far as I can see, he presented his results as preliminary and said that complete application of the methods of their analysis might turn agreement into disagreement.

        I await the final results, but don’t think that they will be much different.

    • steven mosher

      You should also note that Watts abstract CONFIRMS Muller’s preliminary finding.

      station quality does not impact the MEAN temperature recorded.
      ( it does hit DTR which is a cool thing folks will have to look at)

      people miss that the mean temp is not effected. This comes as no surprise to those of us who.

      1. pointed surface stations at the CRN rating system document.
      2. conversed with the scientists who actually developed the rating system.
      3. did some preliminary work on the topic.

      The only field test of the rating system indicated a mean bias for the worst stations of .1C. A commenter named Sod and I have repeatedly pointed this out since 2007. the bias will be small and hard to find. JohnV and I couldnt find it, menne couldnt find it, Muller ( prelim) couldnt find it, Watts ( abstract) couldnt find it.

      There are a couple more tests to do. Those tests depend upon metadata that has just recently been provided. We should not expect any huge changes, you MIGHT see an effect as large as .1C, but the planet is warming .8C/century over land give er take ~.15c. And much of this warming is attributable to increased GHGs, or leprechauns. One can never rule them out.


  15. OK. Here is a model study by Burkhardt and Kärchner that estimates that jet contrails contribute:

    Globally, the long-wave radiative forcing due to contrail cirrus (after correcting the scattering component of the long-wave forcing from the model) amounts to 47.1mWm−2 and short-wave radiative forcing to −9.6mWm−2, resulting in a net radiative forcing of 37.5mWm−2.

    This is about one-hundredth of the radiative forcing from 2xCO2 as estimated by IPCC (Myhre et al.).

    Without feedbacks, 2xCO2 results in a theoretical warming of just under 1C.

    So contrails would contribute 0.01C warming?

    The study concludes:

    Clouds are influenced by small-scale processes that cannot be resolved by a large-scale climate model and which therefore need to be parametrized. The representation of clouds is a major source of uncertainty in climate simulations. The same problems also affect the representation of contrail cirrus.

    So the authors are basically confirming that we do not know if clouds exert a net positive or negative forcing. They point out that we also do not know whether contrails exert a net positive or negative forcing but, in either case, it is insignificant.

    The conclusion could well have been (to put it into IPCC parlance):

    Radiative forcing from contrails could be slightly positive or slightly negative, but is virtually certain to make no perceptible contribution to climate change.


  16. J.C. wrote: “For the record, I’ve stated a number of times that I think the climategate emails reflect poorly on the scientists and on the science. The emails do not demonstrate prima facie that any of the conclusions in the IPCC are incorrect.”

    I wish I could remember the exact question, but I’m pretty sure the gist of it was, “do the emails in any way weaken the AGW case?” If you’d given the answer you wrote above, you’d have left a very different impression than the one you created with your simple and definitive “no”.

    It seemed an important moment to me, maybe even pivotal for you personally because in that moment you could have defined your position in this debate a little more clearly for the record. A more nuanced answer along the lines of the one above would certainly have gotten noticed. Since you’ve made your position clear about this several times, I wonder why you seemed to step back just a bit at that key juncture.

    • I don’t recall answering “no” to such a question, if i did say something like this (and i don’t think I did), it must have been in 2009 before I had dug deeply. The position that I took in the wake of climategate was regarding integrity, transparency and uncertainty, that was the gist of my public statements immediately following climategate and remains what I continue to say.

      • Rattus Norvegicus

        On integrity I don’t think the emails show anything out of band. I’ve engaged in many engineering debates (I worked for a long time as a software engineer/hardware architect for a major computer manufacturer) in which all participants staked out strong positions. Arguments took place in email and in f2f meetings and eventually a consensus emerged about how to proceed.

        Reactions to FOI requests are more problematic. It is important to note that the UK does not have the history of openness that the US has. The UK FOI laws only went into effect in the early 2000’s. The US FOI laws went into effect in the mid 1960s. 40 years of experience and case law vs. 4 years is a big difference. Still, invoking the IP restrictions initially would have been the best course.

        Everything else about Climategate is just noise.

      • John Carpenter


        Maybe we are not reading the same climategate emails, but the integrity of both the scientists and the science of paleoclimate reconstructions is very much in question. It is very clear they were gaming the system to gain the upper hand for the AR4 report in response MM 2005 paper further critiquing the hockeystick graph….in addition to the FOIA issues. Perhaps you don’t think they did anything wrong, but very clearly they were trying to create a narrative depicting a high degree of certainty where there was none.

      • Rattus N: I don’t understand how people just shrug off the Climategate emails.

        These scientists weren’t merely “staking out strong positions.” They were openly discussing deleting data, deleting emails, rigging peer review, refusing FOI requests, and more.

        I’ve worked at companies where dodgy stuff went on. I once had a manager tell me, after I expressed concern about the equipment I was demonstrating to a government vendor, “Relax. We don’t pay you enough for you to lie.”

        So yeah I know dodgy stuff goes on in the real world. I also know that when you see some, there is usually more.

        My question for climate scientists and their supporters is “Why should I trust you?” I haven’t gotten a good answer yet.

      • I can easily see how some would view the climategate emails as being of trifling significance when the world of academia sometimes conceals extraordinary levels of dishonesty (as do so many other spheres of human endeavour including my own profession). Judged by this metric, the email behaviour comes across as relatively benign. The problem arises when these emails are exposed to a naive public who expect “better.”

        At any rate, the numerous enquiries prorting to clear the UEA and associated researches comes as no surprise given that such behaviour is so “normative.”

        I find myself wondering if Dr Curry had perhaps never paid that much attention to the politics of her field until the emails. Some of us can be lucky and conduct our affairs free of professional “politics”until something comes along and opens our eyes.

        However, we do need to ask ourselves whether the attitudes and behaviours exposed by the emails are the kind of thing we really want to see in people who play key roles in influencing politicians and the public.

        After all, Richard Nixon never did do anything particularly unusual by the standards of the politics of the time other than lying about it in a rather clumsy way and getting caught. But is this what we really want to accept as normative for politicians?

        So too with science.

  17. To me, a serious study of what we know of the temperature data of the 20th century, and how we know it, and how large the error bars are, is simply essential. Without decent data here, we can’t really try to separate AGW from natural variability.

    I think such a study has to go past whether CRU, GISS etc have got it about right. It has to look at whether or not we know anything at all. I’ve asked a most knowledgeable meteorologist about the likely error in LIG (liquid in glass) thermometers, and he put it at 0.2 degrees C. Since the 1980s LIG instruments have been replaced by MMTS, and the error is much smaller. But what are the errors in measuring temperatures over the seas? How do we relate those errors to those associated with missing data over land (and over seas)? And so on.

    Again and again I wonder at the apparent precision of moves in the global temperature anomaly, and think of the truly awful state of the data that now seems to have an accuracy of three decimal places.

    Do we know anything at all about what happened in the 20th century to ‘global temperature’ — always assuming that these averages of averages of averages make any real sense to anyone?

    • Don

      I have been making similar points quite a lot. In the end, all this debate about the land temperature is of limited interest. We have satellite data now. What we need to do is develop a deeper understanding of the climate system and try to work out whether CO2 is part of the signal or part of the noise.

    • Don, I had letters in The Australian in 2007 suggesting that the most urgent need was to get better data and scientific understanding than was available before making significant policy decisions. Haven’t seen much advance since then.

    • Clive

      Yes. It is definitely a “must see”, for people in both camps.

      Rational skeptics will enjoy Courtillot’s straightforward style, simple deconstruction of the consensus paradigm and interesting hypotheses regarding possible solar mechanisms.

      Open-minded consensus supporters (the orthodoxy) will be able to get a glimpse of the black swan from outside the box that might become a threat to today’s mainstream paradigm.

      In the first part of his presentation he very effectively gets across the point that Judith Curry has made about the enormously understated uncertainty (“acknowledging what we don’t know”) inherent to the AGW hypothesis.

      He got a round of applause with his statement that there is too much emphasis in climate science on the numerical aspects (i.e. modeling) and absolutely not enough on observations plus his suggestion that observation is the key thing that should be supported in the coming decade.

      His comparison of 20th century North American and European temperature records was also a real eye-opener (warming trends are totally different from each other and from the trend of the globally and annually averaged land and sea temperature construct used in climate science).

      He points to the multi-decadal oscillations in the global record which bear no resemblance to CO2 concentrations. And (oh horror!) he states that it has not warmed for the past 12 years. He has concluded that it is likely that the current lack of warming or slight cooling will continue for the next few decades.

      He then states flatly that the “Mann curve does not hold anymore” (which almost everyone, except a few die-hards and Mann, himself, would disagree with today), that the MWP and LIA plus earlier Roman Optimum were real and that neither the 20th century warming itself nor the rate of warming are without precedent.

      He then goes into three complementary hypotheses of solar effects on climate (which I won’t go into), referring to the preceding presentation by Nir Shaviv.

      He criticizes the climate models as being non-falsifiable (and thus not scientific), stating that if one says there is something wrong “they simply twitter a parameter and it’s OK”, without addressing the real cause.

      He closes saying that there is not much question that it has warmed over the 20th century with regional irregularity, but it is totally uncertain what the cause for this has been.

      He believes that with 5 to 10 years of new data we will be able to see whether or not the IPCC models can be discarded or not.

      The presentation is definitely worth hearing and seeing.


      • Excellent synopsis of what – for me – was one of the top presentations on climate change, if not one of the best.

        Just two footnotes:
        * He observed that the work was done by himself and established colleagues, not by students because such work would be detrimental to their further careers.
        That is a terrible state of affairs!

        * He mentioned that he and his colleagues have run into (the usual!) difficulties in regard to publishing this paper.
        Pal review at work again …

    • The Courtillot video is indeed worthwhile. As are Manacker’s and Viv E’s comments.

      The video is not a bust to orthodox AGW, but it is a red flag that the uncertainties are larger, the data iffier, the process dodgier, and the possibilities more expansive.

      Dr. Courtillot comes across as smart, learned, and even-handed. With scientists like him on hand, how is that we have such a rush to judgment on AGW?

  18. J.C. wrote “I don’t recall answering “no” to such a question.”

    Dr. C.,

    I think this was at a somewhat recent event (with the last 6 months) in which experts on both sides of the question (were there “teams?”) made statements and then fielded questions. The fellow who asked you this question was a congressman who I believe also happened to have a Ph.D. I’m certain you must recall what this event was.

    Perhaps it was after your testimony to the subcommittee on energy and environment?

    Of course I could be wrong. I don’t believe so however.

    • Whether JC said no or not, her answer seems consistent. Climategate didn’t really hurt anything that wasn’t already hurt. It explained quite a few things. I also think it has changed the way certain players are looked upon by their peers as well. Just a few more overly creative statistical approaches may just openly change more peer attitudes. At least it should.

    • Don’t recall this at all.

  19. Dr. Curry

    Thank you for the links and commentary. Drilling down through the first, admittedly slightly lightweight reference, one finds this little gem on CO2E labelling food for thought: http://www.nature.com/nclimate/journal/v1/n1/full/nclimate1071.html

    The AGW Observer collection is a truly chilling insight into the endless inventiveness of the human mind when presented with idle hands.

    The agriculture link, also food for thought, if it could be brought into the body of climate knowledge more formally and better developed.

    The law opinion link from Columbia will broaden the perspective of many readers, and certainly — agree or violently disagree — lends some expertise to the topic.

    At least we can count on, with weeks like this, seldom being bored for want of reading.

  20. [I posted this above as a comment to a blogger named pokerguy, but am posting it again here so others can see it and hopefully comment on it.]


    I think your “conversion” from “dangerous AGW believer” to “skeptic” is representative for many others.

    I had a similar experience.

    A big factor that caused me to shift to skepticism was the enormous understating of uncertainty and the blatant overconfidence of IPCC in the validity of its findings.

    As Judith Curry wrote on another thread about IPCC:

    Overconfidence comes across as selling snake oil.

    And that is precisely what turned me off.

    Then, as I started checking out the IPCC claims in detail, I saw that many of them were exaggerated and others were simply full of holes.

    I saw that
    · claims of unusual 20th century warming for 1300 years were based on faulty hockey sticks (and “spaghetti” copies)
    · claims of accelerated sea level rise were based on bad science (switching methods and scope of measurement)
    · claims of accelerated warming were based on clever comparison of shorter and longer time periods
    · claims of insignificant natural climate forcing were based on measured direct solar irradiance alone, ignoring all other factors
    · claims of mass loss of the Antarctic Ice Sheet contradicted studies that showed just the opposite
    · claims of mass loss of the Greenland Ice Sheet contradicted studies that showed just the opposite
    · claims of decreased northern hemisphere snow cover were not supported by the observations
    · claims that Antarctic sea ice extent showed no trends were falsified by the data, which showed net growth
    · claims that the UHI had a negligible effect on temperature were poorly substantiated and falsified by many studies
    · claims that the satellite record shows faster warming than the surface (validating the GH hypothesis) were false
    · model assumptions that relative humidity remains constant with warming have been falsified by physical observations, which showed a much lower increase in water vapor content
    · model assumptions that clouds exert a strongly positive feedback with warming ignored physical observations, which showed a strongly negative feedback instead
    · claims that tropical cyclone activity increased in the latter part of the 20th century have been shown to be false
    · claims that severe weather events (floods, droughts, heat waves, etc.) increased in the latter part of the 20th century due in part to human activity were not substantiated with facts but simply “expert opinion”
    · projections of future CO2 levels in the two extreme “scenarios” (those that project alarming warming) were based on levels that exceed the carbon contained in our planet’s total optimistically estimated fossil fuel reserves
    · projections of future CO2 levels in the middle “scenarios” were based on a doubling of the past CAGR of CO2 despite a drastic projected slowdown in population growth
    · projections of future temperatures (to 2100) were based on the exaggerated assumptions on future CO2 levels (see above) – if we look at Figure SPM.5. and eliminate the top three scenarios, we are left with no alarming warming to year 2100 (even with the exaggerated IPCC assumptions on climate sensitivity)
    · sea level projections were based on the false conclusion that rates have accelerated in the last part of the 20th century and on the exaggerated temperature projections based on the exaggerated assumptions on CO2 (see above)
    · future projections of extreme weather events were based on the incorrect claims of events in the late 20th century (see above), with stated >50% likelihoods of past events extrapolated to >90% of future events

    What was also annoying in the report (back to Judith’s observation on “snake oil”) were the repeated suggestions of greater confidence in the conclusions:

    “new and more comprehensive data, more sophisticated analyses of data, improvements in understanding of processes and their simulation in models and more extensive exploration of uncertainty ranges”; “the understanding of anthropogenic warming and cooling influences on climate has improved since the TAR, leading to very high confidence [>90%] that the global average net effect of human activities since 1750 has been one of warming”; “warming of the climate is unequivocal”, “most of the observed increase in global average temperature since the mid-20th century is very likely [>90%] due to the observed increase in anthropogenic greenhouse gas concentrations. This is an advance since the TAR’s conclusion that ‘most of the observed warming over the last 50 years is likely [>66%] to have been due to the increase in greenhouse gas concentrations’. Discernable human influences now extend to other aspects of climate, including ocean warming, continental-average temperatures, temperature extremes and wind patterns”; “There is now higher confidence in projected patterns of warming and other regional-scale features, including changes in wind patterns, precipitation and some aspects of extremes and of ice”.

    This does, indeed “come across as selling snake oil”.


    PS For a good summary of IPCC exaggerations, distortions, fabrications, flawed assumptions and conclusions, etc. see:

    • Overconfidence comes across as selling snake oil.

      I won’t clutter the thread with the youtube link, but that always reminds me of the Simpsons Monorail scene.

    • John Carpenter


      A lot has been talked about in this thread about the (disappointing for some) testimony of Dr. Muller during the recent congressional hearing on the processes used to create science and policy. I am not going to offer an opinion on that. IMO, the far more important testimony given at the hearing was from Dr. Christy.


      His testimony points out the biggest problems with the current process of disseminating the current state of climate science to policymakers via the IPCC. I know you are well aware of these issues. I point this out as an additional source. His testimony is very damning of the current process. The process is in dire need of “adult supervision” and I agree with his position.

      A lot of snake oil is being sold and the current process allows it. The question is how can this be changed? Congresswomen Johnson queries “where are all the dissenting scientists?” I ask the same question. Why don’t more of the known, influential climate scientists who feel the process is broken speak up, volunteer to speak at these hearings as well?

      One way to change the process is to have more testimony like that of Dr. Christy… questioning the methodology and process employed by the IPCC to advance the science to policymakers…. getting on the record. IMO, this is where real change can take place.

      • John

        I agree with you that it is the IPCC process which has become corrupt, as John Christy testified.

        As a result, IPCC reports contain distortions, exaggerations and understated uncertainties and simply ignore or omit any data, which do not support the desired message, as pointed out in my post and the cited summary report.

        Christy’s suggestion for correcting the process by putting it under “adult supervision” by independent outsiders makes sense; I would add statisticians to the auditors he has proposed.

        To make this work, however, it is of crucial importance that the “adult supervisor” team include members who are neutral and others who are even skeptical of the “dangerous AGW” message. Christy, himself, or Roy Spencer would be a good candidate, as would Judith Curry; Steve McIntyre or Ross McKitrick plus several other names also come to mind.

        Will it happen?

        Not as long as the current political leadership and membership of IPCC are in place, IMO.


      • Thanks for pointing to Dr Christie’s statement.

        I find it interesting that this doesn’t seem to have gained any traction in the ongoing debate, while much energy is spent on looking at what Prof Muller did.

  21. Open thread! Great!

    OK, Judy. You have now presented (and mocked) several theses in the “Slaying” book, but I think the most important issues still have not been discussed.

    I would be interested in comments on this view:


    It appears from my chair that the “experts” are trying to ignore this perspective. But I may just be missing something…

    Please snip all comments that do not present data or real science (well, then we might have a dead thread here, though, eh?).

    • JAE

      Perusing the uncongealed bucket of quibble of the introduction, one comes across the introductory naive discussion of ‘scientific fact’ vs. ‘theory’, a well-known and deprecated poisoning-the-well technique in this rehased and reissued two-year-old compilation and one’s hopes for something interesting or useful on Hans Schreuder’s website evaporate unless one is patient. (To present metadata, though one is hard-pressed to find discussion of this paper as science.)

      Such usages — to meet quibble with quibble — of ‘fact’ in contextual phrases like ‘scientific fact’ or ‘political fact’ or ‘schoolyard fact’ or ‘wingnut fact’ are well-established in natural language, and invalid sophistry belaboring trivial and peripheral issues of usage does not bode well for a scientific discussion.

      Much like my previous two paragraphs.

      At least one ‘Greenhouse Effect’ itself is scientific fact, by common usage of the phrase (ie observations repeatedly confirmed and accepted as true, “ORCAAAT”).

      1. Air contains CO2; (ORCAAAT)
      2. CO2 has absorbtion bands in the IR spectrum, some of which do not overlap other components of air; (ORCAAAT)
      3. ‘Direct’ radiant energy from the Sun reaches Earth in forms other than IR that air is transparent to; (ORCAAAT)
      4. Some ‘direct’ non-IR radiant energy reaching the Earth is absorbed and re-radiated as ‘indirect’ IR to the air; (ORCAAAT)
      5. Air then absorbs some of this ‘indirect’ IR radiant energy; (ORCAAAT)
      6. Air then re-radiates some of this ‘indirect’ IR radiant energy, including some of it downward. (ORCAAAT)

      What becomes of the energy after #6. we need not specify; we have a Greenhouse Effect at this point.

      A position disputing the above six sets of observations contradicts ORCAAAT in the bodies of knowledge comprising at least physics and climatology, and common widely-available statements defining or explaining the Greenhouse Effect.

      The semantic fact that also there are competing Greenhouse Effect Mechanism theories or hypotheses, disagreements on impacts and on interpretation does not diminish the truth of these repeated confirmed accepted observations.

      If Postma uses another definition of ‘The Greenhouse Effect’ he does not derive for us its provenance.

      The source of Postma’s definition of ‘The Greenhouse Effect’ is not made explicit.

      Does the straw man constructed out of this ‘Postma Greenhouse Effect’ becomes the topic of his paper, wasting a page and a half of our time to get to, matter? We can’t know for almost another two dozen pages.

      Postma’s introductory, “..the proposition that the atmosphere warms the surface of the Earth to a temperature warmer than it would otherwise be without an atmosphere, via a process called ‘back-scattered infrared radiative transfer’..just a fancy way of describing the idea that greenhouse gases act like a blanket around the Earth which traps infrared radiation, with the radiation causing it to be warmer than it otherwise would be,” contains significant ellision, leaving out mention of ‘equilibrium temperature’ or concepts of dynamic equilibrium, or of chaotic systems, or of ergodicity or attractors or external forcings or pertubations. We know he must know and appreciate these terms, as he discusses related ideas later.

      It may have been more acceptable for Postma to suggest rather the more widely acceptable and accurate proposition that the atmosphere slows the cooling of the surface of the Earth, leading to an equilibrium temperature warmer than it would be without an atmosphere (as we see when comparing the temperature of the Earth to the Moon), via a process involving ‘back-scattered infrared radiative transfer’ .. which can be grossly simplified by likening to a blanket around the Earth which traps infrared radiation, with the radiation remaining longer near the Earth than it otherwise would. We know he accepts such verbiage, as he uses it later about water vapour, and somewhat later discussing Venus.

      Further, we know his, “this is supposed to be loosely analogous to how a botanist‟s greenhouse works,” remark has been widely disparaged on all sides, has clearly and repeatedly been distanced from by all many experts on both sides, and except at such levels as address the origin of the term stemming from days when botanist’s greenhouses were less well understood seems mostly to be used to fallaciously discredit an effect perhaps as unlike botanist’s greenhouses as a running shoe is from a running nose.

      What a shame, as we then are called on by serious inquirers to read the rest of this ill-framed position and comment on it after so unpromising a start.

      It is a double shame, as Postma is quite readable later, informative, thorough and thoughtful, so we actually have good reason to read through the rest after wading through the first couple of pages of quagmire.

      Don’t get me wrong, I’m in favor of collaborative and open online evolution of scientific ideas into formal papers. However, some hard editing should happen in formalization, and a decently merciless editor would have been of substantial benefit in this case.

      It would save our time, as such gaffes need not derail productive skeptical questioning or the enjoyment of reading so early on.

      That similar inaccuracies or straw man restatements of definition make us worry they may be there to allow Postma to more easily defeat them rhetorically follow from time to time throughout the paper should not then surprise one, whether of terms like global-warming or anthropogenic climate change or even blackbody. We see after moderately long discussion that Postma sometimes simply needs to be allowed time to get around to his point, and isn’t always building a straw man. He’s just a bit periodic.

      As to Postma’s claim, “it is usually sufficient simply to acknowledge that the equation has been written down and showed to you, but you are not required to work it out for yourself.” I’m staggered.

      I have no comment on this patent dyskeptia equal to such a stunning assertion, especially given that he doesn’t always write down his equations.

      Postma might save himself entirely, and us a great deal of time if he appended to. “it is absolutely fundamentally impossible for a blackbody to further warm itself up by its own radiation,” ‘but this tells us little or nothing about its rate of cooling, or the total heat it will contain if it is a complex object.’

      Then we could talk about a complex ‘blackbody’ object constructed to have a fluffy asbestos wool outer shell and a tungsten inner core that can rearrange itself into a hard tungsten outer shell with a fluffy asbestos wool inner core, as an analogy for what happens when you take carbon from the interior of the Earth and paint its surface with it (to the degree of one additional pencil’s worth per square inch of sky every decade). Which will get hotter under the same Sun, the fluffy asbestos-Earth blackbody, or the hard black tungsten-Earth blackbody?

      Once we thus show the marginally faulty assumptions and seemingly false analogy Postma and the Slayers rely on, we must approach with extreme suspicion their remaining argument, which slows us down and diminishes the enjoyment of reading.

      What is significant is that Postma does not distinguish bottom from top of atmosphere (TOA) in {6} though he does remember the distinction later, and holds the surface brightness of the TOA constant.

      Surface brightness? How can atmosphere have a surface brightness? What else do we call the thing darkened every decade by one pencil’s worth of CO2 for every square inch?

      Let’s ignore than {10} leaves out the (inconsequential) energy of the Moon at night on the dark side of the Earth, too.

      I’m uncomfortable with the phrasing of “the solar energy gets spread around to a lot more material in the atmosphere, and therefore gets diluted.” We must wait about ten pages to have our discomfiture addressed with the purpose of this point.

      I’d prefer ‘spherical boundary,’ over ‘circular curve,’ and related usage corrected to disambiguate two from three dimensions in that paragraph on page 15 when describing the surface of the Earth.

      I’d also prefer if the ‘simple trigonometry’ were explicityl shown, but one can work it out for themself, I suppose.

      By page 16, any editor would have screamed for better organization of thoughts.

      At the page 16-17 break, Postma appears to explain perfect understanding of the Greenhouse Effect of water vapour comparing desert to rainforest at night.

      Why doesn’t this occur to him by the page 1-2 split?

      Also, “The really nice thing about thermodynamics is that you can use the simplest equations to predict the outcome of complex systems..” is a bit of a mischaracterization of how a Chaotician might interpret the situation, which is that thermodynamics can make predictions about some non-complex aspects of systems with some complex characteristics. I can certainly predict a swarm of bees will be black and yellow to a high degree of precision, but make no precise comment on the future flight path of individual bees. It’s not a particularly relevant quibble here.

      What isn’t a quibble is the utter lack on page 23 of logical development for, “This implies that the only real way to increase the temperature on the surface of the Earth via an atmospheric “greenhouse” effect is to increase our atmosphere‟s density.”

      What? Oh. I see. Those who claim Venus is as Venus is because of a runaway greenhouse effect imply when Postma’s clarification of his Venusian ORCAAAT is applied that density is the issue in their greenhouse effect.

      Some holding a view about a Greenhouse Effect is wrong about Venus, therefore everyone with any view about a Greenhouse Effect is wrong about Earth?

      Straw man, that.

      Not the main point of the paper, of course, and could easily be dropped from the discussion as a digression that doesn’t contribute much to understanding in an already overlong piece.

      Here on page 24 we finally get to the explicit direct error at the heart of Postma’s article. Took him long enough. “It does not matter that some energy is re-emitted back down to the ground, because it can never be enough energy to heat itself.”

      It matters if the energy slows the rate of heat loss.

      Which Postma has stipulated to when discussing rainforests vs. deserts.

      All the rest is repetitive rhetorical device.

      Postma’s Thermodynamics is fine.

      I recommend reading Postma on this topic.

      He improves the quality of the debate by injecting scrupulous objection to imprecisions in language and errors in formalism he’s come across.

      He then trips over his own straw man, believing the imprecision, not the actual substance, is the case he’s opposing.

      Overall, the paper could be much shorter, and then I’d have had to spend much less time being disappointed by it.

      This paper’s earlier versions must have had a substantial body of contrary points addressing it, if well-circulated online; such points ought be recognized in a formal presentation, and addressed, discussed or at least acknowledged for the sake of balanced presentation if one’s objective is scientific truth to the extent that one begins by lecturing readers on the semantics of ‘scientific fact’.

      Also, although problematic, credits to contributors under a collective nom de plume is not a confidence-building measure. A bit hypocritical from me, I know.

      Slayers indeed; one half expects Kristy Swanson, Eliza Dushku and Sarah Michelle Geller listed as contributers.

      • Are we still debating this? Atmospheric CO2 can very slightly decrease Tmax and very slightly increase Tmin, but it can do nothing to Tavg. Even Buffy could figure this out.

        Claims of ever-increasing average temperatures caused by atmospheric “back radiation” are nonsense.

        When a molecule emits a photon, it cools. There is no way that photon (or the energy the photon represents) can come back and make the molecule as hot or hotter than it was.

      • When a molecule emits a photon by relaxing a vibration energy state, it does not cool.

      • Ken Coffman

        I could find relatively little wrong with Postma’s paper, other than its conclusion, introduction and a few quibbles here and there, and its inordinate length.

        Your post has brevity on its side.

        In addition to Rod B’s point, technically, as a Physics professor of mine once explained, an metal object — like a wrought fire poker or a tungsten klein bottle — is all one molecule, so long as the metal is homogeneous.

        We know from familiar example — the fire poker — therefore that a molecule can be hot enough in one locus to incandesce while in another locus may be quite cool.

        You can set the klein bottle in a vacuum and heat a point on its interior with a laser, and except for the photons that radiate out through the same path the laser beam took, every photon emitted by that molecule will be reabsorbed by the same molecule.

        With a clever arrangement of prisms and knowledge of critical angles, you can conceivably make every photon emitted by the klein bottle ‘molecule’ return to it, at least until the incandescence spreads to the exterior surface of the jar.

        Silly abstracted physics example, but a proof by counterexample only needs one example to falsify a claim.

        Likewise, your assertions about Tx are at best unconvincing and vague, demonstrable to not be always true, therefore to not contribute much to our discussion.

  22. Re Understanding_the_Atmosphere_Effect.pdf
    “Thermodynamics says that no object in the universe can heat itself by its own radiation, nor can heat flow from cold to hot”

    Thermodynamics does not forbid what is actually happening behind the greenhouse effect:

    1) The atmosphere emits infrared radiation downwards.
    2) The infrared is absorbed by the surface.
    3) Hence the surface is gaining more energy per second because of the presence of the atmosphere.
    4) More energy absorbed means a higher equilibrium temperature of the surface.

    In conclusion the presence of the colder atmosphere makes the surface warmer.

    If that was a violation of a law of thermodynamics (it isn’t), that law would be wrong and would have to be rewritten to take into account observations of the actual universe.

    • “1) The atmosphere emits infrared radiation downwards.”

      That is where most of the confusion lays. CO2 slows the rate of outbound radiation. On a small scale, some LWR is directed back toward earth, but the rate of reduction of outbound LWR is the key. Does a thermos bottle shoot photons back at your hot coffee? To a point, but the heat is retained longer because the rate of heat loss is reduced. Down welling, back radiation are minor players, it is the reduced rate of outbound radiation that is the pooh.

      • Thermos Bottle is probably a poor choice of examples. It works by reducing conduction of heat energy, there is not necessarily any reduction in radiation though there could be when mirrored surfaces are used.

      • True, there is no perfect analogy. I used a thermos so one could compare the reflective surface impact to the total insulation impact. The dam or blanket analogy are useful to some point. It is best to avoid analogies and just understand what is happening. CO2 does get excited and does emit some energy when it drops back to a lower state. That energy is emitted in all directions randomly and only travels so far before it excites another molecule or eventually escapes. The impact is a reduce rate of outgoing LWR flow. Avoiding descriptions that confuse people should be the norm, not creating new terms that are little more than analogies.

      • steven mosher

        a space blanket is perhaps a better analogy which works by retarding heat loss by radiation.

      • Yes, I agree. And a blanket doesn’t have to be at body temperature to keep you warmer than no blanket, like the atmosphere can be cooler than the surface and still keep it warm.

      • Dallas

        If you look at the Kiehl and Trenberth Earth earth radiant energy balance cartoon, it shows 342 W/m^2 incoming solar SW radiation, of which around one-third, or 107 W/m^2 are reflected by clouds, surface, etc.

        It also shows that 40 W/m^2 are radiated directly out to space and 390 W/m^2 are radiated to the atmosphere by the surface as LW radiation (along with 102 W/m^2 from thermals and evaporation) but that 452 W/m^2 of this energy are absorbed by the GHGs and clouds in the atmosphere and 324 W/m^2 re-radiated back down to the surface where they are absorbed, with only 195 W/m^2 being radiated by the atmosphere out to space.

        An up-dated version has jiggled these large numbers slightly to accommodate a postulated “net imbalance” of 0.9 W/m^2 (a “plug number” taken directly from the Hansen et al. “hidden in the pipeline” study and rounded up from 0.85 to 0.9).

        So the back radiation is a major factor in the overall internal flow.

        But how realistic is this all?

        Do we have any empirical data from actual physical observations ? Can TOA numbers from ERBE or CERES tell us anything about this internal energy flow or is it all essentially based on model simulations based on theoretical deliberations?


      • Max,

        Determining the impact of CO2 is the trick. Cartoons are fine for general descriptions but really little else. Does CO2 get excited and shortly after re-emit a packet of energy? Sure. Where would this have an impact? In dry (or dryer) air, else it is lost in water vapor. Is it hard to figure out what impact CO2 has? Hell yeah! But that doesn’t mean there is no impact.

        So as far as cartoon radiation balance diagrams go, they are humorous. They could be more descriptive and use COzilla or something interesting instead of down welling heat.

      • The Spam thing ate my homework! The cartoon is just that. Back radiation is the punch line. Back radiation or down welling heat are just poor analogies that do not describe the process. How far does a CO2 released packet of energy travel in dense humid air? Not far. The direction of heat flow never changes, its rate is just reduced.

      • John Carpenter

        Why do the cartoons like this always show GHG’s way up high in the atmosphere (stratosphere), like a giant reflector aiming the LW radiation back to earth? I understand these images are trying to help non-technical people understand the concept…. but the visual is such a bad example of what is rally happening that it may be doing more damage to the understanding than good. As you allude to, this type of diagram ought to indicate the change in rate the LW can escape rather than wavy lines reflecting back toward earth. Just sends the wrong message. GHG’s are not mirrors, they are more like blockers that make the LW radiation escape through a much longer circuit than if the GHG was absent. Why don’t they show it like that?

      • The most serious problems with the energy budget is that it is static and climate is dynamic and that climate changes are not restricted to, or even dominated by, greenhouse gases.

        There have been problems with satellite data – Wong et al 2006 for instance corrected the ERBE record for altitude drift. And much has been made of the problem ISCCP instruments have in seeing low level cloud. Nonetheless the data has been refined over a considerable period, both satellite sources agree and it is validated by physical observations of cloud changes especially in the Pacific. It is time that this issue came to the forefront – there is a problem with the long picked over satellite data or there is a problem for AGW theory.

        This is the infrared up at top of atmosphere in the tropics from both ERBS and ISCCP – showing huge cooling on the infrared http://isccp.giss.nasa.gov/zFD/an2020_LWup_toa.gif

        This is the equivalent in the shortwave showing considerable warming as a result of less shortwave reflected from less cloud –

        This is the net – showing residual warming.

        Anomalies are shown as these changes are an order of magnitude more accurate than absolute values.

        For completeness – this is the CERES data showing a little warming at the end of the period in the LW and a warming trend in the period in the SW.


        The size of the anomalies show cloud cover changes rather than the slow changes from aerosols or greenhouse gases are responsible for much of the change.

        It is well known that the Pacific Ocean is the dominant source of global climate variability and that this involves the formation and dissipation of low level clouds over a colder or warmer ocean respectively (e.g Burgmann et al 2008, Clement et al 2009).

        The interpretation of the satellite record shows decreasing cloud in the warm Pacific multi-decadal mode to the late 1990’s, followed by an abrupt increase. Cloud cover has declined a little in the interim in moderate El Nino conditions to mid 2010 and the planet warmed a little in the deep oceans especially. The current super La Nina has resulted in cooler conditions as one would expect.

        An abrupt increase in cloud in the Pacific in the late 1990’s was noted in observations by Burgmann et al 2008 and Clement et al 2009. They also noted an equivalent decrease in cloud cover in the 1976/77 Pacific climate shift.

        Secular changes in cloud as shown in both surface and satellite observations pose a fundamental problem in climate science. In popular communications clouds and that other fundamental problem of climate science, ‘internal climate variability’, are invariably mentioned as afterthoughts. In the recent Royal Society climate summary for instance – clouds solely as feedbacks and ‘internal climate variability’ in a single short paragraph. These seem to be rather the most important modes of climate change and deserve much better treatment than they have been receiving.

      • Chief
        “This is the infrared up at top of atmosphere in the tropics from both ERBS and ISCCP – showing huge cooling on the infrared”

        I may have made a very basic misunderstanding of the science and, if so, I apologise. However, in your graph here upwelling radiation at the TOA has increased over time. Doesn’t this imply that the atmosphere has warmed rather than cooled?

        ie hotter = more radiation emitted.

      • Energy is everything in climate – all warming or cooling results from an energy imbalance. Energy in less energy out equals the change in global energy storage. This can be expressed as:

        Ein – Eout = d(GES)/dt – it must be this way from the 1st law of thermodynamics. Energy equals the radiative flux over time – 1 watt for 1 second is one Joule. If energy in is greater than energy out the planet warms and vice versa. Alternatively, if the planet is warming as it was in the period of the satellite record to the late 1990’s – then energy in is greater than energy out.

        What we have is anomalies in radiative flux and what we see is more radiative flux and therefore more energy leaving the planet in the late 1990’s than in the mid 1980’s in the infrared – the graph you refer to. So we have some cooling in the infrared as a result of an increase in energy out.

        At the same time a decrease in cloud cover resulted in less reflected light leaving the planet. Energy out decreased in the SW and the planet warmed.

        The net effect was for less energy leaving the planet – so overall the planet warmed.

        Energy in was also higher in the late 90’s than in the mid 80’s in the normal course of the quasi 11 year cycle.

        You are quite right that hotter equals more emissions in the IR. But to get hotter you need an energy imbalance. One way that arises is from more greenhouse gases absorbing more of the surface IR emissions heating the atmosphere. This energy is stored in oceans and atmosphere and not emitted. Ultimately, the planet is warmer and emits more IR. All other things being equal you don’t get an increase in emissions – they simply tend to return to the more balanced amount with a warmer planet. All other things are not equal.

      • Dallas – I have to disagree with you about the cartoon, which I believe provides a very accurate picture of exactly what happens regarding radiative transfer and the greenhouse effect. I recently gave a talk on climate change to a college audience and used a similar diagram to illustrate the basic principles.

        If you consider simply the multiple redirections of infrared (IR) photons in the atmosphere via greenhouse gas absorption and re-emission before escape to space, the process of escape is almost instantaneous, so that any “delay” from a CO2 increase involves only a fraction of a second – in other words, there is no meaningful reduction in the rate of escape of individual photons but rather a temporary reduction in the net flux (total number of IR photons escaping per second, all of which are still traveling at the speed of light) . In that sense, it can give a misleading mental picture to refer to a “slowing” effect. Rather, the effect consists of (a) local thermalization (heating) in the atmosphere due to transfer of energy from excited GHG molecules to surrounding air, and (b) downward radiation that heats the surface. It is true that heat absorption by the surface, particularly the ocean, results in only gradual warming because of the enormous thermal inertia of the oceans, and so the restoration of balance between incoming and outgoing radiation via increased IR emission by a warmed surface takes much time; in that sense, the greenhouse warming is delayed, but not because it takes a long time for energy emitted by the surface to escape to space via dealys in the atmosphere.

        Regarding questions of observational confirmation, I believe Judith Curry’s blog post on radiative transfer models contains links and references to the A.R.M. measurements of downwelling radiation at the surface, and her post on Ray Pierrehumbert’s article describes the spectral TOA measurements demonstrating the major “ditch” dug out of IR emissions in the CO2 absorption region. These should be consulted for some of the quantitative details.

      • Then let us agree to disagree. It all boils down to an analogy unless you accurately describe the process. Black body radiation “flows” from a warmer source to a cooler sink seeking thermal equilibrium. Pesky little LWR molesting molecules make the black body a gray body, they temporarily block the flow of heat in the form of LWR. Some are better blockers than others. That packet of LWR at a certain wave length, excites the molecule which chills out and releases a small packet of LWR at the same wave length in a random direction. Depending on the density of the good blocking molecules, another gets excited and chills. As the density of good blocking molecules decrease, the distance the packet travels increases. The odds of the packet traveling further in the less dense good catching molecule direction are greater than in the opposite direction. The less dense good catching molecule region just happens to be space, where the little photon of LWR was headed to begin with. While the little outgoing packet of energy got stuck in rush hour traffic for a while, he finally will make it to the inter stellar freeway, it just takes longer. Whether the blockers stopped the little packet for a picasecond or some other minute time period is not relevant, the packet’s progress was impeded. As far as temperature of the atmosphere goes, it is the increased number of momentary stops and releases that increase temperature. The little LWR packet never returned to its starting point and gave up on its journey, it endeavored to persevere, driven by the laws of thermodynamics on its quest for thermal equilibrium.

        Yeah. that is kinda hard to put in a single cartoon cell. I guess it is more fun to imagine each CO2 molecule as little space ships shooting heat rays back at Earth.

      • I believe you’re repeating the same misconception that it is a “time delay” in the escape to space of energy released into the atmosphere that mediates the warming from increased CO2. That is not correct, Dallas. The warming is due to the heating of the atmosphere by CO2 (or other) GHG molecules that are absorbing more energy because they are increased in number and intercepting more surface emitted IR, and also to heating of the surface due to increased downwelling radiation resulting from the increase in GHG content. The increase in CO2 temporarily reduces total outward flux to space, but this is restored to earlier levels when the temperature increases. At this point, the rate of IR escape is back to normal but the temperature remains warmer.

        One way to visualize this is to consider the fate of an extra IR photon. If it is back-radiated into the ocean, the equilibration time may require centuries before the ocean warms to the point that it can emit an extra pboton in return – that is the delay due to surface absorption. On the other hand, if an extra photon is emitted into the atmosphere from a warmed surface, that extra energy will escape to space almost immediately no matter how many times it bounces around in the atmosphere (as long as it doesn’t get absorbed by the surface). In other words, the tiny delay due to the extra bouncing around will have no discernible effect on temperature.

        The difference between the ocean-mediated delay and the immediate escape to space of atmospheric energy increases results from the high heat capacity of the ocean and the very low heat capacity of the atmosphere.

      • If we release greenhouse gases to the atmosphere we decrease the transparency of the the atmosphere to IR – more IR is captured in the atmosphere before escaping into space. Thus the internal energy of the molecules increases along with the temperature of the atmosphere. Is this at the speed of light? It relies on internal changes to the kinetic energy of molecules but is probably pretty fast. The warmer atmosphere emits more infrared in all directions by a 4th power function – enough more up to offset the lower transparency.

        If we don’t continue to add greenhouse gases – the temperature of the atmosphere is maintained at the warmer temperature while the greenhouse gases remain in the atmosphere but doesn’t warm further.

        While driven at the quantum scale – it is a macroscopic gas problem involving heating of the entire atmosphere and then increased IR emissions from a warmer atmosphere.

        Nothing to do with the oceans at all – except that a warmer atmosphere will transfer heat to the top 150mm (?) of the ocean surface. Heat from the surface can go to 3000m deep and more and stay there for 1500 years.

        But the IR changes are the proverbial in the ocean when compared with SW that heats the ocean to 100m and more – and changes an order of magnitude more than IR.

      • Interesting. So a simple change in the Greek symbol e thingy in the Stefan-Boltzmann cannot describe the “greenhouse effect” since we now have more sharpshooting CO2 molecules that heat the oceans with a mysterious lag instead of reducing the radiant heat transfer from the oceans to space even though the impact of the sharpshooting CO2 molecules is not significant until the air is sufficiently dry (less water vapor) for CO2 influence to be on the same order of magnitude as the predominate “greenhouse gas”.

        I was under the impression that the CO2 impact would be most noticeable at the poles and upper troposphere where CO2 could do its thing.

        Perhaps you have an equation that includes a Greek b or d thingy representing change in back radiation or down welling? That could explain the different warming measured between the north and south poles. Or perhaps ozone has a double reverse down welling factor? Why bother keeping it simple, right?

      • Regarding the ocean’s role in absorption and eventual reradiation of absorbed heat, the link cited earlier and repeated below indicates that downwelling IR contributes much more to ocean heating than direct solar radiation – Earth Energy Budget.

        IR is absorbed almost entirely at the surface while visible solar radiation is absorbed over a larger range (mostly by particulate matter rather than water itself), but the two are quickly homogenized by turbulence and convection.

      • Gad zooks! I am running out of reply thingys! Fred, where ever you are. If you insist on the back radiation thing, why not show a solid sphere, Earth proper, inside a hollow sphere, Earth’s atmosphere, then show that when the temperature of the hollow sphere is greater than the solid sphere, there is radiant heat transfer to the solid sphere from the hollow sphere. Kinda like following the laws of thermodynamics.

        If you want to really trick out the sphere in sphere diagram, you can show the outer sphere as not truly a sphere, but a somewhat spherical shape with varying thickness representing varying thermal mass. We might need to bump the cartoon up to an animation.

      • John Carpenter


        I spent some time reading up on radiative energy transfer and the GHE. I did as you suggested, went to earlier posts by JC. I read at science of doom, RC, skeptical science and one other.. i forgot.. and I have to say I am not satisfied with the explanations so far. I am not a GHE denier, but I have studied IR spectroscopy quite extensively in my former life. Only skeptical science got the reason why some molecules can absorb radiation and others cannot partially correct (molecule must have an oscillating dipole moment) but then ignored the importance of molecular symmetry as to why only certain vibrational modes are IR active while the other are not. RC glossed over this, but did include the addition of rotational-vibrational coupling to broaden the absorption spectrum of gases. {Side note: If you trap a molecule in a frozen inert gas like Ar where it is not longer able to rotate, you get the pure vibrational modes of the molecule and the absorption lines are very sharp. This method is called matrix isolation.}

        In the grand scheme of things, IR absorption (an emission) may not be the most critical aspect of describing how the GHE works, but it is a fundamental understanding that I would expect any expert in the field to know.

        Regardless of the poor explanation of how electro-magnetic radiation is absorbed by a species at the various sites I visited, I am more perplexed now of how the thermodynamics of energy transfer is understood. I think I need to review more from my old statistical thermodynamics text to see if there is a better explanation/derivation of vibrational to translational/collision energy transfer. I am not finding a very satisfactory mechanism so far on the web as to how that translates to atmospheric warming. If you (or anybody else that happens to read this) have anything good, let me know.

      • Hi John,

        I must admit that I have given up on the quantum scale explanations of IR opaqueness. But if you look at satellite data showing IR sprectal absorption data – it seems clear that there is an effect.

        Warming of the atmosphere follows without too much difficulty if you have more of these molecules distributed in the atmosphere.


      • John Carpenter


        Yes, you are right. I don’t question the absorption of radiation. I am trying to gain better understanding of the mechanism by which that absorbed energy then leads to increased increased air temperature. One path is collision of excited state molecules with surrounding atmospheric molecules and vibrational-rotational energy tranfer to translational energy while the other path is re-emission of the absorbed energy back into the system. Two very different mechanisms that can have different outcomes. I am not satisfied with the more or less glossed over explanations I am finding so far.

      • John – Here are two good sources to answer your questions –

        Pierrehumbert On Infrared Radiation<.a<

        IR Spectroscopy

        Briefly, the time needed for an excited CO2 molecule to emit a photon in much longer than the average interval between collisions with neighboring molecules (N2, O2, etc.) and so most de-excitation is collisional, increasing the kinetic energy (and therefore the temperture) of the surrounding air. Similar, most CO2 excitation is collision-based, with only rare excitations due to photon absorption – hence the dependence of emission on temperature.

        The relevant transitions in CO2 (i.e., those involving the creation of oscillating dipoles) are a bending vibration and rotations in the main absorption region (about 15 um wavelength), although I believe assymetric stretch is active at other wavelengths (symmetric stretch does not create a dipole).

        I’m believe there are also details at the Modtran.org website.

      • More quantitatively less than one excitation in thousand leads to emission and more than 999 release the energy in collisions. Similar numbers apply also to the cause of excitation. Something like three or five percent of CO2 molecules are always in some of the excited states corresponding to 15 um radiation due to the high rate of excitation by collisions.

        The radiative processes have little influence on the excitation level of CO2 molecules, but the radiation provides still an efficient energy transfer mechanism as the mean free path varies from few tens of meters close to surface to progressively longer values at lower pressure levels. Such a mean free path leads to efficient energy transfer.

      • John Carpenter


        Thanks for the links… some of what I read helped to jog my memory more and helped me find a few more answers. As with anything, it also lead to more questions…. so I will continue by odyssey.

      • manacker 4/3/11 2:23 pm, week 4/2/11

        Your data are fine, reflecting K&T’s budget , but you linked to a chart wrong on a couple of counts. First, it doesn’t have the same K&T data that you site. It omits the thermals and evapotranspiration, consequently it is not in balance at TOA and the surface boundary, which is a key feature of K&T’s budget..

        Second, it has technical errors. It says that solar energy … is converted into heat>. It is converted into thermal energy, which is measured by temperature. Solar radiant energy to Earth, less the negligible radiant energy returning from Earth to Sun, is heat.

        Third, and most important, it is not what K&T developed, which is the initial condition and the basis for IPCC’s definition of its radiative forcing paradigm. There would be AGW concern but for IPCC and its Reports.

        IPCC needs to be debunked on its own terms, not by posing alternatives, such as other models, other data, or other data reduction. Even if an alternative model could predict climate, which would be the ultimate scientific test, IPCC provides no usable prediction of its own which might be shown inferior. We can extract predictions from IPCC’s modeling, such as global average surface temperature over the past decade, or current climate sensitivity, both contradicted by measurements. However, like the alternatives to the GCMs, whether the extraction is correct the AGW proponents can reduce to a matter of opinion. Therefore, these various alternatives are all subjective, which is surely good enough for Congress and politics, but they are not susceptible to objective evaluation, as demanded in a scientific forum.

        One of many good ways to debunk IPCC is to show its errors explicit and implied by its use of that K&T diagram. For example, the budget is neither unique, nor even a conditionally stable point in the atmosphere. It is not peculiar to 1750, nor does it partition the atmosphere into multiple layers. Just think of all the implications!

        Here’s a link to that budge gloriously colorized by IPCC. http://www.ipcc.ch/publications_and_data/ar4/wg1/en/faq-1-1-figure-1.html .

    • No. Just leave it at that.

  23. lolot writes “If that was a violation of a law of thermodynamics (it isn’t), that law would be wrong and would have to be rewritten to take into account observations of the actual universe.”

    Clearly, the greenhouse effect exists, so there can be no violation of the 2nd law of thermodynamics. However, my reading of the crisitcisms of CAGW is not that the greenhouse effect itself violates the 2nd law, but that the process whereby the proponents of CAGW claim that the greenhouse effect works, violates the 2nd law. That is, the physics used, for example, to estimate no-feedback climate sensitivity, violates the 2nd law. So it is the process that the proponents of CAGW use to estimate the climate sensitivity for a doubling of CO2 that violates the 2nd law, not the greenhouse effect itself.

    However, my physics is not good enough to really understand this. My vague understanding of what Bill Kinninmonth writes, is that the whole of the greenhouse effect comes from a change in the lapse rate, which is not part of the estimation of no-feedback sensitivity.

  24. …we will be rewriting people’s perceived wisdom about
    the course of temperature change over the past millennium.


    Is this doing science?

  25. Coorelation between PDO and Global Mean Temperature oscillation


    Is this correlation coincidental?

    • Girma

      “Is this correlation coincidental?”

      Spencer doesn’t think so. (Nor do I, for that matter.)

      But IPCC cannot even see it, Girma, because it lies outside the “box” of the CAGW paradigm.


  26. Here’s some more hot off the press settled science for consideration (of interesting note is that it was the result of an FOI request):


    “Victoria could experience between 21 per cent fewer and 25 per cent more one in 20 year events.”

    Now in local parlance, I would say this is an “about face” and “having two bob each way”. Still some work to do on the communications I’d suggest.

  27. Nonsense. For one thing – I was speaking of the change in SW and the change in LW – in the satellite record an order of magnitude difference in forcing to the end of the 1990’s.

    It is at any rate nonsensical to say that ‘downwelling IR contributes much more to ocean heating than direct solar radiation’. The planet gains heat from the Sun, the surface warms and re-radiates in the IR and the planet loses heat to space both directly and indirectly. All of the back radiation is the result of solar heating – there would be no back radiation if there were no Sun. If you are saying that there is a greenhouse effect? I am quite prepared to concede that. It doesn’t make your statement any less silly.

    The other gross error involves a glossing over of multiple processes in the oceans. Quickly homogenised? There are a couple of thermal barriers. The first is commonly at about 10m – this is a mixed layer in contact with the atmosphere. The mixing is the result of the rotational energy of waves. Convection is not the dominant process by a long way – the movement of heat is determined by buoyancy considerations and moves towards the surface and not to depth by processes other than downwelling in a few places on Earth. Like on land – much of the heat in this mixed surface layer is lost to the atmosphere at night.

    The other thermal barrier occurs at 100m or more and is the result of shortwave penetration to that depth and beyond depending on the clarity of the water. 3% of the SW energy at the surface reaches a 100m depth.

      • Joe Lalonde


        Why is it no one includes ocean salts?
        It is like a big voodoo subject to stay away from.

    • The quantitation in the Kiehl/Trenberth Energy Budget Diagram (see the black and white diagram) shows that downwelling IR contributes about twice as much to ocean heating as direct solar radiation (the diagram doesn’t specify oceans alone but they are the main energy absorbers of the surface). The homogenization of solar and IR absorbed energy in the upper parts of the ocean is demonstrated by the lack of a discontinuity between the skin temperature and directly underlying water – in fact, the skin temperature is slightly cooler than the water immediately underneath, which is incompatible with a disproportionate distribution of IR into the surface as opposed to underlying layers –
      Ocean temperature distribution (a = night, b = day).

      The claim that IR can’t warm the ocean because it “can’t penetrate” is one of those enduring blogosphere myths that appears to persist despite incontrovertible evidence to the contrary, and is probably due to misleading connotations of the word “penetrate”, which in the case of IR simply reflects the extraordinarily more efficient absorption by water of IR than of visible light, which is hardly absorbed at all by pure water.

      • Just to be clear re the above, IR is absorbed at the skin layer, but is redistributed by turbulence and convection in such a way as to mix IR and solar absorbed radiation more or less homogeneously into the top layers.

      • Absolute piffle. The outdated K&T96 figure also shows IR and latent heat losses from the surface that are much greater than the downwelling IR – as it should be. I am still not clear what the point is because without solar emissions of both IR and visible light there would not be either downwelling or upwelling IR. I have ‘conceded’ that there is a greenhouse effect. What more can I say?

        I am still talking about delta LW and delta SW – a point you resolutely refuse to face.

        You show me a diurnal thermocline in what? This simply shows that water is cooler at night than in the day. This seems fairly obvious but again the point eludes me. The discussion on ‘skin’ is moot. The effect of surface tension is the result of the polarity of the water molecule. It is a weak alignment that has negligible effect in open water bodies exposed to wind shear and turbulence. That it is warmer apparently half a millimetre below a still surface has a significance that entirely eludes me again.

        I must admit that I have not seen this so-called blogosphere myth. I was simply discussing the optics of oceans without going at all into heat distribution in the ocean – which is obviously extremely complex and not susceptible to the gross simplifications of blogosphere post modernist science. And if you can find pure water in the oceans – I’d be surprised. Stick to the facts Fred and you wont continue to make such a fool of yourself.

        I sincerely suggest that you openly explore topics instead of leaping to the defence of an ideologically driven position. It just makes things more pleasant and productive.

      • Robert – I’m not sure what you are trying to say of a substantitve nature, because it seems to be clothed in invective. Mine is that downwelling IR from greenhouse gases contributes much more than direct solar radiation to ocean heating. The “myth” I referred to holds that the IR contributes much less than direct solar radiation because IR “can’t penetrate”. Understanding the actual contribution is important because it is a major element in the Earth’s energy balance.

      • You have a habit of commencing your responses with – you’re wrong, mistaken or otherwise misguided. I am just messin’ with you Fred – more for my own amusement sad to say.

        Then there is this other element of post-modern science that involves a narrative whose only justification is the assumed plausibility of the chain of causality. This is far too common these days and the chain of causality is most usually far too simple minded to have more than a passing familiarity with reality. I more often than not proceed from the assumption that I don’t understand and try to understand than otherwise. You can easily find an appropriate Richard Feynman quote.

        “I think I can safely say that nobody understands Quantum Mechanics”

        “I’m smart enough to know that I’m dumb.”

        This one from Albert Einstein caught my attention.

        “Once you can accept the universe as being something expanding into an infinite nothing which is something, wearing stripes with plaid is easy.”

        In something as complex as climate there are multiple chains of causality and the only reality is data. Here is the ERBE and ISCCP-FD data for the tropics. Discuss it or not – but I feel you would rather offer distractions with the substance of fairy floss from the trenches of the climate wars. And if you don’t like my responses – go play with someone else.


        Energy is everything in climate – the only ultimate measure is the change in radiative flux at the top of the atmosphere.

      • Fred,
        In this connection those who don’t like to accept the concept of back-radiation at all but only the net LW radiation are perhaps closer to the truth than your description. The energy balance of the thin skin layer is dominated by

        1) Net LW radiation, which is the difference of outgoing and incoming LW radiation. Both OLR and ILR are larger than net LW and OLR is always larger than ILR. Thus the LW is always a cooling factor. The skin depth is the same for OLR and ILR.

        2) Convective heat transfer from layers below the skin. This is the only positive contribution of the three main components. Thus the skin must be cooler than the layers immediately below.

        3) Evaporative heat loss to atmosphere.

        In addition there are small conductive heat flows from layers below and to the atmospheric skin layer or from it. The SW has a very small immediate contribution, if we consider a very thin skin, but SW becomes essential when layer thickness is increased. For the thin skin the SW contributes through the mechanism 2), not directly.

        You look at OLR and ILR as separate components, but that way you lose the strong relationships between them (OLR always larger than ILR and both larger than the net). From the point of view of the skin it is better to look at the net effect only.

      • Pekka – this diagram professes to show the daylight temperature profile of the surface of ocean.

        How does your explanation explain the shape of the profile between the blue marker and the green marker, and the green marker and the red marker? To me, what you are describing should result in an abrupt angular profile from the blue marker to the red marker:


      • JHC,
        The arguments are fool proof within their limits as they are based on the first law, but this such arguments require temporally constant temperatures. Thus the daily variations may lead to deviations. Where such deviations can occur can be decided through more detailed calculations.

        Taking averages over periods of several days or so we know with certainty that SW radiation heats layers below the immediate surface and that much of this heating is released to the atmosphere through the skin and only a fraction will heat gradually the deeper ocean water.

        There are also exceptional situations with warmer moist (foggy) air covering a cold ocean. Under such local conditions the ILR might be larger than OLR but these exceptions are not significant for the wider averages.

        Anyway my main point was that OLR and ILR are strongly related and their net effect influences the skin energy balance.

      • ‘Why are sea surface temperatures rather than air temperatures used over the oceans?

        ‘Over the ocean areas the most plentiful and most consistent measurements of temperature have been taken of the sea surface. Marine air temperatures (MAT) are also taken and would, ideally, be preferable when combining with land temperatures, but they involve more complex problems with homogeneity than SSTs (Rayner et al., 2003). The problems are reduced using night only marine air temperature (NMAT) but at the expense of discarding approximately half the MAT data. Our use of SST anomalies implies that we are tacitly assuming that the anomalies of SST are in agreement with those of MAT. Many tests show that NMAT anomalies agree well with SST anomalies on seasonal and longer time scales in most open ocean areas. Globally the agreement is currently very good (Rayner et al, 2003), even better than in Folland et al. (2001b). However, some regional discrepancies in open ocean trends have recently been found in the tropics (Christy et al., 2001).’ – http://www.cru.uea.ac.uk/cru/data/temperature/

        Now obviously there is transfer of energy between the oceans and atmosphere. There is physics involved. The incoming visible light tends to be absorbed 50% in the first 1m, 80% in the first 10m and 97% in the first 100m. The ocean warms and emits IR up from the top microns, the atmosphere warms and re-radiates back into the top microns of the surface. The net IR flux at the surface is positive upward so the surface loses energy to the atmosphere just as the atmosphere loses energy to space. The rate of energy loss in the top microns is greater than the rate of mixing so the top microns are usually a little cooler than the underlying water.

        With greenhouse gases – we are looking at a reduction in the net IR – as the atmosphere warms and radiates more energy in all directions – and therefore in the heat loss from the oceans.

        Rather than this odd idea of mixing cooler water into warmer and getting a temperature increase at depth – there is just a simple reduction of energy loss which is then stored as heat in oceans and atmosphere.

        It matters not a hill of beans whether it is LW or SW – all that matters is the radiative flux and the only place you get a longer term (dynamic – because you need to factor in planetary warming or cooling) energy equilibria is at the top of the atmosphere.

        The oceans and atmosphere have a seasonal temperature equilibria because net energy moves from a warmer to cooler body – but it is not an energy equilibria.

        Just to return to the quantum picture for a bit. Heat is a measure of the internal kinetic energy of a system. But radiative heat loss involves the emission of a photon. The electrons in a molecule return to a lower and more stable orbit and the kinetic energy of the system as a whole is less by a photons worth of energy.

      • Pekka – to me the daylight profile suggests the heat delivered to the skin layer by IR is heating water well below its depth of penetration.

        Also, that profile is replicated in every source I’ve been able to find. I’ve never seen a version for varying atmospheric conditions. Assuming a skin has formed, does that mean the profile exists regardless of atmospheric temperature?

      • JCH,
        Under those clear sky daylight conditions the net LW radiation is strongly negative, i.e. cooling. The absorption of solar SW is strong enough to heat the skin, when we define the skin as centimeters rather than micrometers thick. That picture had points at 1 mm and 1 m, nothing in between. I cannot telle, where the maximum temperature at those conditions is, but my guess is a couple of centimeters below the skin.

        The thin skin is still heated from below as there are no other possibilities, but not from far below. Part of the SW penetrates tens of meters, but a large part is absorbed in the topmost one meter, while very little is absorbed in a thin skin of tens of micrometers.

      • JCH,

        That is why I think the whole back radiation thing is overly confusing thus not very useful. I daylight, even though SW radiation dominates there is significant ILW. Face it the sun is much hotter than the so even thought the distance is great its impact is net positive incoming LWR, but that is only in the daytime and most significant at noon.

        At night, S-B law takes over. It takes a lot of rationalization to think that the laws of physics have to be reinterpreted allow for reversal of heat flow. This discussion by TallBloke compares the Trenberth cartoon to the NASA cartoon, which I think is a lot more realistic.

      • Pekka – I believe the skin layer is usually defined in terms of 1 mm or less – often microns if one is interested in where almost all energy exchange with the atmosphere occurs, because IR is absorbed in that interval, emitted in that interval, and because evaporation occurs at the surface. I don’t disagree that if one looks at only net flux involving IR alone, it is upward from ocean to atmosphere, whereas if one looks at total flux (LW and SW), it is downward during the day and upward at night (when SW is absent). Since the outgoing flux is the response of the ocean to total incoming energy, I don’t think it provides a full picture to compare it with only one component of the incoming flux – the IR component.

        Those are convenient ways to look at net energy transfer. However, my main point was simpler. Expressed in W/m^2, downwelling IR contributes more to ocean heating than direct solar radiation. It has been argued that IR “can’t penetrate” below the skin layer and therefore contributes only (or very disproportionately) to evaporative cooling and other losses to the atmosphere. For this to be true, the skin layer where IR is absorbed would need to be warmer than the water immediately below it, but it is actually slightly cooler, and the rather shallow upward temperature gradient further below is what one expects from upward convection rather than disproportionate heating of the surface. In other words, the skin layer is not capable of losing heat faster than water below it would lose heat if put in the same location. Because of turbulent mixing, that lower level water has a temperature derived from both LW and SW absorption, and so skin layer heat loss can be considered quantitatively representative of heat loss from water whose energy comes form LW and SW absorption rather than the LW IR component alone. In the absence of turbulent mixing, it is likely that the skin layer would be considerably warmer and would be losing heat mainly from the energy it had directly absorbed rather than from the mixture that includes heat emanating from below.

      • It seems to me that it ought to be a pretty simple lab experiment to determine the IR absorption profile in a swimming pool full of water. Trying to infer it from temperature in a turbulent ocean seems like a foolish exercise. Even in the ocean, measuring IR radiation under the surface rather than temperature would settle this argument in short order.

      • Fred,
        I don’t think that our understanding differs in substance. It’s rather a question of the best way of using the facts in understanding a specific issue, like the energy balance of the skin layer.

        The main reason for looking at the net LW radiation is that the penetration is almost identical for ILR and OLR, while SW penetrates much deeper and releases only a tiny fraction in the thin skin of tens of micrometers. Therefore its role in the energy balance of the skin is different. LW affects directly the energy balance of the skin while SW influences the skin indirectly by heating a thicker layer (from several centimeters to tens of meters) of upper ocean.

        I agree that the numerical value of ILR exceeds SW, but ILR is absorbed totally in a layer which would cool without the net heating from below.

        Combining ILR and OLR works very well, because they influence exactly the same layer. For SW the deeper penetration has an important effect on the outcome. In calm waters the SW is certainly the main driver for the mixing that determines the temperature profile of the uppermost tens of meters.

      • ChE,
        The SST description page linked by JHC a couple of messages above tells that 10-12 um IR penetrates approximately 20 um into ocean water. Microwave radiation penetrates about 1 mm, and visible light penetrates much deeper loosing most of its energy in a couple of meters but heating weakly a bit further down (10 m or so) into ocean.

      • Pekka, if that’s true, then the people who are claiming that IR doesn’t penetrate are right, and Fred’s theory is wrong, correct?

      • Also Pekka, it seems like there’s another piece of this. If you have a reasonably still ocean, and the IR warms a layer that’s millimeters or less, then the heat is as likely to be removed from the skin by evaporation as convection downward. It would be useful to know under which conditions evaporation dominates and under which conditions convection dominates. I think it’s safe to say that conduction, by itself, can’t compete with either convection or evaporation.

      • ChE,
        Concerning Fred’s comments I say only that they do not provide in my opinion the best approach to understanding the physics involved.

        It’s essential that solar SW heats a layer several meters thick. If there is a lot of mixing by wind generated waves that distributes this heat to the whole mixed layer. In absence of such mixing the layers below the skin warm until they get warm enough to induce mixing. Even a weak warming will do that in absence of other mechanisms of heat transfer from those layers. This mixing may occur during the night when the skin cools.

        In any case this results again to essentially constant temperature in the uppermost ocean layers. The uppermost meter or two may warm faster during the day as roughly half of the SW energy is absorbed in the top 2 meters, but this is effect disappears during the night.

        Using the T&F&K (2009) average numbers the energy balance of the thin skin layer is

        – net energy loss of 57.4 W/m^2 through LW radiation (OLR 400.7, ILR 343.3)
        – net energy loss of 97.1 W/m^2 through evaporation
        – net energy loss of 12 W/m^2 as sensible heat to atmosphere
        – heat flow (convective mixing and conduction) from below 166.5 W/m^2 The source of this heat is in solar SW absorbed almost fully in the top 10 m.

        These numbers are worldwide and annual averages and may differ very much from those at a specific location at a specific moment.

      • ChE – Downwelling IR is absorbed almost totally within the first few micrometers of ocean – i.e., within the skin layer. Those are the same few micrometers from which energy is lost upward. However, the absorbed IR energy is then distributed into the layers below by rapid mixing, due to turbulence, with contributions from convective mixing depending on circumstances. What is incompatible with a process by which most or almost all the absorbed energy is lost via evaporation and outgoing radiation, with little distributed into the ocean, is the observation that the skin temperature (where the IR is absorbed) is not hotter than the water below it. Using some of figures that Pekka has cited as examples, a significant increase in absorption that added little or nothing to ocean heat would raise the skin temperature above that of the water below in order to increase outgoing heat loss by the indicated amounts; for a CO2 doubling, I estimate an increase in skin temperature (again assuming no downward heat distribution) of well over 2 deg C, which would reverse the observed differential between skin and lower layers. One can’t shed excess heat without warming up, and if none of the warming went downward, the surface must warm considerably. How much is lost by evaporation, as opposed to radiation (and slightly by conduction), depends on air temperature, relative humidity, wind speed, and other variables.

        In a totally calm ocean, it may be true that much energy absorbed in the skin layer would be lost from there, due to differential temperature increases in that layer, but the observed temperature profiles tell us that such is not the case in general.

      • Pekka, those numbers are believable, but I would have expected evaporation to be higher and sensible heat to be lower. In any event, this is just a quibble, but conduction really doesn’t move heat in water, even if it’s very calm. Even calm seas will set up natural convection cells. The thermal conductivity of water is is 0.58 w/mC, and if you calculate out the heat flux, it’s negligible compared to even weak convection.

        Water without natural convection isn’t a very good conductor of heat. Conduction does however, affect the natural convection parameters, i.e., it affects how the natural convection behaves.

      • Fred,

        I don’t like the following sentence from your text:

        “However, the absorbed IR energy is then distributed into the layers below by rapid mixing, due to turbulence, with contributions from convective mixing depending on circumstances. ”

        This is semantics, but to me it’s strange to discuss distributing absorbed energy into layers below, when the net energy balance from the skin to the atmosphere and space is negative. Formally you can talk about that, but then you mix also more “negative energy” at same time and in exactly the same way. To me it’s much more natural to notice that the skin is losing energy and that the mixing (helped by conduction) is bringing that energy from below.

        The role of ILR is not to bring “new energy” to the skin but to reduce net heat loss by LW radiation. The atmosphere acts as an insulator for the radiative heat loss. Somebody compared it to a space blanket and it is indeed a rather good analogy.

      • Pekka – I realize that our perspectives differ more in semantics than content, but my comments were based on the following attempt to correct misconceptions frequently found on the Web. First, I believe it’s important to point out that downwelling IR contributes substantially to the heat content and temperature of the bulk ocean, and that claims that the IR can’t do this because the energy is all dissipated in OLR and evaporation are incompatible with both observations and theory. Those claims are ubiquitous on the Web and leave readers with a false impression of what happens. Second, when IR photons are absorbed in the skin layer, they increase the kinetic energy of the molecules there – a local heating effect. They do not impede the escape of IR photons from the skin layer, and so on a molecular level, they can be said to cause heating rather than a reduction in cooling. Third, during the day, the net flux is also one of heating -i.e., SW plus ILR exceeds OLR, and it is only at night that the net flux is outward. Fourth, turbulent mixing is known to be an important process in heat distribution in the very upper layers of the ocean. Given the very thin dimensions of the IR absorbing layer, even minimal turbulence will redistribute much of that energy into layers below, so that the surface does not heat disproportionately. Without turbulence, heat distribution and temperature profiles would be very different, and the effects of skin layer IR absorption and deeper SW absorption would be less well intertwined.

        Having said all that, I think you are absolutely right to state that the localized increase in skin layer energy absorption reduces the negative IR flux rather than creates a positive IR flux – i.e., OLR still exceeds ILR but to a lesser extent, and that can be considered reduced cooling. As long as readers understand what is happening at a more detailed level than net flux, I think your explanation offers the most accurate quantitative assessment.

        What would happen if downwelling IR absorption did little or nothing to add to bulk ocean heat content and temperature? In that case, rises in ILR would leave bulk ocean temperature unchanged, but the enormous energy that would not be directed downward would now be dissipated upward, with much of it resulting in increased evaporation. Current climate models without that phenomenon predict rises in atmospheric humidity that are fairly well matched by observation, but if the atmosphere were forced to accept a much higher evaporation rate, water vapor feedback and climate sensitivity would probably be far higher than the current estimates based on much of that absorbed energy distributed into the ocean.

        It appears that Science of Doom has discussed some of this, including attempts at quantitation based on simple models that go well beyond the very general principles I’ve described.

      • Fred,
        One additional comment.

        Even in the bright sun from a clear sky the layer needed to get a positive energy balance from absorbed and emitted LW and SW radiation is certainly thicker than the thin skin. I don’t know, how thick, but my guess is that it would take 10 cm or more. Including evaporation and sensible heat losses, the layer gets even thicker. Thus the thin skin is always heated from below, if only tens of centimeters below.

        When we discuss radiation in the atmosphere close to the surface, it appears much easier to understand the physics through the separate components of OLR and ILR. When we are looking at the physics of the ocean, I believe that we should start by calculating the net energy flows to and from different layers. I listed them in an earlier comment for the skin. For the layers below we have the positive contribution from SW and convective heat transfer, which must remove most of the SW heating as net warming of the oceans takes only a small fraction of the total heating.

        The balance of SW heating and net warming is evidently reached at a depth of a few tens of meters. (I.e. layers below that level warm as much on average as SW warming contributes to the layers below that level.)

      • While Dallas was correct to say a warmer atmosphere can warm a cooler ocean – this is not and cannot be the ruling condition. The oceans must lose energy to the atmosphere and the atmosphere to space just as it gains energy from the Sun – and these must be in what I call a dynamic imbalance with changing planetary warmth being the third term in the energy equation.

        All systems that are warmer than absolute zero emit energy but for heat transfers from an on average warmer ocean to an on average cooler atmosphere – the net heat flow is and must be from the warmer to the cooler body.

        The radiant heat flux from the atmosphere to the ocean and the radiant heat flux from the ocean upward all occurs in the top microns of the water column. What matters here is the net radiative flux which is generally – as opposed to the condition of the warmer atmosphere – positive upward. Concentrating on ILW at the surface without considering OLR at the surface is not the holistic understanding that is needed to understand energy pathways – and is not simply semantics

        The generally cooler condition of the ‘skin’ must be the result of energy dynamics – losing energy faster that it is replaced. Why that is so relates to the nature of the skin. Prince of Doom suggests that turbulence ceases very near the free surfaces – such that conductance of the water in the very top microns of the surface and viscous drag of the atmosphere on the water dominate.

        I think that is an artificial construct remembering that we are discussing the oceans and that waves are always present and progress through a translation of rotational momentum. Individual particles in a wave describe a circular motion roughly the depth of the wave – and there is no physical reason to suggest that particles at the very surface are immune from viscous drag from the rotating cylinder of water. These molecules on the surface experience the same turbulent mixing in the rotational progress of the wave as particles just below – IMO.

        SW is absorbed in the order of 50% in the first 1m, 80% in the top 10m and 97% by 100m. This warmer layer of water some 100m deep on average deep floats on the super cold waters of the abysmal deep – a fundamentally important system in global climate. Heat tends to move toward the surface through convection – the buoyancy of more energetic and therefore less dense molecules – and is mixed turbulently.

        The skin exists solely as a temperature differential as measured by airborne or satellite instruments. It is not anything to do for instance with surface tension – which is a result of the polar alignment of water molecules on a free surface. I don’t think it is long lived as it is constantly being mixed into the warmer water below – but then re-establishes as heat is again lost from the surface microns. A dynamic balance between quantum radiative energy emission and the Newtonian world of force and mass.

      • Fred Moolten 4/14/11 6:33 pm, week of 4/2/11 …

        Suppose you did a job for which you were paid 10 crisp new $100. To do the job, you had to buy a can of paint, for which you put down one of those notes, and received $96 in change. You claimed a business deduction from the $1,000 paid for the paint, presenting the receipt. The IRS said from these transactions your income $1,096.

        That may sound silly, and as a lesson for climate, its an obnoxious and utterly useless model by analogy. It’s like the pointless discussions here comparing Eugenics to Climatology. What would happen before all the dust settled would be that you would rely on accounting definitions.

        Suppose I had two isolated containers of air at two different volumes and two different pressures. Suppose I connected them, but separated by a pair of diathermic walls, optically transparent over a band much wider than the shortwave and longwave bands of interest in climate, and that these wall were separated by a vacuum. This is highly hypothetical, but no worse than any other diathermic wall, nor the concept of a surface of an oblate spheroid Earth, or that that surface has a global average temperature. The pair of walls block conduction and convection between the containers, but allows energy exchange by radiation. In the process the warmer volume will receive radiation from the cooler volume, but no heat in that direction. Like the can of paint transaction, the result is true by definition.

        Heat is defined as the flux of energy from a warmer body to a colder body due to thermal energy alone. It is a flux leading to thermal equilibrium. From that definition, temperature is defined by calibrating the thermometric properties of different things called thermometers. Radiant heat is the radiation, according to the Stefan-Boltzmann Law, that passes from the warmer body LESS the radiation from the colder to the warmer. See any edition of Zemansky, Heat and Thermodynamics. Radiant heat is proportional to the net radiation, whether in the shortwave or longwave bands. Heat is what warms and cools Earth’s surface.

        The idea that the back radiation warms the surface is rather jarring in thermodynamic terms, when it is the change returned from Earth spending its thermal energy to buy a warmer atmosphere. In a heat model, which is a lumped network of idealized nodes interconnected by branches in almost any configuration, thermal energy is the flow variable, temperature is the potential variable attached to each node, and complex impedances comprise the branches.

        Heat flow models can predict transient phenomena, and predict the system responses to variable forcings, like solar changes, and seasonal and diurnal effects, or to varying impedances. These models provide insight into the role of heat capacitance (thermal inertia in climatology vernacular), predicting, for example, that atmospheric thermal energy is trivial, verging on the negligible, compared to ocean thermal energy. Thermal capacitance, essential in heat flow models, is absent in IPCC’s radiative forcing model.

        Heat flow models do not split forward and back radiation, nor do they give any power to either component radiation alone to affect thermal energy.

        Science does not restrict how a system might be modeled. Forward and back radiation is OK; so would be dynamic equilibrium instead of thermodynamic equilibrium. But if the vocabulary is to be unique, science demands that novel words be made specific and unambiguous for the application. Karl Popper was quite wrong (again) when he said, They say that this concept is meaningless (or that it is undefinable, which, incidentally, in my opinion does not matter, since definitions do not matter). Popper, Objective Knowledge, etc., 1966. “It [does matter] what the meaning of is is.”

        You addressed some confusion recently about slow radiation. There would have been no confusion at the outset if the writer had just used the term radiant heat at the outset instead of radiation. Earth’s global average surface temperature is slow to respond to solar forcing because of the ocean, which acts like a tapped delay line. For example, in one configuration of the Solar Global Warming (SGW) model the ocean appears to have a major tap at 134 years an a minor one at 46 years. The skin depth of the radiation is irrelevant since, as you suggested subsequently, the turbulent surface layer mixes the warmed water into a relatively deep soup. That mixing occurs at a rate much faster than the rate of warming. It’s not the radiation that’s slowed, it’s the response to radiant heat.

        Science doesn’t deal in incontrovertible evidence. It deals with models, and facts relevant in the context of each model. The modeler is free to propose anything. The ultimate test has nothing to do with whether he published his model in a peer-reviewed journal, nor whether he accumulated a consensus in support of it. If his model has predictive power, the demonstration transcends all other considerations.

  28. Joe Lalonde


    Science has never understood what efficiency is. We have many comments and claims on efficiency but true efficiency has never been looked at. Build a machine and if it works fine, then slap on a efficiency label with no idea what it is.
    To truly understand the interactions of the sun to planets, you will find that rotating planets are not truly inert objects. The sun contributes to the rotation through it’s massive magnetic field. Being round and the distance of the planets size means that the fields are stronger on one side and slightly weaker on the other through this distance. This also keeps planets in line in the corridor to keep the efficient flat plane of the solar system traveling through space. In essence the sun is slapping one side of our magnetic field like a spinning basketball on a finger. The planets too close suffer from lack of this field being weak yet the solar debris from the sun is greater which adds friction.

    Simple example of why we do not understand efficiency and energy is the wind turbine. 164 meter diameter 7 megawatt wind turbine takes a volume space of 21,124 square meters of space. Less turbine blade of 492 square meters and that leaves 20,632 square meters of wind NOT touching the blades. And this we deem as efficient. This is the same problem with hydro-electric turbines. The only redeeming quality with that is hydro-electric turbines have the advantage of massive amounts of torque energy from water and gravity.

  29. John Carpenter

    Our local paper had this article on the front page this morning:


    More scary scenerios being conjured up about sea level rise. We will soon be living next to bath water here near the long island sound and can look forward to 2 ft sea level rise, with 20 ft just around the bend….


  30. Dr. C.,

    It was Rep. Brian Baird’s final question to you during the so-called “Rational Discussion of Climate Change.”

    I’ve been trying to find a video of your testimony and follow-up questions, but so far no luck.

    I’m sure you can easily put your hands on it if you so desire.

    Dr. C., my intent is not to be “right” here. It’s a small point in the scheme of things, and as I’ve said many times I’ve all the respect in the world for you. But I think you could go further. If you do decide to take a look and find that you did indeed leave the impression that the c.g. emails do not reflect poorly on the science, that might be something worth thinking about, since you obviously do not actually think that. So then the question becomes what cautious impulse held you back.

    Even as I write that I hear that it might sound condescending, but it’s not how I mean it. And then of course, there’s always the chance I’m wrong. It does happen from time to time :>)

    • I don’t know where the testimony video is at this point and I don’t have time to listen to it. But if you find it, let me know. I do not recall this.

    • Pokerguy,

      It’s good to read an even-handed, fair, post. Something to aspire to.

  31. David L. Hagen

    Arlan Carlin published a peer reviewed paper:
    A Multidisciplinary, Science-Based Approach to the Economics of Climate Change, International Journal of Environmental Research and Public Health, ISSN 1660-4601, Published: 1 April 2011

    He critically reviews the IPCC assumptions and models, finding them lacking.

    Carlin was fired from EPA for giving this politically incorrect evaluation rather than supporting EPA’s endangerment finding. See:
    Alan Carlin | July 9, 2009 Comments on Proposed EPA Endangerment Technical Support Document

    • David– I would caution you to not be so sure of why this guy was fired unless you have better information than I think you have.

      • his information right.

        Make sense?


      • {Somehow message got truncated]

        Rob Starkey

        Do you have better information than Dave on why Carlin left EPA?

        If so, what is it?

        If not, I would caution you to not be so sure of why this guy was fired unless you are sure that Dave did not get his information right.

        Make sense?


      • David L. Hagen

        Rob Starkey
        You are “correct”. Alan Carlin officially “retired” late 2009. NCEE officially posted: Dr Arlan Carlin, retired

        I interpolate that he retired for being put under unsufferable professional conditions. Al McGartland, Office Director of EPA’s National Center for Environmental Economics, refused to submit Arlan Carlin’s detailed climate science evaluation to EPA’s Office of Air and Radiation; forbade him to speak to anyone outside NCEE about it; and forbade him to do any further work on climate change. See: The EPA suppresses dissent and opinion, and apparently decides issues in advance of public comment

        Alan was muzzled. Others who tried to get the work group to evaluate his arguments ran into a brick wall. It is not that Alan’s comments were flawed. It is that the people who were in charge wanted him taken out of the process and his report “disappeared”. This was “politics” pure and simple.

        The EPA’s internal nightmare over global warming: Part 1

        “Dr. Carlin remains on the job and free to talk to the news media, and since the furor his comments on the finding have been posted on the E.P.A.’s Web site. Further, his supervisor, Al McGartland, also a career employee of the agency, received a reprimand in July for the way he had handled Dr. Carlin.”

        Behind the Furor Over a Climate Change Skeptic NYT Sept. 24, 2009

        . . .the principal current tangible adverse effect on me at EPA has been a continuing prohibition against working on climate change or even attending seminars on it. . . . the continuing prohibition against work on climate change does not make EPA a very attractive place for me to continue to work either.

        Thomas Fuller August 15, 2009

        Last year, Carlin went through all the proper channels in submitting a study to the EPA’s top leadership in which he raised serious questions about the credibility of scientific reports used to justify the agency’s decision to regulate greenhouse gases. Carlin’s study became public thanks to the Competitive Enterprise Institute. Carlin’s reward was to be publicly pilloried by President Obama’s EPA administrator, Lisa Jackson. His work was suppressed within the agency, and he was threatened with additional retaliation if he continued voicing his views. Rather than endure this bureaucratic muzzling, Carlin retired.

        Senate surrenders to the EPA Washington Examiner 06/11/10

    • Being fired from (or forced out of) a government agency for telling truth that doesn’t fit the required story line isn’t a negative.

  32. Up to you of course if you think it’s important enough to check. I’ve looked around but every link I’ve been able to find is no longer functional. I suppose it doesn’t even matter much at this point.

    Meanwhile, can’t tell you how grateful I am for your blog and your abundant personal courage.

    • Are you looking for Judith’s testimony? If so, C-Span 3, video archives, Judith Curry. I think she was on the 3rd panel.

  33. I’ll check that out. Thanks JCH…

  34. Dr. C.,
    Huh! Well, I found what I was looking for and your answer was actually much more nuanced than I’d remembered. So I owe you an apology for that…

    However, your answer went to trust, not the science “It’s an issue of “public trust,” you said with respect to the science and those behind the science…If you don’t trust the experts, so went your reply, then things like the IPCC assessment report will not carry the weight they deserve. I’m paraphrasing here. But I listened a couple of times, and the substance of your answer was that the facts and the data remain valid.

    “The data and the fundamental research is there,” you say at one point, as if the case for AGW is really settled. Perhaps that’s not what you meant to say, but it’s the clear impression you left.

    This really is different in substance from what you remember.

    • “The data and the fundamental research is there” sounds like something that i would say. This by itself does not connote settled science.

      • Judith Curry

        “The data and the fundamental research is there”

        I realize that this statement does not specifically indicate whether or not the “data and fundamental research” are conclusive, or even whether or not you, personally feel that they are, so it is de facto a neutral statement.

        But it conveys the impression that the “data and fundamental research” are conclusive and that you basically agree with them, even if this is not specifically stated or even meant to be implied.

        For example, one could ask whether or not you were including in “the data and the fundamental research” the studies based on physical observations from satellites such as Spencer + Braswell 2007 or Lindzen + Choi 2009?

        These physical observations show that the model estimates of climate sensitivity cited by IPCC, which in turn were based on interpretations of dicey paleo-climate data and theoretical deliberations, rather than “real life” empirical or physical “data and fundamental research”, gave greatly overstated sensitivity results and resulted in greatly exaggerated projections for the future.

        Since IPCC has not included these “data and fundamental research” , I would suggest that a statement such as

        “The data and the fundamental research are there, but these are still inconclusive to date, particularly in view of most recent physical observations”

        This would be closer to “telling the whole truth”.

        If questioned “why? one could have added (without getting into the “nitty-gritty”):

        “The IPCC conclusions and projections have not included more recent empirical findings, which conflict strongly with the IPCC estimates of climate sensitivity and raise great uncertainties regarding these estimates and the projections based on them.”

        Judith, I am not trying to “put words in your mouth”, but that is how I would have expressed it.


      • Thx max, but if you put this in the context of my previous testimony that was given about 30 minutes earlier, which was all about uncertainty, my statement has a different interpretation.

      • Thanks, Judith,

        I did go through the whole testimony and I agree (see my separate post below with my “take home” from the testimony).


  35. Sorry, should have left a link. It’s in the C-span archives as per JCH…


  36. Well that sounds a tad “legalistic.”

    in context it sure gives that impression, at least in my opinion. If you’re interested, and you seem to be, the link’s right there for you. It would take you all of two minutes.

  37. DeSmogBlog has an extensive database of individuals involved in the global warming denial industry. If you are not on it, you can recommend yourself at: http://www.desmogblog.com/contact_us

  38. Any comments on Doug Keenan’s article in the WSJ (4/5)? For full text, search Google for “How Scientific Is Climate Science?” and access the article using the Google link. He says that global temp is not AR1 and that warming is not statistically significant using better assumptions. Are Keenan’s ideas presented anywhere in a more scientific form?

    • Interannual variation has nothing to do with climate variation. Where he says that basically interannual changes don’t correlate beyond a year, it doesn’t account for decadal averages climbing steadily, but this is masked by the short-term variability he limits himself to.

  39. IPCC: For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenarios. Even if the concentrations of all greenhouse gases and aerosols had been kept constant at year 2000 levels, a further warming of about 0.1°C per decade would be expected.


    Here is a graph that compares the above projections of the IPCC with the actual observed warming rate of 0.03 deg C per decade.

  40. Manacker wrote: “The data and the fundamental research is there”

    I realize that this statement does not specifically indicate whether or not the “data and fundamental research” are conclusive, or even whether or not you, personally feel that they are, so it is de facto a neutral statement.

    But it conveys the impression that the “data and fundamental research” are conclusive and that you basically agree with them, even if this is not specifically stated or even meant to be implied.”

    I’d hesitantly agree with this even out of context, but would also be ready to be proven wrong. In general usage, the statement does seem to connote a sense that the data and research are sufficient to draw conclusions…

    HOWEVER, once you listen to Dr. C.’s answer to Rep. Baird’s pointed question about whether the emails weakened the underlying science, it’s clear that she’s saying “no.” She says it several times, and in several different ways. Since this was recent testimony, this is not consistent with her current stated beliefs right here in this thread…

    My intent is not to play “gotcha.” but to wonder aloud, in a public forum, why she seems to have taken a step back here. Some of us on the skeptical side are frustrated by our feeling that she could be, and based on her avowed current position, perhaps should be doing more.

    There’s no doubt in my mind about the meaning of Dr. C’s answer in this video. It would be nice to get a response, but if none’s forthcoming I’ll let the matter drop. This is the last comment in an old thread and can easily slip through cracks…

    Again, here’s the link:

    right at the end, about 57 minutes in

    • At Tom Fuller’s once I had a brief argument with Richard Tol, who insisted that ClimateGate didn’t change the science. He’s right, if ‘the science’ is defined as the radiative effect of CO2 in Aarhenius’ laboratory. But my Daddy always insisted that all argument is a matter of definition, and ClimateGate has changed the definition of science, at least as popularly discussed.

      Perhaps the science hasn’t changed, but the world’s perception of it certainly has.

  41. Hey Kim,

    Agreed as to perception, but it goes further. The “trick” used to “hide the decline” calls into question the entire temperature record used in the infamous hockey stick graph. That graph galvanized the AGW community, and was the basis for their argument that the current warming is “unprecedented.”

    • Perception changed without a doubt! It is humorous how the normal cadre still defend results which were questioned before Climategate and explained by Climategate. Climate science is a new field which will have new refinements and discoveries. Nothing is written in stone other than a range of climate sensitivity that is too broad to be truly useful.

      A lot of times a five year old paper is antiquated, but still used to “prove” a point. New papers are still using not only old papers that have been revised by the original authors, but the same methods that have been questioned by “real” statisticians. The climate in-crowd complains having to debunk the same things over and over but never address the real questions. It is funny really.

  42. “It is funny really.”

    Your attitude is healthier than mine..To me it’s maddening. Just absolutely maddening. I’m a 60 year old guy, and thought I understood human nature. .

    People will simply not admit when they’re wrong. And that’s on both side. They deny, distort, and ignore. “Projection” is rampant. Paul Krugman’s calling skeptics “deniers” for example, when he’s the one doing the denying.

    It gives me a deep feeling of hopelessness.

    • You are not too old to enjoy irony. You have to be careful though, “and thought I understood human nature. ” That’s like saying you understand women or Mike Mann saying he understands statistics. :)

  43. Pokerguy,

    Take heart, the edifice is crumbling and as temperatures cool (courtesy of the PDO which turned negative in 2007) we may get to focus on real problems with real solutions. The worst thing we can do is despair.

  44. Schrodinger's Cat

    I know that the GHG effect is largely accepted, but this reference challenges it in detail. I think it deserves serious consideration, and serious criticism if you don’t agree. (As opposed to non-scientific ridicule).

    • The article presents a lot of straw man argumentation on pages 24-30 starting paragraphs “The greenhouse theory says ..” and presenting then some totally misplaced or misunderstood statements, which are supposed to present common claims. Some of these claims are intentional simplifications well known to be only that, some do not present any real claims.

      There is evidently also much correct in the text, but nothing new in that.

      I cannot see any real point in the article.

  45. Thanks guys. And I think you’re right Clive. That is, I’ve no doubt at all that Joe Bastardi and company are right concerning the cold PDO and colder temps coming right around the corner. In fact they’re already here.

    But the warmists will never admit they’re wrong. It doesn’t matter in the short term, what the weather does, or what papers are published. Probably like you guys, I read all the blogs and see how the skeptics get all excited about this or that paper, some of them proclaiming the war is now over, but the reality is the other side simply does not pay attention.

    For a good example, the recent peer reviewed paper showing the oceans are not rising any faster than they have been over the last couple thousand years or whatever it is. And yet the newspaper articles and columns just keep coming. There was a major piece last Sunday in the Boston Globe about how the rising seas are dooming the coastline of NE.

    Ok, granted, it was one paper, but at least throw in an appropriate caveat. Fat chance. Meanwhile Paul Krugman and his ilk continue to call Mann’s “trick,” nothing but cosmetics. It’s worse than simple ignorance because it seems willful.

    These people are so invested in the AGW case, that there’s no turning back.

  46. Eric Ollivet

    Sea-Level Acceleration Based on U.S. Tide Gauges and Extensions of Previous Global-Gauge Analyses

    (Journal of Coastal Research – Still to print) http://www.jcronline.org/doi/pdf/10.2112/JCOASTRES-D-10-00157.1

    J.R. Houston: Director – Emeritus Eng. R&D Center – Corps of Engineers
    R.G. Dean : Professor Emeritus Dpt of Civil & Coastal Civil Engineering. University of Florida

    Abstract :
    Without sea-level acceleration, the 20th-century sea-level trend of 1.7 mm/y would produce a rise of only approximately 0.15 m from 2010 to 2100; therefore, sea-level acceleration is a critical component of projected sea-level rise. To determine this acceleration, we analyze monthly-averaged records for 57 U.S. tide gauges in the Permanent Service for Mean Sea Level (PSMSL) data base that have lengths of 60–156 years. Least-squares quadratic analysis of each of the 57 records are performed to quantify accelerations, and 25 gauge records having data spanning from 1930 to 2010 are analyzed. In both cases we obtain small average sea-level decelerations. To compare these results with worldwide data, we extend the analysis of Douglas (1992) by an additional 25 years and analyze revised data of Church and White (2006) from 1930 to 2007 and also obtain small sea-level decelerations similar to those we obtain from U.S. gauge records.

    Conclusion :
    Our analyses do not indicate acceleration in sea level in U.S. tide gauge records during the 20th century. Instead, for each time period we consider, the records show small decelerations that are consistent with a number of earlier studies of worldwide-gauge records. The decelerations that we obtain are opposite in sign and one to two orders of magnitude less than the +0.07 to +0.28 mm/y2 accelerations that are required to reach sea levels predicted for 2100 by Vermeer and Rahmstorf (2009), Jevrejeva, Moore, and Grinsted (2010), and Grinsted, Moore, and Jevrejeva (2010). Bindoff et al. (2007) note an increase in worldwide temperature from 1906 to 2005 of 0.74C.
    It is essential that investigations continue to address why this worldwide-temperature increase has not produced acceleration of global sea level over the past 100 years, and indeed why global sea level has possibly decelerated for at least the last 80 years.

    Personal comment:
    1) Bad times for alarmism and good example of how too alarmist communication can generate distrust towards climate science and climate scientists, when catastrophic forecasts (up to few meters according to some of IPCC’s / AGW proponents) are formally disproved by observations.

    2) Climate science shall more rely on observations rather than on sophisticated computerized models that have been formally proved invalid.

  47. Schrodinger's Cat

    Pekka, you contest the claims about the greenhouse effect (without specific reasons) and you ignore the argument that conventional physics explains the observed global average temperature without the need to resort to a GHG theory.

    • I contested his claims on “The greenhouse theory”. “The greenhouse theory” as described by Postma is pure straw man.

      I have not read through the whole article. I have, however, noticed that he presents much of the conventional theory, but for some reason implies that it would not be the same theory that others are also describing.

      The introduction starts to lead to this misleading direction. It is not correct to say that “Greenhouse Effect is indeed a theory”. while it is correct to say that there are theoretical analyses of the greenhouse effect based on the theory of physics and on knowledge about the atmosphere.

      He is also only partially correct in his presentation of the adiabatic lapse rate and its role in the theory. This issue has been discussed many times in other threads here, and I’ll not go deeper into this problem again.

      In other points as well it’s clear that Postma understands much of the physics correctly, but errs on some important details.

      After reading a fair part of the text, I concluding that reading more of that would be just waste of effort.

  48. Eric O. write re the no ocean rise paper:

    “Personal comment:
    1) Bad times for alarmism and good example of how too alarmist communication can generate distrust towards climate science and climate scientists, when catastrophic forecasts (up to few meters according to some of IPCC’s / AGW proponents) are formally disproved by observations.”

    It *should* be a bad time for alarmists, but how to break through that shell? It’s as if the two groups are living in separate realities. NPR’s Tom Ashbrook had his usual alarmist type guest opining this morning that the whole skeptical case is based on a couple of insignificant emails blown up way out of proportion by a bunch of “Republicans.”

    Now Tom Ashbrook is one smart fella. A true polymath. And from what I can tell, he’s genuinely well-intended. And yet he has no clue. How could he? I doubt he’s every even spoken with an educated skeptic, much less had one as a guest. And it’s not as if this stuff is difficult to find out. How long would it take say, Paul Krugman to educate himself on what “hide the decline” really means? Half an hour?

    Sorry Dr. Curry. I said I’d let the matter drop about your testimony, but my frustration boils over. You’re one of the few people both sides half way listen to. It’s only when credentialed people of principal and courage like yourself stand up and tell the truth as they see it that progress can be made.

    All just my opinion naturally.

  49. Schrodinger's Cat

    Thank you for replying and making your objections more clear. As a simple seeker of the truth I still have gaps in my understanding and it is probably my lack of expertise that drives my desire to understand the detail of where Postma has got it wrong and where climate scientists have got it right.

    • Ir’s hopeless to go through every lengthy paper that claims to have a more correct description than the standard theoretical understanding of the greenhouse effect.

      I mentioned one point, which is the way the adiabatic lapse rate is introduced and used. It’s possible that Postma’s problems start here, but without going through the rest of the paper carefully, it’s not possible to tell for sure.

      As is common to many similar papers, his paper contains numerous strange parts. Some of them are certainly just empty sidetracks and finally irrelevant, but some of them are certainly behind the wrong conclusions. Usually this kind of papers don’t contain enough information for making a solid judgment on the source of the error as they are likely to skip some essential steps in the argumentation.

      As discussed here (also contested, but that cannot be helped) in many threads here, the basic physics is well understood and on solid basis. That is enough to explain, what the greenhouse effect is and give a rough order of magnitude, but certainly not at all enough for estimating the actual strength of the effect in the real atmosphere with all its complications.

      This well understood solid theory is enough to exclude with certainty conclusions like “there is no such thing as a radiative Theory of the Greenhouse Effect” as long as we accept that the theory is incomplete and that the strength of the effect depends on factors beyond the coverage of any simple theory. By these I mean effects like the changes in cloudiness and resulting feedbacks.

      Postma has obviously missed some important points in, how radiative effects enter the theory and how thermodynamics is in error without the inclusion of these effects. (This leads to the idea that his error is in the handling of the role of adiabatic lapse rate.)

  50. After listening to the youtube several times I came away with:

    Rep. Brian Baird was trying to get a group of experts on various climate-related subjects, including Dr. Judith Curry, to give him the statement that the “consensus” science is OK and sufficiently robust to tell us we should act now.

    He essentially got what he wanted from the other invited experts.

    One (William Geer) discussed a particular spieces of trout, which he stated is living at the limit of the “upper range of thermal tolerance” and will not have the time to adapt to temperatures that might occur, because the pace of change is too rapid. Geer closed his testimony with an appeal for action (and $1 to 3 billion per year) invoking the “quality of living for our children and grandchildren” with the closing remark:

    Action on climate is justified not because science is certain, but precisely because it is not.

    In contrast to the others, Judith did not give Baird what he wanted in her testimony.

    She started off saying

    Anthropogenic climate change is a theory whose basic mechanism is well understood, but whose magnitude is highly uncertain.

    She stated that there is ignorance about what is known about natural climate variability, what is not known about natural climate variability and the feedback processes.

    These are the key issues that divide the supporters of the “mainstream consensus view” that AGW poses a serious threat from the “lukewarmers” who are rationally skeptical of this premise (a point that Baird probably missed).

    A few sentences later Judith said:

    The threat from global climate change does not seem to be an existential one on the time scale of the 21st century even in its most alarming incarnation.

    [This is definitely not in line with the “mainstream consensus” or IPCC.]


    It seems more important that robust policy responses be formulated rather than to respond urgently with policies that may fail to address the problem and whose unintended consequences have not been adequately explored.

    [Also not in line with the “mainstream consensus” and IPCC.]

    Judith called for a greater need for scientists to explore uncertainties and for improved and transparent historical and paleo-climate data records. She stated that citizen science groups in the blogosphere bring much needed scrutiny, particularly with regard to paleo-climate data [a reference to the “hockey stick saga”?].

    She closed her testimony with the remark:

    Anthropogenic climate change on time scales of decades is arguably less important in driving vulnerability than increasing population, land use practices and ecosystem degradation.

    This was all definitely not what Baird wanted to hear.

    Later in the Q/A part JC stated that population changes and how we engineer our environment are a big part of the problem.

    It appeared that Baird was hoping to get more definitive answers from her about the imminent dangers of AGW, but he was unsuccessful.

    JC pointed out that there are “no silver bullet solutions”, that climate change is only a part of the problem, that there will be “winners and losers” from warming, citing northern China, which would benefit from a warmer climate and more water that comes with it. She also stated that China is poisoning its soil, water and air (not with CO2, but with real pollutants), and that the competition for scarcer petroleum reserves will increase the need for alternative energy solutions.

    Baird then tried to invoke the “conspiracy theory”, referring to the blogosphere as polarizing and full of “snarky” remarks and “ad hominem attacks” rather than a venue for the rational discussion of the science.

    JC countered that there is a legitimate growing public interest and concern over climate change and climate change policies. She agreed that some of the more extreme technical blogs had “snarky” posters and there were “ad homs”, but that the “lukewarmer” sites tended to promote and open and civilized way of debating technical and scientific issues and encourage dialog engagement.

    To Baird’s last question about whether Climategate had basically obliterated all the legitimate data JC gave her (now famous) quote (which has been discussed here) but added that the public has seen climate scientists as “arrogant” and that Climategate resulted in a public distrust of climate science. She added that the IPCC reports include a “heavy dose of judgment” rather than hard data and that it is also important how the message is communicated.

    So Baird got a confirmation that Climategate did not “obliterate” all the legitimate data, but I would say (in light of the earlier testimony and the way JC answered his other questions) he did not get an endorsement from climate science (as represented by JC) for urgent action (as he did from the other witnesses, whose expertise was not directly related to climate science).

    Baird (departing chairman of the committee) gave some closing remarks about decision making with “imperfect and uncertain data” but “the best we can do”, evoking his two sons and stating that he hopes we will

    weigh the consequences of inaction or inaccurate action against the consequences of acting in a responsible, reasonable and rational way for the broader good of not only our society but the globe itself.

    I’d say this was the usual political dog and pony show with the “dompteur” trying hard to get “scientific” confirmation that “urgent action” is needed, essentially getting this support from everyone except JC, and then giving his political farewell speech.

    As a rational skeptic, I might have worded things differently than JC did on the validity of the data (earlier post), but I think she was about as objective as one could be in her position, not giving Baird the desired “green light” for “urgent action”, but rather calling for better and more transparent data and definition of uncertainties regarding natural variability and feedback processes.

    Just my impression and opinion.


    • thx max, your summary is very much appreciated.

    • Max: Yes, excellent summary. Thanks.

      I’m impatient with videos when I’m trying to get the gist and the C-SPAN player doesn’t seem to allow one to jump ahead.

      I find myself in agreement with JC’s comments.

  51. Fred, Pekka subject Downwelling Infrared.

    Y’all keep running out of reply thingies. I think I said that downwelling was a confusing term? Imagine that.

    Downwelling or incoming LWR is measured with a pyrgeometer. It has a trick filter to block SWR and is tuned for LWR. The maximum incoming LWR is measured generally at noon. At noon on a cloudy day, the percentage incoming LWR is much higher than on a clear sky day. Hmm?

    At night, the pyrgeometer says there is incoming LWR. Is there really? The guts of the pyrgeometer is a regular old pyrometer that measure temperature. Do clouds have a temperature? Does the temperature of the clouds at say -55 C mean there is incoming LWR? There is of course no heat flow from colder to warmer. So why would we assume that an instrument designed to measure temperature via infrared radiation can magically be used to define a downwelling of energy at night that violates the laws of thermodynacis?

    The daytime pyrgeometer reading of incoming LWR does indeed indicate in coming (warming) LWR if it is greater than the nominal value (temperature) of the sky being measured, at night it indicates a temperature that is related to S-B law’s emissivity. I wonder how many people are confused by the term Incoming, Downwelling or Back radiation of infrared radiation? Could that confusion be the reason NASA’s radiation balance cartoon does not include downwelling radiation?

    • Well, Dallas, you’ve certainly got me confused, or at least unsure of what you are trying to say. Clouds as well as air with greenhouse gas molecules emit IR in all directions including downward, as a function of temperature. We can refer to this as “downwelling IR”, and it is substantial.

      Is there any other source of downwelling IR? Solar radiation includes a fairly small IR component, mostly in the “near IR” wavelengths that are shorter than IR emitted by greenhouse gases. Some of this reaches the surface, whereas another fraction is absorbed in the atmosphere, mainly by water vapor and clouds, although a little by aerosols as well, and is part of what I described in the first paragraph.

      Are all these the same as “back radiation”? Mostly yes, because except for the solar component, it all comes from energy emitted by the surface, absorbed in the atmosphere, and then redirected downward. The solar component is of course not “back radiated”. It is also absent from nighttime IR, which is all “back radiated”.

      The NASA cartoon doesn’t give a figure for the downwelling IR but the KT diagram does.

      • I guess to understand it you would have to specification for the test equipment and sames of the actual testing. That used to be my thing, testing stuff. The maximum incoming IR or LWR is measured at noon in the summer for any area on Earth. The Russian guys that invented the pyrgeometer developed the “so called pyrgeometer equations” (that is the not too glowing term used in the encyclopedia) to measure ILWR.


        So while everything has a temperature greater than 0 K, that is not an indication of the flow of heat to an object with a higher temperature. The majority of the ILWR on the old KT diagram was from the sun. Confusingly, the Prygeometer or balance meters have a reading at night that was called ILWR. This is very fancy way of determining if air temperature warms the Earth, melts snow or heats the ocean. I am incline to agree that warm air is indeed warm.

        So when I said before that ILWR is indeed warming if the measured temperature of the sky is greater than the nominal temperature of that spot in the sky, that is what is being measured, is the sky warmer?

        So if you feel that warmer than normal air temperature due to cloud cover is caused by incoming LWR at night, hey, to each his own.

        So in my opinion, the use of all the terms for ILWR are confusing. What is wrong with saying, the air is warmer under some conditions when there is cloud cover and if the air temperature at the surface is warmer than the surface it warms the surface?

        So all the incoming discussion boils down to is an odd interpretation of measuring temperature differential. Do note that the 340 W/m^2 is for a 24 hour period. Most of that total is recorded during daylight hours.

        I will try to find a few papers not behind paywalls, but this one very dryly discusses the ILWR issues.


        Trust me it is a snoozer. Note the field testing timing.

        But yes, warmer air does warm the oceans.

      • Dallas,
        The pyrgeometer measures the intensity of all IR reaching the device. By construction that means radiation coming from above. The window acts as a filter that prevents visible light, UV and also the near IR of wavelength less than 4.5 um from reaching the detector. Thus it’s sensitive precisely to those wavelengths that dominate the thermal radiation originating from atmosphere. The solar radiation extends to these wavelengths, but less than 1% of the energy of solar radiation is at wavelengths above 4.5 um and most of that is absorbed in the atmosphere. Perhaps 1% of the IR radiation measured by a good pyrgeometer in sunlight is directly from the sun.

        On the basis of the above discussion essentially all radiation observed by the pyrgeometer comes from the atmosphere, where it originates from clouds, aerosols and from greenhouse gases such as CO2. All these radiate continuously day and night with an intensity determined by the local temperature at the point of origin.

        Much of the atmospheric radiation is absorbed at another point in the atmosphere. From the earth surface the atmosphere is transparent at few IR wavelengths only. It’s very opaque near 15 um. Therefore the 15 um radiation corresponds always to the temperature of the very low atmosphere (only few tens of meters from the detector). At other wavelengths more distant sources like clouds dominate and they radiate at a intensity of the lower temperature that exists at their altitude.

        From clear sky at the most transparent wavelengths we get practically no radiation. This is the reason for the fact that ILR is stronger under cloudy sky than clear sky. The effect is particularly strong during clear nights at high altitude. Under those conditions we may get little other ILR than what we get from CO2 near the detector.

      • Hi Pekka,

        Since I am not a researcher, I cannot get access to the daily data, only the issues they are having.

        One of the issues is daytime FIR. While the sun only radiates a small percentage of FIR, the SWR interaction with clouds and water vapor (subvisable cirrus) is producing significant daytime incoming FIR. The net radiation instruments are also having errors, or had errors, 25 W/m^2 up to 55 W/M^2. So how much is real and how much is instrumentation, I don’t know. So while one may only expect about 1% daytime incoming FIR, the data is significantly more even after correcting the errors.

        There is also the issue on the sign of the “Incoming FIR”. It is negative at night, showing net outgoing as expected, and in most cases less negative in the day and at times positive based on a SIRS BSRN monthly comparisons. Still there is +/- 20 W/M^2 accuracy.

        There is also no standardization of net radiation instrumentation I could find. WHOI is using a different set up for the LWR. Instead of a tracking shade, they are using flat black sensors with a different algorithm including constant albedo.

        So in my opinion, the jury is still out on the utility of “incoming LWR” as a term or a measurement. Though I do agree, in a truly clear sky, daytime LWR is minimal, in a dry atmosphere like the desert stations in BSRN show.

      • Dallas,
        This is a point that I have not discussed, but it’s not in contradiction with what I have written. I have considered all radiation from the atmosphere as one whole an that includes also the effects that the sun has through heating the atmosphere by the SW radiation. These effects make estimating day-time ILR more difficult, but I have not tried to make any estimates on that level.

        You may have misunderstood me. I do not say that daytime ILR is minimal. It certainly is not minimal, but always very large, larger than nighttime ILR and one of the largest components in the energy balance of the skin layer. It is, however, almost always significantly smaller than OLR from the skin, the exception is discussed below. Under clear sky and sun high up in the sky the SW radiation reaching the surface is likely to exceed ILR, but SW is not absorbed strongly by the skin but penetrates deeper in the ocean as discussed in a recent message above also by Chief Hydrologist.

        Combining all possible effects there may be exceptional circumstances where the atmosphere heats the skin, but this is marginal at best. Evaporation cools the surface as long as the dew point of the air is below the skin temperature of the ocean. The skin radiates LW IR essentially as a black body, but the lowest layers of the atmosphere are usually not fully opaque to all wavelengths. Consequently I can imagine only a very local situation, where the air heats the water. That is when very moist foggy air is blown by light breeze from warmer land area to cold sea. Then we don’t have cooling evaporation but warming condensation, and the LW radiation balance may also be on the warming side.

      • “Under clear sky and sun high up in the sky the SW radiation reaching the surface is likely to exceed ILR, but SW is not absorbed strongly by the skin but penetrates deeper in the ocean as discussed in a recent message above also by Chief Hydrologist.”

        I try to relate most of this to my personal observations. Fishing, I am concerned with the surface temperature and the thermoclines. While the SW penetrates deeper, the lower end, red, is absorbed by 30 to 40 feet. It is not uncommon to have a first thermocline at that depth. So by my unscientific observations, most warming is in the first 10 meters. That appears to be much more significant than the thin skin layer. As mentioned, that calm skin layer is not a very normal occurrence, or I would not have been sitting at the dock the past week with the winds over 20 knots. In this area, mid-June through mid-August there are calm waters, so there is that ideal skin layer (between hurricanes of course). So as a matter of scale, my observations indicate that the skin layer impact is minuscule relative to the low end of SW in the first 10 meters and other wavelengths that seem to create the second thermocline at roughly 50 meters, that is marlin conditions here, over that second thermocline. Also blackfin tuna get stupid then which is a tasty treat.

        So the reason I feel that ILR is a bit confusing, is the scale of the warming is much more evident via SWR. Then these are observations from 20 to 25 degrees North lat.

      • A major part of the confusion and much of the disagreement in the discussion is not related to the basic facts, but to the interpretation of the word “warming” and to the order, in which the various components of the energy balance are taken into account. All this applies in particular to the ILR component.

        ILR is a large component and without it the surface would reach rapidly a very low temperature. ILR is warming at least in the sense that it counteracts cooling. In the language of many people counteracting cooling is warming, but for some others it’s not. Still the physical effect is exactly the same.

        In the case of oceans ILR contributes to the skin, which is the only part of the ocean that is losing energy. It is releasing to the atmosphere (and space) almost all the energy that is absorbed from SW in layers below the skin (the remaining small fraction heats the ocean). As the skin is losing energy to the atmosphere, it’s again a semantic issue, whether we wish to say that ILR is warming it even when we admit that a lot of ILR is bringing energy to the skin.

      • “ILR is a large component and without it the surface would reach rapidly a very low temperature. ILR is warming at least in the sense that it counteracts cooling. ”

        That statement is exactly why I think ILR is confusing. While the surface is where cooling of the ocean occurs via mainly LR, the flow of heat from the ocean to that surface layer does not have a LR component of any significance. Rapid is a relative term, the thermal huge mass and less than ideal heat conduction characteristics below the surface are the limiting factors for ocean cooling. Here in the Keys, the water temperature drops much more slowly that it rises. The drop is mainly due to lower air temperatures, outgoing LR increase. So while warmer air temperature reduce the OLR and in a very few cases, actually provide some warming, the majority of the warming is SWR.

        So thinking that less cooling is warming, muddles the understanding of processes that create the majority of the warming and have regulating impacts on the cooling, the heat has to get to the surface right? While the ten and 30 meter thermoclines are small relative to the big thermocline at 100 meters, they are good examples of how poorly temperature layers in the ocean mix.

        I don’t know if that makes sense, but it is kinda like not seeing the forest because of “A” tree, in my opinion.

    • Solar radiation is about 40% IR – so max at noon on a clear sky day sounds about right – http://en.wikipedia.org/wiki/File:Solar_Spectrum.png

      All warmth comes from the Sun. The atmosphere and surface is warmed and re-radiates in the infrared. It is not warmed nearly enough to re-emit in the visible spectrum – although quite a large proportion of visible solar energy in reflected from cloud and surfaces.

      IR is re-radiated from the surface and quite a sizable proportion is captured by greenhouse gases, water vapour and cloud (which is simply a concentration of water and water) and thus warms a little. Increase greenhouse gases and it warms a little more. Any warm object will re-radiate energy in all directions. Back radiation is not a physical concept – radiation from a warming atmosphere doesn’t just happen downward.

      It is not even a useful concept. Far better to think in terms of top of atmosphere energy balances where energy in less energy out is equal to the change in heat storage in the oceans and atmosphere. You seem like you might be able to cope with a 1st order differential equation?

      Ein/s – Eout/s = d(GES)/dt – where GES is the global energy storage.

      Greenhouse gases capture a proportion of the outgoing IR – thus reducing energy out and the planet must warm from the 1st law of thermodynamics. If the whole planet warms – there will be an increase in IR emissions to space.

      Are back radiation, latent heat or surface IR up emissions even quantifiable within any hopelessly inadequate limit of accuracy? I think not.

      • I have a longer post that disappeared into the spam box – but the atmosphere does not warm oceans. Quantum IR processes all happen in the top microns – which because net IR is usually positive upwards is usually cooler than the water beneath. You can’t mix cooler water with warmer and get net heating. Solar SW warms the oceans and increases in greenhouse gases reduces the energy loss – therefore the oceans (and atmosphere) remains warmer.

      • Actually Chief, if the air temperature is warmer than the water there will be a little warming from the air. Why that is harped on I have no clue. The net effect is that warmer air reduces heat loss and that there is ILWR during the day that warms the surface. Totally useless in my opinion, to use back radiation to re-describe something we already know.

      • We will need to disagree. Because it is all quantum IR radiative interaction between the oceans and atmosphere – it all happens in the top microns of the ocean. As the net IR flux is positive upwards – the ocean is cooling from the top microns almost always. All the ocean warming happens in the SW.

        I don’t know why either – it makes no difference to a cooler or warmer ocean – but I like to get the physics clear.

      • Actually I agree with you. The thermal mass of air is so much lower that ocean that the warming of the ocean by the air is insignificant. Technically, there is a small amount of warming. Conversely, with much colder air over the water the rate of cooling of the water increases significantly.

        I hope you don’t think with you because a 10 degree warmer air front may warm the surface temperature of Florida Bay where I fish by a fraction of a degree. :)

      • If you take air that is cooler than the underlying ocean (which is typical), and add CO2 to it without changing its temperature, that air, which is still cooler than the ocean, will now cause the ocean temperature to rise. Whether you want to call that “warming the ocean” or something else is a matter of semantics. You can do the same thing by adding water vapor or clouds.

        In Judith Curry’s post on radiative transfer models, I believe, she cited links and references to ARM measurements of IR downwelling radiation (or “back radiation”), indicating that this can be measured with reasonably good precision. The latter term itself is widely used in geophysics, although not everyone likes it.

      • Did you read that link I posted? Rather skim it, you would be asleep if you read it. There are quite a few recent papers about improving the ILWR algorithms. The whole point is the concept of ILWR is new and a bit controversial in most circles. NASA dropped it and don’t be surprised if more drop it. If Trenbreth is missing heat, it is probably because of screwy ILWR numbers that are all over the place.

        Adding CO2 will reduce emissivity and increase temperature. That is not new or all that controversial. By how much is the question.

      • Dallas – Adding CO2 increases emissivity rather than reducing it.

        I skimmed the article you cited. As far as I can tell, it appears compatible with prevailing understanding of downwelling IR.

        I’m not sure what you are referring to as “controversial”. Some of the algorithms may be, but the importance of “back radiation” from CO2, water vapor, and clouds in adding heat content to the oceans and raising their temperature is well established and not controversial in the scientific literature.

      • Dallas – An apology about the tone of my comment above. It was too abrupt, as though I were dismissing your ability to discuss the subject, which is not the case.

        Technically, emissivity and absorptivity are two sides of the same coin (via Kirchoff’s Law). When CO2 is added, the atmosphere can both absorb and emit more IR. You were referring to the increased absorptive capacity, and I was just making a technical point about the word emissivity that doesn’t contradict your understanding of increased absorption.

      • Fred,

        Do you remember when the H&T cartoon used to be in Wikipedia?


        What happened?

        From science of doom,”When energy is transferred by radiation from a colder body to a hotter body, it is important to understand that this incident radiation cannot be absorbed – otherwise it would be a clear violation of the second law of thermodynamics”


        All that is of concern is the net radiation flux.

        Chief thinks or thought I disagreed with him because air warmer than the ocean can warm the ocean a little. The “back radiation”, night time ILWR, cannot. In coming LWR, from a warmer source can warm the ocean, that’s day time ILWR.

        That few molecules that could possibly be warmed by the air if they happen to be cooler than the air, then the elaborate mixing scenario where a few molecules can warm the ocean with heat from a colder body, is horse hockey!

        I don’t know how else to explain it. Everything that has temperature emits LWR, but the warmer body always wins. So that huge back radiation number is just a confusing way to say air is warm. The only time there is any net ILWR at night is when the air is warmer than the surface, whether you use the “REAL” or imaginary second law.

      • ”When energy is transferred by radiation from a colder body to a hotter body, it is important to understand that this incident radiation cannot be absorbed – otherwise it would be a clear violation of the second law of thermodynamics”

        Not quite correct. It’s a violation of the 2nd law for there to be a NET transfer of energy from cold to hot. Individual quantum transfers are allowed, as long as there are more going the other way.

        Remember, in a 200 mph hurricane, something north of 40% of the molecules are going against the wind. It’s the bulk net effect that has to obey the laws of thermo, not the individual actors.

      • Dallas

        I think Trenberth’s missing heat is from TOA flux anomalies – which seem mostly to be shortwave.


        However, I have violated my first principle – never make categorical statements. You are quite right that a warmer atmosphere can transfer net heat to a cooler ocean.

        Under what circumstances does that happen? Is there a reversal of this ‘skin’ temp thing? Would Batman beat Aquaman in a fight?

        I think I’ll go to sleep now.

      • No you are absolutely mistaken on all counts.

        ‘Why are sea surface temperatures rather than air temperatures used over the oceans?

        Over the ocean areas the most plentiful and most consistent measurements of temperature have been taken of the sea surface. Marine air temperatures (MAT) are also taken and would, ideally, be preferable when combining with land temperatures, but they involve more complex problems with homogeneity than SSTs (Rayner et al., 2003). The problems are reduced using night only marine air temperature (NMAT) but at the expense of discarding approximately half the MAT data. Our use of SST anomalies implies that we are tacitly assuming that the anomalies of SST are in agreement with those of MAT. Many tests show that NMAT anomalies agree well with SST anomalies on seasonal and longer time scales in most open ocean areas. Globally the agreement is currently very good (Rayner et al, 2003), even better than in Folland et al. (2001b). However, some regional discrepancies in open ocean trends have recently been found in the tropics (Christy et al., 2001).’ http://www.cru.uea.ac.uk/cru/data/temperature/

        This was included in my longer post that disappeared for some annoying reason. There is a temperature equilibrium on a seasonal basis between the ocean and atmosphere – but no energy equilibria. Which is why the sea surface temperature is used in lieu of surface temperature for the global surface temperature record. But the ocean is always giving off more radiant heat than it receives heating the atmosphere which in turn loses energy to space. This is obvious in any of the global energy budgets.

        The determination of absolute values of radiative flux is quite difficult – which is why there has been a substantial revision – around 5W/m-2 from memory – recently in the solar constant. The limits of error in absolute flux is an order of magnitude greater than that for anomalies. I believe the error bound for absolute flux is at best now about 3.4W/m-2. But simply measuring surface upward and downward flux at a point is not the major source of error – it is that these point values can not even nearly be interpolated across ocean and land scapes in the real world of cloud, dust, vegetation, ice and aerosols with any accuracy. You need to show understanding and not simply apply arguments based on an obscure appeal to authority.

        If you add CO2 to the atmosphere – and the atmosphere doesn’t warm – how can it possibly have any effect on downwards radiant heat? In reality the atmosphere must warm as radiant heat is captured by molecules in the atmosphere. The atmosphere warms and radiant heat from the atmosphere increases in all directions. The net radiant heat up at the surface of the oceans is reduced and the oceans warm a little. Whether you call this ocean warming or not is not simply semantics. It is a matter of more completely understanding the physics.

        You are hopelessly muddled – you insistence on some misconceived idea that downwelling IR is the overwhelmingly significant factor in ocean warming leads you to go further and further astray.

        You should simply admit your error and move on. It does nothing for the discussion to pile error on error. As the expression goes – if you want to get out of a hole first stop digging.

        If you wish to respond to me – please do it directly.

      • Just to repeat my out-of-sequence question, when you say, “…misconceived idea that downwelling IR is the overwhelmingly significant factor in ocean warming…” with downwelling’s 324 Watts/m2 compared to incoming solar’s 168, why is it not the overwhelming factor (cause) of ocean warming?

      • Fred, or others – I asked this question on another thread and it was not answered. In the daylight diagram, where is the gradient that additional downwelling long wave radiation is warming, and which, because of this warming, is slowing the heat from below from exiting the ocean?

        Is this gradient between the colored markers, or within one?

        Consider this response by Stefan at RC to a reader’s comment:

        Thus we are talking not about the gradient between sea surface and overlying air, but we are talking about the gradient through the skin – i.e., the water temperature difference between the top and bottom of the skin layer, which controls how heat flows across this layer, from the bulk of ocean water below to the surface. Obviously, if you heat the top of the skin layer, this reduces the heat flow across this layer from below. Clear? Or still confusing? -stefan]

      • I suggest that the skin concept should be considered flexibly. Wherever there are waves there are molecules in circular orbits such that the particles are constantly being replaced by particles from below.


        The only explanation I have for a colder ‘skin’ layer is that radiative emission is much faster than a turbulent wave. Speed of light as opposed to m/s.

      • JCH,
        A full answer to your question is complicated by the many time scales and processes involved. Those pictures tell a few temperatures at two specific moments. They do not tell, how they develop over 24 hours or how cloudiness or winds influence the situation.

        They do not either tell details of the profile in the top 1 m. One essential question is the location of the temperature maximum under the conditions of the picture (b). My guess is that that the maximum may be at a depth of 10 cm or so, not at 1 mm (the picture doesn’t rally make any claims on that).

        The top few micrometers are important. That is where the heat transfer to the atmosphere takes place. They are continuously near to equilibrium with the atmosphere as the thin skin cools or warms very rapidly. As the picture indicates the thin skin is practically always colder than the layer immediately below, but the location of the maximum temperature and the size of the temperature differences varies depending on momentary conditions. The maximum may be within centimeters from the surface or it may be at a depth of several meters, perhaps even deeper under special conditions.

      • Pekka – I guess I will never understand this. To me, when the skin layer is formed, I sincerely doubt the maximum temperature would be at cm depths. With an existent skin layer, I think it will usually be in fairly close proximity to .5 mm.

        That would be where SW heat moving up first meets LW penetration. Sounds like a hot couple to me.

      • JCH,
        Most of the absorption of SW radiation occurs more than 1 m below the surface. Thus there is significant heating that occurs there. That will lead to a continuous warming of those layers unless the heat is transferred away and that can happen only up or down. We know that only little goes down as the oceans are warming very slowly. Thus almost all that heat must move up, but it cannot do that unless the temperature is lower above. (Mixing can can cool only warmer water, not the cooler one.)

        This is a proof that there must be a maximum a fair distance below surface.

      • Pekka – from SoD’s site:

        50% of solar radiation is absorbed in the first meter, and 80% within 10 meters

        50% of ”back radiation” (atmospheric radiation) is absorbed in the first few microns (μm).

        Is the above incorrect?

      • JCH 4/6/11, 1:02 am, week of 4/2/11 …

        The extant model for AGW uses the radiative forcing paradigm. It is defined as changes to a baseline state, given as the K&T energy budget. That budget has single nodes for the Sun, for deep space, for the atmosphere with subnodes of clouds and greenhouse gases, and for the surface. The surface has a subnode of ice and snow albedo, but otherwise is a mud, representing land and some kind of ocean surface layer, all with a nominal natural temperature of 288K returning energy from a half-day half-night Sun. These are macroparameters, the domain of thermodynamics. The GCMs have departed from that structure, breaking the atmosphere and the ocean into multiple layers. Changes to those layers are departures from radiative forcing, and are fictions on multiple levels.

        Science, including technology and especially the natural world, doesn’t deal with the real world, but with samples of it. It deals with measurements of the real world as that world impinges on our senses and instruments. Even physical samples excised from the real world and brought into the lab for experimentation are no longer of the real world. What we observe and reduce to facts by measurements compared to standards, form the basis for scientific models that express cause and effect.

        In ordinary conversations scientists will speak of their models as if they were speaking of the real world. They will mix macroparameters with mesoparameters and microparameters, parameters nowhere measurable, sensible, and unresolvable, respectively. Usually, they know better. Sometimes scientists err to use such imprecise language for the public or in their scientific papers.

        Sometimes the real ocean has a surface layer in such a froth that the boundary between ocean and atmosphere is indiscernible. The air in a hurricane several stories above the ground can be noticeably salty. In severe weather, nothing exists which might be identified as a skin. At these times, the thermocline can be hundreds of meters deep. At other times, the ocean can be glassy calm, mirroring ships and land features. At these times, the thermocline is likely to be a few tens of meters deep. In between, the surface layer has wave structures that extend well below the surface. During the day, these draw the newly warmed skin into the surface layer and replace it with cooler water. What is absorbed in the skin, whatever its thickness might be, is merely a transient phenomenon in the heating and cooling of the entire surface layer.

        The question then is what is the structure of the surface layer in a GCM? The answer is that it is a fiction. It is not an emulation of a real surface, nor of the land, nor even of the ocean. It is no kind of average. It is a set of parameters to be adjusted empirically in the search for a GCM configuration that might express some power to predict climate.

      • JCH,
        When I wrote “most is absorbed more than 1 m below”, I had in mind a little more than 50%. SoD tells that the percentage is 50%. Either one of us is slightly inaccurate, but we mean the same thing. (I don’t know the precise average value to tell, who was inaccurate.)

        Concerning LWR there is no disagreement, even slight one.

      • Adding to the comment of Jeff Glassman.

        Whenever a scientist or anybody else is describing real world phenomena she does it trough simplifications and abstractions. Our language can describe only those. Any scientific model can present only those. There is no way around.

        A common problem in communicating scientific results is that the scientists have very often different language than others. They may use words unknown outside their expertise or they may use common words in a different meaning. This latter fact is particularly confusing. It’s even more confusing in the names of model parameters and variables. The models are usually highly aggregative. Each aggregate variable is given a name that has some relationship with other uses of the same word, but the meaning of the variable is often very different from the common meaning of the word in details.

        Interpreting the scientific results correctly requires usually that the precise definitions of the concepts and variables are known and understood. Even the scientists doing the work may sometimes get confused by these differences. They may, e.g., use input from statistical aggregates although the aggregation of the data is very different from the aggregation in the model.

      • Pekka, Fred, others – thank you very much for the help. I was looking last night for articles about maximum temperatures in the bulk layer, and I stumbled upon what I believe is
        Minnett’s article on the skin layer. If you can access it, I think you would find it very interesting.

      • Chief,

        LOL advection fog is rare, but happens, we may get one or two days per winter when warm air warms the water. Doesn’t play much of a role at all in ocean warming, but I guess every penny counts when balancing a budget. :)

      • Despite evidence that back radiation from atmospheric greenhouse gas molecules contributes more to ocean heat content and temperature than direct solar radiation, a notion persists in the blogosphere that the back radiation contributes little and that almost all heat transfer to the ocean comes from the solar component. This argument holds that the energy from absorbed back radiation remains confined to the skin layer (known to be where it is initially aborbed) and escapes from there to the atmosphere via radiation and evaporation with little or no downward redistribution, while the solar component is distributed over a greater range of depth and reaches the surface to escape mainly by convection and conduction.

        That this argument is untenable can be inferred, I believe, from the most recent Trenberth-Fasullo-Kiehl Energy Budget Diagram in Table 1b. The Table shows that of the roughly 400 W/m^2 upward radiation from the ocean surface, only a maximum of about 165 could come from absorbed solar (less, to the extent that some of the absorbed solar contributes to latent or sensible heat flux). This leaves about 235 W/m^2 or more to come from the back radiated contribution. If these two energy sources persisted in a different pattern of distribution until they combined at the surface before escaping, then they must have very different temperatures, with the skin layer much hotter than the underlying 10’s of meters of water. In fact, the opposite is observed. The skin layer is slightly cooler than the water immediately below, and further down, the water is only slightly warmer, exhibiting a shallow temperature gradient characteristic of the ocean mixed layer. The only plausible mechanism in my view involves a rapid mixing of energy from the two sources after absorption, mainly by turbulence, so that the temperatures are averaged out.

        The conclusion that the back radiation is the major component of radiant energy transfer to the oceans appears to be reinforced by these observations.
        (a small side note on emissivity for Robert – adding CO2 increases an atmosphere’s IR emissivity, so that it emits more IR at the same temperature)

      • Oh please – ‘the emissivity of a material (usually written ε or e) is the relative ability of it’s surface to emit energy by radiation. It is the ratio of energy radiated by a particular material to energy radiated by a black body at the same temperature.

        The emissivity of Earth’s atmosphere varies according to cloud cover and the concentration of gases that absorb and emit energy in the thermal infrared (i.e., wavelengths around 8 to 14 micrometres). These gases are often called greenhouse gases, from their role in the greenhouse effect. The main naturally-occurring greenhouse gases are water vapor, carbon dioxide, methane, and ozone.’

        So we have an increased ability to absorb and emit thermal energy.

        ‘A more general case is of a grey body, the one that doesn’t absorb or emit the full amount of radiative flux. Instead, it radiates a portion of it, characterized by its emissivity, ε:

        j* = εϬT to the 4th power’

        So how much has emissivity changed? By about 0.0005. http://climaterealists.com/index.php?id=5847

        You stick to your guns that the cooler body warms the warmer
        body. I am just going to assume that net radiative and latent heat flux at the surface is the the relevant factor in ocean heat content changes.

      • The emissivity relevant to heat transfer into the ocean from back radiation relates to the thermal IR range. There, emissivity is quite high in general, and close to 1.0 in parts of that spectrum. It is also substantially increased by increases in CO2.

      • ‘(a small side note on emissivity for Robert – adding CO2 increases an atmosphere’s IR emissivity, so that it emits more IR at the same temperature)’

        I suspect Fred needs to consider numbers and visualistion rather than a qualitative narrative. Visualisation is a good technique – http://nrich.maths.org/6447 – allowing for cognitive models of complex physical systems. Relying on verbal constructs can be very misleading – and frustrating for others because qualitative narratives are especially difficult to nail down.

        Let’s try put it in a wider context. The generalised Stefan-Boltzmann relationship is: j* = εϬT to the 4th power where ε is the emissivity – a dimensionless factor between 0 and 1.

        All warming or cooling on the planet happens because of an energy imbalance at the top of atmosphere (TOA). If for instance we add greenhouse gases to the atmosphere – more upwelling IR is absorbed and therefore less energy leaves the planet creating an energy imbalance at the TOA. The planet accordingly warms and therefore emits more energy in all directions – including up. All other things being equal – and there is no reason to suspect that they ever are – the warming continues until the imbalance is reduced to zero.

        The generalised S/B equation simply shows that the TOA longwave up increases as as result of emissivity as well as temperature. With increased emissivity – atmospheric temperature need not rise as much to balance the radiative budget.

      • “With increased emissivity – atmospheric temperature need not rise as much to balance the radiative budget.”

        Exactly the opposite is true, Robert. These are standard radiative transfer principles. Increased absorptivity/emissivity (the two go together) tell us that more IR from the Earth’s surface directed to space will be intercepted with some of it emitted downward. This increases the radiative imbalance at the tropopause and therefore increases the atmospheric and surface warming needed for OLR at the tropopause (or at the TOA when that framework is used) to again equal incoming absorbed solar radiation.

      • Increased IR absorbtion causes the radiant imbalance. Increased IR absorbtion warms the atmosphere at the same time – which then re-emits IR.

        The top of atmosphere is where there is a pure dynamic energy balance in accordance with the 1st law of thermodynamics. So please don’t confuse things more than necessary by bringing in superfluous boundaries at which there is no energy equilibrium.

        At TOA: Energy in – energy out = the change in global heat content – a simple but complete description of the global energy budget.

        Increased absorbtion of IR decreases energy out and the atmosphere warms. A warmer atmosphere emits more IR in all directions accordance with the generalised S-B equation:

        j* = εϬ(T to the 4th power)

        j* increases a little as a result of both the small increase in emissivity ε and the small increase in T. The math is quite simple if you stop long enough to look at the terms. It has 3 terms on the rhs – a constant and 2 variables. If j* stays the same – you can increase ε or decrease T – and T is far more important given the 4th power factor.

        Absorbed IR increases the temperature of the atmosphere which then re-emits IR. If delta ε increases – than for the same T more energy is emitted to return to the nominal equilibrium at TOA. This is exactly what you originally said – but failed to put into the radiative flux context.

        Changes in emissivity have a very minor role in changes in radiative flux – the change itself is small and the term is dominated by the 4th power of temperature.

      • Gray body Stefan-Boltzmann equation is not a good basis for looking at the TOA energy balance. It can be used in inverse to calculate an effective temperature for the Earth, but it is practically worthless for understanding what is going on. To understand that one must use the Planck’s law with its wavelength dependence and handle each wavelength in a way that takes into account the transparency of the atmosphere for that particular wavelength.

        The role of emissivity is also very different for a gas than it’s for a solid (or liquid) non-transparent surface. For a non-transparent surface the sum of absorptivity and albedo is one, and emissivity is equal to absorptivity. For a gas we do not have much albedo, while the alternatives are absorption and transmission, which leads to totally different effects. (Clouds and aerosols may combine features from both surface and gas.)

        Looking from TOA the role of increased absorptivity/emissivity is to move higher the altitude of origin of the radiation. That leads to a lower temperature and a reduced intensity of radiation, which leads further to an imbalance to be compensated by a warming atmosphere. Some of this effect is to replace emission from the surface by emission from higher levels of atmosphere.

        Looking from the surface the role of increased CO2 is to lower the altitude of origin of the down-dwelling radiation and to increase down-dwelling radiation at wavelengths where is was practically non-existent with less CO2. This leads to warming of the surface until increase in OLR is sufficient to reach a balance.

        It’s is often useful to look at the balance both at TOA and at the surface. The results must be closely the same, because the heat capacity of the atmosphere is so small that differences must average out over a rather short period.

      • Hi Pekka,

        My use of the generalised S-B equation was in relation to this from Fred:

        ‘If you take air that is cooler than the underlying ocean (which is typical), and add CO2 to it without changing its temperature, that air, which is still cooler than the ocean, will now cause the ocean temperature to rise.’

        And this:

        ‘a small side note on emissivity for Robert – adding CO2 increases an atmosphere’s IR emissivity, so that it emits more IR at the same temperature’ .

        I feel this was a deeply non-physical formulation. There is no heat added to the system when Co2 is first added to the atmosphere I believe – it is simply one half of a chemical reaction and this cannot by itself warm the ocean/atmosphere. The warming commences when an energy imbalance is created at the top of the atmosphere and the warming acts to redress the imbalance by emission in accordance with the generalised S-B equation.

        It is very difficult to express the working of this formula in words due to the covariance of the variables. Add CO2 and emissivity increases and increased temperature results from increased CO2 in the atmosphere. In reality these variables cannot be isolated. Increase CO2 and an energy imbalance is created and temperature increases. Because emissivity also increases the temperature increase needed to redress the imbalance is less than it otherwise would be – an inadequate expression of a mathematical relationship.

        Looking at radiative flux at TOA is simply to emphasise the primacy of radiative flux in global energy dynamics. There is an energy equilibrium at top of atmosphere. By the 1st law of thermodynamics – Energy in – Energy out = the change in global heat storage. This can be expressed as:

        Ein/s – Eout/s = d(GHS)/dt – a perfect model of the global energy budget.

        Ein is relatively stable in SW and LW so Eout is relatively stable in net. But the proportion of Eout in SW or LW changes hugely as albedo changes. About 85W/m-2 from snowball Earth to the blue green planet.

      • Rob,
        Looking at the various ways of defining radiative forcing one of them is defined by a hypothetical sudden addition of CO2 to the atmosphere keeping everything else fixed and determining, what are the immediate effects on the radiative energy balance at TOA.

        The answer is that a small part of that radiation from surface that earlier escaped to space is now stopped in the troposphere, and that the radiation originating in the troposphere to escape to the space is now originating from higher altitudes of lower temperature. Consequently there is an immediate imbalance that starts to warm the troposphere. The definition stops at this moment and the result is stated as the imbalance in radiative energy flux at TOA before any changes in temperature have materialized.

        Fred’s statements appear to be in agreement with this correct description of a hypothetical setup. While this setup cannot be realized in practice, it serves as one possible basis for defining a concept (the radiative forcing).

      • I think you are misunderstanding the nature of the formulation. It revolved around the addition of CO2 without changing the temperature of the atmosphere – a condition that was stated.

        This immediately resulted in an increase in downwelling radiation as a result of the emissivity factor. In fact just a result of an increase in the number of molecules emitting radiation. The increase in downwelling radiation warms the ocean – according to Fred.

        There are so many problems with this that it is pointless to continue. It needs to be thrown away to be replaced by considerations of net radiation flux – especially at TOA and at the surface of the oceans.

        It is all part of a web of unreality as far as I am concerned.

    • Dallas 4/5/11, 6:54 pm, week of 4/2,

      You said, Downwelling or incoming LWR is measured with a pyrgeometer. On 4/6/11 at 4:33 am, Pekka Pirilä repeated, The pyrgeometer measures the intensity of all IR reaching the device.

      Not exactly.

      This relates to the problem Fred Moolten and others are having with radiation and thermodynamics. For example, on 4/4/11 at 6:33 pm, Moolten said, downwelling IR contributes about twice as much to ocean heating as direct solar radiation, and on 4/5/11 at 7:22 pm, The solar component is of course not “back radiated”..

      The pyrgeometer measures the voltage across a thermopile. One side of the thermopile is at a known case temperature. In an ideal device, the other side is exposed to the incoming radiation, causing it to warm by heat proportional to the difference between the OLR from the thermopile itself and the ILR being measured. The instrument or the operator now calculates the incoming power as OLR plus the voltage drop, calibrated to Watts by an empirically developed instrument sensitivity. See Fairall, et al., (1998), A New Look at Calibration and Use of Eppley Precision Infrared Radiometers, etc..

      The instrument is measuring the warming from heat. This means that if the atmosphere is at the same temperature as the case, the instrument output is zero. That reading does not mean that the atmosphere is not radiating.

      The thermopile in the pyrgeometer is analogous to the surface of the earth, whether land or sea. The warming that takes place is proportional to the net radiation power, not the incoming radiation power.

      Chief Hydrologist on 4/6/11 at 12:37 am said, In reality the atmosphere must warm as radiant heat is captured by molecules in the atmosphere. The atmosphere warms and radiant heat from the atmosphere increases in all directions. The net radiant heat up at the surface of the oceans is reduced and the oceans warm a little. This should be written with radiation in place of radiant heat at all three occurrences. The criticism is a bit nit picking for the first two, but not for the third. Thermodynamics does not deal with net heat between objects. Instead, radiant heat is net radiation energy.

      Chief Hydrologist also said in response to Fred Moolten, You are hopelessly muddled – your insistence on some misconceived idea that downwelling IR is the overwhelmingly significant factor in ocean warming leads you to go further and further astray. The cause of the muddle in part is imprecision in the language, causing a failure to adhere to the delicate requirements in the art of thermodynamics.

      A greater cause might be that Moolten and Pirilä are both indelible AGW believers, and so defend GCM modeling (and find much agreement between themselves). Those models might have developed as scientific models, able to predict climate from natural causes, much in keeping with IPCC’s charter in 1988. But IPCC changed its own charter in 1998 (1) to assume that AGW existed, and then (2) to task itself to assess the severity of human effects.

      To the extent that any model might have predicted warming from natural causes, those causes would have had to be removed and replaced with human causes. This had all sorts of grim side effects. Radiative forcing had to depart structurally from its definition as changes from the baseline Kiehl & Trenberth budget. The surface layer had to be made stagnant (ChE, 4/5/11, 1:04 pm, speculated insightfully about a reasonably still ocean), this to force inadequate ACO2 to accumulate in the atmosphere. (A beneficial side effect for IPCC’s fear mongering was that CO2 caused acidification.) The Revelle Buffer Factor had to be rehabilitated. CO2 had to be made long-lived, and required amplification by water vapor. Human fingerprints on CO2 had to be fabricated. IPCC raised causation by correlation to an art form with a bunch of hockey stick reductions. IPCC had to redefine cloud albedo, and it had to ignore other, real correlations.

      No wonder Trenberth found the latest work a travesty. No wonder Revelle thought that greenhouse warming is too uncertain to pursue in 1991.

      Once again we see how a belief system can infect a scientific model.

      • True, I should have listed the thermopile or radiation balance meter. One thing I found amusing in all this was the Wikipedia revisions for the radiation budget entry.

      • Sir,

        I assure you that the use of the term radiant heat was quite deliberate. Sometimes I feel that people lack the physical reference to the phenomenon. Radiant heat is in quite common use for various heating systems where an element is excited giving off IR radiative flux. We feel this as heat on our skin and thus it is quite natural to call it radiant heat. The equivalent exists for any warm body such as the ocean or atmosphere.

        I often use the terms visible light and radiant heat to a give a physical reference that people sometimes don’t understand or seem to forget.
        Although of course in reality it is an incomprehensible ‘packet of energy’ with a specific wavelength.

        I keep throwing these up. Of course the origins of the data are a bit clouded (pun intended) with instrument drift, orbital drift, missing records because of breakdown and a shuttle disaster, cross calibration problems, etc. But it is what we have and it has been corrected to within an inch of its life.


        Both ERBE and ISCCP-FD show lots of cloud changes in the tropics – and this is confirmed by surface observation in that most significant of zones – the eastern Pacific.

        Kevin Trenberth’s warming – or failure to find any – travesty concerned the CERES data found here – http://pielkeclimatesci.wordpress.com/2010/04/27/april-26-2010-reply-by-kevin-trenberth/

        If you note the net graph at the bottom – the increase in energy retained in planetary systems is quite significant although it was offset by some net solar cooling in the quasi 11 year cycle. The atmosphere shows little warming – but by some accounts the extra heat is finding its way to 2000m and below in the oceans.

        There are 2 things apparent in the CERES data. The large variability associated with ENSO and that most of the change happened in the shortwave. The latter is something that is most clearly associated with cloud cover change. The ENSO connection suggests low level cloud changes negatively correlated with sea surface temperature.

        I am beginning to think that Kevin must be a bit slow if he hasn’t worked this out yet.

        I am anticipating an intensification of the frequency and intensity of La Niña in perhaps the next decade or so in a cool Pacific mode – so it will be interesting to see how this evolves as well as when the next bifurcation point will be and what happens then.


      • Chief Hydrologist, 4/7/11, 12:40 am, week of 4/2/11 …

        Radiant heat is a perfectly good term. It just isn’t a synonym for radiation, so a reader is likely to be confused by terms wrongly used interchangeably. When scientists talk to Congressmen or the media, the eyes of their audience can be seen to glass over. That’s because accuracy in conveying scientific information too often is just tedious. We can’t all be PhD entertainers like Carl Sagan, but then old Carl slid down the slippery slope of sloppy science when he launched into a connection between his nuclear winters conjecture and the Kuwaiti oil fires.

        IPCC wrote about the greenhouse effect that traps heat near Earth’s surface, determined by the amount of carbon dioxide and other greenhouse gases in the atmosphere. AR4, FAQ 1.1. That’s layman-talk. So some poor schnook with a BSME remembers just enough of his thermo to know that heat can’t be trapped to conclude that the greenhouse effect doesn’t exist and that CO2 is harmless. So he runs off to join some club of rabble-rousing, self-proclaimed, pseudo skeptics. His conclusions are half right, but for the wrong reason so he is easily marginalized. He is unhelpful.

        To the extent that IPCC might be seen as relying on the Laws of Thermodynamics, it needs to get the term heat right. Every mention of equilibrium by IPCC might be just such a reference. The notion that the K&T balanced energy budget might be a stable reference point has no justification, and might be a silent and false appeal to the Second Law.

        Your link to Trenberth’s travesty lament and the CERES charts was spot on. However, I had some problems with the reference. Going back through several layers of references, I couldn’t find the meaning of the blue and yellow shadings on the charts. Is that shading connected to your apparent … large variability associated with ENSO in the data? Without labels for ENSO events, I didn’t find the association at all apparent. My reference search collapsed when I was unable to find “The state of the climate in 2008”, although I did find an irrelevant document for 2009.

        You said:

        The latter is something that is most clearly associated with cloud cover change. The ENSO connection suggests low level cloud changes negatively correlated with sea surface temperature.

        I am beginning to think that Kevin must be a bit slow if he hasn’t worked this out yet.

        I am anticipating an intensification of the frequency and intensity of La Niña in perhaps the next decade or so in a cool Pacific mode – so it will be interesting to see how this evolves as well as when the next bifurcation point will be and what happens then.

        What is apparent in the CERES chart is that it is for the time period 2000 to almost 2010. Even the averages of the data records over that whole period are nothing but data points for an average weather. These CERES charts are weather reports. A minimum of 30 years is the IPCC standard for climate. An important climate question to ask about ENSO is, what are its effects averaged over 30 years? My guess is nothing. ENSO certainly adds variability to climate data, but it is merely a regional redistribution of thermal energy.

        Trenberth might seem to have gone over the top in calling a little blip in weather a travesty in the big picture of climate predictions. In another regard, two great events have blunted the thrust of the AGW bamboozlement. One is the whistle blowing with the CRU e-mails, evidence justifying words like fraud and conspiracy. The other is the fortuitous cooling over the last decade.

        The excuse that a one decade cooling trend is merely variability is legitimate. Believers broached that argument, but it passed right over the heads of the public, the media, and politicians. Don’t try to explain it to them; their eyes will just glass over. Regardless, this prolonged weather event is having the beneficial public effect of creating an irrational doubt where ample evidence exists for rational rejection of the whole AGW concept. To a Believer, that would be the travesty.

        Viewed objectively, this decade-long weather event does prolong the claimed AGW catastrophe, seemingly giving governments an extra decade before taking action. Meanwhile real scientists can awake, and come forward to debunk AGW and IPCC.

      • CERES is one half of a complete description of the energy budget of the planet. Energy in – energy out = the change in global heat content

        This can be expressed as:

        Ein/s – Eout/s = d(GHC)/dt – far more than a weather report we can tell nearly instantaneously the state of the global energy budget. Energy is everything is climate.

        In the satellite data we see evidence on secular cloud change on decadal timescales. This is one critical factor – nothing at all to do with the pedestrian assumptions of weather or lack of surface warming. Indeed – how often do I need to say that CERES shows warming as a result of an energy imbalance mostly in the shortwave as a result of cloud changes associated with ENSO? How is that mistakable for something else? There is and can be no bloody cooling.

        You need to jump the tracks in your thinking and cast a wider net – to mix metaphors – because you are showing very little depth of understanding at all. Merely tendentious bombasity.

      • Chief Hydrologist, 4/7/11, 5:46 pm, week of 4/2/11 …


      • Perhaps so – perhaps I was oversensitised by a discussion on emissivity with Fred – whose understanding seems based on a global warming narrative, a word based understanding which has of course some limitations. Slow, ponderous and mistaking verbal dexterity for right cognitive model formulation and then a poor attempt at verbalisation. My apologies.

        The shaded areas relate to different instruments.

        I suggest that – from a hydrological perspective – decadal changes are well established. Most of these globally emerge from the Pacific and are associated with cloud cover and cloud radiative changes on decadal scales. There is every reason to suspect that variability is not limited to decadal changes. Changes in cyclone frequency in Northern Australia, a change 5000 years ago to EL Nino conditions that dried the Sahel.

        It’s not personal – I quite often think physicists don’t have a clue.

      • Chief Hydrologist, 4/7/11, 11:48 pm, week of 4/2/11 …

        Apologies accepted, with thanks.

        Now in this moment of civility, I can broach a topic I wanted to raise with you, and with others who believe AGW exists so to act as surrogates for the cloistered IPCC: energy balance.

        Q: Why do the GCMs drive the climate to a state of energy balance? Why does anyone worry, as you did yesterday at 5:46 pm with respect to the CERES data, about an energy imbalance?

        One might argue the Second Law to say that the climate seeks equilibrium. I wouldn’t accept that because the Law would apply to a closed system, including the Sun and deep space, in which climate parameters are no longer changing values. Steady state oscillations would not count as equilibrium. The Sun would have to stop changing its output.

        Some principle needs to be invoked, e.g., a state of least energy, or conservation of something. If one had the power to modify the climate system to stabilize it in some preferred or conditionally stable state, feedback signals would have to be added to measure the pesky changing variables, and then find something to change to zero the error signal and keep or restore the balance. In other words, if the climate abhors an imbalance, how does any part of it know that an imbalance exists? I am satisfied that cloud albedo acts as a negative feedback to keep surface temperature nearly constant, but I can’t say that a constant surface temperature implies an energy balance.

        It seems to me that the initial conditions for the GCMs were designed by K&T to represent the contemporary atmosphere circa 1997. IPCC then made it arbitrary by attributing their energy budget to the year 1750. I have worked through the K&T model enough to convince myself that any surface temperature can be assigned, and an energy balance reasonably achieved by letting water vapor follow the Clausius-Clapeyron equation. The year 1750 could have been an ice age minimum, at least as far as the initial condition is concerned.

        I see neither a preferred state nor a conditionally stable warm state for the climate. Is an energy balance anything more than intuition and wishful thinking? What am I missing?

      • Jeff,
        The discussion in this thread has not really gone to the issues of AGW or CGM, but we (and you up to the last paragraphs of the latest message) have discussed some basic physical processes and their consequences. I have used Trenberth-Fasullo-Kiehl numbers as a source for order of magnitude as they are certainly in the right ballpark as averages, while possibly far from the correct values of any specific situation.

        As far as I can see there is not very much disagreement on the issues I have discussed here. Your comments have been largely in agreement and so have those of Chief Hydrologist. Everyone has different views on, how to best explain issues too complicated to discuss in full detail, but that is not disagreement on the basic understanding.

  52. BTW, I need a new keyboard, this one is really screwing up my typing. I hope you can make sense of that last comment. Oh, and the KT cartoon is just another example of older results being reused when newer results, the NASA cartoon, are glossed over.

    • Dallas – The Trenberth-Fasullo-Kiehl energy budget diagram and article demonstrating the primacy of back radiation as a contributor to ocean heat content and temperature was actually published slightly later in 2009 than the NASA cartoon. However, both articles state similar principles. The NASA article, together with that cartoon and others, states the following (I’ve italicized the relevant passages):

      “Virtually all of the energy that heats the Earth’s surface is then transferred to the atmosphere and to space by several different mechanisms. First, all surfaces radiate (give off ) energy back through the atmosphere toward space. Also, heating from the Earth’s surface causes upward motion of the air above (convection) and changes the state of water from liquid to vapor form (evaporation).
      Convection, evaporation, and radiation from the surface exceed the total amount of energy that was absorbed by the surface to begin with! This is impossible unless there is a missing element of the energy budget. In fact, there is: the Earth’s atmosphere contains water vapor, carbon dioxide, and other greenhouse gases, which absorb energy radiated toward space and then emit some to space and some back to the Earth’s surface. Greenhouse gases are responsible for keeping the Earth’s temperature warm enough to support life as we know it. The exercises presented in the following pages are intended to explore what governs the amount of energy absorbed by the Earth’s surface”.

      Despite blogosphere disagreements, a dominant role for back radiation is not seriously controversial within the scientific literature itself, even with scientists who disagree about the magnitude of other effects.

      • “Greenhouse gases are responsible for keeping the Earth’s temperature warm enough to support life as we know it.”

        What is responsible for keeping the earth warm is the sun, coupled with the phase change of water on a massive scale, within a large, thick blanket of nitrogen and oxygen. Add to this gravitational and electromagnetic fields, ionzing radiation, people driving around in cars and heating their houses, cow farts and a lot of political hot air.

      • How does a “thick blanket of nitrogen and oxygen” help keep the earth warm?

      • Fred Moolten, 4/7/11, 11:02 am, week of 4/2/11 …

        How does a “thick blanket of nitrogen and oxygen” help keep the earth warm?

        According to the model, the atmosphere is well-mixed. Consequently the greenhouse gases, warmed by absorption of LW emissions from the surface, warm the nitrogen and oxygen by convection and conduction. Non-greenhouse gases are just as prone to warming as GHGs, except by radiation. Those non-GHG gases radiate just like all bodies with a temperature. The return radiation doesn’t come just from GHGs, but from the full atmosphere.

        What counts in the warming and cooling of these gases with respect to each other and the exchange with the surface ultimately comes down to the heat capacities of all the components. A key relationship turns out to be the heat capacity of the atmosphere, not the GHGs, compared to the heat capacity of the ocean. Note: that’s a key relationship in the physics, not in the GCMs.

      • “The return radiation doesn’t come just from GHGs, but from the full atmosphere”

        That statement is incorrect. There is miniscule radiation from O2 and N2 at tempertures and pressures in atmospheric layers contributing radiant energy to the earth (troposphere and stratosphere), because their emissivity/absorptivity is close to zero. Almost 100 percent of the the return radiation emanates from the greenhouse gases and almost none from O2 and N2.

      • Hi Fred,

        I have an observation about this statement you made:

        “There is miniscule radiation from O2 and N2 at tempertures and pressures in atmospheric layers contributing radiant energy to the earth (troposphere and stratosphere), because their emissivity/absorptivity is close to zero.”

        I accept your assertion and your reasoning. However I think it misses part of the story, at least from the perspective of Jeff Glassman’s previous comment. The missing part is that oxygen and nitrogen both have a very efficient indirect radiative heat transfer path to the earth. First, the O2 and N2 heat is transferred via conduction to water vapor. Water vapor then relays this heat via radiation to the earth. So you are correct that “Almost 100 percent of the the return radiation emanates from the greenhouse gases and almost none from O2 and N2 “. But because almost all the atmospheric mass is O2 ad N2, almost all the transferred heat originates in the O2 and N2 molecules.

        As an aside, there is also O2 and N2 heat transferred indirectly to the oceans via conduction and mass transfer, first via conduction to the small liquid water droplets in sea spray above the ocean surface. Then these heated droplets recycle back into the ocean.

      • Hi willb – The N2 and O2 molecules that transmit energy to water or CO2 acquired their energy from water or CO2 in the first place – the heat didn’t originate within them or reach them via radiation from the surface (but see below) regarding surface conduction). Therefore, the process you describe should have almost no net effect on temperature. As I suggested in a response to Pekka, the presence of N2 and O2 may actually have a net cooling effect, because these molecules scatter sunlight – some of it back to space – thereby reducing the amount of solar radiation that reaches the surface. The effect is a small one, and so I don’t think one can say that N2 and O2 do very much in any direction to influence temperature.

        Conduction plays a very minimal role in the Earth’s energy budget, but the surface/atmosphere conduction net direction is upward from surface to atmosphere. The collisional exchange of energy among atmospheric molecules occurs within such infinitessimally small distances that it can’t be considered conduction in the usual sense of the concept, even though it is an important means of homogenizing temperature within local regions, as Pekka has pointed out. However, conductive heat transfer from surface to air, while very small, is an example of the process you cite whereby N2 or O2 could serve as an intermediary in transferring surface heat to greenhouse gases to be radiated to space. This would be a cooling rather than a warming phenomenon. However, the reason why this is quantitatively inconsequential is that the altitude of the radiation would be so close to the surface as to make little difference vis-a-vis escape from the surface per se. There might also be tiny warming aspect to it as well, because heat transferred via N2 or O2 to greenhouse gases will be radiated in wavelengths that can be intercepted by greenhouse gas molecules at higher altitudes, whereas if it left the surface via radiation rather than conduction, some would escape to space in “window” wavelengths not absorbable by the greenhouse gases – again the quantitative consequences would I expect be trivial.

        Thanks for the interesting comment.

      • Fred Moolten, 4/7/11, 22:20 pm, week of 4/2/11 …

        FM: The N2 and O2 molecules that transmit energy to water or CO2 acquired their energy from water or CO2 in the first place – the heat didn’t originate within them or reach them via radiation from the surface (but see below) regarding surface conduction). … As I suggested in a response to Pekka, the presence of N2 and O2 may actually have a net cooling effect, because these molecules scatter sunlight – some of it back to space – thereby reducing the amount of solar radiation that reaches the surface. The effect is a small one, and so I don’t think one can say that N2 and O2 do very much in any direction to influence temperature.

        In the ultimate first place, almost three fourths of the warming of N2 and O2 molecules, and then the CO2, is from H2O molecules warmed mostly by short wave radiation. Even though water vapor is the dominant greenhouse gas, SW radiation is not part of the greenhouse effect. Water vapor wears many hats.

        IPCC initializes its models with K&T’s energy budget, and in this budget the atmosphere absorbs 67 W/m^2 in the short wave. In K&T (1997) diagram, the authors compute this SW absorption is for a cloudy day, over half of which is due to H2O, and the remainder they split roughly between ozone and cloud “overlap effects”. This flux is separate from the (cooling) scattering to space, which K&T lump into their albedo figure of 77 W/m^2. A reasonable assumption is that K&T’s SW flux of 168 W/m^2 arriving at the surface includes any (warming) Rayleigh scattering from the atmosphere.

        Per the budget, SW absorption is substantially larger than the net longwave absorption of 26 W/m^2 (350 – 324). (K&T have a longwave window of 40 W/m2, which might be better represented today as the LW transmitted through the atmosphere. Correcting for that 40 W/m^2 would not change the results.)

        The budget appears to give atmospheric absorption of SW a major surface cooling effect, but some unknown amount of that energy must return in the LW back radiation K&T attribute to the greenhouse effect. Regardless, this is a major energy exchange that is not part of the greenhouse effect:

        greenhouse gases … act as a partial blanket for the longwave radiation coming from the surface. This blanketing is known as the natural greenhouse effect. Bold added, AR4, FAQ 1.1.

        Defined as a blanket, the greenhouse effect is to warm, and the amount of warming is the back radiation. This is a likely reason for K&T to separated the back radiation instead of modeling the heat from the surface to the atmosphere or greenhouse gases. That back radiation, above, is 324 W/m^2, and comprises the greenhouse effect plus any deficit in the SW cooling. As a result, in keeping the radiative balance, any reduction in SW cooling is compensated by a reduction in the greenhouse effect.

      • Jeff,
        The total absorption of SW in the atmosphere is about 80 W/m^2 while the absorption of LW from the Earth surface is about 360 W/m^2. Thus the absorbed SW is less than 20% of the radiative warming of the atmosphere.

      • Hi Fred,

        Thank you for your informative response to my comment about heat transfer in the atmosphere. You make some thought-provoking statements and I would like to examine some of them more closely. If you would permit me to edit your comment a bit:

        “The N2 and O2 molecules that transmit energy to water or CO2 acquired their energy from water or CO2 in the first place… Therefore, the process you describe should have almost no net effect on temperature. …so I don’t think one can say that N2 and O2 do very much in any direction to influence temperature.”

        Perhaps I’m misunderstanding exactly what you are saying, but this assertion seems counter-intuitive to me. Isn’t it true that heat energy from water vapor and CO2 that would otherwise be radiating out into free space is instead being transferred to the N2-O2 mass of the atmosphere? If the energy were radiating out into free space, it would result in a cooling of the atmosphere. Instead, the heat remains captured within the N2-O2 heatsink. My conclusion is that nitrogen and oxygen do play a significant role in increasing atmospheric temperature. Is this not so?

        I was also hoping you might provide a bit more explanation of what you mean by this statement:

        “The collisional exchange of energy among atmospheric molecules occurs within such infinitessimally small distances that it can’t be considered conduction in the usual sense of the concept…”

        I was under the impression that collisional exchange of energy is in fact the usual concept of conduction, at least on a microscopic scale.

      • willb – As you point out, energy absorbed by a GHG such as CO2 is “thermalized” – transmitted to neighboring molecules, mostly N2 or O2. In turn, as a function of their temperature increase, they excite other CO2 molecules to higher energy states capable of radiating to space or back to Earth. The net result is photon absorption followed by photon emission, with N2 and O2 as intermediaries and the absorbing and emitting molecules almost always being different. However, the net absorption/emission balance is what would happen in the absence of the intermediaries (and does to a significant extent at very high altitudes where molecules are more sparse). As Pekka pointed out, an atmosphere without N2 or O2 would be characterized by more unevenness, but average thermodynamic behavior should not be greatly altered by the fact that the absorbed energy remained in the CO2 rather than being more widely distributed (note that a definition of temperature includes the vibrational/rotational energy in excited molecules). In particular, since downward radiation was the original topic, this would not be discernibly altered.

        Regarding conduction, the point is probably a semantic one. At most altitudes and temperatures, the collisional interactions between GHG molecules and N2 and O2 occur at extremely high rates in extremely small distances, and so the absorption – transfer to N2 or O2 – excitation events don’t entail the typical calculations of thermal conduction rates that limit heat transfer when macroscopic distances are involved. I’m not sure what to call it, but I didn’t want it to be confused with macroscopic conductance. As an example, conduction of heat from the Earth’s surface to the overlying atmosphere is an inefficient process because unlike the atmospheric collisional events, it involves heat transfer from one location to a different one.

      • Pekka Pirilä, 4/7/11, 4:01 pm, week of 4/2/11 …

        PP: The total absorption of SW in the atmosphere is about 80 W/m^2 while the absorption of LW from the Earth surface is about 360 W/m^2. Thus the absorbed SW is less than 20% of the radiative warming of the atmosphere.

        Didn’t you just turn the atmosphere into a blackbody in the longwave, absorbing all incident the LW radiation?

        Or, did you figure the atmosphere was at 0K?

        Or, is your use of the word total for the SW but not for the LW significant?

        If you were to use the same numbers for the same things as used by K&T (1997) and adopted by IPCC in its Reports, an interested reader could follow our discussion by searching those papers. K&T separated the longwave flux into up, Fu = 390, and down, Fd = 324, components, with a net of 66 W/m^2 at the surface. See id, Figure 7, p. 206 (AR4, FAQ 1.1, Figure 1); Table 2, p. 201. Of that 66 W/m^2, K&T have 40 W/m^2 passing through an atmospheric window. The difference between Fu = 390 and the 40 W/m^2 passing through the window is 350 W/m^2, apparently corresponding to your 360 W/m^2. Neither K&T nor IPCC gives this 350 W/m^2 a name, but the diagram indicates it is the LW radiation incident on the atmosphere.

        The nameless 350 W/m^2 less Fd leaves a net absorbed of 26 W/m^2. Net radiation is what warms, and is, by definition in thermodynamics, radiant heat.

        When the temperature of the atmosphere reaches the temperature of the surface, we have thermal equilibrium, and the radiant heat ceases, as one would want. The radiation could be any value. In terms of power and energy, and the relative effects of SW vs. LW, what counts is the net radiation.

        I stand by my comparison of 66 W/m^2 for the SW and 26 W/m^2 for the LW.

      • Fred, you say:
        “However, the net absorption/emission balance is what would happen in the absence of the intermediaries [N2 and O2]…”
        “In particular, since downward radiation was the original topic, this would not be discernibly altered.” (I presume in the presence or absence of N2 and O2?)

        I’m not saying that your above statements are wrong but I’m not ready just yet to accept them at face value.

        Where I live the temperature usually drops by between 10 and 20 deg F going from day to night, and a 25 degree delta is not uncommon. Clearly some form of heat transfer is occurring over this period (assuming latent heat flux is not a part of this process). Since almost all the atmospheric mass is O2 ad N2, then almost all the transferred heat is lost from the O2 and N2 molecules. I’ll make the conjecture that the heat transfer mechanism in this case is indirect radiative transfer:
        – First, the O2 and N2 heat is transferred via conduction (molecular collision) to water vapor and CO2;
        – Water vapor and CO2 then relays this heat away via radiation.

        Some of this radiation goes up, and some goes down to be absorbed by the earth and sea. But none of this radiation would be present if the N2 and O2 were absent. So on the face of it, downward radiation appears to be greatly altered by the presence of N2 and O2.

        So am I missing something? Is my conjecture unreasonable? How do I reconcile it with your statements above?

      • willb – I don’t quite follow your logic, but the day/night difference is not conceptually complicated. During the day, the surface gains heat from the combination of solar radiation and downward IR radiation from GHGs, while shedding energy upward at a rate less than the combined solar/IR downward effect. At night, there is a net loss because the solar component is gone, which is why the surface cools. In both cases, the surface is radiating IR upward. How much of this escapes to space without interception and redirection depends on the GHG concentration, but is independent of N2 and O2, which simply shift energy from one GHG molecule to another but don’t add or subtract energy in the process.

      • Jeff,
        There is no end to the semantic disagreement related to words as “warming”. When and in which order different contributions should be netted cannot be determined by any binding logic.

        I’ll not continue to argue on these issues, but I tell, how I used the quantities. I used total for the SW, because all SW is coming from the sun and absorbed from that only component. The situation is very different for LW which is emitted both from the surface and within the atmosphere. The total sum of all these LW components is very large and hardly known to anybody (it could be calculated, but I doubt anybody has done that). For LW I calculated that part that originates from the surface and does not pass directly to the space. I didn’t see any reason to net from that radiation that leaves the atmosphere hitting the surface or escaping to the space as atmosphere emits this radiation independently of the mechanism that heats it.

        This way of splitting the energy flows seems reasonable to me, but there seems to be as many (or more referring to Keynes’ comment on economists) opinions on the right way than there are people discussing this issue.

      • Fred,
        Sorry for the convoluted logic in my post of April 9, 2011 at 1:49 pm. All I’m really trying to say is that:
        – The N2-O2 mass cools at night.
        – The cooling is the result of heat transfer from N2-O2 to the GHG’s, which then radiate this heat away.
        – Some of the radiation is downward, contributing to the heat content of the sea and ground.
        – If the N2-O2 mass did not exist, neither would this radiation nor its contribution to surface heat content.
        – The radiation (and surface heat) is therefore not independent of the N2-O2 mass.

        Almost everything you say makes sense to me except when you talk about the role N2 and O2 play in atmospheric heat transfer. In your post on April 9, 2011 at 2:04 pm you say:
        “How much of this escapes to space without interception and redirection depends on the GHG concentration, but is independent of N2 and O2, which simply shift energy from one GHG molecule to another but don’t add or subtract energy in the process.”

        You say this as if the GHG’s are somehow doing more than “shifting energy”. But the way I see it that’s really all the GHG’s are doing as well. They are absorbing, emitting and scattering IR radiation. Just as with N2 and O2, they are not “adding or subtracting” any energy. All of the energy of interest originates only with the sun.

      • N2 and O2 neither absorb nor emit to any meaningful extent, so they can’t redirect energy, which is what the greenhouse effect is all about. For practical purposes, the atmosphere can be treated as though they didn’t exist except for the minor points I’ve already discussed. If you want to email me for further discussion, that would be fine.

      • Fred – So you claim my reasoning is faulty but you refuse to tell me why. Interesting. I guess we’ll just have to leave it at that.

      • willb,
        Most of your points are correct, but some issues remain.

        – If the N2-O2 mass did not exist, neither would this radiation nor its contribution to surface heat content.
        – The radiation (and surface heat) is therefore not independent of the N2-O2 mass.

        A very thin atmosphere would cool much more during the night. Thus the radiation would weaken much faster during the night. The diurnal variations might be at low altitudes similar to what they are now at high mountain tops. The total over 24 hours would not necessarily change much.

        You say this as if the GHG’s are somehow doing more than “shifting energy”. But the way I see it that’s really all the GHG’s are doing as well. They are absorbing, emitting and scattering IR radiation. Just as with N2 and O2, they are not “adding or subtracting” any energy. All of the energy of interest originates only with the sun.

        The radiation moves energy from one place to another. The role of N2 and O2 is mostly local. Their heat capacity is, however, important for the strength of convective heat transfer.

        The atmosphere would be very different without N2 and O2, but the heat transfer properties of the atmosphere are affected surprisingly little by its principal components. The minuscule amounts of GHG’s have a much stronger influence on the energy transfer.

      • willb – You are still welcome to email me (you can find my email address in the denizens thread), but I believe if you read through my explanations, you should be able to understand why N2 and O2 don’t contribute to atmospheric or surface heating. Your responses have been polite, but your persistence causes me to wonder whether you are really asking a question or simply trying to argue agaisnt current principles of atmospheric radiative transfer.

        You will find the same principles on almost any site you visit that discusses greenhouse effects, including the principle that only the greenhouse gases participate in the heating/cooling process. Perhaps I’ve not articulated them adequately, in which cases you should visit the others sites, where you will find the same thing.

        (As a side note on earlier comments, Pekka is right that N2 and O2 exert stabilizing effects without contributing to warming, but I’ll leave you to review all the comments to figure out why an Earth without those gases would probably be slightly warmer than with them – ie., why their net effect is a slight cooling. Good luck.)

      • Fred Moolten 4/10/11 11:12 am, week of 4/2/11 …

        You say, You will find the same principles on almost any site you visit that discusses greenhouse effects, including the principle that only the greenhouse gases participate in the heating/cooling process.

        Of course, only greenhouse gases participate in the greenhouse effect.

        However, water, while the dominant greenhouse gas, is the principal absorber of short wave radiation, accounting for three fourths of radiating heat to the atmosphere. That is not a greenhouse effect. The greenhouse effect is defined only in the longwave region.

      • To Jeff- Yes, water absorbs some incoming solar radiation, mainly in the near IR.

        To willb – Regarding small changes in heating/cooling that might result from the absence of N2 and O2, the reduction in light scattering would result in slightly more surface warming. However, I should also have mentioned an opposing effect – lesser line broadening of CO2 and H2O absorption bands (so-called “pressure broadening”). I’m not sure which would dominate, but this has nothing to do with the the principle that N2 and O2 do not capture heat that would otherwise escape. That is a capacity limited to the greenhouse gases.

      • Pekka,
        Thank you for your reply.

        “A very thin atmosphere would cool much more during the night. Thus the radiation would weaken much faster during the night.”

        May I assume from this that, at least during the period of the day-to-night transition, the atmosphere would be a lot cooler without the N2-O2 mass? And that during this nighttime period there would be less radiative heat transfer to the earth’s surface than would otherwise occur?

        You say that the diurnal variations will probably average out to the same value regardless of whether N2 and O2 are present. If the atmosphere and surface were significantly colder at night without N2 and O2, then they would have to attain a significantly higher temperature during the day to compensate. I guess my question at this point would be, if N2 and O2 were absent, what would be the mechanism for getting to this higher temperature during the day?

        Regarding your second point about the local heat transfer role of N2 and O2, Fred has made this point also. I completely accept and agree with it. But with Fred I was trying to make the point that despite the localized nature of heat transfer for N2-O2, there is still substantial indirect interaction with the surface via an indirect heat transfer path.

      • willb,
        Giving precise answers to your questions would require a significant amount of calculations on a hypothetical, non-existent atmosphere. My answers have been qualitative and based on physical principles. On that basis I cannot say much more.

        I add only that there is enough radiation during daytime to heat a thin atmosphere above the present temperature range. A low density and the resulting low heat capacity act equally strongly in both directions. The average would change much less, but it’s impossible to say in which direction without a full analysis.

      • willb – Heating and cooling of N2 and O2 do not affect the Earth’s energy balance except in the very minor ways I’ve already described, because they can’t trap or emit radiation under atmospheric conditions. These molecules shift heat from one greenhouse gas molecule to another, but don’t intercept heat that would otherwise escape to space.

        Do they substantially alter day/night variations even though the overall average is unaffected? I don’t know the quantitation, but I suspect the effect is minor – perhaps Pekka disagrees. Radiative equilibrium in the atmosphere is almost instantaneous, because the heat capacity of the atmosphere is small and heating and cooling following molecular absorption/emission is an event localized to tiny regions in the vicinity of individual greenhouse gas molecules. The diurnal variations are therefore dictated almost exclusively by the presence or absence of solar radiation, and their pace at night is dictated, as the main rate-limiting step, by the pace of IR emission from the surface combined with interception and back radiation by greenhouse gas molecules (including cloud water). Convective adjustments occur over days and weeks.

        Here is one reasonable description of Greenhouse Gases.

        Note the section that states ” Although the Earth’s atmosphere consists mainly of oxygen and nitrogen, neither plays a significant role in enhancing the greenhouse effect because both are essentially transparent to terrestrial radiation. The greenhouse effect is primarily a function of the concentration of water vapor, carbon dioxide, and other trace gases in the atmosphere that absorb the terrestrial radiation leaving the surface of the Earth (IPCC 1996).”

        I believe you can find similar descriptions on many other sites.

        Perhaps I shouldn’t intervene any more, because I have the impression you are determined not to believe this (which is perhaps why you addressed Pekka rather than me even though he and I don’t disagree about overall average effects). That’s fine, but if you want to understand these processes in greater detail, I remain available via email to contribute what I can.

      • Fred,
        I wasn’t trying to be obstinate with you, I was just trying to explain why I found some of your assertions counter-intuitive.
        I used the scenario of heat transfer during a day-to-night transition as being the easiest way to explain my viewpoint. I certainly wasn’t trying to argue against current principles of atmospheric radiative transfer and frankly I’m a bit surprised you would say that.

        Look, I gave you four premises and a conclusion. Each of the premises are very simple and fall well within the bounds of current scientific principles. If the conclusion is valid, it too will fall within the bounds of current scientific principles. Standard operating procedure would be for you to identify a faulty premise or show that the conclusion does not follow logically from the premises. This is in fact what I expected you to do, something like what Pekka Pirilä did above. But if you don’t feel like discussing this further, please don’t feel obligated. I’ve already taken up quite a bit of your time with this.

      • Fred Moolten, 4/10/11, 12:05 pm, week of 4/2/11 …

        Just to keep things in perspective, let’s not minimize atmospheric short wave absorption. It’s triple the absorption due to the vaunted greenhouse effect, and about half that SW absorption is directly due to water vapor absorption, and variably more, an indirect effect due to cloud effects. K&T (1997), Table 4, p. 204.

        The atmosphere absorbs 26 W/m^2 due to the greenhouse effect out of 195 W/m^2, which is 13%. K&T (1997), Figure 7, p. 206; AR4, FAQ 1.1, Figure 1. Of that, CO2 contributes 26% (K&T (1997), p. 206), or 3.5% of the total atmospheric warming, or 6.8 W/m^2.

        What happens if we double the CO2? But wait, wait! Double your order and you pay only additional S&H [Scare & Hype]. IPCC says the radiative forcing for a doubling of CO2 is +3.7 W/m^2. AR4, ¶2.3.1, p. 140. This is enhanced (to improve something or to make it more attractive or more valuable, http://www.macmillandictionary.com/dictionary/british/enhance ) by a factor between 1.5 and 2 by water vapor feedback. 4AR, ¶, p. 631.

        Contrasting 3.7 W/m^2 for a doubling with 6.8 W/m^2 since 1750 is an interesting exercise. Regardless, the 3.7 W/m^2 is predictably wrong because IPCC models don’t close the powerful negative cloud albedo feedback through the parameter of cloud cover. The figure of 3.7 W/m^2 could be a tenth as large due to an undetectable change in cloud albedo. It’s beginning to appear from ERBE that the actual sensitivity may be 25% as large, or about 0.9 W/m^2.

      • Pekka,
        I wasn’t really looking for anything more than a qualitative answer to my question, which you kindly provided:
        “…there is enough radiation during daytime to heat a thin atmosphere above the present temperature range. A low density and the resulting low heat capacity act equally strongly in both directions.”

        I see this as quite plausible. However, my devil’s advocate comment to this would be that the thin atmosphere that we are speculating about is composed entirely of GHG’s. This imaginary GHG atmosphere contains exactly the same quantities of water vapor and CO2 as our actual “high density” atmosphere. Perhaps the GHG atmosphere does not need the N2-O2 heat capacity to prevent its temperature from rising above the present range? Perhaps the GHG’s would be able to maintain the same daytime temperature as our actual atmosphere by radiating the heat away, just as they do now?

      • willb,
        One more point.

        Without convection we would not be limited by the adiabatic lapse rate, but greenhouse effect would be much stronger. The surface would be about 30 degree hotter than it’s in reality. With a very thin atmosphere the apparent temperature that corresponds to the radiative heat transfer and the excitation levels of vibrational modes would be partially decoupled from the temperature related to the thermal motion of the molecules. It would, however, not be decoupled from the temperature of the surface.

        The most likely consequence of a very thin atmosphere of GHG’s without N2 and O2 is a much higher average surface temperature, but without a full analysis of the convection in the thin atmosphere I dare not say that this conclusion is true. The vertical density profile would also be different, and the adiabatic lapse rate would be different, because the Cp of H2O is significantly larger than that of N2 and O2.

        There are many differences, but what would not change as long as the amounts of all GHG’s are the same is the total transmissivity of the atmosphere for IR from the surface. This is the most important single factor influencing the strength of the greenhouse effect. Other factors modify the result significantly.

        The most important modification is the reduction of the effect by convection. Convection offers a parallel heat transfer mechanism to cool the surface by coupling the surface to the top of troposphere, which can radiate effectively to the free space. Thus convection weakens the insulation offered by the troposphere, but would not have any effect (or even exist) without the strong radiative cooling of the top of troposphere. To have a high lapse rate we need a atmosphere heated at the bottom, cooled at the the top, and with a strong absorption and emission of IR at all altitude levels. This combination would lead to a larger lapse rate than is stable. The lapse rate is then forced down to the adiabatic lapse rate by convection.

      • The heat capacity of N2 and O2 makes the temperature of the atmosphere a bit more stable than it would be otherwise. N2 and O2 affect also strongly the collision frequency in the atmosphere and contribute to the maintenance of a equilibrium level of occupation for the vibrational and rotational excitations of CO2 and other GHG’s. Reducing their amount to one half or one third while keeping the amount of GHG’s constant would have very little influence on the surface temperature as much less than the present amount is needed for the maintenance of the equilibrium in the atmosphere.

        The Greenhouse Effect is almost totally due to the GHG’s, while the role of the other gases is small and indirect as explained above.

      • I believe at least one of the effects of removing N2 and O2 would be a warming of the surface due to the reduction in the component of albedo that comes from Rayleigh scattering.

      • Fred Moolten, 4/7/11, 12:01 pm, week of 4/2/1 …

        Pekka Pirilä, 4/7/11, 12:18 pm, …

        I was in error, and your criticisms are well-taken.

        Some side observations:

        1. I also confess to giving too much credit to heat capacity. The heat capacity of the atmosphere is quite sensitive to the water vapor content.

        2. I would not have assumed that emissivity equals absorptivity. That is an à priori condition valid only at thermal equilibrium. The atmosphere is never in thermal equilibrium. Because science has no scale for equilibrium, the question of how close the atmosphere is to equilibrium has no answer. That observation also means the difference between emissivity and absorptivity is unknown.

        3. I almost agree with Fred’s statement, There is miniscule radiation from O2 and N2 at temperatures and pressures in atmospheric layers contributing radiant energy to the earth (troposphere and stratosphere), because their emissivity/absorptivity is close to zero. I would have added “in the longwave band”. The sentence taken as a whole is a tautology. Nonetheless, oxygen has strong bands in the UV that surely “help” by contributing some energy to the ocean, which is outside the domain of the greenhouse effect.

        4. I am puzzled by Pekka’s criterion of the maintenance of the equilibrium in the atmosphere. Undoubtedly that’s what the GCMs do because most of these problems have no known solution except at equilibrium. Hence these models put the surface layer of the ocean in equilibrium. Mechanics has principles like least work, and the laws of conservation, that help determine the response of a system. Thermodynamics has no equivalent. The Second Law and the maximum entropy principle would be helpful, provided the Sun would benignly burn out.

        5. The climate is stable at some minimum temperature, depending only on its internal thermal energy. But it has no known preferred state under solar radiation. The assumed GCM initial condition of the K&T energy budget is neither a preferred state nor a unique one. This makes the climate quite sensitive to solar radiation.

        6. Here again is a model for the climate that fits all the data. In the present warm state, climate is determined (1) by the mean and fine structure of solar radiation, (2) by Earth’s response to that radiation through cloud albedo and the oceans, and (3) by Earth’s energy loss to space. For the latter, the mean greenhouse effect is crucial, but variations in the greenhouse effect are regulated by the slow temperature feedback of cloud albedo acting through the hydrological cycle. Consequently, climate sensitivity is much less than IPCC estimates from its open-loop greenhouse effect. IPCC’s greenhouse effects are greatly exaggerated, and the answer to Fred’s question lies elsewhere than in a greenhouse analysis. Any climate effects of a thick blanket of O2 and N2 would be subject to the same mitigation by the same closed-loop gain.

      • Jeff,
        The equilibrium that was referring to concerns the local temperature of the gas and the occupation level of the vibrational excitations of the CO2 molecules. This occupation level has an equilibrium value determined by the temperature as long as the thermal excitation and de-excitation proceed much more rapidly than emission and absorption of 15 um radiation. This condition is satisfied when the density of the atmosphere is not too low and N2 is the most important molecule due to its large share of all molecules.

        This condition is not satisfied in the upper stratosphere where the radiative energy transfer is partially decoupled from the thermal motion of the molecules.

      • John Carpenter


        How much LW radiation energy absorbed by CO2 eventually dissipates via micro-wave radiation via populated excited rotational states of H2O from collisions with N2, O2, CO2 and other H2O? Isn’t this a viable mechanism for radiative heat loss also?

      • The role of microwave radiation is minuscule due to the small energy of microwave “photons”. It is strong enough for serving observational purposes, but has practically no significance for the energy balance. I don’t have the numbers, but the above applies already to the longest wavelengths of IR and more strongly to the microwave. This can be seen from the shape of Planck’s law (http://en.wikipedia.org/wiki/Planck%27s_law).

      • Jeff Glassman – “A key relationship turns out to be the heat capacity of the atmosphere, not the GHGs, compared to the heat capacity of the ocean. Note: that’s a key relationship in the physics, not in the GCMs.”

        Equilibrium temperatures of the Earth and atmosphere are dictated by the GHGs in relationship to incoming solar energy as well as planetary albedo. Heat capacity doesn’t determine this but rather the rate at which equilibrium is approached (very quickly in the atmosphere and very slowly in the oceans). In GCMs, oceanic and atmospheric heat capacity are not ignored but are critical parameters incorporated into the models.

      • Rob Starkey

        Obviously not very well yet since the models are not working very well

      • That is an interesting statement though. I would have worded it differently, “A key relationship turns out to be the heat capacity of the atmosphere and water vapor, not the other GHGs, compared to the heat capacity of the ocean. Note: that’s a key relationship in the physics, not fully incorporated in GCMs.” Stated that way it describes the uncertainty as I see it. The oceans are 70% of the surface and what, 80 something percent of the relevant thermal mass/heat capacity?

      • Rob – The models perform differently depending on spatial and temporal extent. Their skill is good for multidecadal global warming, less good for shorter intervals or smaller timescales, and rather poor for interannual short term internal variations such as ENSO. One reason for their better performance at longer timescales as that they even out short term deviations in ocean warming (i.e., 10 years or less), and so the longer term changes reflective of total ocean heat capacity dominate the projections.

      • Fred Moolten, 4/7/11, 12:53 pm, week of 4/2/1 …

        FM: Equilibrium temperatures of the Earth and atmosphere are dictated by the GHGs in relationship to incoming solar energy as well as planetary albedo.

        Agreed, if by equilibrium you mean steady state and not thermodynamic equilibrium. However, then you say, Heat capacity doesn’t determine this but rather the rate at which equilibrium is approached (very quickly in the atmosphere and very slowly in the oceans).

        I had said, A key relationship turns out to be the heat capacity of the atmosphere, not the GHGs, compared to the heat capacity of the ocean. Here’s support, based on observations, not modeling:

        The ocean plays an important role in climate and climate change. The ocean is influenced by mass, energy and momentum exchanges with the atmosphere. Its heat capacity is about 1000 times larger than that of the atmosphere and the ocean’s net heat uptake is therefore many times greater than that of the atmosphere. Global observations of the heat taken up by the ocean can now be shown to be a definitive test of changes in the global energy budget. Changes in the amount of energy taken up by the upper layers of the ocean also play a crucial role for climate variations on seasonal to interannual time scales, such as El Niño. Citation deleted, AR4, Technical Summary, ¶TS.3.3 Changes in the Ocean: Instrumental Record, p. 47.

        Heat capacity does determine the rates. But to have the atmosphere approach “equilibrium” quickly and the ocean slowly, don’t you have to decouple the atmosphere from the ocean?

        Because of its large heat capacity, ocean heat storage largely controls the time-scales of variability to changes in the ocean-atmosphere system, including the time-scales of adjustment to anthropogenic radiative forcing. The ocean is coupled to the atmosphere primarily through the fluxes of heat and fresh water which are strongly tied to the sea surface temperature, and also through the fluxes of radiatively active trace gases such as CO2 which can directly affect the atmospheric radiation balance. Citations deleted, bold added, TAR, ¶7.3, Oceanic Processes and Feedbacks, p. 435.

        The response time of the strongly coupled surface-troposphere system is, therefore, slow compared to that of the stratosphere, and mainly determined by the oceans. AR4, GLOSSARY, p. 951.

        FM: In GCMs, oceanic and atmospheric heat capacity are not ignored but are critical parameters incorporated into the models.

        Why don’t the IPCC reports provide those numbers in the ratio of about 1000:1?

        Many experiments IPCC reports with AGCMs, Atmospheric Global Circulation Models, run with a specified SST forcing. Those results treat the ocean as a temperature source, i.e., supplying heat with no change in temperature, which by definition is infinite heat capacity.

        [W]hen an atmospheric model is run with specified SSTs, the fluxes are reversed in sign, showing the forcing of the atmosphere from the now infinite heat capacity of the ocean (implied by specified SSTs). TAR, ¶7.6.1 Ocean-atmosphere Interactions, p. 451.

        Doesn’t the following reduced SST response imply that the AOGCMs have an unrealistically high heat capacity?

        It should also be noted that the AOGCMs used here have relatively thick top oceanic grid boxes (typically 10 m or more), limiting the sea surface temperature (SST) response to frequent coupling. Citation deleted, AR4, ¶8.2.6, Coupling Advances p. 607.

      • Jeff – I don’t have time right now to try to address every one of your points, but I do want to agree with one of your major premises – the term “equilibrium” is often used loosely to refer to what is better described as a steady state, or even a state that is not completely steady and which might be called “quasi equilibrium”, simply meaning that the system does not appear to be changing discernibly, but not that net fluxes among all components are zero. In fact, as I understand the evidence, the time required to reach “quasi equilibrium” depends on the depth of the ocean layers whose coupling to the atmosphere we are analyzing. Apparent coupling can therefore vary from years to centuries, with the former representing a climate that is changing slowly, and the latter a climate that is changing even more slowly in response to a single applied forcing. In other words, “coupling” does not refer to a completely unchanging relationship, but rather to the response of one part of the system to another part.

        In a circumstance when solar or GHG forcing continues to change, the dynamics are complicated, but the simpler case when a single forcing is applied without further change – for example a positive forcing – is easier to describe. In that circumstance, atmospheric radiative temperature increase is almost instantaneous, and further convective adjustments are also rapid (days to weeks). This atmospheric temperature will now rise only slowly as the surface (mainly the upper ocean) gradually warms so that it is in near equilibrium with the atmosphere. The time constant for this appears to be in the order of a decade or two. After that, temperature will remain even more stable, but not completely so, because the additional heat in the upper ocean is gradually distributed into the much larger deeper ocean over the course of centuries. Only then, would a true steady state exist in which no further temperature trend would occur without some new perturbation (some internal oscillations might still occur).

        With continuing changes in forcing, on the other hand, temperature continues to change, but one can calculate time constants for the intervals over which a given change in forcing is matched in the climate system by a commensurate change in temperature, which I gather is done by an analysis of autocorrelation. Typically, the constants for upper ocean response time to atmospheric forcings again appear to be in the order of a decade or two, although these are really lag times beween variables in the atmosphere and ocean rather than times needed for complete equilibration involving the deep ocean.

      • Concerning the different time scales of the atmosphere and the oceans and the related dynamics, Isaac Held’s Blog is a good place to start.

    • You guys need another reply thingy. Here is Peeka’s last,
      The total absorption of SW in the atmosphere is about 80 W/m^2 while the absorption of LW from the Earth surface is about 360 W/m^2. Thus the absorbed SW is less than 20% of the radiative warming of the atmosphere.”

      BTW, 360W/m^2 absorption plus 80 W/m^2 for SW sure sounds like a lot. Now if you used the net is would make sense, but 440W/m^2 is a little confusing, opps! that was my point to begin with.

      • Pekka said, Ref SW versus LW

        “I didn’t see any reason to net from that radiation that leaves the atmosphere hitting the surface or escaping to the space as atmosphere emits this radiation independently of the mechanism that heats it.”

        By the same token, 170 W/m^2 SW is absorbed by the ocean plus ~333 W/m^2 from the clouds. Since there is no reason to net, nearly 2/3 of something in the ocean is due to longwave from the atmosphere. Some people might even believe that 2/3 of the warming of the ocean is due to the atmosphere. If I could only get my gross income to equal my net.

      • Dallas,
        There is a specific reason to handle the ocean differently from the atmosphere. The reason is that in the ocean the energy balance of the skin is very different from the energy balance of the rest of the ocean, while there are no similar strong differences between various parts of the troposphere.

      • I have tried to describe, how a physicists approaches these questions. He tries to find such a way of combining various factors, where the dominant physical processes and relationships can be understood most easily.

      • Actually, thinking about the communications issues that have been brought up in past posts, what warms the atmosphere?

        16% SW to atmosphere, 3% SW to clouds, 23% latent from surface, 7% conduction/rising air for 49% of 342 or ~167 W/m^2 via something other than LWR. If 360 W/m^2 gross is used, the more than half of atmospheric warming is due to LWR.

        If the net LWR is used, ~15% of the total energy from the sun is converted to LWR (~51 W/m^2) that is absorbed by the atmosphere from the Earth’s surface. Of all the net energy absorbed by the atmosphere, ~51 W/m^2 via LWR results in a total of ~218 W/m^2 total where LWR is ~23.3%.

        By using the net, 15% LWR, 23% latent and 7% conduction would give you the ratio of current surface generated contributors to atmospheric warming at current Earth conditions.

        Now how would those percentages change if the surface air temperature were 3 degrees warmer? While you have no need to net LWR, somewhere along the line to determine warming you have to, or it is meaningless.

      • Dallas 4/8/11, 4:59 pm, week of 4/2,

        You asked, what warms the atmosphere?

        K&T (1997) provide the values below in W/m^2, symbols added. These are the parameters and values used by IPCC, and the basis by which AGW is claimed to exist. Consequently, nobody else’s baseline numbers are worth a fig, unless they show an IPCC error, of course.

        The numbers in the square brackets are my preliminary estimates for an upcoming paper with T = 291K (+3º), using the K&T method with reasonable parameter adjustments for humidity dictated by Clausius-Clapeyron. Note that K&T refers to Sa as absorbed, but not Ieg, which they diagram merely as incident. They label Ige inconsistently and incorrectly as absorbed when it, too, should have been incident.

        Sun short wave absorbed, Sa = 67 (34%); [77 (33%)]
        Earth thermals, Th = 24 (12%); [29 (13%)]
        Earth evapotranspiration. Ev = 78 (40%); [94 (41%)]
        Earth incident, Ieg = 350; [406]
        Greenhouse Gases incident Ige = -324; [-375]
        Earth longwave absorbed La = Ieg – Ige = 26 (13%); [31 (13%)]
        Atmosphere in (warming) Ain = 195 (100%); [231 (100%)]
        Cloud radiation to space Icz = -30 (15%); [-36 (16%)]
        Gas and aerosol radiation to space (Iaz) = -165 (85%); [-195 (84%)]
        Atmosphere out (cooling) (Aout) = -195 (100%); [-231 (100%)]
        Balance (Enet) = 0

      • Then you may like to take a shot at my new climate puzzle.


        The numbers listed in the puzzle are from the NASA cartoon, which I prefer. I will post my guesses tomorrow, based on less rigorous math of course.

      • Fred, I posted my answer. It believe it explains why you can fo’git’bout Stefan’s law for this puzzle. It may even give some clue why Arrhenius revised his estimate for a doubling downward. Unless the thermopause sink is a myth.

        Feel free to ream me a new one.

      • Dallas 4/9/11, 7:28 pm, week of 4/2/11,

        Your NASA cartoon energy budget is found in a NASA article, Investigating the Climate System Energy, A Balancing Act, an educational piece for grades 9-11. http://www.nasa.gov/pdf/62319main_ICS_Energy.pdf . It’s worth analyzing for the benefit of education.

        Your version of the cartoon changed the orange paths (designating absorption) to yellow (designating SW radiation). This makes the little 15% orange path with no arrowhead wrongly into SW instead of absorbed LW. The chart labeling is the same in both, but would be improved by adding “Radiated from Surface 21%”, per your table, to the bottom right red path.

        Some observations:

        1. The NASA cartoon is little more than a redrawing of the K&T energy budget, adjusted as discussed below. Like K&T’s budget, NASA’s conceals consideration of emissivity and the Stefan-Boltzmann Law. In addition, NASA omits the problematic back radiation, so not to confuse the Little Children. AGW supporters even with doctorates manage to confuse one way radiation with radiant heat. Both budgets balance the radiation at the surface, in the atmosphere, and at TOA, which one would want to retain for aesthetic if not scientific purposes. The popular balancing is a warm state of probability zero.

        2. The notion of a window is an approximation of regions of high transmissivity in the spectra. It is not a separate physical path. The latest from radiative transfer has no windows, just regions of extremely low absorption. The children should see all the LW radiant energy emerging from TOA combined into one bundle. That is a parameter estimated from measurements. The division between short wave and longwave is enough spectral analysis for grade 9-12 students following the environmentalism strand in place of science.

        3. NASA’s cloud albedo is quite high (89 vs. 77), while the surface is quite low (14 vs. 30). Combined though, NASA’s total albedo is slightly low (103 vs. 107). The total albedo would be better for the kids to get away from the subjective guess at the attribution between clouds and surface, and because Bond albedo is a parameter estimated from obvious measurements.

        4. NASA has about twice the LW absorbed in the atmosphere (51 vs. 26), indicating a much weaker greenhouse effect. The discrepancy arises principally from the window concept. NASA has trimmed energy from the window, assigned it to atmospheric warming, then bled the shifted energy off to space to keep everything in balance. In other words, NASA split the red arrow for the window, fed part of it into the atmosphere, and correspondingly widened the thick red arrow from the atmosphere to space. This weakens the greenhouse effect, suggesting the following scenario:

        The K&T diagram has too much greenhouse effect for 1750. It leaves too little room for scary AGW effects. So, let the K&T model represent modern day, and use NASA’s model for 1750. And to complete the story, throw in a degree or two warming since 1750.


        I share almost none of your blog’s enthusiasm for and dedication to hydrogen. It’s more like batteries than fossil fuels. When batteries and hydrogen can be mined, they will be like fossil fuels. They must be manufactured, and the process is thermodynamically inefficient, spending more in energy to make than can be delivered as fuel, and requiring exotic materials. As to portability, natural gas, unlike hydrogen, is readily available, efficient, and adaptable.

        Besides, natural gas emits a lot of CO2, a benign, beneficial greening agent. Hydrogen emits mostly water vapor, the dominant greenhouse gas. On the other hand, that same water vapor contributes to clouds and cloud albedo, the dominant feedback in climate, and the process that regulates Earth by keeping climate sensitivity low.

        Hydrogen is a tail pipe dream.

  53. Max,
    You’ve gone to great lengths to be fair, and that’s all well and good, but the very energy required to make your case I believe is telling. In the end I couldn’t disagree more. Dr. C. most certainly gave Baird what he was looking for in those last few minutes. I hardly see how her answer could have been more party line.

    It’s the scientists that are the problem here, not the science itself…is the essence of what was being said. How many times have we heard that? You could see Baird fairly eating up Dr. Curry’s words.

    It was a terribly wasted moment in my view.

    • I suspect Judith answered the questions honestly. If you don’t like the answers she gave then convince her to your point of view. That way the next time she will give the answers that you would have given and still be answering the questions honestly.

    • pokerguy

      Before listening to the entire testimony of JC (plus the other expert witnesses) I would have agreed with you that she “gave Baird what he was looking for in those last few minutes”.

      But that was only a single sound bite (maybe an important one) and her overall testimony told a different story that was definitely not “party line”.

      -AGW is not an existential problem for the 21st century
      – other problems, such as overpopulation, land use changes, energy shortages and real pollution (soil, water, air) are more important for our future than AGW
      – it is better to formulate robust policy responses based on better knowledge than to act now with actions that may or may not bring any solutions and may cause more harm than benefit
      – we need more transparent and better historical and paleo-climate data
      – we must address the enormous uncertainties, especially with respect to natural climate forcing and feedbacks
      – don’t look for a “silver bullet” – there is none
      – there will be “winners and losers” from a warmer climate
      – IPCC reports include a heavy dose of judgment that is not backed by hard data
      – Climategate destroyed the public trust in climate science and scientists, even if it did not “obliterate” all of the legitimate data

      I am rationally skeptical of the “dangerous AGW” premise, primarily because it is not supported by empirical evidence based on physical observations or reproducible experimentation, but rather on model simulations based on theoretical deliberations and highly doubtful paleo-climate interpretations.

      A second reason is that, upon closer examination, I detect many exaggerations, fabrications, omissions and outright falsifications in the IPCC AR4 report, along with a lot of “expert judgment” rather than hard data.

      But despite this, I can agree with all of the statements made by JC.

      On Baird’s question of “obliteration of the legitimate data through Climategate” I, personally, would have answered that some of the data were not legitimate to start off with, some more recent physical observations showing low climate sensitivity were ignored and post-Climategate revelations have shown that IPCC had, indeed, reported bogus and exaggerated data in several instances, but that Climategate, itself, had not “obliterated” the legitimate data.

      Maybe we are parsing words here, pokerguy, and I know that JC may be a bit less skeptical that I am (or you may be), but I do not think her testimony played into the hand of Rep. Baird (who definitely had a “now is the time for action” agenda).


  54. “I suspect Judith answered the questions honestly. If you don’t like the answers she gave then convince her to your point of view. That way the next time she will give the answers that you would have given and still be answering the questions honestly.”

    What you’re not understanding is Dr C. herself does not believe the substance of her answer re the emails. She’s said as much on this thread. The initial discussion here was about my memory of her testimony versus her contention that she would not have answered that way, given her current beliefs. She thought perhaps I was thinking of testimony prior to 2009..

    With another poster’s help, I was ultimately able to locate the video. Her answer was longer than I’d remembered, but otherwise consistent with what I’d thought.

    Look, we’re all human. Personally, I’d just as soon face a firing squad than testify in front of congress. It can’t be easy for her.

    I’m only asking for her best, presumably the same thing she asks of herself. If Dr. C. believes as she’s stated many times that the emails do reflect poorly on the science, and not just the scientists, then it would be nice to hear her say so in such an important venue. There are now a bunch of congressman running around with minds even more closed than they were before her testimony.

    • pokerguy

      Looks like our last two posts crossed.


    • What I get from the conversation is that you both agree you see a shade of gray. Judith described her shade of gray throughout her testimony and when asked if she saw the color black she said no. You say her saying no means she is saying it is white. She says no, I already described it as gray. I say if you wanted her to say yes then you need to convince her that she should be seeing a much darker shade of gray. If you only wanted a more precise explanation of her opinion she had already given that. If your complaint is people will hear only what they want to hear, they are going to do that anyway. If your complaint is a soundbite was created, I can only say sure, that’s why politicians ask things the way they do. Will Judith avoid a straight answer to a question in the future? Maybe, but she really shouldn’t have to. Can we parse words and second guess answers indefinately. Yes :)

  55. Dr. Curry,
    If you are looking for more topics to discuss on the blog, you might want to have denizens discuss this Youtube video presentation.

  56. Joe Lalonde


    Simple question NOT being asked:


    Every time I bring up surface salt changes, they go on deaf ears.

  57. I am getting a lot out of this thread, but it’s a little hard to follow. I have a number of questions but am having difficulty organizing in a reasonable fashion. I’m going to apologize for the messiness and try one-off shots.

    It has been stated that ocean warming is predominately from incoming SW (and also refuted.) Since the downwelling (back radiation or whatever) LWR significantly exceeds the incoming SW – and even ignoring the evaporative cooling – how can the SWR possibly be most responsible for warming the surface (ocean or terra)?

    I’m also skipping over some of the thread, so sorry again.

    • Yep, that is pretty much the debate. I look at the albedo of the ocean, the depth of the SW spectrum penetration of water, the way ocean thermoclines just happen to match the different penetration depths and knowing that longwave doesn’t penetrate much at all say, it’s the sun that does the heating. Cooling is by longwave radiation mainly, so the atmosphere and clouds reduce the rate of cooling at the surface, but the stratification of temperatures in the ocean and poor thermal conductivity of water is also a limiting factor. I not much of a fan of downwelling or back radiation though.

    • Rod B,
      All the listed components of the energy balance are essential. Some of them are larger than others, but even the smallest are very large compared to the net energy flows that cause changes in the temperature.

      Which is the most important is not not a reasonable question, as they are also dependent of each other. LW IR influences directly the very thin skin layer, SW radiation warms the layers a little deeper (the top one meter, the next 10 meters and weakly even a bit further). Thus there are differences both in the size of the energy flow and in the target of this flow. That allows for many interpretations of the same physical phenomena.

    • Rod – Although there may be some disagreement or differing interpretations of how different energy sources are distributed, I believe any “debate” contesting the predominant role of back radiation as a contributor to ocean temperature and heat content exists primarily in the blogosphere. That back radiation contributes more than solar SW direct radiation is well accepted in the literature based on data such as the T-F-K energy budget data cited earlier. My own interpretation is that the LW component, although absorbed only at the surface, is rapidly mixed with absorbed SW energy – this is supported by the absence of a significantly warmer surface than water underneath noted in temperature profiles. Others emphasize the ability of absorbed LW to inhibit the dissipation into the atmosphere of energy from SW absorbed at lower depths. In either case, though, the back radiation contributes more than direct solar radiation to ocean heat and temperature, and the notion that it contributes little seems to be an Internet myth based on misconceptions of how energy fluxes are determined, and a confusion between directional fluxes and net flux.

    • All, Other than a wee bit of heat energy coming from the bowels of the earth and maybe a small bit from the atmosphere exchange, incoming radiation is the sole heat energy source for the oceans. Of the 492 watts/m2 of radiation 324 comes from LW IR, and all 492 watts goes into raising the temperature, whether it stops off at the skin or progresses down to 10 meters or so — it doesn’t matter. If the claim is that the LW IR gets only in the molecular skin and immediately is emitted out, it doesn’t cut it: that’s exactly how reflection works and LW IR is not reflected. Or one could quibble with the T-F-K numbers, but there’s a ton of quibbling required.

      I agree this is probably not the correct question and that it doesn’t cover the net flux and cooling effect of LW IR emissions, but I’m just probing the assertion of others. It seems most of you concur, so I don’t have to waste any more of your time with it. Thanks. It looks like a consensus! :-D

      • Rod B,
        LW IR is indeed not reflected. The ocean skin emits IR radiation with an intensity determined solely by the temperature of the skin. The ILR influences the intensity only through its effect on the skin temperature. This effect is very large and important as an imbalance of 324 W/m^2 would lead to a very fast cooling and soon also freezing of the surface even at the equator.

        On micro level the ILR has practically no influence on OLR, but through the temperature effect the connection is very strong. The same applies to the atmospheric side. OLR heats the atmosphere and that leads to strong radiation to all directions including down. Increasing the GHG concentration makes this mutually reinforcing coupling between the temperatures of the surface and the atmosphere even stronger. (By this I don’t say that other effects like evaporation would not be important for the energy transfer and for the relationships of the various temperatures, but the radiative coupling is one essential component.)

  58. cementafriend

    Fred Moolten seems not to have any engineering knowledge and clearly has never made any measurements in furnaces and heat exchangers.
    It is suggested he reads the following which is the basis of a presentation to KNMI (the offical Government organisation for weather and climate in the Netherlands) which operates one of the three climate databases. http://climategate.nl/wp-content/uploads/2011/02/CO2_and_climate_v7.pdf . Dr Van Andel is a Chemical Engineer who understands heat and mass transfer (which is a core chemical engineering subject as are thermodynamics and fluid dynamics).

    • That paper misses the most essential point. Nothing described in it would occur without the Greenhouse Effect. It starts by skipping over those steps of physical phenomena, where the true effects arise and imagining that those would not be important.

      The paper has no value.

      • Continuing from my previous comment.

        A very common misconception is that the adiabatic lapse rate is something that forms always in the atmosphere and that cannot be avoided. It’s thought that this mechanism is an alternative for the Greenhouse Effect. That all is false. Without the Greenhouse Effect the atmosphere would be close to isothermal with a very small lapse rate. The Greenhouse Effect is the driving force that creates a large lapse rate.

        Without the vertical convection it would create an even stronger lapse rate, and this is the point where the thermodynamics comes in. Thermodynamics tells that the lapse rate cannot grow larger than the adiabatic limit. The attempt of the Greenhouse Effect to make a larger lapse rate leads to convection which reduces the temperature gradient to the adiabatic lapse rate. (Lower values of lapse rate are stable, larger than the adiabatic lapse rate are not.)

        Increased Greenhouse Effect leads to more convection and also to more of the phenomena described in that paper. The temperature of the troposphere is higher and the tropopause is at a higher altitude. Thus the temperature at the new tropopause may be equal or even lower than before, but the temperature at any fixed altitude in troposphere is higher.

        Everything that I write above is qualitative. There may be exceptions at some locations, and there are feedbacks that influence the strength of the effects. Quantitative analysis requires essentially more than these qualitative arguments. The qualitative arguments are sufficient only for showing, where the error of that paper and of many other similar wrong arguments is.

      • al in kansas

        What Pekka Pirilä is saying here is what Miskolczi’s papers describe. Perhaps I am restating the obvious. To describe in another way, the earth already has runaway water vapor feedback. It has had ever since there were oceans. In any positive feedback system, the the feedback amplifies and continues until it triggers another stronger process that overwhelms and halts it, a negative feedback. In the earths case, latent heat transfer, (thunderstorms) and cloud formation. CO2 is so small in this that is irrelevant.

      • I have never seen in any Miskolczi’s paper anything that I could agree with, except for some totally uncontroversial trivialities.

      • al in kansas

        “Note On The Miskolczi Theory”, http://www.friendsofscience.org/assets/documents/Note_on_Miskolczi%27s_theory_25-05-2010.pdf describes things in a way that seems to be similar to the way you do. It has a more understandable explanation of what Miskolczi’s papers are saying and where he gets some numbers and relationships. Perhaps I am misunderstanding what you are what you are saying, not having read through all these threads in detail. However, I would strongly suggest to anyone to read the above note for a better understanding of Miskolczi, whether you agree or not. Obviously, at this point, I think he has nailed it.

      • I took a look at this, i got lost on the first page when the apparently all important Su is discussed, without ever defining what it is. The other variables are also undefined, although there is a cryptic diagram with arrows that at least refers to these other variables (but not Su). With a glossary that actually defines what these variables are, this might be useful.

      • Al
        I have read some of Miskolczi’s text. They are all based on wrong physics. As on example, some approximate rules of thumb are taken as exact and then used to derive new – and totally wrong results. They are such nonsense that it’s not worthwhile to discuss them further. Valid critique of these theories can easily be found from the web.

      • I have been withholding judgement on this because i find it to be incomprehensible. Waiting for M and colleagues to translate their ideas into something that is coherently understandable. At this point, I can’t rule out that there may be a nugget or two of interest in their work.

      • al in Kansas, 4/8/11, 9:12 am, week of 4/2/11 …

        You wrote, What Pekka Pirilä is saying here is what Miskolczi’s papers describe. … To describe in another way, the earth already has runaway water vapor feedback. It has had ever since there were oceans. In any positive feedback system, the feedback amplifies and continues until it triggers another stronger process that overwhelms and halts it, a negative feedback.

        If the magnitude of the feedback is between zero and one, the affected parameter will be positive and runaway will not occur. If it is greater than one, the parameter might runaway or it might oscillate, depending on phasing. For climate, a sufficient criterion for discussion purposes is not whether the feedback is positive, but whether it is greater than one.

        As a philosophical matter, climate has no runaway condition, at least as long as it has surface water. Except during extreme events like eruptions, earthquakes, and novas, collectively explosions, natural systems exist in conditionally stable states. Round boulders are not found perched on the sides of hills, nor cones standing on their apexes. The imperative for modeling is to account for the feedback processes that create conditional stability, and then to predict the dynamic range of those control processes against possible disruptions.

        Exactly which of Miskolczi’s papers under discussion is not always clear on this cite. Three prime candidates are The Stable Stationary Value of the Earth’s Global Average Atmospheric Planck-Weighted Greenhouse-Gas Optical Thickness, E&E, August, 2010; Greenhouse effect in semi-transparent planetary atmospheres, Quart.J. Hungarian MetServ, March, 2007; and with M. G. Mlynczak, The greenhouse effect and the spectral decomposition of the clear-sky terrestrial radiation, Quart.J. Hungarian MetServ, December, 2004. Judith Curry’s 8:34 pm frustration with a definition of Su had to have come from the 2007 paper. The 2007 paper is incomplete without the 2004 paper.

        My detailed, math-free (getting into the math was not required) critique of Miskolczi’s 2007 and 2004 papers is a response to a commenter on 1/14/10. RSJ, http://www.rocketscientistsjournal.com/2009/03/_internal_modeling_mistakes_by.html . Miskolczi makes inflated, unscientific claims, but nevertheless concludes correctly that a control system must be in operation in the climate. However, he dismisses that the controlling feedback mechanism has anything to do with water vapor feedback because that “is difficult to imagine“. Of course, it exists, and it is due exactly to water vapor feedback through cloud albedo. All the requisite processes exist and are confirmed in IPCC Reports. Because Miskolczi, like IPCC, failed to discover and model this feedback, their modeling is open loop with respect to the dominant feedback in all of climate. Consequently, Miskolczi and IPCC share a fatal error: fitting an open loop model to closed loop data doesn’t work.

      • I have looked at several of his papers and found very severe errors in every single one. Severe enough to make their results totally worthless. There have been errors in handling absorption in radiation although this error has been corrected in later papers. There has been erroneous formulations of Kirchhoff’s law, there have been balance equations that have appeared from nowhere and are erroneous. This is one error of the work linked by Al.

        And this is only a partial list of errors.

      • cementafriend

        Pekka, maybe your knowledge about thermodynamics and heat transfer is not complete. Dr(Ir) Noor Van Andel is highly respected in the Netherlands where he was research director of Akzo Nobel and is the inventor of improved heat exchangers for domestic and commercial use. He gave a presentation to KNMI in Sept 2010 see here http://climategate.nl/wp-content/uploads/2010/09/KNMI_voordracht_VanAndel.pdf and the paper I linked was for a presentation in Feb 2011.
        Do you think you know more than Van Andel about the Chemical engineering subject of heat &mass transfer and can dispute actual measurements recorded in the KNMI Climate explorer database.?
        Do you think that the meteorologists and other researchers at KNMI who welcomed Van Andel’s presentations know nothing?

      • My understanding of thermodynamics is certainly sufficient to know that what I wrote is correct.

      • I could add that for meteorologists the cause of the warming of the sea and the origin of the driving force is not important. They look at the local consequences and that is a totally different matter. For that it’s perfectly legitimate to skip the earlier steps, but for understanding changing climate it’s not.

      • ‘There has been erroneous formulations of Kirchhoff’s law, there have been balance equations that have appeared from nowhere and are erroneous.’

        As an engineer I would say that this is most certainly so. When pressed I was told that these relationships said to be based on fundamental physical principle were in fact based on clear sky radiosonde data.

        There are 2 points.

        The radiosonde data is not nearly accurate enough to show what is purported.

        Your other link makes a great deal of cloud changes – which I accept without much quibble. But you can’t say that the data from clear skies shows this effect in operation and offer as proof of concept – cloudy skies.

        I would offer cloudy skies as proof that that theory is based on only half the facts.

  59. This is completely off topic, as far as climate, so feel free to skip reading further. But this article includes an excellent explication of the importance of listening to those with whom we most disagree, which seems to be one of the main themes of this blog. So it’s kinda on topic in a general sense. (And it was written by a progressive, on a progressive site, so maybe I can make a plea for amnesty.)


    These excerpts give a flavor of the article:

    “And yet, I think having a Republican friend is making me a better liberal. We need friends who differ from us. It’s easy to watch Republican extremism and think, ‘Wow, they’re crazy.’ But when someone is sitting face to face with us, when someone we admire and respect is telling us they believe differently, it is at this fine point that we find nuance, and we begin to understand exactly how we got to this point in history. We lose something critical when we surround ourselves with people who agree with us all the time. We lose out on the wisdom of seeing the other side.”


    “…Janet’s willingness to associate with so many liberal friends — though I know she seeks refuge in chat rooms and magazines that share her beliefs — makes her a better and more interesting person. She has her beliefs challenged constantly. She is more well-read and educated in her politics than most of the liberals I know. Too many liberals I know are lazy, they have a belief system that consists of making fun of Glenn Beck and watching ‘The Daily Show.’ Shouldn’t their beliefs be challenged, too?

    This is a democracy, after all. Isn’t it worth understanding a bit more about why approximately half the country votes differently than we do? Isn’t it important that we understand why people — good and legitimate Americans, whose votes count as much as ours — like Sarah Palin? Isn’t it crucial we figure out why any woman would want to defund Planned Parenthood, if only so we could then address the argument? Nobody benefits from sitting in a room, agreeing with everyone else.”

    I can read comments I agree with on some blogs. I can read comments that will make my head explode on others. But I genuinely enjoy the range and tone of the debate on this site. Some ridicule the tolerance for dissenting views here (in both directions), but to my mind that is this site’s greatest strength. My compliments to the chef.

  60. I just spotted this: Richard Alley PBS special on limate change and sustainable energy debuts April 10


    • cementafriend

      Dr C, you say you can not understand Miskolczi, I hope you look at Dr Van Andel’s presentation http://climategate.nl/wp-content/uploads/2010/09/KNMI_voordracht_VanAndel.pdf which should make clarify a few points.
      Pekka above is certainly not an engineer and one has to question his understanding of thermodynamics and heat& mass transfer.

      Re-sustainable energy. Engineers have to take account of economics and costs, as well as technology, performance, safety, public good etc. in making decisions on alternatives and design. If a process or project costs five or ten times an alternative and it is likely that a replacement process or project will become viable in a few years that particular process/project is not sustainable. That is the case for energy production from wind, solar and crops at present. The only long term sustainable energy source is nuclear and leading the field is Thorium followed up sometime in future by fusion. The supply of Thorium is many thousands of years and the supply of deutrium in the ocean is endless.

      • I certainly have engineering education with very much physics included and a lot engineering related experience. For further details check, what I have listed in the “Denizens” or on my very brief CV on my web site.

        I’m far from the only physicist or engineer to conclude that Miskolczi’s work is all in error.

      • cementafriend

        Out of courtesy I looked up your web site and CV. Not much evidence of engineering.
        However, credit where it id due. You mention that thermodynamics and fluid mechanics are of prime importance when assessing climate. These are core chemical engineering subjects as are the equally important subjects of heat & mass transfer and reaction kinetics (ozone, CO2 dissolution in oceans etc).You did mention the importance of thermal radiation but that is only one of four aspects of heat transfer.
        As Van Andel has put in his presentation radiation occurs at the TOA to space but in the lower atmosphere convection and phase change (evaporation & condensation) dominate. Condensation is the reason for the reduction in the lapse rate from 9.8 K/1000m (dry adiabatic) to the average environmental rate 6.5 K/1000m. This is gives a measurable result for evaporation. Are you aware of the Nusselt number which is used in determination of convection and evaporation? I have never seen it mentioned in any paper written by so-called climate scientists.
        Then you mentioned photons which do not exist –see this http://www-3.unipv.it/fis/tamq/Anti-photon.pdf by Nobel prize physicist WE Lamb and this by another physicist http://arxiv.org/abs/1009.5119 and this by an experienced engineer http://www.worldsci.org/pdf/abstracts/abstracts_5711.pdf

        Finally, your backing of the discredited Stern report is laughable. No competent company director would allow a discount rate of 1% to be used in justifying a project and then if he hid his values, as Stern tries to do he could be jailed. There can be no case to waste money on something that is not proven and there is no case to ignore the technical advances engineers can make to adapt and rectify problems when the need arise.

      • I agree with Pekka that Miskolczi’s theory is now almost universally regarded within the climate science field as invalid, not only because of theoretical errors, but also because it is falsified by observational data. Miskolczi requires that CO2-mediated warming be offset by reductions in upper tropospheric (UT) humidity, but multiple sets of observational data now show that UT humidity increases, including all the recent satellite data that have superseded the older radiosonde methodologies. There is some disagreement as to whether relative humidity (RH) keeps pace, with Mindschwaner and Dessler (2004) concluding that it does not, while Soden et al (2005) and Gettelsman et al (2007) concluding that it does, but no disagreement about the fact that humidity increases. To support his claims, Miskolczi has had to rely on the one old radiosonde dataset known to report spurious reductions in UT humidity – the NCEP/NCAR reanalysis, which was found to have been invalidated by errors due to instrumentation changes causing artifactual downward jumps in humidity recordings. Van Andel has cited the same data, and perhaps is unfamiliar with the literature on this subject.

    • al in kansas

      1. I agree, Miskolczi does not “show his work” adequately. He uses Su and Sg interchangeably. His 2010 paper is mostly in my mind, actual observations to back up the theory. And yes, error bars and some discussion of accuracy and precision of measurements (radiosonde data)would be in order. The “Note” linked previously is apparently the middle of a discussion so is a bit disjointed, but did start to fill in a few blanks for me. The IR optical depth was new to me but makes a certain amount of sense.
      2. Water vapor feedback is ill defined in some ways. Is it a damped amplifier (1). The cloud albedo is only part of the feedback, the other being the latent heat transfer. It would seem to me that it would not mater what the source of heating, solar increase, surface albedo decrease, GHG… This seems to generally be ignored. I’m of the >1 opinion. Cloud albedo cutting the energy input and latent heat transfer removing the the heat from the surface.
      General comments:
      Thanks to Dr.C for working on separating the politics from the science, and making the science more rigorous and transparent through this blog. It is obviously generating a great deal of spirited discussion.
      Personal note:
      I have only few hours on the weekends of dial -up internet access available, so if my replies are slow in coming, disjointed and intermittent, don’t assume your intellectually and academically superior replies have silenced my ill informed wrong headed opinions. ; ) ha ha