Uncertainty in Arctic temperatures

by Judith Curry

Motivated by the paper by Cowtan and Way, this post examines uncertainties in the recent variability of Arctic temperatures.

This post considers two papers that compare reanalyses with surface based observations in the Arctic.  For background on reanalyses, see this previous post reanalyses.org.

On the possibilities to use atmospheric reanalyses to evaluate the warming structure in the Arctic

C. E. Chung, H. Cha, T. Vihma, P. Räisänen, and D. Decremer

Abstract.  There has been growing interest in the vertical structure of the recent Arctic warming. We investigated temperatures at the surface, 925, 700, 500 and 300 hPa levels in the Arctic (north of 70 N) using observations and four reanalyses: ERA-Interim, CFSR, MERRA and NCEP II. For the period 1979–2011, the layers at 500 hPa and below show a warming trend in all seasons in all the chosen reanalyses and observations. Restricting the analysis to the 1998–2011 period, however, all the reanalyses show a cooling trend in the Arctic-mean 500 hPa temperature in autumn, and this also applies to both observations and the reanalyses when restricting the analysis to the locations with available IGRA radiosoundings. During this period, the surface observations mainly representing land areas surrounding the Arctic Ocean reveal no summertime trend, in contrast with the reanalyses whether restricted to the locations of the available surface observations or not.

In evaluating the reanalyses with observations, we find that the reanalyses agree better with each other at the available IGRA sounding locations than for the Arctic average, perhaps because the sounding observations were assimilated into reanalyses. Conversely, using the reanalysis data only from locations matching available surface (air) temperature observations does not improve the agreement between the reanalyses. At 925 hPa, CFSR deviates from the other three reanalyses, especially in summer after 2000, and it also deviates more from the IGRA radiosoundings than the other reanalyses do. The CFSR error in summer T925 is due mainly to underestimations in the Canadian-Atlantic sector between 120W and 0. The other reanalyses also have negative biases in this longitude band.

Published in Atm. Chem. Phys. (2013), [link] to full manuscript.

The paper considers the three most recent reanalysis products since 1979:  ERA Interim (ECMWF), CFSR (NOAA), and MERRA (NASA).  Surface temperatures from the reanalyses are compared with GISTEMP, as described below:

To evaluate the surface air temperature in the reanalyses, we use the Goddard Institute for Space Studies (GISS) analysis of global surface temperature change , referred to here as “GISTEMP”. GISTEMP integrates in situ surface air temperature measurements over land- and ship-based and satellite-derived SST measurements. The SST measurements are, however, only used over year-round ice-free areas. The GISTEMP version we use here is a gridded monthly mean data set with a 250 km smoothing. The use of a smoothing distance of 250 km instead of 1200 km (the default of GISTEMP) avoids the uncertainty related to the extrapolation of temperature measurements made at Arctic observation sites to large distances over the open ocean or sea ice, thereby providing a more robust point of comparison for the reanalyses. On the other hand, a consequence of the 250 km smoothing distance is that it leaves a large amount of data gaps. The GISTEMP data we use are defined only for a fraction of the Arctic area (on average, 27% from 1979 to 2011), mainly limited to the vicinity of the observation sites on the land and permanently ice-free parts of the Barents Sea and Greenland Sea (see Fig. 11). 

JC comment:  The bolded text directly addresses concerns that I had about the Cowtan and Way analyses of Central Arctic surface temperatures.

The main figure of interest to me is Figure 2 (double click to enlarge):

Presentation7

Fig. 2. 70–90 N average Ts anomalies relative to 1979–2009 climatology from four reanalyses and GISTEMP. On the left, the true 70–90 N average is shown. On the right, the average is calculated using only those reanalysis data that correspond to the available GISTEMP data in location and time. Note that Ts refers to surface air temperature (SAT) for the reanalyses, and to a combination of SAT over land and SST over ocean for GISTEMP.

The things that strike me about Figure 2 are:

  • The the left column and the right column show the same general interannual variability and trend, indicating that using only the locations where there are observations does a fairly good job for the entire region from 70-90N.
  • Comparing Figure 2a with Figure 4b from Cowtan and Way shows large differences (although Fig 4b is only for a small patch in the Central Arctic Ocean). A comparison of CW for 70-90N with Fig 2a from Chung et al. is needed.
  • The 2005 warming emphasized by Cowtan and Way (and earlier by Hansen) appears to be associated with a large spike in winter (DJF) of 2005, and does not appear in the other seasons.  From cryosphere today plot, 2005 appears as a year with an unusually low annual cycle of sea ice extent.  I don’t recall reading anything about this, worth looking into.
  • Most of the overall trend is associated with the jump in 2005; since 2005 (the period of the main sea ice loss), there was a decrease in SAT until about 2008, and then an increase; the last few years show substantial disagreement between MERRA and the other two reanalyses.

Vertical structure of recent Arctic warming from observed data and reanalysis products

Vladimir Alekseev, Igor Esau, Igor Polyakov, Sarah Byam, Svetlana Sorokina

Abstract.   Spatiotemporal patterns of recent (1979–2008) air temperature trends are evaluated using three reanalysis datasets and radiosonde data. Our analysis demonstrates large discrepancies between the reanalysis datasets, possibly due to differences in the data assimilation procedures as well as sparseness and inhomogeneity of high-latitude observations. We test the robustness of arctic tropospheric warming based on the ERA-40 dataset. ERA-40 Arctic atmosphere temperatures tend to be closer to the observed ones in terms of root mean square error compared to other reanalysis products used in the article. However, changes in the ERA-40 data assimilation procedure produce unphysical jumps in atmospheric temperatures, which may be the likely reason for the elevated tropospheric warming trend in 1979–2002. NCEP/NCAR Reanalysis data show that the near-surface upward temperature trend over the same period is greater than the tropospheric trend, which is consistent with direct radiosonde observations and inconsistent with ERA-40 results. A change of sign in the winter temperature trend from negative to positive in the late 1980s is documented in the upper troposphere/lower stratosphere with a maximum over the Canadian Arctic, based on radiosonde data. This change from cooling to warming tendency is associated with weakening of the stratospheric polar vortex and shift of its center toward the Siberian coast and possibly can be explained by the changes in the dynamics of the Arctic Oscillation. This temporal pattern is consistent with multi-decadal variations of key arctic climate parameters like, for example, surface air temperature and oceanic freshwater content. Elucidating the mechanisms behind these changes will be critical to understanding the complex nature of high-latitude variability and its impact on global climate change.

Published in Climatic Change, 2011, [link] to full manuscript

Section 2 on data provides a very useful explanation of available surface observations in the Arctic, reanalyses, and their limitations.

IMO the most interesting section is 5 Lower Stratospheric Temperature Trends.

See Figure 10 for maps of seasonal surface air temperature trends from reanalyses and from the IABP/POLES surface observations.

From the Conclusions, I find this paragraph to be particularly insightful:

This pattern of temporal changes may be associated with multi-decadal fluctuations on time scales of 50–80 years, which are known to be exceptionally strong in the Arctic and North Atlantic. Polyakov et al. (2008) demonstrated a strikingly coherent pattern of long-term variations of the key arctic climate parameters and strong coupling of longterm changes in the Arctic climate system with those at lower latitudes. Remarkably coherent low-frequency variations are expressed by the arctic SAT, Arctic Ocean fresh water content and intermediate Atlantic Water core temperature, fast-ice thickness, and North Atlantic sea surface temperature. For example, associated with this variability, the arctic SAT record shows two warmer periods in the 1930–40s and in recent decades, and two colder periods early in the 20th century and in the 1960–70s. The observed stratospheric air temperature variations are consistent with this pattern. The long-term changes in the upper troposphere/ lower stratosphere seem to occur together with changes at the surface, including the extent of Eurasian snow cover and sea ice. Elucidating the mechanisms behind these relationships will be critical to our  understanding of the complex nature of low-frequency variability found in the Arctic and at lower latitudes, and its impact on climate change.

JC comments:   These two papers provide additional background on the available data sets for Arctic Ocean surface temperatures.  In addition to the surface buoys and the reanalysis data sets, there are also field observations from a number of experiments over the past several decades.  So there is some information; the challenge is to usefully evaluate the information that is available, and assess the uncertainties.

Both papers raise interesting issues regarding the Arctic atmosphere, identifying a shift circa 1998.  Piecing together the coupled variability of the Arctic atmosphere, sea ice and ocean is a substantial challenge, particularly in view of Alexeev et al.’s concluding statement:

Elucidating the mechanisms behind these relationships will be critical to our  understanding of the complex nature of low-frequency variability found in the Arctic and at lower latitudes, and its impact on climate change.

130 responses to “Uncertainty in Arctic temperatures

  1. Apparently 1k km smoothing of temps can be deceptive and error-prone. Who knew?*

    *Not Goddard, evidently.

  2. Since you are reviewing Arctic temperature papers it would be useful to discuss these papers as well as some similar ones by the same authors more recently:

    Screen, J.A. and Simmonds, I. (2011). Erroneous Arctic Temperature Trends in the ERA-40 Reanalysis: A Closer Look. Journal of Climate, 24(10): 2620-2627.

    Screen, J.A., Deser, C. and Simmonds, I. (2012). Local and remote controls on observed Arctic warming. Geophysical Research Letters, 39(10).

    We have updated our FAQ to address several of the comments from here including the incorrect assertion that we included reanalysis into our reconstruction.

    http://www-users.york.ac.uk/~kdc3/papers/coverage2013/faq.html

    At the FAQ we have also provided some discussion of the synoptics explaining the larger differences for 1998 and 2010 respectively (relative to HadCrut4).

    Finally if you will be discussing our paper I was wondering if you could give your opinion on our cross-validation measures, our comparison with the IABP data, the tests with interpolating from SSTs versus land and the errors associated with leaving regions null (e.g. setting trends to global average).

    I look forward to this continued discussion but I feel that these issues were not addressed last time so it would be worth discussing further.

    • I was wondering if you could give your opinion on our cross-validation measures,…

      Since Judith was so dismissive of your paper, she must have rather extensive reasons for rejecting your cross-validation measures.

      It will be interesting to see her elaborate.

    • What are you going to do, Robert, if minimum summer ice extent in the Arctic starts to rise?
      =============

    • Dr Way,
      Would you care to comment on this posted on Dr Spencer’s site:
      http://wattsupwiththat.files.wordpress.com/2013/11/cowtan-wray_before-after.jpg

      • That figure was from the SkS site
        http://www.skepticalscience.com/global-warming-since-1997-more-than-twice-as-fast.html
        posted by the authors themselves.

        Do you have a problem with it?

      • Web,

        No, I don’t have a problem with it nor do I have a problem with Robert Way’s work. I am not even close to your level to be able to properly evaluate any of it I am just trying to learn as much as I can to try and understand it all. When I looked at that the implication was that the temperatures were out of kilter warm in comparison with the rest of the globe. Now with both 2007 and 2012 showing significant arctic ice loss and the general decline along with the sea ice anomaly decline perhaps it was warmer there and then; however, could it really be that significant and why is it showing up to that degree since 1979. I am most certainly receptive to Robert Way’s explanation or yours and my interpretation may be way off base but I would like to come to an understanding of it.

      • That’s funny I had previously read that article too and forgot that that was there.

      • ” Would you care to comment on this posted on Dr Spencer’s site:”

        With especial reference to the exaggeration of the significance of the polar areas due to the use of Mercator and similar projections, please!

      • They discussed that issue. Converting to an orange peel or Goode homolosine projection makes it more difficult to intuit the coverage of the warming regions.

        The numbers are more important in any case.

      • According to Mosher, Way does good work. I have no reason not to accept that. I’m sure his numbers are good. Did they use a homolosine?


      • ordvic | November 19, 2013 at 12:56 pm |

        According to Mosher, Way does good work. I have no reason not to accept that. I’m sure his numbers are good. Did they use a homolosine?

        That is only useful for visualizing. The math they obviously due correctly by applying trigonometry on a spherical mapping,

      • thanks, I didn’t think it would play a part in the numbers but since homolosine is trig as well it gave me pause.

      • Well just in case Robert Way ever reads any of this embarrassing babble on my part, I just want to add that that heated arctic is predicted by the Stadium Waves:

        http://judithcurry.com/2013/11/19/the-2-8-effect/#comment-415636

        The peak of wave IV was close to 2005

    • Matthew R Marler

      Thank you for your visit, and for the link to the faq page. Before I go on a chase, are the two Screen papers available online?

      • I applaud Robert Way. In one memorable phrase he has made clear to hoi polloi what climate science is all about: ‘There is lots of statistical power to manipulate and make the data say what it needs to say’.

        It is said that people do their best work when they are young and foolish.
        I dunno, Stephen Schneider was a bit older when he bloomed with his insight into honesty and advocacy.
        ============

    • Thank you for visiting, Dr. Way. In considering your contributions, I must admit a great deal of curiosity regarding your participation in the “conspiracy to save Humanity”, that is the “secret” forum at Skeptical Science that evidently turned out not-so-secret.

      According to the recent post at Climate Audit regarding your paper:

      Co-author Way was an active participant at the secret SKS forum, where he actively fomented conspiracy theory allegations. Uniquely among participants in the secret SKS forum, he conceded that Climate Audit was frequently correct in its observations (“The fact of the matter is that a lot of the points he [McIntyre] brings up are valid”) and urged care in contradicting Climate Audit (“I wouldn’t want to go up against that group, between them there is a lot of statistical power to manipulate and make the data say what it needs to say.”)

      Pending your comments on the subject, this would appear to me to suggest (but not prove) that the purpose of your paper is to use “a lot of statistical power to manipulate and make the data say what it needs to say.” This, if true, wouldn’t invalidate your conclusions, but it does suggest that they should be judged in terms of a distinct bias.

      Any comments?

      JC comment: Any conspiracy theory issues should not be discussed on this thread, you can take this issue to the open thread. Further playing ‘gotcha’ with what a commenter says on another blog that is not directly related to the topic at hand is technically against blog rules. While it is a rule I rarely enforce, I will enforce it when an author of a paper under discussion graciously participates in the comment thread here.

      • In some of these situations the detective HackIntyre would be better suited to auditing whether his suspects have a clean pair of underwear on.

        OTOH, he certainly does a good job of giving away the store:
        http://contextearth.com/2013/11/19/csalt-ju-jutsu/

        thanks for all your efforts, much appreciated, saves me some time.

      • Web,

        Apparently Way is on the record in his private correspondence with his climate buddies saying that McIntyre is usually correct on these issues and that it would be wise not to mess with him. Interesting.

      • Bill –

        “Usually?”

      • ‘The fact of the matter is that a lot of the points he(McIntyre) brings up are valid’.

        Whatsa matta Joshua, you are not ‘usually’ this dull? That’s just upthread.
        =========

      • Oh yeah, McIntyre is “usually” right about this stuff.
        Like I said in my post, his findings “almost always” strengthen the AGW argument.

        He just doesn’t realize it. Same goes for Tisdale .

    • Perhaps cross-checking with the rate of decline in ice area would be useful too.
      http://climategrog.wordpress.com/?attachment_id=496

  3. R. Gates aka Skeptical Warmist

    Their may be uncertainty in Atctic temperatures, but new and innovative ways are being found to determine how the sea ice has actually behaved for many centuries:

    http://phys.org/news/2013-11-underwater-tree-calcite-crusts-arctic.html

    The results coorborate other proxy studies and indicate that the current low and downward trend is very likely at a multi-century low point.

  4. Robert Way,

    I am curious. I wonder if you could let me know why you think anybody cares about whether Arctic temperatures have gone up, down, or sideways?

    Does it matter?

    Is there any point in endlessly going over dodgy and incomplete past temperature records?

    May I respectfully suggest the past is the past. Why not let it lie? Surely the present is paramount, and regardless of what assumptions you care to make about the future, you are no more likely to be right than I.

    If you can provide any information that has utility to anyone, please do so. I am surprised that there are so few research opportunities that might have the effect of providing eventual benefit to mankind. Obviously, you are unable to find anything more useful with which to occupy your no doubt considerable talents.

    I suppose you have to follow whatever compulsion you have. You have my sympathy. It just seems like a waste.

    Live well and prosper,

    Mike Flynn.

    • Mike Flynn, you get on a website about climate and seem genuinely surprised that people talk about climate here. What were you expecting? Perhaps you got here by mistake.

      • Jim D,

        I’m not surprised to find climatology discussion on a climate blog. I would expect to find astrological discussions on an astrology blog.

        The difference might be that the average person probably has more faith in astrological predictions. There presently seems to be precious little to support any faith in climatological predictions.

        Given that climate is the “average of weather”, it is impossible to predict climate, as the events of which it is the average, have not yet occurred. This is impossible to avoid, and no amount of irrelevant or silly analogies can change it.

        As I have stated before, the study of an average is fine. It is not a science by any stretch of the imagination. Research into past “averages” seems like a waste of money. What does it achieve?

        Parts of the Earth cool. The permafrost regions contain large amounts of previously living plants. The Antarctic continent once had plant and animal life on its surface rather than ice. Similarly, some desert regions were once fertile and green. Parts of the Earth become warmer.

        One might as well support a movement to “Save the Dinosaurs” as to try to prevent the climate in any region from changing. My view is simply that the billions wasted based on a physically impossible belief that CO2 can warm the Earth, could have been applied in a much more beneficial way to the Earths population.

        You obviously disagree.

        Live well and prosper,

        Mike Flynn.

      • The difference might be that the average person probably has more faith in astrological predictions

        I love it when “skeptics” formulate conclusions: (1) that are highly improbable and (2) for which they have absolutely no evidence.

      • Mike –

        My view is simply that the billions wasted based on a physically impossible belief that CO2 can warm the Earth, could have been applied in a much more beneficial way to the Earths population.

        Leaving aside your conclusions that CO2 cannot possibly warm the Earth, why would you be so concerned about those “wasted billions” when you could be focusing on the trillions that have been spent to support the flow of fossil fuels, and to cover the negative externalities that result from fossil fuel use? Given the orders of magnitude, it would seem curious that you would be focusing on exponentially less “waste.”

        Almost enough to make someone think that your concern is “motivated” (in the sense of motivated reasoning) by something more than you say.

      • My view is simply that the billions wasted based on a physically impossible belief that CO2 can warm the Earth

        OK, I’ll bite. Why is it physically impossible?

      • Joshua,

        Unlike a Warmist, I say what I mean, and it mean what I say.

        Many things are found to be true, even though seeming at first to be ” highly improbable”. I am sure you don’t need me to spell them out.

        I said “might be” for good reason. You are correct, inasmuch as I have no definitive proof. Unlike a Warmist, I used language that depended on the usual usage of English, as I understand it. Please feel free to correct me if I have erred.

        However, a quick perusal of periodical publications aimed at the general population indicates a large number of continuing advertisements offering astrological predictions of one sort or another, for a fee.

        On the other hand, I see precisely no (nil, nada, zero, zilch) evidence for a single member of the general public paying a climatologist directly for a prediction, projection, or scenario of future climate.

        I love it when unbelievers get it right, although I would rather be happy than right.

        Live well and prosper,

        Mike Flynn.

        ,

      • Joshua,

        Thank for your suggestion that I stop pointing out that the study of weather averages (climate) has produced precisely nothing of consequence, in spite of the billions of dollars expended.

        If you wish to complain about aspects of fossil fuel use, I suggest you do so, as vigorously as you can. I have no interest in supporting you, however. You say you find my actions “curious”. Good for you – is there a particular reason that you think I might care?

        Live well and prosper,

        Mike Flynn.

      • Andrew Adams,

        Thank you for your query.

        Maximum transmission of radiant energy occurs in a vaccuum. The concept of the requirement of a luminiferous ether to conduct the energy has been shown to be unnecessary.

        Interposing any matter between the source of energy (the Sun), and the target (the Earth), reduces the amount of energy available to the target.

        If you examine the Warmist explanations (back radiation etc.), you will find that they eventually equate a reduction in the rate of cooling with “warming”. This is not so.

        The matter is fairly easily demonstrated by experiment. You will note that every experiment intended to show that you can raise the temperature of an object heated by a non-contiguous radiant heat source, by surrounding it with CO2, in fact shows the opposite. Warmists, certainly, are unable to demonstrate that which they claim to be true.

        I suggest, if you have not done so already, reading some of Feynman’s lectures relating to the interaction of energy and matter. He is quite easy to understand.

        Live well and prosper,

        Mike Flynn.

      • Joshua,

        Excellent. Thank you for your clarification that you have not “suggested” that I stop pointing out anything.

        I am most grateful that you support my right to free speech.

        I did try reading again. Unfortunately, I drew the same conclusion as the first time. I will let other denizens decide what you were suggesting.

        You have me at a disadvantage, of course. I am not fully fluent in the use of the Warmist language, where, for example, “cooling” means “warming”. Obviously, I misread your Warmist comment to mean something other than what you now claim it meant.

        Do you derive your main source of income from supposed “climate science” or any ancillary activity?

        As A Fan of More Discourse would say – the World wonders, eh?

        Yes, that was an attempt at sarcasm. I know I’m not very adept. I will try harder next time.

        Live well and prosper,

        Mike Flynn,

      • Unfortunately, I drew the same conclusion as the first time. I will let other denizens decide what you were suggesting.

        Interesting that my previous clarification, for some unknown reason, disappeared – but I see that you saw it before it bit the dust.

        Also interesting that although you read it again, you still read something that wasn’t there. Your string remains unbroken – and nothing that other denizens decide or don’t decide can change that.

      • Mike Flynn,

        Thanks for your answer.

        First of all, of you reduce the rate of cooling of an object then it is going to be warmer than it otherwise would have been, and if it is still receiving a constant heat source then its temperature will increase in absolute terms. I’m not sure what your objection is here.

        Secondly, the radiative properties of CO2 have been directly measured in the laboratory and applied in various ways in the real world. I don’t know about the specific experiments you refer to but I would guess that replicating the greenhouse effect in a laboratory is going to be difficult because the GHE is dependant on the temperature structure of an atmosphere which is surrounded by a vacuum, so you can’t just test it by surrounding an object in a lab with CO2 and applying heat. In any case the existence of the GHE is confirmed by actual observations in the outside world, such as measurements of outgoing radiation at the TOA.

        As for Feynman’s lectures I’m sure they are interesting in their own right but I doubt that they contain any observations which are not understood by modern atmospheric scientists who accept the existence of the GHE.

        It might help for me to understand your objection if you explained precisely at which point the mechanism of the GHE, as it would be explained by an atmospheric scientist, breaks down.

      • There seems to be a conservation law on this site:

        There’s always at least one denizen, who claims to know physics better than others, while that belief is based on the great power of total ignorance.

        Trying to educate those denizens has never led to any positive result.

      • Pekka,

        Oh, absolutely. But as a layman I sometimes find it interesting to get into this kind of discussion as a way of testing my own understanding of the subject.

    • Mike, the Poley Bears are dying out. What? You say they aren’t? Well, nevermind, then.
      =======

      • Kim

        I haven’t seen a single Poley bear here in the south of England. Proof positive of something or other and that it’s all our fault.

        Tonyb

    • That is science mate. Scientists care about all sorts of things that most people don’t give a toss about. Before you declare this interest a waste of time and money, realise that every now and then a scientist doing this kind of pure research will stumble across something really interesting and potentially useful in an unlikely place. That is why society is generally willing to pay them while they satisfy their rather odd curiosities.

      • Ian H,

        If you think that is science, you go ahead and fund it. This is not scientific research by any reasonable standard. Members of the public do not willingly pay scientists, as a general rule. This has been as true in the past as it is now.

        I agree that pure research occasionally stumbles across things. The key words are “occasionally” and “stumbles”. How broke do you have to get before you start wondering about the cost effectiveness of the process?

        Live well,and prosper,

        Mike Flynn.

    • Matthew R Marler

      Mike Flynn: I wonder if you could let me know why you think anybody cares about whether Arctic temperatures have gone up, down, or sideways?

      The most obvious evidence to readers here is probably that the paper received a lot of comment here. It might have generated less comment on a weblog devoted to high temperature superconductivity.

      • Matthew R Marler,

        Thank you for your input. I was asking the author, because I cannot see any use for his exercise. I assume he had a reason for spending time doing what he did. Of course, if his purpose was to generate blog comments, then I accept it. I am surprised.

        Live well and prosper,

        Mike Flynn.

      • Matthew R Marler

        Mike Flynn: I was asking the author

        So you were. Now he has no need to answer you, if he felt one in the first place.

    • Mike Flynn,
      People care enough about good science that they have donated almost enough money to fully defray the open source publishing costs for the Cowtan & Way paper. That’s several thousand dollars that your veiled insults have no impact on, LOL.

      • WHT,

        Governments have spent billions of dollars, for no perceptible benefit to anyone except the CAGW community.

        I don’t find that anything to laugh about. You do, obviously.

        Live well and prosper,

        Mike Flynn.

      • Flynn, These are not governments that are helping to defray the costs of the C&W paper but online readers like me. You on the other hand are likely paid-off by your Aussie buddy Rupert to be a nuisance … an embarrassing one at that, LOL

      • WHT,

        How much have you contributed? I have contributed nothing. If you want to donate money to support research, rather than to alleviate human suffering, that is your choice.

        I choose otherwise.

        I am sorry you feel embarrassed. This is becoming an entrenched Warmist trait.

        Live well and prosper,

        Mike Flynn.

      • WebHubTelescope,

        It’s very difficult to launch a war against an enemy that doesn’t exist.

        Perhaps you can provide an example of “science” that you believe I engage in “rhetorical wars” against.

        I ask again – how much have you contributed to the publication of the paper?

        Live well and prosper,

        Mike Flynn.

    • John Carpenter

      What an intellectually stimulating question. It’s this type of comment that really raises the bar here at CE. Really, why do scientists even bother studying anything? Why investigate any observable phenomenon? What does it matter? Who really cares? What would we do without the likes of Mike to point out what he deems to be unworthy time spent by some people? We are in the presence of true intellectual genius here, someone who truly knows what will or won’t benefit mankind. Someone who knows true utility to humans. Thanks Mike! I’m sure Robert will think twice about his next project and perhaps consult you directly for advice on its worthiness to mankind.

      • John Carpenter,

        Climate is the average of weather. It is a calculation. You may choose to study aspects of weather. To do this, you will need to observe it, measure it, and hopefully come up with some theories relating to your observations and/or measurements that can, sooner or later, be verified or disproven by experiment.

        This is what I understand to be science, in essence. It involves application of the scientific method.

        In this context, “climate science”, (consisting of observing nothing, measuring nothing, but certainly reanalysing observations and measurements made by others, and altering to suit predetermined conclusions in some cases, and preparing “averages” of these numbers), achieves nothing.

        If you consider this science, you fund it. You have my full support.

        If you consider that taxpayers should pay for it, that is your right. You might like to point out one specific benefit of all the resources devoted to “climate science”.

        Robert did the right thing. He did the recalculating, and creating his best guess for values of missing data in his own time, as I understand it.

        I merely asked a question – who cares?

        You may care, I don’t. if you are trying to tell me I should care, you need to provide a reason. So far you haven’t.

        Live well and prosper,

        Mike Flynn.

      • John Carpenter

        “I merely asked a question – who cares?

        You may care, I don’t. if you are trying to tell me I should care, you need to provide a reason. So far you haven’t.”

        You don’t have to care. But if you don’t care… Then why even bother asking or making a comment in the first place? To me your comment represents an arrogant point of view that if you don’t care, why should anybody else? If you don’t care, what utility does it have to mankind? If you don’t care, why make a commentary on what you think is science?

        The only explanation that answers those questions is that you do care. You posting the comment is the only evidence needed, otherwise you would not have been compelled to comment in the first place. Simple.

      • John Carpenter,

        I take your point. I was unclear.

        I was referring to yet another recalculation of figures, both observed and non-observed, purporting to be meaningful.

        If people have nothing better to do, then good luck to them. I do not share their enthusiasm for creating data where none exists, and then using guessed data to draw a conclusion.

        What’s wrong with spending time actually doing “science”? It probably produces more utilitarian outcomes, in the long run. A personal opinion only. I accept you disagree.

        Live well and prosper,

        Mike Flynn.

  5. “Most of the overall trend is associated with the jump in 2005; since 2005 (the period of the main sea ice loss)”

    First, I’m with you on this one, Judith. I suspect oceans, which we now know are for some reason quite cold, has a lot to do with sea ice melt.

    The Arctic is almost always below freezing, so it is pushing out warm ocean heat when it is “hot.” It’s ocean temperatures, perhaps winds, that determine the Arctic sea ice extent.

    What does it mean to have a lot of “Sea Ice Loss” in the arctic? It’s never very thick. I suppose the minimum is 1/4th of 1979, but, so what? 2005 to 2004 minimum volumes were nearly the same, according to this. Minimum is what matters for volume, since everything else melted.

    Meanwhile, Arctic sea ice extent is .848 square kilometers shy of “average,” whatever that is, and Antarctica is .972 million square kilometers in excess, again, whatever that means, but on same baselines. So global sea ice extent is, right now, above average.

    Antarctic temperatures are low, and declining, provided they are right. Arctic temperatures are increasing, provided they are right. But I would guess Arctic surface temperatures are more variable because water is a better conductor of energy than is ice, there is a continent in Antarctica, and an increasing set of ice. So, total sea ice volume may be growing. It’s not like it’s possible to tell with the antarctic, very easily, but I would guess the increase in sea ice extent is a good indicator that volume is also increasing.

  6. The recent rash of papers attempting to explain away the pause demonstrates that ‘there is a lot of statistical power to manipulate and make the data say what it needs to say’.

    What would ‘a constant defender of good practice in science’ say to that?
    =========

    • Oops, Hat tip to Ted Carmichael, and, of course, Robert Way.
      ================

    • Some of the new data is very stubborn, like in situ buoys (IABP). When the facts don’t support your view, what are you going to do?

  7. Motivated by the useful focus group participant McIntyre, here you go:
    http://contextearth.com/2013/11/19/csalt-ju-jutsu/

    • Matthew R Marler

      WebHubTelescope: http://contextearth.com/2013/11/19/csalt-ju-jutsu/

      So you will be right or wrong together. The future will tell. If the true trend in the Arctic temps is less than what they impute, then you will be wrong together. The C. E. Chung, H. Cha, T. Vihma, P. Räisänen, and D. Decremer paper is a warning not to be too confident. I am sure it won’t be the last word.

      Your coefficient for lnCO2 has changed to 2.818. Why is that? It’s a negligible difference, so I am guessing something liike new data added to the data used for the least-squares estimation. Is it something else?

      Thinking more about the time derivative of T, your model has

      dT/dt = 2.818(1/CO2)(dCO2/dt) which is 0 if dCO2/dt is 0. With estimation error and approximation error and all that, your model is compatible with
      dT/dt = eps (1/CO2) where eps is epsilon, a very small number. Your model is compatible, therefore, with a time to go from TCR to TCR + 0.5C near the surface (after doubling of CO2 concentration) on the order of 500 years.

      • “Your coefficient for lnCO2 has changed to 2.818. Why is that? It’s a negligible difference, so I am guessing something liike new data added to the data used for the least-squares estimation. Is it something else?”

        The inconsequential differences around 2.82+/0.01 in the figures are due to the number of data points I use in the regression and whether I put in a yearly filter. No big deal. Unless you are confused between the TCR and the coefficient of ln(CO2) which has a factor of ln(2) multiplicative difference.

        As far as your other concern, maybe you should tell that to McIntyre. He will “accidentally” show how correct my analysis is, LOL.

  8. Russia is in the best position to supply Arctic temperature. But they are smarting from Greenpeace interference in that region to their drilling program. I wonder whether they are cooperating

    • To say Russia are “smarting” from Greenpeace interference is an odd way to express it. Russia is simply doing the sensible thing teaching these adolescent protestors the facts of life in the real world.

      Other governments are hardly falling over themselves trying to get their nationals back.

  9. A fan of *MORE* discourse

    How does Arctic warming fit into the overall picture of climate-change science?

    • Lucarini et al., Mathematical and Physical Ideas for Climate Science (arXiv:1311.1190):  Energy-balance thermodynamics continues to dominate climate dynamics on decadal-and-longer timescales.

    • Krissansen-Totton and Davies et al., Investigation of cosmic ray-cloud connections using MISR (arXiv:1311.1308):  “ABC: anything but carbon” explanations for global warming continue to fizzle.

    • Deza, Masoller, and Barreiro et al., Distinguishing the effects of internal and forced atmospheric variability in climate networks (arXiv:1311.3089)  Statistical analysis continues to resolve decadal-and-shorter dynamical variability with ever-improving precision.

    Conclusion I  In the multi-decade long run, James Hansen’s climate change worldview continues to look simple, solid, and just plain right.

    Conclusion II  Continuing-and-accelerating Arctic warming, ice-mass loss, and sea-level rise are in the cards.

    Question  Is strengthening climate-change science the reason why denialist websites are focusing ever-more-obsessively on politics, personalities, conspiracy theories, nutjob science, far-right economics, and general whinging?

    \scriptstyle\rule[2.25ex]{0.01pt}{0.01pt}\,\boldsymbol{\overset{\scriptstyle\circ\wedge\circ}{\smile}\,\heartsuit\,{\displaystyle\text{\bfseries!!!}}\,\heartsuit\,\overset{\scriptstyle\circ\wedge\circ}{\smile}}\ \rule[-0.25ex]{0.01pt}{0.01pt}

    • Thanks for the links. Like you, Fan, I am becoming more convinced that thermodynamic energy balance arguments are perfectly valid first-order scientific arguments.

      Performing the equivalent of a variational minimization of the Gibbs free energy on the earth’s thermodynamic parameters works surprisingly well.
      see http://contextearth.com/context_salt_model/

    • FOMD. thanks for the link to the GCR paper.

      supports what I found.

    • “Changes in the galactic cosmic ray (GCR) flux due to variations in solar activity may provide an indirect connection between the Sun’s and the Earth’s climates. Epoch superpositional (composite) analyses of high‐
      magnitude GCR fluctuations, known as Forbush decrease (FD) events, have been widely used to test this hypothesis, with varied results. This work provides new information regarding the interpretation of this approach, suggesting that FD events do not isolate the impacts of GCR variations from those of solar irradiance changes. On average, irradiance changes of ∼0.4 W m−2 outside the atmosphere occur around 2 days in advance of FD‐associated GCR decreases. Using this 2 day gap to separate the effects of irradiance variations from GCR variations on cloud cover, we demonstrate small, but statistically significant, anomalous cloud changes occurring only over areas of the Antarctic plateau in association with the irradiance changes, which previous workers had attributed to GCR variations. Further analysis of the sample shows that these cloud anomalies occurred primarily during polar darkness, precluding the possibility of a causal link to a direct total solar irradiance effect. This work suggests that previous FD‐based studies may have ineffectively isolated the impacts of GCR variations on the Earth’s atmosphere

      http://benlaken.com/documents/JGR_LKW_2011.pdf

      • Since these guys bring it up, the poles might be the place to look for magnetic field modulation (of radiation) of clouds.

        1. The normal atmospheric circulation will move sulphur and nitrogen compounds from lower latitudes to the poles.
        2. The Earth’s magnetic field funnels ionizing radiation to the poles.
        3. As a possible added bonus, frequently one pole is sun lit while the other is dark.

        The poles might be the most sensitive places to Forbush decreases due to the confluence of chemicals and radiation.

    • Fan once again hits the trifecta, with 3 positive responses. The first linked paper, Lucarini et al:

      “Bistabiliy and tipping points Based on the evidence supported by Ho man and Schrag (2002) and from numerical models (Budyko, 1969;
      Ghil, 1976; Sellers, 1969), it is expected that the Earth is
      potentially capable of supporting multiple steady states for the same values of some parameters such as, for example, the solar constant. Such states are the presently observed warm state (W), and the entirely ice covered Snowball Earth state (SB). This is due to the presence
      of two disjoint strange (chaotic) attractors.”

      “In the W states, surface temperature are 40 – 60 K higher than in the
      corresponding SB state and the hydrological cycle dominates the dynamics… …with respect to the corresponding SB states. The SB state is eminently a dry climate, with entropy production mostly due to sensible heat fluxes and dissipation of kinetic energy.”

      “A general property which has been found is that, in both regimes, the efficiency  increases for steady states getting closer to tipping points and dramatically drops at the transition to the new state belonging to the other attractor.”

      Fan what are you betting on? Only 2 big attractors, or many sub attractors and many smaller tipping points?

  10. Judith

    From the climate change manuscript you excerpt this.

    “For example, associated with this variability, the arctic SAT record shows two warmer periods in the 1930–40s and in recent decades, and two colder periods early in the 20th century and in the 1960–70s. The observed stratospheric air temperature variations are consistent with this pattern.”

    With so few long lived temperature records we are not always comparing apples with oranges but according to Prof Phil Jones the two warmest consecutive decades in the Greenland record are 1930/1940.

    We will not know if the period from 2000 will beat that until 2020. This variability in temperatures and spikes are nothing new at all, in fact they appear to be the norm. I commented on them in the period 1820-1860 and also 1920 to 1950. The latter here;

    http://judithcurry.com/2013/04/10/historic-variations-in-arctic-sea-ice-part-ii-1920-1950/

    There is a very much longer piece available with dozens more papers cited, many contemporary and of Russian origin.

    In the Scott Polar Institute in Cambridge is ample evidence of other periods of notable warmth including that from the 1700’s and the early 1500’s.

    I remain a little bemused as to why the brief period around 2005 is considered to be so crucial. Perhaps better context is needed that stretches beyond the very brief satellite record.
    tonyb

  11. Judith,

    You make comments:

    The paper considers the three most recent reanalysis products since 1979: ERA Interim (ECMWF), CFSR (NOAA), and MERRA (NASA).

    and

    the last few years show substantial disagreement between MERRA and the other two reanalyses.

    Put together these comments are misleading. From the paper we can read:

    We analyze monthly mean 2m air temperature fields from three latest-generation atmospheric reanalyses: ECMWF ERA-Interim reanalysis (ERA-I) (Dee et al., 2011), NASA Modern-Era Retrospective analysis for Research and Applications (MERRA) (Rienecker et al., 2011) and the NCEP Climate Forecast System Reanalysis (CFSR) (Saha et al., 2010).
    ..
    We also analyze NCEP’s earlier-generation reanalysis product, the so-called NCEP II reanalysis (Kanamitsu et al., 2002). The period of analysis is from 1979 to 2011. Due to data availability, CFSR is only analyzed until the year 2009.

    Thus they use actually four reanalyses, but two are from NCEP. Both are probably included, because the newer version ends in 2009. From Figure 2 we see that the newer CFSR reanalysis agrees with MERRA in 2009 deviating at that point from NCEP II and ERA-I. Thus MERRA does not deviate from “the two others” but agrees rather well with one of the of the newer reanalysis as far as results are available.

  12. Dear Judith

    I’ll try and work up a graph showing the monthly variations in our isolated test region later, which I think is instructive. However for now I’ve just had a chance to look at one other source of observations which Robert picked up a while back.

    A conference proceeding by Chapman (2013) quotes a 2003-2013 year trend on the region 60N-90N in the satellite radiometer data of 0.08K/yr. (This is a different kind of instrument, unrelated to the UAH microwave data which we used indirectly in our paper).

    Our result for the same region is 0.082K/yr (i.e. 0.8C/decade). Our global trends also look similar.

    I’m afraid I don’t know enough about the IR data to comment on the comparison, and I understand that the clear-sky requirement is an issue, however it is further source of observations to take into account.

    Paper here: http://proceedings.spiedigitallibrary.org/proceeding.aspx?articleid=1690262

  13. leaving the Cowtan-Way paper on its face value, we are still stuck with the premise that the paper was LOOKING for the data to support a conclusion. Not allowing the data to lead to a conclusion.

    • Links here, for anyone not familiar with the issue.

      • Bob T calls the break he and StevieMac found @ ’05 ‘funky’. It scares the Hell out of me.
        ========

    • wrong.

      Having had the pleasure of talking with robert and working with robert over the past two years ( he’s perhaps the most curious and most skeptical grad student I’ve ever run into ) I can tell you that he set out with one purpose:
      come up with the best estimate for this poorly observed part of the world.
      your mind reading sucks.

      • So why is this poor estimate hyped all out of proportion to its value? He may be curious and skeptical, and naive, too.
        ========

      • Also, moshe, you should read what ilmis has written, using Robert Way’s own words, about Robert’s intellectual development over at climateaudit.org

        Mebbe you are young and naive, too.
        ============

      • I claim no capability of clairvoyance. Your rush to judgement not withstanding, learn to differentiate an observation with a claim of telepathy.

  14. Pingback: The 2.8% effect | Climate Etc.

  15. “For example, associated with this variability, the arctic SAT record shows two warmer periods in the 1930–40s and in recent decades, and two colder periods early in the 20th century and in the 1960–70s. The observed stratospheric air temperature variations are consistent with this pattern.”

    Graphically:
    http://www.woodfortrees.org/plot/hadcrut4gl/detrend:0.8/mean:12

  16. Pingback: On Cowtan and Ray (2013) “Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends” | Bob Tisdale – Climate Observations

  17. Pingback: On Cowtan and Ray (2013) “Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends” | Watts Up With That?

  18. “This pattern of temporal changes may be associated with multi-decadal fluctuations on time scales of 50–80 years, which are known to be exceptionally strong in the Arctic and North Atlantic. Polyakov et al. (2008) demonstrated a strikingly coherent pattern of long-term variations of the key arctic climate parameters and strong coupling of longterm changes in the Arctic climate system with those at lower latitudes. ”

    cf 71 year period found in new Halfar et al paper:
    http://www.pnas.org/content/suppl/2013/11/14/1313775110.DCSupplemental

  19. I am usually sympathetic to the skeptical views on here but I have not seen a single comment to Dr Way’s request for the following. ” Finally if you will be discussing our paper I was wondering if you could give your opinion on our cross-validation measures, our comparison with the IABP data, the tests with interpolating from SSTs versus land and the errors associated with leaving regions null (e.g. setting trends to global average).”
    If there are problems with the paper dont be a bunch of wusses and just cop out.

    • I am interested in the cross-validation, I would say I’m skeptical you can krige your way to good data where there was once none.

      But the links I’ve seen for the paper are behind a paywall.

  20. A circa 80y period would be more in line with the variation in decadal rates of change the I extracted in my article here on CE a few months back.
    http://climategrog.wordpress.com/?attachment_id=496

  21. Pingback: Judith Curry: The 2.8% Effect | The Global Warming Policy Foundation (GWPF)

  22. Judith,

    I’d be highly suspect of any reanalysis data given what I’ve seen in comparisons with areas that are well sampled. In short, in areas that are well sampled the reanalysis products ( NARR and MERRA– Ill have to look at others) have spatial patterns of trends that appear to be non physical.

    Hmm, We will have something at AGU..

    • Yes, i am definitely suspicious of reanalyses, that is why i found their Fig 2 to be interesting, comparing left and right columns

    • All of these procedures are creating a spatially linear average of a surface that is actual chaotic. Until your samples are all, even 50 miles apart, you don’t know what the actual average temperature of any unsampled area is.

      This goes to why I’m skeptical on the Arctic temps, most stations (in the data sets I’ve looked at, GSoD, CRU) in the polar arctic are mostly coastal, which if the water is not ice, is all about 31-32F and not -80F. Big shift in virtual average with a small movement of the boundary between warm and cold air masses.

      • “All of these procedures are creating a spatially linear average of a surface that is actual chaotic. Until your samples are all, even 50 miles apart, you don’t know what the actual average temperature of any unsampled area is.”

        Wrong.

        1 you assume the surface is chaotic.
        2. you can get a great prediction with samples farther apart that 50miles–
        simple hold out analysis shows you this.

      • Steven Mosher | November 19, 2013 at 9:32 pm |

        “Wrong.

        1 you assume the surface is chaotic.
        2. you can get a great prediction with samples farther apart that 50miles–
        simple hold out analysis shows you this.”

        1. You assume it isn’t.
        2. Can you point me to something not behind a paywall that has such analysis? As I mention here http://judithcurry.com/2013/11/18/uncertainty-in-arctic-temperatures/#comment-415738 Where I live we get all sorts of weather, maybe it does average out, but I know our weather is different(like different micro climate different) from the weather 100 miles to the south.

  23. “impacts on climate change”

    And not the other way around. Hmmmm, real science here?

  24. Seems like, in lock step, with IPCC increased certainty (95%) we have transitioned from global to regional warming debates. WOW PROGRESS!

  25. Tomas Milanovic

    All of these procedures are creating a spatially linear average of a surface that is actual chaotic. Until your samples are all, even 50 miles apart, you don’t know what the actual average temperature of any unsampled area is.

    While this is (partially) true, it is not the strongest argument.
    The strongest argument (the same shows how EOF can lead to large errors) are non isotropy and non stationarity.
    All spatially interpolating methods using only one parameter (here distance between 2 points) basically only work correctly for stationary isotropic fields.
    What does that mean?
    It means that spatial correlations (or covariances) between point A and B depend on the vector AB and not only on the distance between A and B. In presence of anisotropic transport phenomena (e.g winds, oscillations and currents) the correlations are not even stationary.
    Actually the meteorologists know that very well because experience taught them that there are special highly anisotropic regions where the correlations get destroyed within 50 km in one direction while they seem to hold 300 or more km in another direction.
    Just imagine that you establish spatial covariance matrices for a region of the Pacific Ocean during an El Nino episode and then you do the same thing during a La Nina episode. You will obtain quite different results. And if you mix in some temporal averaging on top, you obtain a mess that has no significance whatsoever because the spatial correlations in energy transporting fluids critically depend on the time scales used for the sampling.
    I suppose that this is what you meant by mentioning spatiotemporal chaos.
    .
    I have always looked with deepest distrust on all statistical methods establishing spatial correlations especially if those use only a single distance parameter. Such approaches are so loaded with unstated assumptions (isotropy, continuity, stationarity, homogeneity) that more time has to be spent to verify these assumptions than to roll out the statistical tool looking for covariances.
    . Now in the case that interests us here we deal with a region that is particularly non homogeneous (interfaces ice-water, ice-continent, water-continent), non isotropic and temporally oscillating at many time scales. I would say that it is difficult to find a worse example for using spatially interpolating statistical tools.
    The fact that you cannot experimentally verify the interpolations because per definition no data is available, comes as an additional difficulty and, of course, verifying that the method works on some other places which may have very different dynamics doesn’t prove anything..

    • Tomas, “verifying that the method works on some other places which may have very different dynamics doesn’t prove anything..”

      Exactly

    • Tomas you have said in other comments that the concept of averages have no place in chaos theory and in the use of vectors in an Hilbert space. Hence the statistical approaches that have been shown on this blog to date seems to have very little basis for making progress in the science underlying climatology.

    • “I suppose that this is what you meant by mentioning spatiotemporal chaos.”
      It is.
      I’m not sure, but I sometimes wonder if peoples views on this might depend some on the weather where they spend their time at. I live ~ 41w-81n in the shadow of one of the great lakes, and far enough north that the jet stream passes over head a lot where we get both “Canadian cold air, and tropical warm air. We get all kinds of weather, storms track in all different directions.

    • Investigations into the properties of the classic Lorenz model equations of 1963 are generally conducted with constant values of the parameters Ra and Pr and b, and did not consider temporal variations in the forcings associated with the model. The parameters are not constant in the earth’s climate system and the system experiences both internal and external temporally varying forcings ( and spatially, too, but space is not a part of the 1963 equations ). Additionally, the original Lorenz system did not contain accounting of viscous dissipation, the conversion of fluid motions ( kinetic energy ) into thermal energy, which appears as a positive definite increase in the temperature of the fluid.

      Each of these effects, the parameters and variable forcings and viscous dissipation, have a potential to significantly alter the chaotic response of the system. Specifically, each can cause non-chaotic response. We can speculate about effects of spatial variations of all of these, and all do vary spatially. My first guess is that the effects on a local basis are the same as the temporal basis.

      The papers cited below offer insights into the effects within the context of temporal chaotic response. Both show that it is possible for the chaotic response to disappear.

      Generally, brute force / blind averaging over large spatial and temporal ranges always has the potential to annihilate regions in these domains that are critically important relative to some system-response functions of interest. Averaging a chaotic response with a non-chaotic response seems to me to be averaging apples and zebras.

      V. Pelino, F. Maimone, and A. Pasini, Oscillating forcings and new regimes in the Lorenz system: a four-lobe attractor, Nonlinear Processes in Geophysics, 19, 315–322, 2012, doi:10.5194/npg-19-315-2012

      Abstract
      It has been shown that forced Lorenz models generally maintain their two-lobe structure, just giving rise to changes in the occurrence of their regimes. Here, using the richness of a unified formalism for Kolmogorov-Lorenz systems, we show that introducing oscillating forcings can lead to the birth of new regimes and to a four-lobe attractor. Analogies within a climate dynamics framework are mentioned.

      The first paragraph of the Introduction

      1 Introduction
      Some years ago, in a pioneering study about the influence of external forcings on patterns of climate variability, Corti et al. (1999) suggested that these forcings led to a change in the frequency of occurrence of dominant regimes of the Northern Hemisphere atmospheric circulation in the second half of the twentieth century. The authors also showed that this situation is consistent with the simple dynamical-system picture obtained by the insertion of a constant forcing term in the Lorenz system. In fact, even in the latter case, one observes no creation of new regimes/lobes on the Lorenz attractor, but a change in the frequency of residence of the state in the two lobes is clearly detectable via calculation of the two associated values of the probability density function. Even increasing the forcing value does not lead to new regimes but just to the disappearing of chaos: after a certain threshold the attractor becomes a fixed point (see, for instance, Pasini, 2008 and references therein).
      [ my bold, ed. ]

      Valerio Lucarini and Klaus Fraedrich, Symmetry breaking, mixing, instability, and low-frequency variability in a minimal Lorenz-like system, Physical Review E, 80, 026313&ff, 2009, DOI: 10.1103/PhysRevE.80.026313

      Abstract
      Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant non dimensional number Eckert number Ec is different from zero, an additional time scale of O(Ec^−1) is introduced in the system, as shown with standard multi scale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f^3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties ergodicity, hyperbolicity and that cause the model losing ability in describing intrinsically multi-scale processes.

  26. OK, last night I did a comparison against the MERRA reanalysis data. Obviously testing a hypothesis always takes much longer than making it, to inevitably the discussion has moved on, however there is some interesting science here.

    Background: One of our validations uses the most isolated part of the central Arctic. This is the place most isolated from observations, and it is sea ice, so if we can get this right everywhere else should be easier.

    We need something against which to validate. The Rigor IABP/Poles work provides observational data up to 1998. Unfortunately we’re mainly interested in the period after that, so we also validated against ERA-interim. ERA alone of the reanalyses assimilates land station observations. The other reanalyses we looked at (NCEP, NCEP2, 20CR) all go a bit bonkers (that’s a statistical term) in our test region, presumably because they aren’t anchored to air temperature observations. So I didn’t hold out strong hopes for MERRA.

    The results are here:
    http://www-users.york.ac.uk/~kdc3/papers/coverage2013/arctic_30merra.pdf

    Trends are:
    Hybrid: 1.51 C/decade
    MERRA: 1.77 C/decade
    ERA-I: 2.10 C/decade

    I must say am very impressed with MERRA. It looks as though it provides a more realistic reconstruction than ERA without using the land station data, and doesn’t show any of the bonkers behaviour of the other reconstructions. That also makes it useful an independent test for our land reconstruction. I expect this will turn into another update, maybe in the new year.

    Thanks for bringing this to our attention!

  27. What I don’t understand is if the Arctic temperatures are so important why haven’t Scientists come up with a method of measuring the temperature over most of the Arctic. I am sure they could devise automatic Instrumentation that could be seeded over large areas of the arctic with some of the $ Billions that they have already had for Climate Change research. Compare it to the efforts with Argo buoys for sea temperature measurement.
    I can only think that it is not really an issue.

    • In the NCDC GSoD dataset, there are a total of 20 station north of 80lat. Looks like 12 were the most operating at the same time. The first and last years of a station are missing records on some dates. There isn’t a lot of station data for >80N

  28. Investigations into the properties of the classic Lorenz model equations of 1963 are generally conducted with constant values of the parameters Ra and Pr and b, and did not consider temporal variations in the forcings associated with the model. The parameters are not constant in the earth’s climate system and the system experiences both internal and external temporally varying forcings ( and spatially, too, but space is not a part of the 1963 equations ). Additionally, the original Lorenz system did not contain accounting of viscous dissipation, the conversion of fluid motions ( kinetic energy ) into thermal energy, which appears as a positive definite increase in the temperature of the fluid.

    Each of these effects, the parameters and variable forcings and viscous dissipation, have a potential to significantly alter the chaotic response of the system. Specifically, each can cause non-chaotic response. We can speculate about effects of spatial variations of all of these, and all do vary spatially. My first guess is that the effects on a local basis are the same as the temporal basis.

    The papers cited below offer insights into the effects within the context of temporal chaotic response. Both show that it is possible for the chaotic response to disappear.

    Generally, brute force / blind averaging over large spatial and temporal ranges always has the potential to annihilate regions in these domains that are critically important relative to some system-response functions of interest. Averaging a chaotic response with a non-chaotic response seems to me to be averaging apples and zebras.

    V. Pelino, F. Maimone, and A. Pasini, Oscillating forcings and new regimes in the Lorenz system: a four-lobe attractor, Nonlinear Processes in Geophysics, 19, 315–322, 2012, doi:10.5194/npg-19-315-2012

    Abstract
    It has been shown that forced Lorenz models generally maintain their two-lobe structure, just giving rise to changes in the occurrence of their regimes. Here, using the richness of a unified formalism for Kolmogorov-Lorenz systems, we show that introducing oscillating forcings can lead to the birth of new regimes and to a four-lobe attractor. Analogies within a climate dynamics framework are mentioned.

    The first paragraph of the Introduction

    1 Introduction
    Some years ago, in a pioneering study about the influence of external forcings on patterns of climate variability, Corti et al. (1999) suggested that these forcings led to a change in the frequency of occurrence of dominant regimes of the Northern Hemisphere atmospheric circulation in the second half of the twentieth century. The authors also showed that this situation is consistent with the simple dynamical-system picture obtained by the insertion of a constant forcing term in the Lorenz system. In fact, even in the latter case, one observes no creation of new regimes/lobes on the Lorenz attractor, but a change in the frequency of residence of the state in the two lobes is clearly detectable via calculation of the two associated values of the probability density function. Even increasing the forcing value does not lead to new regimes but just to the disappearing of chaos: after a certain threshold the attractor becomes a fixed point (see, for instance, Pasini, 2008 and references therein).
    [ my bold, ed. ]

    Valerio Lucarini and Klaus Fraedrich, Symmetry breaking, mixing, instability, and low-frequency variability in a minimal Lorenz-like system, Physical Review E, 80, 026313&ff, 2009, DOI: 10.1103/PhysRevE.80.026313

    Abstract
    Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant non dimensional number Eckert number Ec is different from zero, an additional time scale of O(Ec^−1) is introduced in the system, as shown with standard multi scale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f^3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties ergodicity, hyperbolicity and that cause the model losing ability in describing intrinsically multi-scale processes.

  29. This paper is confusing because the climate signal intertwines with the seasonal signal during the year. The paper would have been more useful had the temperature been mesured at the conclusion of each year to minimize the seasonal signal.

  30. Pingback: De nya klimatförnekarna - Stockholmsinitiativet - Klimatupplysningen

  31. Tomas Milanovic

    Peter Davies

    Tomas you have said in other comments that the concept of averages have no place in chaos theory and in the use of vectors in an Hilbert space

    This is not exactly what I said.
    But read carefully the extract of the paper quoted by Dan Hughes :

    This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties ergodicity, hyperbolicity and that cause the model losing ability in describing intrinsically multi-scale processes.
    .
    This must be understood as follows : chaotic systems (and weather and its averages are an example of such) cannot be uniquely predicted even if they are strictly deterministic.
    Therefore and this is also something that The Chief reminds often, the only way to interpret and predict them is statistically. This is analogous to quantum mechanics – the Schrödinger equation is strictly deterministic but it is impossible to predict uniquely the result of a measure – alone the probabilities to measure this or that value can be predicted.
    Now in fluid dynamics (f.ex weather and climate) we do not have an equivalent of a Schrödinger equation which would allow to predict probabilities of future states.
    However we have something similar – ergodic theory.
    If a system is ergodic then despite the chaos, it has an invariant probability distribution of future states and obeys the ergodic theorem which allows to compute temporal averages. The hard part is to prove that the system is indeed ergodic and then to find this probability distribution.
    .
    That’s why the comment by Dan Hughes is fundamental – if you neglect (or destroy by averaging) the couplings between fast and slow variables then what you obtain is irrelevant mess.
    All the above applies strictly to temporal chaos e.g when the space doesn’t play any role and the dynamics are described only by a system of ODEs.
    In the case of weather/climate the problem is much more difficult because the existence of energy transport couples variables much more strongly and often with a high anisotropy at different locations.
    It is in this case that Fraedrich’s warning applies completely – by doing spatial and/or temporal averages before the analysis is done, one destroys couplings and the “results can be catastrophic”.
    In other words if some correlations are found between such averaged variables, these correlations can be completely spurious.
    .
    So again – using statistical tools on highly anisitropic and non homogeneous spatial regions dominated by coupled “fast” and “slow” variables (like the Arctic) is not likely to yield any valid spatial correlations. Especially then not if the dynamical parameters are temporally averaged and/or if the distance between 2 points is the only “explaining variable” used.

    • Thanks for your response Tomas. I surmise that statistics do have a (somewhat limited) role in non-linear dynamics and that the assumption of ergodicity is crucial.

      As an economist and financial analyst I have dealt with non ergodic time series data most of my working life (in excess of 30 years) are you of the opinion that climate time series data could be ergodic? Or even likely to be?

      Because if not, we are left with averages as being of little assistance in climate research.

  32. Tomas Milanovic

    Peter Davies

    As an economist and financial analyst I have dealt with non ergodic time series data most of my working life (in excess of 30 years) are you of the

    opinion that climate time series data could be ergodic? Or even likely to be?
    .
    This is for me the single most interesting and important question.
    Most climate scientists do not ask it and a few (generally non climate scientists) who ask it have no answer yet.
    Actually while the ergodic theory of temporal systems (which lead to the development of statistical mechanics) is known since Boltzmann, the ergodic theory of spatio-temporal systems is still in infancy.
    And of course the question about ergodicity is not a question about data series, it is a fundamental dynamical question about the behaviour of the whole system.
    Only when it is answered on the global level may a search after some signs in data series begin. Before it is cleared, you don’t even exactly know what you should be looking for.
    Semi empirical approaches like Tsonis or the Stadium wave are probably the beginnings of climate paradigms that go away from just simulating (GCM) or using primitive low dimensional equilibrium models and take the system’s dynamics and spatial couplings seriously.
    Of course that economics is not ergodic – if it was it would be easy to become very rich :)
    .
    All this to say that even if the question of what is ergodicity for spatio-temporal systems is not fully cleared, I am not sure that the climate is ergodic all the time.
    The evidence is showing that climate proceeds more by shifts than by slow and continuous evolutions what suggests more a very complex non ergodic dynamics. My qualitative model would be that there are volumes of the attractor where exists an approximate ergodicity (for some time) and then volumes where fast shifts occur and teh system transits to another but quite difefrent quasi ergodic behaviour.

    • So you are saying that the property of being ergodic is never set in stone for any system? It can change? If so, what exogenous influences in weather/climate could have this capability?

      Based on what I have learned from this dialogue so far is that ergodicity stems from endogenous influences on a system being capable of representation by a PDF and that exogenous influences are incapable of being so represented and so leaving any systyem as being non-ergodic and intractible?

    • SOME exogeous influencesd could be be so described.

    • I like the message of Tomas’ last sentence:

      My qualitative model would be that there are volumes of the attractor where exists an approximate ergodicity (for some time) and then volumes where fast shifts occur and teh system transits to another but quite difefrent quasi ergodic behaviour.

      What I like is the implication that assuming ergodocity may lead to good results in many cases, but we cannot trust in the applicability of that assumption more generally.

      From a formal perspective it seems hopeless to get positive answers to all question like:
      – Is the Earth system ergodic in absence of external disturbances?
      – How do the frequent external disturbances affect the Earth system?
      – Are the large GCM’s ergodic? (Studying models tells, whether there’s any hope that they are useful in answering specific questions. A separate issue is, whether the answer they give is the correct one for the real Earth system.)
      ..
      ..

      From practical point of view the empirical observations tell something on questions like:
      – Are there apparent strong deviations from ergodicity in studies performed at some spatial and temporal scales?
      – If there are, how have they manifested themselves in the observations?

      All ideas that the present climate science can produce useful projections to the future are based on the assumption that both the models and the real Earth system are ergodic enough at the time scale considered. Evidence for and against that assumption can be gained by making sure both that the models are ergodic enough and by studying empirical data on the history of the Earth system. For this argument it’s essential that it’s applied to specified spatial and time scales avoiding claims that go beyond that.

  33. Tomas Milanovic

    So you are saying that the property of being ergodic is never set in stone for any system? It can change? If so, what exogenous influences in weather/climate could have this capability?

    No, it is actually set in stone. For any given dynamics (e.g a system of PDE or ODE that describe the behaviour) and regardless of the “forcings” you can just check the ergodic hypothesis and obtain a yes/no answer with no ambiguity.
    A system of hard balls is ergodic (=statistical mechanics).
    A biased or unbiased coin is ergodic. Etc.
    A non ergodic system must be imagined like a biased die whose bias is dynamically changing all the time. However if for a certain period of time the bias doesn’t change (much), the system would not be distinguishable from an ergodic system if we observed it only during this time.
    .
    But like Pekka very rightly observes (I translate it with my terms), the ergodic hypothesis is quite easy to verify empirically or theoretically for low dimensional systems .
    The problem with weather and with climate is that it is infinite dimensional – e.g any state of the system may be expressed as an infinite sum of functions (fields) so it is extremely difficult to verify it empirically and as for the theory it doesn’t really exist for spatio-temporal systems.
    The empirical verification would mean f.ex to answer the following question : Given 100 different future states Si produced by a GCM can we say that the probability of each of them is 1/100 or some fixed number Xi ?
    Obviously you can’t answer it (and not only because it is not sure that the GCM simulate correctly the dynamics) because the 100 future states were produced by arbitrarily selecting 100 different perturbations among a possible infinity.
    Nothing guarantees that this arbitrary sample of 100 is representative of the probabilities among the infinity of possible future states.
    And observing the realization doesn’t help either because you can’t redo the experiment many times – each realisation happens only exactly once.
    .
    Expressed like this it could appear that the problem is circular and can’t be solved.
    But that would be too pessimistic and I also share what Pekka said.
    Actually meteorologists are doing exactly that and are able to make (probabilised) predictions with better odds than 1/N. Much of this skill is empirical and based on experience with a specific spatial region but the fact that it works, shows that the behaviour of the system at least at small time and space scales
    does exhibit some “ergodic” properties.
    Of course when one goes over to much larger time and space scales it becomes more difficult and some dose of ergodicity is less obvious to detect.
    That’s exactly why I support strongly approaches like Tsonis and Stadium wave because those are ideas that try to tackle this very difficult problem.
    Like in every scientific endeavour one always starts with many rather simple attempts most of which are wrong or only partially valid untill comes somebody who has the right idea which realizes the synthesis of all previous attempts and explains why this approach rather worked while this other didn’t.

    • Thanks for your thoughts Tomas and Pekka. To my mind the assumption of ergodicity seems consistent with real time modelling of weather that is performed by meteorlogists but it would seem dubious when attempts are made to predict over longer timescales from the current GCM’s and if such predictions are used for policy-making with respect to carbon emissions.

  34. Have commented before but the total sea ice area for year 2013 is going to be above average for a WHOLE YEAR very shortly. This will put a big crimp in any “Kriging” rubbish by Cowtan et al.
    With the extent positive any Arctic amplification will be wiped out by Antarctic Deamplification and there will be a Hockey stick spike down in global cooling for 2013. Cannot wait.
    Judy should do a post on this significant event which is strongly counter to this recent paper.

  35. Fred Moolten, commenting as fredmoolten, here. Along with David Young’s comment here, are related to the recent discussions in this thread. [ Steve McIntyre has closed comments on that thread. ]

    David wrote:

    Fred, This is an interesting question. In my experience in a complex nonlinear system all scales effect all other scales. So, if models don’t reproduce the recent climate system changes, this may effect the longer time scales too. My years of experience since our last discussions at Judith’s have convinced me that simple models constrained with good data may be more accurate than complex ones with many parameters.

    One common way of arguing about the models is “they don’t simulate weather, but they get the long term right.” If so, are they any better than simple conservation of energy models? Unphysical dissipation means details are smeared and damped and subtle effects are just lost. But the overall energy balance might still be right.

    Coupling between fast and slow phenomena and processes that are critical to system response functions of interest must also be resolved by the numerical solution methods used in the GCMs. Because the models are made up of a very large number of PDEs, ODEs and algebraic equations, successfully attacking the chaos / ergodic issues at the continuous equation level for GCMs is very likely far beyond reach. The algebraic equations in fact are generally used to represent the small spatial and temporal scale phenomena and processes, and these representations replace the PDEs and ODEs that describe the phenomena and processes at a fundamental level. Small here meaning less than the discrete spatial and temporal increments used for numerical solution of the equation systems, and these are quite large for practical GCM applications. Auxiliary special-purpose differential equation ( PDE or ODE ) models might be used for some small-scale phenomena and processes. These will sometimes to numerically integrated with methods and discrete approximations different from those used for the general equations.

    Importantly, at the numerical solution domain when extensive computational work is required for practical applications, the algebraic models appearing on the right-hand side of the models of the fundamental equations will generally be evaluated implicitly at the new-time level. The objective is to allow use of the largest possible discrete time step size. What this means is that the time-constants for these are more or less completely ignored, especially for those that have associated with them the most rapid response. The temporal fast processes generally are not resolved in the numerical solution domain.

    Additionally, many of the algebraic parameterizations act like instantaneous switches whenever threshold values of dependent variables are attained; the temperature of water at its phase-change states, for example. The actual value of the dependent variable will seldom, if ever unless special considerations are introduced into the numerical solution methods, be exactly equal to a threshold value. These result in introduction of discontinuites into the solution domain. Discontinuities can also be introduced if different algebraic parameterizations for a given phenomena or process are necessary in order to cover wider ranges of the previous states that materials have attained; transition from laminar to turbulent flow for example. Some parameterizations involve gradients of the dependent variables; atmospheric stability criterion is an example. It would be interesting to see the effects of discontinuities in the parameterizations on the response of temporal chaos.

    The presence, or absence, of chaotic response of a GCM calculation could be determined by calculation of the Lyapunov exponents for the selected system responses. In order for this approach to be successful, it would first require that the calculated numbers be free of numerical artifacts.

  36. Pingback: Weekly Climate and Energy News Roundup | Watts Up With That?

  37. Each of these people has some aspect of the Arctic temperature right but collectively they act like blind men trying to guess what an elephant is. What is missing is the over-all picture I published in 2011 [E&E 22(8):1069-1083]. It was based on the observations of Kaufman et al. (2010) who used circum-Arctic lake deposits to determine the Arctic temperature for the last two thousand years. What they found was that there was a slow, linear cooling for most of these two thousand years until the temperature suddenly turned up at the turn of the twentieth century. Looked very much like a hockey stick so Joe Romm listed it in his collection of hockey sticks. That sudden upturn at the turn of the twentieth century initiated the current Arctic warming. But things are not quite that simple. Kaufman et al data did not have the time resolution to show what really happened during the twentieth century, but NOAA’s Arctic Report Card for 2010 did. With the help of their temperature chart I was able to determine that after the warming started there was a pause in mid-century, from 1940 to 1970. The warming did not simply stop but there was an actual reversal, with temperature decreasing at the rate of 0.3 degrees per decade during that pause. I checked what atmospheric carbon dioxide was doing during this entire period and it was doing nothing – just a smooth graph that lined up with the Mauna Loa data. From that it was clear that the warming (as well as the cooling) had nothing whatsoever to do with the greenhouse effect. None of the authors or the commentators here has said anything about it but the context of their remarks makes it clear that CO2 done it and it is not worth even mentioning. Sorry, but that’s a no-go: CO2 does not warm the world. So what else could cause this warming? It appeared quite suddenly over a large area of the Arctic. I decided that the only thing that could cause this would be a rearrangement of North Atlantic current system that began to carry warm Gulf Stream water into the Arctic Ocean. And the mid-century pause could then be understood as a temporary return of the original flow path of currents. This points to the necessity of researching the rest of the circum-Atlantic temperature history, something that is not readily available now. The peak warming of the early century was in the twenties and thirties. It even kept Russian Arctic ports ice free during World War II, thus permitting US Lend-Lease supplies to reach Stalin during the war. That is how the Red Army got their Spam. The warming restarted about 1970 and since about 2011 has exceeded the early century warming. This, incidentally, also explains why the Arctic is still warming while the rest of the world is not. What worries me most about the Arctic is the possibility of some kind of cyclical process that could come out of nowhere. For example, if the cooling of mid-twentieth century should return it might put an end to Arctic mineral exploration and trans-shipments that now are possible. The early century warming lasted about 40 years. It was followed by 30 years of mid-century cooling. The warming that followed this cooling has lasted 43 years. What next?

    • Arno Arrak,
      I found that the surface station temperature record when not all averaged into a lump, shows very similar results.
      You can see the difference between North American, and Eurasia temperatures.

    • I’m sorry, I meant to add you can follow the link in my name to see the data.

  38. Pingback: Early 20th century Arctic warming | Climate Etc.

  39. Let’s support a local business by hiring a nearby ‘man and his van’ to transport our weighty goods.

  40. I wish that people would realise how dangerous
    gas burning appliances can be. An annual gas safety inspection would save lives.