Separating natural and anthropogenically-forced decadal climate variability

by Judith Curry

The issue of separating natural from anthropogenically forced variability, particularly in context of the attribution of 20th century climate change, has been a topic of several previous threads at Climate Etc.  The issue of natural vs anthropogenically forced climate variability/change has been a key issue of contention between the climate establishment and skeptics.  There are some encouraging signs that the climate establishment is maturing in their consideration of this issue.

Distinguishing the Roles of Natural and Anthropogenically Forced Decadal Climate Variability:  Implications for Prediction

Amy Solomon, Lisa Goddard, Arun Kumar, James Carton, Clara Deser, Ichiro Fukumori, Arthur M. Greene, Gabriele Hegerl, Ben Kirtman, Yochanan Kushnir, Matthew Newman, Doug Smith, Dan Vimont, Tom Delworth, Gerald A. Meehl, and Timothy Stockdale

Abstract. Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on regional scales, it is envisioned that initialized decadal predictions will provide important information for climate-related management and adaptation decisions. Such predictions are presently one of the grand challenges for the climate community. This requires identifying those physical phenomena—and their model equivalents—that may provide additional predictability on decadal time scales, including an assessment of the physical processes through which anthropogenic forcing may interact with or project upon natural variability. Such a physical framework is necessary to provide a consistent assessment (and insight into potential improvement) of the decadal prediction experiments planned to be assessed as part of the IPCC’s Fifth Assessment Report.

Citation:  Solomon, Amy, and Coauthors, 2011: Distinguishing the Roles of Natural and Anthropogenically Forced Decadal Climate Variability. Bull. Amer. Meteor. Soc.92, 141–156.  doi: 10.1175/2010BAMS2962.1

Link to the complete article [here].

JC comment: The first sentence of the abstract really caught my attention: Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on regional scales. . . I don’t recall the climate establishment “giving” this one before.  The implications of this is that the warming between 1970 or 1980 to 2000 should be operating under the same givens also.

From the Introduction:

As the science of decadal prediction is in its infancy, one would like to assess and understand the following:

  1. the expectations for added regional climate in-formation and skill achievable from initialized decadal predictions;
  2. what physical processes or modes of variability are important for the decadal predictability and prediction problem, and whether their relevance may evolve and change with time;
  3. what elements of the observing system are important for initializing and verifying decadal predictions; and
  4. in terms of attribution, to what extent are regional changes in the current climate due to natural climate variations and thus transitory, and to what extent are they due to anthropogenic forcing and thus likely to continue.

The purpose of this paper is to describe existing methodologies to separate decadal natural variability from anthropogenically forced variability, the degree to which those efforts have succeeded, and the ways in which the methods are limited or challenged by existing data. Note that the separation of decadal natural variability from anthropogenically forced variability goes beyond what has already been accomplished in previous studies that focused primarily on the detection of a long-term anthropogenic signal (Hegerl et al. 2007b) because on decadal time scales anthropogenic effects may be nonmonotonic, regionally dependent, and/or convolved with natural variability.

JC comment: the detection of the long-term signal from anthropogenically forcing was detected in the AR4 basically for the period 1970 or 1980 to 2000, without account for this: because on decadal time scales anthropogenic effects may be nonmonotonic, regionally dependent, and/or convolved with natural variability.

Observational uncertainties

Verification of the forced component of twentieth- century climate trends simulated in model experiments depends on the existence of accurate estimates of these trends in observations. Given the limited sampling in both space and time of the observations and proxy records, these verifications must be handled carefully. In particular, knowledge of the spatial patterns and magnitudes of climate trends over the oceans is hampered by the uneven and changing distribution of commercial shipping routes  and other observational inputs as well as different approaches to merging analyses of the observations (Rayner et al. 2011).

An example of the impact of observational uncertainties on the interpre-tation of twentieth-century SST trends is shown in Fig. 7 based on an uninter- polated dataset [version 2 of the Hadley Centre SST dataset (HadSST2); Rayner et al. 2006] and two optimally interpolated reconstructions [the Hadley Centre Sea Ice and SST dataset (HadISST; Rayner et al. 2003) and version three of the National Oceanic and Atmospheric Administration’s (NOAA’s) extended reconstructed SST (ERSSTv3; Smith et al. 2008)]. Although trends from the three datasets share many features in common, such as a strengthening of the equatorial Pacific zonal temperature gradient (Karnauskas et al. 2009), there are also differences. Most notably, the eastern equatorial Pacific shows cooling in HadISST and warming in HadSST2 and ERSSTv3 (see also Vecchi et al. 2008). However, independently measured but related variables, such as nighttime marine air temperatures, provide some evidence that the eastern Pacific trends represented in the HadSST2 and ERSSTv3 datasets may be the more realistic ones (Deser et al. 2010b). These observational sampling issues underscore the challenge of providing a robust target for model validation of twentieth-century surface marine climate trends and perhaps the need to consider a suite of complementary measures for poorly sampled variables and/or regions.

A limitation of the instrumental record is that it spans at most a few realizations of decadal variability. Paleoclimate records—derived from tree rings, corals, lake sediments, or other “proxies”—have been used to extend this record to hundreds of years or more and are generally believed to be free of anthro- pogenic influence prior to the industrial age (Brook 2009; Jansen et al. 2007), thus constituting a poten-tial means of model verification.

JC comment: with all these uncertainties in the observations of ocean temperature, “unequivocal” and “very likely”  in the AR4 seem overconfident

Modeling uncertainties.

The spatial structure and dominant time scales of natural variations differ across models (see discussion of Fig. 5). Additionally, coupled climate models produce a range of responses, in space and time, to anthropogenic radiative forcing (Fig. 8). Such differences in model estimates of internal variability and response to external forcing limit our understanding for the potential of the decadal climate predictions.

As an example, the historical changes and future response of the tropical Pacific mean state have been subjects of debate. Different proposed mechanisms disagree on the expected sign of change in the zonal SST gradient in the tropical Pacific  in response to anthropogenic forcing. The observational record does little to clarify the situation, as trends in different observed SST records differ in even their sign (see Fig. 7). Models that simulate the largest El Niño–like response have the least realistic simulations of ENSO variability, while models with the most realistic simulations of ENSO project little change in the Pacific zonal SST gradient (Collins 2005). These differences in tropical Pacific interannual variability and change have implications for Pacific decadal variability through their impact on large-scale changes in the atmospheric circulation (e.g., Alexander et al. 2002; Vimont 2005).

Conclusion

The main conclusion drawn from the body of work reviewed in this paper is that distinguishing between natural and externally forced variations is a difficult problem that is nevertheless key to any assessment of decadal predictability and decadal prediction skill. Note that all the techniques are limited by some assumption intrinsic to their analysis, such as the spatial characteristics of the anthropo- genic signal, independence of noise from signal, or statistical stationarity.

JC summary: The authors of this paper are members of the climate establishment, in terms of being involved with the WCRP CLIVAR Programme and also the IPCC.  This paper arguably provides more fodder for skepticism of the AR4 conclusions than anything that I have seen from the climate establishment (the authors may not realize this).  The issues surrounding natural internal decadal scale variability are a huge challenge for separating out natural from forced climate change.  The same issues and challenges raised for future projections remain also for the warming in the last few decades of the 20th century.  Sorting this out is the key challenge.  No more unequivocals or very likelys in the AR5, please.

524 responses to “Separating natural and anthropogenically-forced decadal climate variability

  1. Judith, could you define “climate establishment?”

    And speaking of which, I believe some edits might be in order:

    I don’t recall the climate establishing “giving” this one before.

    and you might want to rewrite this sentence also?

    The implications of this that the warming between 1970-2000 should be operating under the same givens also.

    • Harold Pierce Jr

      The “climate establishment” are families of white-coated dons and wiseguys running a climate protection racket and shaking down a guillible public, gutless governments, clueless corporations and foolish foundation for cold hard cash.

      • OR the “climate establishment” is the loose collective of highly qualified people, talking about their area of expertise in some aspect of climate, in the full plurality of its form. It would also include the organisations providing the research infrastructure for the normal science research such individuals and research groups undertake. As a group of many thousands of people and many hundreds of organisations, the “establishment” comprises a wide variety of perspectives. As with any such collective, misunderstandings and errors will occur from time to time, and the group recognises this and includes checks and balances.

        As opposed to the white-coated dons and wiseguys running a climate protection racket and shaking down a guillible public, gutless governments, clueless corporations and foolish foundation for cold hard cash, ten of which are listed below:

        Mercatus center: ($9.2m received from Koch grants 2005-2008) Conservative thinktank at George Mason University. This group suggested in 2001 that global warming would be beneficial in winter and at the poles. In 2009 they recommended that nothing be done to cut emissions.

        Americans for prosperity. ($5.17m). Have built opposition to clean energy and climate legislation with events across US.

        Institute for humane studies ($1.96m). Several prominent climate sceptics have positions here, including Fred Singer and Robert Bradley.

        Heritage foundation ($1.62m). Conservative thinktank leads US opposition to climate change science.

        Cato Insitute ($1.02m). Thinktank disputes science behind climate change and questions the rationale for taking action.

        Manhattan Institute ($800,000). This institute regularly publishes climate science denials.

        Washington legal foundation ($655,000) Published articles on the business threats posed by regulation of climate change.

        Federalist society for law ($542,000) advocates inaction on global warming

        National center for policy analysis ($130,000) NCPA disseminates climate science scepticism.

        American council on science and health ($113,800) Has published papers claiming that cutting greenhouse emissions would be detrimental to public health.

        http://mitigatingapathy.blogspot.com/

      • Are you from bizarre-o world?

      • Simple facts, which you cannot disprove.

      • http://www.citycaucus.com/2010/10/north-vancouver-mom-exposes-us-millions-for-oil-sands-activism
        Tides, and the U.S. foundations that fund it, have incredibly deep pockets. A large part of Tides’ funding comes from the Gordon & Betty Moore Foundation, the William & Flora Hewlett Foundation, the David & Lucile Packard Foundation, the Pew Charitable Trusts and the Rockefeller Brothers Fund. These are The Big Five. They give away about US$1.2-billion every year. If these foundations decide to undermine a foreign industry, they probably can.

        These Big Five have poured at least US$190-million into Canada’s environmental movement over the last decade, but their American logos are nowhere to be seen. Instead, we see a pageant of Canadian icons: dogwood, herds of caribou, wild salmon, First Nations and loons. U.S. tax returns show that the David Suzuki Foundation has been paid at least US$10-million from American foundations. This hasn’t exactly been out in the open.

      • You put these to create the false impression that this all the charitable or think tank money going into the climate issue, and you lie by omission.
        the amounts of money spent to promote your particualr faith is orders of magnitude larger than what is spent on skeptics, and you know it.
        Go play your ignorant games where the readers are too stupid to think about it, like at RC or Rabett’s den or Romm’s.

      • Good point. Because all of the money that the Pew Charitable Trust spends is on promoting the myth of AGW.

      • Paul: how do you know that you are right and they are wrong? There is this Hungarian scientist, Ferenc Miskolczi, who worked at NASA and had access to NOAA database of weather balloon observations. Using this database that goes back to 1948 he determined that the transmittance of the atmosphere in the infrared where carbon dioxide absorbs had not changed for the last 61 years. And during these years the amount of carbon dioxide in the air had increased by 21.6 percent. Do you know what that means? It means that the added carbon dioxide made zero contribution to warming the atmosphere. Its greenhouse absorption signature is simply missing. No absorption, no greenhouse effect, case closed. If you don’t believe me check out Miskolczi’s work. Find out also what he has to say at the EGU meeting going on right now in Vienna.

      • You haven’t been in a lab in decades have you Harold. No one wears white coats except synthetic chemists because they have a tendency to spill nasty stuff all over themselves. The change came in the 1970s when clothing (jeans, chinos, shits) became cheaper than lab coats (seriously).

      • Indeed, is the definition of the “climate establishment” similar in its lack of specificity as is a definition of the “Mainstream media” which: (1) assumes a vast left-wing conspiracy, (2) assumes that Fox News, the Washington Times, World News Daily, Matt Drudge, rightwing talk radio, etc., are either not media or not mainstream (even though entities such as MoveOn or Dailykos are considered to be inextricably linked to the “MSM”) and, (3) assumes that there are no significant distinctions of kind between The USA today, the NY Times, NPR, MSNBC, PBS, Time Magazine, CBS, CNN, NBC, ABC, etc.?

  2. a

    a poten-tial…..anthropo- genic

    And Judith, I’m curious about this comment:

    This paper arguably provides more fodder for skepticism of the AR4 conclusions than anything that I have seen from the climate establishment (the authors may not realize this).

    Do you think that the authors may not know that saying that “distinguishing between natural and externally forced variations is a difficult problem” would provide fodder for skepticism? Or are you referring to the amount of fodder this particular paper might supply (i.e., you think that they may be unaware that their paper could provide more fodder for skepticism than other papers published previously from the “climate establishment”).

    • Skepticism arises from more than a published paper.
      If one has read enough history, one realizes that climate has varied a lot (cathedrals, dark ages, Greenland, Scottish vineyards, frost fairs, the retreat from Moscow, Year without a Summer, dust bowl, etc.)
      Then, if one lives long enough, one experiences such periods as the Chicago winters of the late ’30s and mid-60’s.
      The recent multi-decadal warming is blamed on man-made CO2. Some effect, yes. Catastrophic?

      • Read Brian Fagan’s book “The Little Ice Age” or perhaps “The Long Summer”. They give a better view of climate variability than anything I’ve seen come out of the climate science community. Strange thing is that he’s a believer in AGW in spite of his own evidence. It is what it is.

      • Natural climate variability doesn’t invalidate AGW , and doesn’t hide it, either.

      • Pooh, Dixie

        The greater the range of natural variability, the less remains for AGW to explain.

      • When the natural variability exceeds EVERYTHING attributed to “anthropogenic” changes by a factor of anywhere from 10 to 100, as the Little Ice Age, the Medieval and Roman Warm periods – in fact, the entire Holocene record – belief in AGW becomes extremely difficult, if not impossible. The idea that three percent (the amount of CO2 added to the atmosphere by mankind) of four percent of a single greenhouse gas (water vapor makes up 95%, all others about 1%) can drastically change climate does not compute.

      • The paper concludes with skepticism about the ability to distinguish anthropogenic and naturally forced decadal climate variation, the predictability of such variation, and our ability to predict such variation. How could anyone in the “climate community,” not know that such conclusions would provide fodder for skepticism?

        How often, and for how long, have we heard the criticism that if we are limited in our ability to predict short-term climate variations, we can’t trust long-term predictions?

      • The paper concludes with skepticism about the ability to distinguish anthropogenic and naturally forced decadal climate variation, the predictability of such variation, and our ability to predict such variation

        Do you have reason to believe that scepticism unfounded?

      • No. I think the skepticism is very well-founded.

        I agree with what Peka says above. I don’t see some massive tectonic shift in the “climate community” with respect to decadal variation. It seems to me that the notion of naturally forced decadal variation is neither profound nor in contrast to the “consensus” climate science.

        I find it hard to believe that any climate scientist would have said 2, 10 or 20 years ago, that in any given long-term period, there might be decadal trends that would be in opposition to the longer-term trends. In fact, I don’t know the details, but I would imagine that paleo-climate analysis from the “climate community shows multi-decadal trends that were in opposition to longer-term trends. Certainly, if we can call volcanic activity a “natural” forcing, then the “consensus” science clearly recognizes decadal deviation from long-term trends.

      • Sorry – that should be I find it hard to believe that any climate scientist would have said 2, 10, or 20 years ago, that in any given long-term period, there might not be decadal trends that would be in opposition to the longer-term trends.

      • I find it hard to believe that any climate scientist would have said 2, 10, or 20 years ago, that in any given long-term period, there might not be decadal trends that would be in opposition to the longer-term trends.

        But IIRC, they did. Mainly by denying the existence of any such thing and refusing discussion of the subject. I believe the common response to questions was {snort}. At least, that’s the general response that I got. :-)

      • Are you saying that no paleo-climate reconstructions done by the “climate science community” showed decadal or multi-decadal trends that diverged (obviously as a result of natural forcings since by definition, there were no “unnatural forcings” that diverged (or were in opposition to) long term trends?

        Really? This goes to the statement above that the “climate science community” “ignored” natural forcings. That is ridiculous at face value; obviously the “climate science community” recognized that the Earth’s climate changed from natural forces over time. Saying that the natural forcings can’t explain recent warming is not the same thing as ignoring natural forcings. Saying that decadal variations don’t disprove longer-term trends is not the same thing as saying that decadal variations don’t exist or saying that they can be fully explained.

      • On the contrary Joshua, there is lots of paleo recognition of such short term trends, the most notable being the so-called abrupt events. But it is claimed by AGW that whatever caused those is not operating now, and has not operated in the last 100+ years. When skeptics say that the AGW-driven science is not considering natural variability that is short for something rather more complex, or it should be.

      • <blockquote…But it is claimed by AGW that whatever caused those is not operating now,

        As I see it, what is “claimed by AGW” is that the natural decadal forcings that are happening now are on a different scale (in the sense of having a shorter life-span) than the longer-term forcing attributable to CO2 (and the measured “natural” long-term forcings). That is different from saying that they aren’t happening now, or that they cannot impact global climate.

      • Joshua –
        I find it hard to believe that any climate scientist would have said 2, 10, or 20 years ago,

        You’re still not reading what’s being written, but rather what you think is being written. Notice the bold – from your statement – which was the specific subject of conversation here.

        Are you saying that no paleo-climate reconstructions done by the “climate science community” showed decadal or multi-decadal trends that diverged

        We weren’t talking about paleo, but if you can find a paleo construction that shows what you’re implying, we can talk about it. But you’d have to look back about 30-40 years because since at least 1988, I believe there’s been little or no mention of the possibility of divergent decadal or multi-decadal trends.

        As I see it, what is “claimed by AGW” is that the

        What is being claimed is that the 1910-1940 warming , for example, was due to natural causes (solar and lack of vulcanism, therefore natural variation) but that the 1970-2000 warming could not have been due to the same kind of natural variation even though there is little or no difference between the two in terms of slope or magnitude. And in spite of the solar build-up to a Grand Maximum in the late 20th C and the relative lack of vulcanism – among other natural factors, like PDO. Nor has the 60-70 year warming-cooling “natural variation cycle” apparently changed since 1880.

        The claim has been that ONLY GHG’s could have produced the latest (1970-2000) warming. Which leaves no room for natural variation. That claim has been made repeatedly on this blog during the time I’ve been here.

        The claim has also been made that the models (GCM’s) won’t work without CO2 as the main CC driver. I’ve been specifically told that several times by different people. And that’s horse puckey, too. That simply gets back to the need for IV&V on the models – and a review of the basic assumptions embedded in the software.

  3. Here’s the comment that caught my eye:

    “As the science of decadal prediction is in its infancy…”

    Really? I remember the climate establishment saying predictions 100 years into the future were easier and more certain than weather predictions two weeks in the future. This seems to be quite an admission to me… or perhaps it indicates these people are finally waking to reality. It is good to see because they have been living in a fantasy world since the 1980s.

    • steven mosher

      What they are saying is that predictions over the next 10 years, or 20 years, or 30 years, are tougher to make than predictions over 100 years.
      because of decadel variability.

      • Really? That’s what they are saying? So then uncertainty does not accumulate, it dissipates the more distant into the future the predictions? I’ve never seen uncertainty be resolved so quickly! This is a breakthrough for science!

      • steven mosher

        Not really a breakthrough. You can understand the effect quite simply for your self.

      • You did understand I was being facetious, right?

      • What is the scale of decadal changes versus the scale of changes over longer time periods? Can we not identify past climactic patterns over longer time periods in the past although we cannot determine decadal variability within those longer time periods?

      • er, climatic. Another interesting typo.

      • :-)

      • Ron,
        We can all bet that in 100 years everyone posting here will be dead.
        In the next decade or two or three, there is considerable variability.
        However, I think that actuarial approach if applied to cliamate has a big fallacy: the climate is not going to die unless the sun turns off.

      • Not if you know how to average numbers to get a more precise result. The more numbers, the more precise the result. I learned that in third grade. Didn’t you?

      • ferd berple

        The law of large numbers / central limit theorem only holds for specific types of distributions, such as the “normal” distribution you learn in grade school. fractal distributions such as temperature time series do not converge. thus, forecast are no more likely to be accurate over 100 years than they are over 30 years. the error is in climate science assuming temperature obeys the law of large numbers. one the the criticisms of climate science is that for a field so dependent on statistics, it has not made adequate use of statisticians.

      • There may be an error and there may not.

        It’s true that central limit theorem does not apply to all kind of distributions, but it applies extremely often. Real fractals are rather an exception than a rule. Reading books of Mandelbrot, the main protagonist of fractals, one can observe that even his examples are not often real fractals. Central limit theorem would ultimately apply even to those distributions albeit so slowly that saying that it does not apply gives a correct view of reality.

        Concerning temperature distributions both claims are conjectures, there is no strong direct evidence in either direction. Climate modelers can check, whether the averages converge in their model calculations – and usually they do. Thus presenting the results makes sense, but that does not prove that the real climate behaves similarly.

        Additional stochastic forcings blur the picture even if there would be a well defined average without them, but this is not really a problem as long as these disturbances do not dominate the temperature development so strongly that the additional GH effect disappears in the noise.

        Whether forecasts are more accurate for 100 year averages than 30 year averages is also unknown at present. We have strong evidence on natural fluctuations on decadal time scales but not on centennial time scale, but we lack also strong evidence in the opposite direction. Having a better accuracy on centennial time scale is a plausible, but unconfirmed conjecture.

      • Whether forecasts are more accurate for 100 year averages than 30 year averages is also unknown at present. We have strong evidence on natural fluctuations on decadal time scales but not on centennial time scale, but we lack also strong evidence in the opposite direction. Having a better accuracy on centennial time scale is a plausible, but unconfirmed conjecture.

        So the lack of acknowledging the seemingly strong interdecadal fluctuations in the models and attributing all of the change due humans instead, might not impact the long term forecasts significantly?

        Gee. I thought garbage in also means garbage out and vice versa.

      • Aren’t you assuming that there aren’t other periodic natural phenomena? What reason is there to assert that the decade is some special period, and that there aren’t all manner of other confounding frequencies?

      • steven mosher

        The real concern is are there longer cycles or LTP

      • maksimovich

        As anti persistence is observed there is little concern eg Carvalho et al 2007

        Abstract. In this study, low-frequency variations in temperature anomaly are investigated by mapping temperature
        anomaly records onto random walks. We show evidence that
        global overturns in trends of temperature anomalies occur on
        decadal time-scales as part of the natural variability of the climate
        system. Paleoclimatic summer records in Europe and
        New-Zealand provide further support for these findings as
        they indicate that anti-persistence of temperature anomalies
        on decadal time-scale have occurred in the last 226 yrs. Atmospheric processes in the subtropics and mid-latitudes of
        the SH and interactions with the Southern Oceans seem to
        play an important role to moderate global variations of temperature on decadal time-scales

        http://www.nonlin-processes-geophys.net/14/723/2007/

      • Yes, that’s nice for the natural variability part of the climate system. Note that they had to make that qualification, as it does not hold true for the anthropogenic part of climate variation.

      • because of regional decadel variability

        a very different thing and a much larger thing that, yes, wait for it, everyone was always aware of.

      • ferd berple

        “What they are saying is that predictions over the next 10 years, or 20 years, or 30 years, are tougher to make than predictions over 100 years.
        because of decadel variability.”

        Don’t agree. Predictions 10, 20 or 30 years in the future are risky because they can be checked in one human lifetime and the authors occasionally held to account. This can affect one’s career.

        Predictions 100 years in the future are totally safe because it is pretty much guaranteed that everyone involved in Climate Science will long be dead. (Killed no doubt by climate change). Thus there is zero risk, which makes 100 year predictions extremely easy to make.

        As to whether 100 year predictions are more accurate than 30 year predictions, there is zero evidence this is true. No such prediction has stood the test of time, except as a curve fitting exercise using 20-20 hindsight.

        For example: Why did CET temperatures rise from 1750 to 1850? Or from 1850 to 1950? Climate science today could not have predicted this based on existing theories. The recovery from the LIA shows that there is variability well in excess of decadal variability.

        Spectral analysis shows variability on multi-decadal, century and millennia time-scales, which will defeat 100 years predictions just as readily as decadal variability defeats 10, 20 and 30 year predictions. Climate is not a normal distribution (Gaussian) subject to the law of large numbers. It is a fractal distribution with scale invariance. Assumptions that longer time scales are more predictable do not hold.

        http://i49.tinypic.com/rc93fa.jpg

  4. “Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on regional scales..”

    Is it possible they’re worried about the PDO having turned cold? If so, is this some sort of attempt to get out in front of the coming decline in world temps over the next few decades, a decline that’s already begun?

    • Certainly looks that way, pokerguy.

      • RobB

        http://www.woodfortrees.org/plot/hadcrut3vgl/from:1955/to:1974/plot/hadcrut3vgl/from:1962/to:1974/trend/plot/hadcrut3vgl/from:1961/to:1973/trend/plot/hadcrut3vgl/from:1960/to:1972/trend/plot/hadcrut3vgl/from:1959/to:1971/trend/plot/hadcrut3vgl/from:1958/to:1970/trend/plot/hadcrut3vgl/from:1957/to:1969/trend/plot/hadcrut3vgl/from:1956/to:1968/trend/plot/hadcrut3vgl/from:1955/to:1967/trend/plot/hadcrut3vgl/from:1955/to:1974/trend

        Keep in mind that everything that follows is statistically nonsensical; the significance levels, error bars, and so forth render what I say here utterly meaningless.

        I’ve been looking at the 12-year trend lines as there’s an interesting effect going from 11-year to 13-year trend lines on the various datasets on woodfortrees.org interactive tool.

        At 13-years and longer, the trends are generally ‘well behaved warmists’ and at 11-years or lower we see an explosion of trend lines that disagree.

        The 12-year lines I’ll call lukewarm lines. This tipping point has some fun properties.

        The well-behaved warmist lines appear by eye to become more frequent and more exaggerated with time overall.

        The shorter lines have quite random behavior.

        I’ve plotted the last global downturn as a series of lukewarm lines, from 1955-1974. Of course, this 20-year span is net increasing, though six of eight lukewarm lines definitely are downward trending.

        Even more surprising, even when looking at the trend for the span with all six downward trending lukewarm lines, the trend for those eighteen years is overall increasing.

      • Hi Bart
        I’m sorry but I’m not sure I understand the point you are making. We can all play games with trends by being selective about start/finish points? If the PDO has turned negative then we will soon see the effect.

      • RobB

        That’s my point.

        We may see an effect, but we won’t be able to well understand it for at least two decades.

        We may have been seeing the effect already since 1999.

        For all we know, the negative turn has come and gone.

        If the PDO has turned negative in the same way as the great majority of negative turns since 1840, then within six years we’ll see it turn positive again, which we won’t know to any degree of confidence for fourteen years at least, and within twenty years of the start of the downturn all decreases will be erased.

        The PDO appears much more reliable and significant on the upswing than on the down, but as I said at the beginning, all of this analysis is meaningless due relatively large noise and error problems.

        Likewise claims of the PDO turning negative are at best meaningless.

        A nearly unverifiable six-year or so interruption every 50-70 years.

        A blip.

  5. Rob Starkey

    I find the presumptions of the paper almost laughable, except that they are seriously stating that they will be able to segregate the “natural” from the “human caused” climate changes, and at a regional level no less. It sounds like the paper is written as the basis to ascribe future significant weather events to additional human released CO2 so that policies can be justified as a result.

  6. Can I bring up comments that have been made before. Judith you say “JC comment: with all these uncertainties in the observations of ocean temperature, “unequivocal” and “very likely” in the AR4 seem overconfident”

    The word I key on is “seem”. It strikes me that in all your comments, when you are faced with solid science that supports the skeptical side of CAGW, you cannot go that last step, and take a definite stand. Are you so certain that CAGW is a real danger that you cannot write “are” instead of “seem”. Isn’t it more accurate for you to say “with all these uncertainties in the observations of ocean temperature, “unequivocal” and “very likely” in the AR4 ARE overconfident”?

    • Rob Starkey

      Jim- I would guess she will not since there is a small probability that the described event, or condition may occur.

    • So, you want JC to be unequivocal in her criticism of IPCC being unequivocal???

      • Rod B writes “So, you want JC to be unequivocal in her criticism of IPCC being unequivocal???”

        Not quite, I cannot understand why, on many occasions, Judith provides, what to me, is solid scientific evidence that we skeptics are absolutely correct. Yet, even with this science, she still does not seem to be able to, unequivocally, support the skeptical side. For example, she emphasised the postings of Tomas Milankovic and Terry Oldberg with respect to no-feedback climate sensitivity, but still will not agree that no-feedback climate sensitivity is scientific nonsense.

      • “seem” is used to state an opinion, not the result of chaining through the uncertainties and arriving at a final characterization of the overall level of uncertainty. An open mind isn’t a bad thing, although you may disagree.

        As for me, my position is there is no problem until it has been proven to me there is one. So far I’m unconvinced.

      • Jim, I get your point. But an “unequivocal skeptic” is almost an oxymoron.

  7. The first sentence of the abstract finishes with “on regional scales”, so the comparison with the global record of the past 30-40 years that you draw seems odd.

    For the global scale, the year to year variability is of the same order as the change in avg temp over 10-15 years, so typically that’s how long it takes for the forced signal to emerge from the noise/variability.

    On regional scales, variability is greater than at global scales, so clearly it will take longer for the signal to emerge from the noise.

    Some sources of global variability in temperature are (incompletely) understood, and can be (imperfectly) accounted for. accounting for such known sources of variability the forced component of the temperature signal becomes clearer. See e.g. http://tamino.wordpress.com/2011/01/20/how-fast-is-earth-warming/ esp this graph: http://tamino.files.wordpress.com/2011/01/adj1yr.jpg
    where he corrected for the influence of ENSO, volcanoes and solar irradiance by regression analysis.

    • Bart V,
      Come on. You are now in a balck knight mode.

    • Bart, the issue is this. They are talking about large oceanic oscillations such as the AMOC, PDO, that have very large scale responses.

    • John Carpenter

      Bart,

      “For the global scale, the year to year variability is of the same order as the change in avg temp over 10-15 years, so typically that’s how long it takes for the forced signal to emerge from the noise/variability.”

      I assume you are talking about the anthropogenic forcing? From 1975 to 2000 the signal was so clear… that remarkable 25 year unprecedented warming, like a staircase, up up up. Now the signal seems to be lost. Where did it go when there was such certainty 10 years ago that it was there?

      • andrew adams

        But it wasn’t up, up, up. There were clearly peaks and troughs during that period.

      • John Carpenter

        “like a staircase”, my anology to the ups and downs… or like treads and risers… you get the drift.

  8. Jim,

    While I understand, or think I understand, the difficult position Dr. C. is in, I notice the same reluctance at times.

    • Thank you, pokerguy. Let us see whether Judith is prepared to comment. I must say I do not understand why Judith is in a “difficult position”. If there is solid science, what is the problem?

    • I don’t see that Dr. Curry is in a difficult position. There’s no reason to choose sides, since the science so far isn’t up to the task.

      • I agree.

        As has been stated, at length before- this debate is not as simple as ‘for’ and ‘against’.

  9. It is in its infancy, but it is settled.
    As JFK was known to say when he and his pals were fooled by a good one,
    ‘We’ve been had’.
    Can we please have our money back?
    Can we please move on to spending our money on things that actually make a difference?

  10. Having read carefully the article by the Decadal Predictability Working Group, I see their conclusions as consistent with their focus. Decadal prediction, globally and even to a greater extent regionally, is to date less able than multidecadal prediction to disentangle anthropogenically forced variability from natural variability – particularly natural internal variability. Decadal prediction is also more likely to be affected by initialization than multidecadal predictions, and variables other than temperature (e.g., precipitation) may be more uncertain than temperature, at least globally. The article refers to past uncertainty about temperature data (particularly applicable to the earlier part of the twentieth century and less so than in recent decades), but does not imply that this uncertainty invalidates conclusions that the recent several decades of temperature increase are driven primarily by anthropogenic contributions, and in fact suggests that model hindcasts have done fairly well in this regard. How well future predictions will separate future anthropogenic and natural contributions is conjectural.

    The article addresses methods to maximize signal to noise ratios through S/N-maximizing EOFs, and also assesses the use of specific anthropogenic “fingerprints” for evaluating observational data and predicting future trends. What is particularly interesting is the observation that particular signals other than global mean temperature trends may be more informative in this regard; an example cited was Atlantic Multidecadal Variability (the Ting et al reference). Along the same lines, Isaac Held in his recent blog discussion, has also addressed the concept that extrapolation from selected observational data sets may be more informative in describing long term anthropogenic trends than conclusions drawn from changes in mean global temperature alone – Why Focus So Much On Global Mean Temperature?.

    • Fred,
      Let me demonstrate how uncertainty accumulates. For the example I will use round numbers. Let’s say we can demonstrate the combination of natural and anthropogenic forcings can change the annual global surface temperature anomaly either up or down by 0.3C. Let’s say we start at 0.7C. This means in 2020, the global surface temp anomaly will be somewhere between 1.0C to 0.4C. Now it is 2020 and you want to make another prediction for the next 10 years. If the high prediction is true, then 2030 will be between 1.3C to 0.7C. If the low prediction came true, then 2030 will be between 0.7C to 0.1C. So now say the high, high prediction came true and you want to predict for 2040. Then the global temp would be between 1.7C to 1.0C. But if the low, low prediction came true, then you are looking at 0.4C to negative 0.2C.

      Uncertainty accumulates over time. It is unscientific to claim otherwise.

      • @Ron
        To play devil’s advocate for a moment: — If the non-anthropogenic forcing is cyclical, it will tend to average zero over a long period; i.e. it won’t accumulate. Thus, in your example with 0.3 degrees variation that doesn’t double over the next ten years, it remains the same; assuming of course that the original model of AGW is accurate.
        Note: I am not arguing this position, merely trying to establish what the theoretical position might be.

      • Michael,
        It is fair to say the non-anthropogenic forcing is cyclical, but what is the time scale? Do you really expect it to cycle every 10 years? Every 100 years? Every 1,000 years? Every 10,000 years? Every 100,000 years? By the same token, what is the scale of the temperature anomaly? 0.3C? 3c? 13c?

        This is the reason geologists tend to be more skeptical of CAGW. They have a longer view of natural climate variability and they understand the range of natural climate variability is very great… and yet polar bears are still here.

      • @Ron

        Do you really expect it to cycle every 10 years?

        I don’t expect anything, I was just trying to explain what the other chap said as you were arguing against something different from what he was proposing.
        I thought I made it clear that it was not my view.

      • Michael,
        Sorry. I was rushing out the door and read it too quickly. But my point stands. The limits to natural variability are quite broad. People who think the limits are narrow have been fooled by the Hockey Stick.

      • PaulDunmore

        Ron, your example assumes a very specific type of temperature change, that it follows a random walk. If that is the case, the uncertainty does increase indefinitely (proportional to the square root of time, in fact). But there are many other types of random process where the cumulative noise eventually becomes less than the trend.
        What kind of process describes the various parts of the climate system is not something that I have seen discussed. The issues are well-known to economists, who have to deal all the time with data generated by processes with various forcings and feedbacks. They know how easily these processes can generate persistent fluctuations that appear to be trends until they fade away, and how to analyze data series to distinguish real trends from fluctuations. I have not seen much evidence that these techniques are understood, much less that they are a regular part of the toolkit, in the climate science community.

      • Paul – You may be interested in this paper on Trends and Time Series, which addresses some of these issues. The only thing I would add is that climate dynamics must conform to the laws of thermodynamics, which ultimately places constraints on what might otherwise be mathematically plausible. Admittedly, those constraints might allow for considerable latitude in the short run.

      • Paul Dunmore

        Hi Fred
        Thanks for the reference. Unfortunately, it demonstrates exactly the point I was making. The authors apply the Hilbert-Huang method of identifying quasi-periodic components in a time series (a generalization of Fourier analysis, it seems). If a time series is not already monotonic, it must have one or more maxima and minima: the method first finds and removes the highest-frequency component, eliminating one or more extreme values. It is then repeated on what remains, and so on until what remains has no internal maxima or minima and is thus monotonic. The authors define this monotonic function to be the trend in the data, and apply the procedure to the standard global temperature anomaly series to extract the trend so defined.
        It appears that the trend is by construction a spline curve, and thus useless for forecasting or for gaining any understanding of the process being analyzed (the authors admit this). It is essentially an automated version of drawing a freehand trend curve through the data, although it may be optimal in some useful sense. But it does not address the prior question: how do we know that there is actually a trend in the data to begin with? Applied to any time series whatever (a pure random walk, for example), the method will remove the extreme values and eventually identify a residual “trend”. That is, it is a machine for producing trends both from time series that really do have trends and from those that do not.
        To explain why this bothers me, I have placed here a picture showing the CRUTEM3 series of global annual average land temperature anomalies for the 160 years available. The recent trend is quite clear, and the mainstream discussion assumes or asserts that it will continue or accelerate. There are physical arguments in support of this, but whether the greenhouse-gas forcings are actually strong enough to explain the trend or whether the association is purely coincidental seems to be a matter of some perplexity. In my day job, however, I see lots of stock price charts that look very much like the CRUTEM3 data series; I have put two of them on the same picture. (Of course, I cherry-picked the cases: these are real and occur often enough that they are not hard to find, but I did want something that looks like the CRUTEM3 data.) The temperature data is driven by real physical causes, but stock prices are driven by real causes too – economic conditions, profit changes, news about the company, the psychology of investors, and lots more. There is a whole tribe of stock analysts (they call themselves Technical Analysts, but detractors call them Chartists) who pore over pictures like these seeking clues about which trends will continue. They draw trend lines, outer bands, and zigzags of various kinds, and they have theories that tell them when a trend is about to continue and when it is about to end. And one of the most firmly evidenced facts about the stock market is that Chartists do not make money (they get lucky by chance, of course, about as often as anyone does). They simply cannot tell which trends will continue and which have run their course. I do not know how the next 40 years of CRUTEM3 data will look, but I do know how the next 40 tradings days for these stocks looked: ExxonMobil stopped rising and IBM fell sharply.
        So before looking for a method to plot the trend for me, I first want to be convinced that the trend is something more than a reasonably short fluctuation with no interesting cause which is soon likely to go away again. Huang and his co-authors do not address this question. Certainly none of them is an econometrician; they cite Granger’s textbook on econometrics, although it is clear that they have not understood its importance to their proposal. We know that some autocorrelated time series are prone to large or even infinite fluctuations while others are not, and we know how to characterize which are which. Huang’s paper does not address or even recognize this question. The paper would not have been accepted in an econometrics journal without substantial revision, and it seems no accident that it passed peer review in a science journal, presumably with referees who are as uninformed of the econometric issues as the authors seem to be.
        So I am afraid that this paper has done nothing to relieve my concern that climate data sets are being analyzed by people who do not understand the risks of dealing carelessly with correlated time series. The paper by Scafetta being discussed on another thread has the same issue: he assumes that fluctuations in the temperature record require explanation, since the temperature will never fluctuate far from equilibrium. He may be right, but if correlation-induced fluctuations are larger and more persistent he may be just reading patterns in tea-leaves, proposing physical mechanisms for mere noise. You obviously read far more extensively in this literature than I am ever going to do: have you found it common for authors to study the issues of autocorrelation and possible non-stationarity in their observational series (or for that matter in the outputs of the models)?

      • Paul,
        Thanks for a very clear message and the related graphs.

        A related issue with relevance to climate science has also been studied widely in connection of technical analysis and more general econometrics: How to use a finite set of data to choose between alternative models and the values of their parameters, and then to perform statistical tests to determine, whether there is evidence for the predictive power of the model. It has turned out that keeping the testing independent from the choice of the model is extremely difficult, when historical data is being used. The knowledge on the actual history influences the modelers in so many different ways that the statistical analysis cannot be applied without breaking at least some rules that should be followed.

        Testing the predictive power of climate models is influenced in the same way, and the problems appear even more difficult and complex than in the study of models of stock prices or exchange rates in the currency market. We have very few and very limited predictions that have been compared with data from later periods. One forecast of Hansen doesn’t prove practically anything even in the most favorable interpretation of its correctness.

      • Paul – Thanks for your comments. I plead profound ignorance in this area, but I wonder whether the perspective in the PNAS article by Wu et al differs as greatly from your own as you imply. They appear to limit their definition of a trend to a timescale of interest, acknowledging that their method is not applicable to prediction, and that “one economist’s ‘trend’ is another’s cycle”.

        I found the application to climate data useful to two reasons. First, it identifies a timescale over which a trend, by their definition, can be identified, and which can be correlated with a physical mechanism involving greenhouse gas increases, other anthropogenic emissions, and changes in solar irradiance. Second, to my knowledge, historical data have identified periodic or quasi-periodic fluctuations on a variety of timescales, but they have either been clearly shorter than the centennial scale of these observations or much longer (often millennia). If this trend is actually part of a longer cycle that just happened to take an upturn around 1910, considerable coincidence would need to be invoked to explain its correlation with physical data and theory. That is certainly not an impossibility, but it lends some weight to explanations based on geophysical principles cited in its behalf.

        I don’t think the observed centennial changes can do more than provide correlative evidence; they certainly can’t predict that the future will resemble the past 100 years. They do, however, in my estimation, signify that something has been happening that does not duplicate similar variations in recent climate history. How to proceed further will be a task for the climatologists, and the evidence they unearth.

    • Freed,
      Bunk.
      You are in effect reporting assumption laced propaganda as if it were straight news.
      Junk leads to junk, and the empire of CO2 calamity is built high on shiny bright junk.

    • steven mosher

      Thanks for the link Fred. Held’s blog is a great resource for folks

    • “Decadal prediction, globally and even to a greater extent regionally, is to date less able than multidecadal prediction to disentangle anthropogenically forced variability from natural variability”

      That’s based on theory, but otherwise an assertion. Nobody has checked the multidecadal predictions against actuals yet.

  11. Fred,
    You write:
    “I see their conclusions as consistent with their focus. Decadal prediction, globally and even to a greater extent regionally, is to date less able than multidecadal prediction to disentangle anthropogenically forced variability from natural variability – particularly natural internal variability.”

    But this is a ridiculous and unscientific position. Climate prediction is an initial value problem, therefore uncertainty accumulates. If you don’t know where you will be in 10 years, you have less certainty in predicting where you will be in 20 years. In 20 years, you will have less certainty to predict where you will be in 30 years.

    • Ron – Empirically, that is not how it climate appears to operate. Climate models (i.e., climate prediction) match observations better at mutlidecadal intervals than shorter intervals of a decade or less. This is inherent in the nature of climate trends, which emerge better when the shorter term fluctuations that models handle poorly are averaged out, and is consistent with the principle that climate model projections are more of a boundary value problem than an initial value problem. Of course, the shorter the interval predicted, the more important are initial states, which is why weather prediction is almost entirely dominated by initial conditions. The fact that climate behaves differently is a characteristic of climate dynamics, and is not necessarily true of long term predictions outside of the realm of climatology. It may also be the case that centennial predictions are less certain than multidecadal predictions, but we don’t yet have enough evidence to evaluate this quantitatively.

      • Fred,
        Nonsense. Predictions by climate models are coming apart at the seams. Special pleading for climate science is not allowed. Science is science.

        (Sorry. This comment was not nested properly. Please remove the one further down.)

      • Ron – Your vehemence in your recent comments suggests that it’s very important for you to have it come out a certain way. The problem is that if you get into an argument with Mother Nature, most people won’t place their bets on you to come out the winner. The increase in the accuracy of model predictions with increasing time (up to a point) is empirically confirmable, because that is the nature of how climate behaves.

        A more obvious example is the difference between climate and weather. I recently gave a talk on climate change to a college audience in which I emphasized that distinction, which is often misunderstood. In the U.S., on January 1, you can’t be very certain whether it will be warmer or colder ten day later. However, you can be much more certain that it will be warmer six months later, in July. That is because the seasonal trend has overwhelmed the day to day variability of weather patterns. From audience questions, I believe that helped them place the rest of my talk into an understandable perspective.

      • Fred,
        That is a rather specious example.
        Unless you are trying to demonstrate that climate models are no more predictive than betting the seasons will change?

      • Mr. Fred,
        The data is clear that the planet has seasons. Also, we as humans experice it. Basiclly, any lay person (me) can confirm it is true. Sorta like the sun rising. Now, you use an analogy, that climate predictions are like that? So your audience understands better how to believe GCM’s can predict far into the future, but short term noise, and NATURE, make almost impossible to predict short term. Laughable dude. Here’s the problem for the lay person (me), nobody can feel 0.7 C temp rise over the last 150 years. The data is not clear. Some of us don’t trust the numbers coming from a guy that climbs up smoke stacks in protest.

      • Fred,
        You are very good at regurgitating the received wisdom of the climate community. The problem is you cannot tell the difference between science and the nonsense the climate community sometimes puts out (including this farce that it is easier to predict 100 years into the future than 10 years into the future).

        Climate predictions made by the models are not on track. This article is an attempt to explain away the fact the predictions are way too warm.

        My vehemence, as you term it, is purely disgust. My comments here are not going to change the climate community. I don’t know what will change the climate community because data does not seem to do it.

      • Fred M: I’ll go with hunter. That example is downright misleading. Climate has many cycles and the seasons are practically clockwork but to speak as though seasonal reliability extended to climate models in general verges on dishonesty.

        I understand the distinction between weather and climate and I understand why climate scientists believe they have a better shot at predicting climate in twenty years better than weather in twenty days.

        But that doesn’t make it so and I am weary of the orthodox hoisting the flag of that talking point as though it were a keen insight.

        Until climate scientists have a proven record for predicting climate decades out, I’ll take their claims with a grain salt.

      • Fred Moolten,
        Your illustration of the difference between weather and climate is instructive. Weather is quite variable and difficult to predict accurately. Earth’s climate is a ponderous beast that has been warming for the last 15,000 years. We are as sure that it will continue to warm as we are that July will be warmer than January (N. Hemisphere). Our ability to predict the rate of climate warming is about the same as our ability to predict how much warmer next July will be than last January.

      • Willis Eschenbach

        Speed | April 8, 2011 at 6:54 am | Reply

        Fred Moolten,
        Your illustration of the difference between weather and climate is instructive. Weather is quite variable and difficult to predict accurately. Earth’s climate is a ponderous beast that has been warming for the last 15,000 years. …

        Umm … Speed … it’s generally accepted that the world has been slowly cooling for the last 15,000 years, see the Greenland Ice Cores for verification.

        So no, we are not “as sure that it will continue to warm as we are that July will be warmer than January”. We can’t “continue to warm” when we have been generally cooling for 15,000 years.

        But even that general 15,000 year cooling means nothing. Consider that it didn’t warm for ~100,000 years … then it warmed.

        Conclusion? We know July will be warmer than January … but despite 15,000 years of a generally cooling world, we haven’t a clue whether in 100 years we’ll be warmer or colder.

        w.

      • “The problem is that if you get into an argument with Mother Nature, most people won’t place their bets on you to come out the winner.”

        And your previous assumptions about the nature of climate and it being close to a boundary value problem is invalidated if it follows Tomas’ suggestion that it’s spatially and temporally chaotic. Then YOU lose the argument with mother nature.

      • Fred Moolten,

        “Climate models (i.e., climate prediction) match observations better at mutlidecadal intervals than shorter intervals of a decade or less.”

        Can we get a couple of examples of models that have been incapable of making decadal predictions, but were cracker jack at making multidecadal predictions?

        And I would welcome such examples from any field, just to see if that counter-intuitive principle has ever been shown to be possible, let alone likely or certain.

      • Gary – If you go through the multiple past threads in this blog on models, you’ll find many examples, including those I’ve cited. I’ve answered the question so many times I’m disinclined to go through the details again, although if you email me, I’ll send you some links. You can also check the references in AR4 Chapter 8.

        I don’t believe there is anything counter-intuitive about it, as my earlier example of weather vs climate illustrates. It’s an attribute of systems characterized by long term trends masked by short term fluctuations.

        I also wondered from the way you phrased your question whether you are really interested in understanding this phenomenon, or whether your question was instead less a question than an invitation to argue. Your use of the term “cracker jack” suggested the latter, but perhaps I’m jumping to premature conclusions. The phenomenon is now well recognized and accepted, so if someone doesn’t want to accept it, I will judge arguing to be futile.

      • Fred.

        “Cracker jack” was admittedly snarky, but the question was genuine.

        I’ve looked briefly through Chapter 8 of AR4 as you suggested, but haven’t yet found examples of models that made multi-decadal predictions that were vindicated. The chapter does indicate the reasons for confidence in the models, and they list 3.

        “One source of confidence in models comes from the fact that model fundamentals are based on established physical laws, such as conservation of mass, energy and momentum, along with a wealth of observations.”

        “A second source of confidence comes from the ability of models to simulate important aspects of the current climate. Models are routinely and extensively assessed by comparing their simulations with observations of the atmosphere, ocean, cryosphere and land surface.”

        “A third source of confidence comes from the ability of models to reproduce features of past climates and climate changes.”

        I don’t find a claim that the models are considered accurate on a multidecadal scale because they have been proved to be so in the past. Nor do I find a claim that X model in 1970 predicted Y temperature trend (or whatever) in 2000, and Y trend in fact materialized.

        I’m not saying there aren’t models out there that have been proven accurate on that scale, but I haven’t read of any. And it seems to me that would be rather a big plus on the CAGW side of the debate if there were such a successful model.

        Rather than email a list, how about just posting one? There seems to be a bit if a dispute of this generally accepted principle on this thread, which suggests it is not a principle which is “well recognized and accepted” by everybody. Which is why I asked for an example.

      • Here’s a fascinating series of articles on multi-ensemble climate modeling, with references, and in at least a few cases, examples (Hubener et al for example) – Model Ensembles

        Doesn’t AR4 chapter 8 also have figures with some examples?

        Those are all hindcasts. Hansen et al in 1988 described a GISS model whose predictions paralleled observations since then but overestimated the trend. The climate sensitivity value incorporated into their model input was 4.2 C/CO2 doubling, whereas if the current canonical modal figure of about 3 C were used, the predictions and observations would match well overall, but do poorly in the most recent decade. What will happen in future decades remains to be seen – Model-Data Comparisons .

        It is not “cracker jack”, but it is not too soggy either.

      • Regarding long term vs short term predictions, how does the Stock Market perform? In which prediction can we have more confidence? I have an impression about this but I don’t really know the data.

      • Yes, those are all hindcasts, which is why I said I didn’t find any that answered my question about proven the ability to make multidecadal predictions. “Predictions” would not of course apply to hindcasting.

        I read the abstracts of the first five papers on the link you included. None of them appear to describe past accurate predictions by climate models. The articles are about “new approaches” and “new methods” to improve the models.

        I didn’t think there had been any successful predictions on that scale, or I suspect we would have been hearing about it non-stop throughout the climate debate.

        This is a link to the Real Climate article regarding the Hansen 1988 predictions.

        http://www.realclimate.org/index.php/archives/2007/05/hansens-1988-projections/

        This is the link to figure 2 of the article, showing the predicted annual mean global temperature change.

        http://www.realclimate.org/images/Hansen06_fig2.jpg

        Despite the text of the article, the lack of arming since 1998 was not accurately predicted by this model.

        So I see no example of a model that was accurate on a multidecadal scale, at all, let alone what I asked for, a model that was incapable of forecasting decadal trends but did accurately predict multidecadal ones.

        I not only don’t find many examples, I don’t find even one.

      • Not “cracker jack” is an understatement. Why don’t you try doint that with climate predictions sometimes? So Fred, you think it’s ok, to input another value into Hansen’s model (hindsight is 20/20 you know), then have it match up to scenario B? Wait a minute, we had to do something about CO2 to get scenario B. We didn’t do anything. This is modeling’s success story? As you know the true temp is under scenario C. Tell everyone what we had to do about CO2 to acheive scenario C Fred.

      • It is much easier to predict things that have already happened.

      • Prediction is very hard, especially about the future – Yogi Berra

  12. Joe Sixpack

    ‘Scuse me for butting in and all among the clever folks, but can somebody explain in simple language why it should be harder to forecast ten years in advance than a hundred? Coz it sure isn’t obvious to me.

    (Apart from the fact that it s possble to test ten years predictions within a human career .. but not so a hundred – meaning that you never get found out if wrong :-) )

    • Easy, Joe. Take a straight line that slopes upward at say 10 degrees. Now make it real jagged. In the short run you don’t know if you will be going up or down, but in the long run you will go up. Simple enough?

      • John Carpenter

        David,

        Don’t you think there might be a leetle goal post shifting? Just a tiny bit? To account for the lack of “warming” the last decade. 10 years ago, wasn’t the confidence level projected a bit higher where we would be in the next 10 to 20 years based on the models… but now it’s a little fuzzier?

      • Yes of course, John. I was simply answering the question. The warmers have been forced to discover natural variability in order to save AGW. It’s very funny.

      • John Carpenter

        nature laughs at the human folly…

      • John Kannarr

        But you are assuming that you know the long-term trend of the straight line is up. The real issue is whether in fact the trend is up on a long-term basis. It’s a convenient fallacy to assume what it is that you want to prove. The problem is to prove that the long-term up trend you claim to see is really just that, rather than a short-term anomaly or some other varying cyclical effects among many, including solar, astronomical, PDO, etc.

      • I am not assuming anything you twit. I am just answering the question.

      • John Kannarr

        Let’s see, Joe Sixpack asked “why it should be harder to forecast ten years in advance than a hundred?”

        You responded with your analogy of “a straight line that slopes upward at say 10 degrees. Now make it real jagged. In the short run you don’t know if you will be going up or down, but in the long run you will go up.”

        That sounds to me like an assumption – that the straight line has a long-term slope that is upward at 10 degrees – from which you deduce that in the long run the line will go up.

        But the whole climate debate is about whether there is in fact any such known long-term “slope” or whether the apparent long-term upward slope is merely part of an even longer cycle’s “upward jag” and finally whether there is any confirmable, predictable anthropomorphic component that slopes upward.

        Excuse me, but name-calling, even “you twit,” is out of place in this blog.

      • He’s pointing out the signal to noise issue. Small effects, as expected to be found in short time frames, can be swamped by higher frequency noise. Over time, the low frequency signal of interest has a larger effect, so it isn’t swamped by noise as easily.

        It’s a well known problem in many disciplines.

      • David,
        That does not answer his question at all. You can look at any stock chart and see a line go up in a jagged fashion. But if you have ever invested in stocks, you know the line can change direction and go the opposite direction right after you buy. “Past performance is no guarantee of future results” is a statement often seen in a prospectus. It is something climate scientists should remember.

      • Something for skeptics to remember – increasing CO2 in an atmosphere always enhances its trapping of infrared heat. It works every single time it’s tried. No exceptions whatsoever, none. Do you have evidence that we are now generating less CO2?

      • cmb, your cryptic little comments are not useful. The fact that CO2 is a GHG is not generally disputed here, with some notable exceptions. This is the starting point of the debate. The rest of the debate then follows. If you do not understand this I suggest you read some of the earlier posts, starting with the no-feedback sensitivity ones, not to mention the lack of warming for the last decade or so. This is actually a very complex issue.

        By the way, what atmosphere are you referring to? The Earth’s or some other planet’s? Gas in a bell jar is not an atmosphere.

      • Joe Sixpack

        But that asssumes that you know the answer already…that the line is a general slope at 10 degrees. My question is ‘if you can’t forecast that its that sort of line for 10 years, how do you know that the next 90 will do what you expect…and say that 100 years is easier than 10?

      • Joe, I was not talking about climate. You asked a mathematical question so I gave you a mathematical answer. Whether it is applicable to climate is a scientific question. As Harold points out, there are well known cases where this model works. Climate may not be one of them, and I doubt it is. Sorry for the misunderstanding.

      • David L. Hagen

        Joe Sixpack

        why it should be harder to forecast ten years in advance than a hundred

        See Don Easterbrook’s pubs
        http://myweb.wwu.edu/dbunny/research/global/glopubs.htm
        The IPCC is proposing a steadily increasing warming trend.
        Easterbrook shows the 60 year Pacific Decadal oscillation strongly affecting temperatures (ignored by the IPCC).
        If you don’t understand what drives the climate, you can’t project 100 years, when there might be major natural 60 year oscillations.

    • Until recently it used to be that weather was chaotic but that it all evened out in an average climate except for the really long term where it became chaotic again. Now it is that weather and unforced natural variability on decadal timescales are chaotic – but it will all even out over the mid term of say the next 100 years. I notice that some have not quite caught up yet – and are still talking about weather and climate. Get with the program Fred.

      If this sounds like something dreamed up by Bruce and Bruce of the Philosophy Dept. of the University of Wooloomooloo – I can only agree. We don’t really give a rat’s … and are looking forward to the royalties on the big book of egregious rationalisation. See the chapter on post rationalisation of cognitive dissonance.

      Chaotic is not the usual dictionary meaning – but a physics term involving wild fluctuations, unpredictability, rapid changes in conditions and, most fun of all, Dragon Kings. The latter live in coral castles, are guarded by shrimp soldiers and crab generals and cause flooding and drought.

      If this is as clear as mud – Bruce has just turned up and I’ll be off now to chase down some ‘roos in the top paddock.

  13. I find this distinction problematic — “to what extent are regional changes in the current climate due to natural climate variations and thus transitory, and to what extent are they due to anthropogenic forcing and thus likely to continue.” Why assume that natural variations are transitory? Ice ages last for 100,000 years, which is hardly transitory. This seems like the old idea that natural variation is some kind of oscillation around a fixed state. There is no reason to assume this on the decadal scale.

    As for the rest , yes it is good to see the establishment embracing the possibility of serious natural variation. But then they have to because it is not warming, on the decadal scale. On the other hand this looks to be part of the new thrust, to predict local climate change and thus finally be useful, hence funded, using AGW of course. But if it forces them to finally recognize natural variability so much the better. AGW can fade into the background.

    • David,
      Good point. And why assume that natural, multidecadal “variations” ever “repeat” due to the exact same cause. The system is fundamentally unpredictable. And long term trends, although apparently similar, are not necessarily identical in their initial cause.

    • You got there first, thanks!
      This paragraph stuck in my craw as well.
      While it is good to see that natural variability isn’t a dirty word any longer, it is odd to assume from the get-go that they would be ‘transient’.
      Transient in relation to what? Anthropogenic forcings?
      As long as we don’t know how earth got from the MWP to the LIA (yes, there are hypotheses, but do we have proof?) – a time during which, to my certain knowledge, people were about and burned fossil fuel in the form of coal, as well as renewables in the form of wood and dried dung, how can we now be certain that natural variabilities are transient, but anthropogenic ones are not?
      It is, after all, not about the exact or even estimated temperature in the case of the MWP or LIA, it is about a real climate change.

    • Let us know when AGW fades into the background. In the meantime, the spectrum of light reflected back from earth as measured by satellite after satellite shows ever increasing widening of the notches at the CO2 resonant frequencies. Each notch represents the heat being held back from escaping the atmosphere. You can use that as your yardstick. =)

  14. AR4 and earlier IPCC reports have little to say on regional and decadal issues, but the plan for AR5 contains the following to chapters:

    Chapter 10: Detection and Attribution of Climate Change: from Global to Regional
    · Executive Summary
    · Evaluation of methodologies
    · Atmospheric and surface changes
    · Changes in ocean properties
    · Cryosphere changes
    · Extreme events
    · Pre-instrumental perspective
    · Implications of attribution for projections
    · Frequently Asked Questions

    Chapter 11: Near-term Climate Change: Projections and Predictability
    · Executive Summary
    · Predictability of interannual to decadal climate variations and change
    · Projections for the next few decades
    · Regional climate change, variability and extremes
    · Atmospheric composition and air quality
    · Possible effects of geoengineering
    · Quantification of the range of climate change projections
    · Frequently Asked Questions

    http://ipcc.ch/pdf/ar5/ar5-outline-compilation.pdf

    It remains to be seen, how successful they will be in writing these chapters.

    • I am sure they will have no trouble writing these chapters. One sided speculation is easy. This is, after all, the IPCC, not science.

    • So this paper is aimed straight at inclusion into Chapter 11.

      If I were cynical, I might say that this paper is also aimed at getting the necessary fundings for then next decade or two set up …. but naw, they wouldn’t even think that, would they??

  15. That first sentence does seem striking at first. But then you realise that it is really just a desperate attempt to explain/excuse the lack of warming over the last decade, so that they can now pretend, retrospectively, that it was to be expected. It is this lack of warming that has forced them to admit that natural fluctuations must be at least as large as the ‘anthropogenically forced’ changes.

    The statement
    As the science of decadal prediction is in its infancy,
    makes an entertaining juxtaposition with the IPCC statement
    For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenarios.

    • (Note the line under statement 4 in the paper, not quoted by Judy,
      As with the preceding decade, the climate evolution in the near term will be a combination of forced climate change and natural variability.“)

    • EXACTLY RIGHT

      The statement
      “As the science of decadal prediction is in its infancy, ”
      makes an entertaining juxtaposition with the IPCC statement
      “For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenarios.“

      • Only if you refuse to understand that those two decades are projected from a much, much longer dataset.

    • AKA the Texas Sharpshooter’s Fallacy.

      http://en.wikipedia.org/wiki/Texas_sharpshooter_fallacy

      It’s always easier to predict the future after it becomes the past.

    • I haven’t been looking in detail at the arguments about climate change as many folks at this site, but:

      Is it really a new phenomenon that climate scientists (who think that GW is probably A) say that decadal variations can overwhelm long-term trends for limited periods, but do not negate the existence of long-term trends and/or do not overwhelm longer-term forcings?

      Certainly, in 2009, it was well-publicized that Mojib Latif stated that decadal variability would hide the implications of a long-term trend. He didn’t seem “desperate” to me; he stated that he was concerned about how people might interpret the contrast between countervailing short- and long-term trends, but he didn’t seem “desperate” to understand their relationship. (Of course, his statements were widely taken out of context by “deniers/sketpics” to be a refutation of theories of AGW – a sign of “desperation” on their part, perhaps?)

      If 10 years ago, say, you had asked a climate scientist (who thought that GW is probably A) whether or not there might have been decadal variations that ran in contrast to long term trends during, say, the little ice age, would s/he have answered “No, that would be impossible?” Haven’t climate scientists (who think that GW is probably A) long said that short-term forcings (such as volcanic activity or an El Nino) can overwhelm the effect of long-term forcings for limited periods of time?

      It does not seem to me that this paper is saying – as you interpret it – that they need to “desperately” explain why decadal trends might stand in contrast to long-term trends, but is instead an analysis of how difficult it is to measure what causes variation at the decadal level.

      If the ability to measure decadal trends has been overstated previously, that is an issue worth addressing, but your interpretation of “desperation” may well be an overstatement as well. I can see some reasons for your speculation; for example, I recall Trenberth’s 2009 comment that it was a “travesty” that short term trends were running against long-term predictions. Taken out of context, that could be seen as a statement of “desperation.” But I also remember that the statement could easily be interpreted (more viably? more consistent with the context) to mean that it is a “travesty” because the countervailing trends would make the necessity of adapting to climate change that much more difficult to justify. Was the meaning of his statement twisted by “deniers/skeptics” – perhaps a sign of desperation?

  16. Judith,

    I can certainly understand why my posts that referred to typos were deleted.

    It is less clear to me why you deleted my post which questioned this statement of yours:

    This paper arguably provides more fodder for skepticism of the AR4 conclusions than anything that I have seen from the climate establishment (the authors may not realize this)

    On what basis do you question that the authors who wrote the following wouldn’t realize that their conclusions would arguably provide fodder for skepticism?

    The main conclusion drawn from the body of work reviewed in this paper is that distinguishing between natural and externally forced variations is a difficult problem that is nevertheless key to any assessment of decadal predictability and decadal prediction skill. Note that all the techniques are limited by some assumption intrinsic to their analysis, such as the spatial characteristics of the anthropo- genic signal, independence of noise from signal, or statistical stationarity.

    That statement, in and of itself, directly expresses skepticism about decadal predictability and decadal predictability skill.

    (btw, you still might want to fix the anthropo- genic<)

    • apologies, comment restored, i thought it was another editing comment. note i greatly appreciate the editing suggestions, but remove them after they are fixed so as not to clutter up the thread.

  17. Willis Eschenbach

    Judith, my thanks for a most interesting peek into the minds of the modelers.

    First, we haven’t solved the problem in question (is change X natural, anthropogenic, or some mixture) at the global level. Since the “noise” is larger at the regional level, and the models are worse at the regional level, it’s not clear to me why they are even touching the regional level.

    I was also interested by their tacit assumption that there is no way to statistically overthrow the null hypothesis. Because obviously, the issue of “separating natural and anthropogenically-forced” climate is merely another statement of the null hypothesis. And all of their proposed approaches are described as “Identifies change in statistics due to external forcing by comparing forced and unforced runs.”

    I disagree entirely with this approach. If we cannot tell the difference between natural and anthropogenically-forced results by observing the climate, then that is a very important finding. Given that the models are tuned to produce realistic results only when using all of their forcings, comparing some climate model’s runs with and without anthropogenic forcing is laughable. Please, Judith, tell me that you understand that you can’t learn anything by pulling forcings out of a complex tuned parameterized iterative GCM model and comparing the results to the same runs with all forcings included …

    I certainly agree that we should begin to see how well the models do on a decadal timescale. The argument is often made that the climate models are useless for predicting next year’s weather, and that they totally missed the currently fifteen years and counting of no statistically significant warming … but despite that they’re killer good at the 100 year mark.

    Now, I’ve always wondered … when does the changeover occur, and why? I mean, clearly the current models are junk at the decadal level … so at what time frame do they get accurate?

    And what is it that makes them accurate in the long run but not the short run? How does that work? As a man who has written more than one climate model of various systems, I don’t understand that part.

    w.

    • Willis Eschenbach

      Sorry, I meant “As a man who has written more than one model of various systems”, although I have written a couple of simple global radiation/convection/evaporation global climate models.

      w.

    • >I was also interested by their tacit assumption that there is no way to statistically overthrow the null hypothesis.<

      These people are attempting to destroy the various global economies, and if successful, they will not be able to tell the climatic difference

      I find this more than tacitly interesting :)

    • Willis, i find myself at a similar position to you on this: that the paper seems to support the null hypothesis more than the ‘theory’; though i must qualify this by saying i have only skim-read the paper at present (though i WILL be thoroughly digesting it soon).

      It’s an exceptionally important and interesting subject for me, as if the paper is suggesting what i think it is (that seperating the natural from the anthropogenic signal is effectively impossible), despite how it’s dressed up, then i think this is an exceptionally significant paper.

      If, as this seems to suggest- we cannot discern the difference between the anthropogenic and natural cycles then we cannot quantify the anthropogenic signal. We can then not assign a degree of warming to co2 (glossing over the many issues with the assignation for a moment) and then the cAGW theory takes a major, possibly fatal hit.

      Am i wrong in this evaluation? It seems to me that this is an exceptionally significant piece- especially pre-AR5.

      • Three ‘exceptionally’s’ in one post. BINGO!

      • Since the anthro part is largely forced by CO2 and its concentration is detectable regionally from orbit, there should eventually be no problem at all distinguishing most AGW from natural variation, which has no visible CO2 signal. It will be lit up like a neon sign when seen from orbit.

  18. Jim C. wrote: Thank you, pokerguy. Let us see whether Judith is prepared to comment. I must say I do not understand why Judith is in a “difficult position”. If there is solid science, what is the problem?”

    Dr. C. has shown great courage in my opinion. Don’t forget that she, like most of her counterparts, was firmly in the AGW camp. Now she’s regarded as a traitor among many of the establishment climatologists. It’s not easy to be a professional skeptic these days. There are great pressures in academia to adhere to the part line.

    All that said, I share your frustration. We had a brief discussion in another thread about what I see as her very difficult to understand answer, given her stated current beliefs, to Rep Baird’s question about whether the climategate emails undercut the science. (Her answer was essentially “no.”)

    The AGW case gets shakier by the day. I’m hoping she’ll make the leap we’re all waiting for soon.

    • The AGW case gets shakier by the day. I’m hoping she’ll make the leap we’re all waiting for soon.

      Every one will leap if the current cooling trend shown below continues for a couple of years!

      http://bit.ly/f42LBO

      • pokerguy writes “The AGW case gets shakier by the day. I’m hoping she’ll make the leap we’re all waiting for soon.”

        It does not look as if Dr. Curry is going to take any notice of this discussion. Pity.

  19. To understand periods 1910-1945 (warming), 1945-1975 (cooling), 1980 – 2000 (warming) and 2000 – 2010 (cooling) it is important consider the sync and upwelling oceans’ regions to which polar and subtropical jet streams respond. Any change in the relevant ocean currents would result in climatic oscillations. These changes therefore are ‘drivers of the natural multi-decadal climate change’, as it is clearly demonstrated by correlation with the major climatic indices.
    http://www.vukcevic.talktalk.net/PDO-ENSO-AMO.htm

    • To understand periods 1910-1945 (warming), 1945-1975 (cooling), 1980 – 2000 (warming) and 2000 – 2010 (cooling)

      20th century trends: http://bit.ly/9lp8q3

      Cooling since 2002: http://bit.ly/f42LBO

    • Paul Vaughan

      vukcevic, the oceans most certainly are not independent of spatiotemporal insolation patterns. Given a mere few months with reduced insolation, ocean surfaces cool quickly. The nature of decadal variability is tied directly to annual & semi-annual cycles.

  20. “There are some encouraging signs that the climate establishment is maturing in their consideration of this issue.”

    Why should I or world have to care about the “climate establishment” or their “maturing”? This seems to follow the theme that their predictions have failed and there methods are subpar but there is no reason to consider it because “science has moved on” while at the same time Hockey Sticks and Tree Rings are sacred items that are still sighted as valid.

    Since we can’t quantify the human impact the discussion of mitigation should be denounced by every real scientist. Since there is no consensus on quantity of agw impact the claim of “consensus” among the establishment was clearly contrived and over stated for a political purpose in mind. This is observed in the many fake polls and the water downed question like “is the earth warming”, “is co2 a ghg” leading to “do humans who produce co2 impact warming”. Science deduction at a 3rd grade level and I’m not being fair to 3rd graders. It’s really always been crafted for leftist congressmen and U.N. officials who needed the fairytale of agw to pass the agenda.

    Of course there is willful ignorance and silence among those on the agw payroll and there should be shame. I couldn’t careless about the establishments maturity, they need ruthless investigations and their funding eliminated.

    • The reason you should care about the climate establishment is because they are in charge.

      • If election trends continue, that won’t be true for much longer.

      • These are not elected officials, they are scientific program officers in the research agencies, such as NSF, NOAA, NASA, etc. The recent slew of USGCRP reports are even more extremely pro AGW than the IPCC reports. These folks are deeply embedded. Elections do not touch them. They are the ones with two billion dollars a year to spend. They are extremely powerful and no one even knows their names. Think about it.

      • When science/social movement guys are in charge, really bad things happen.

      • “When science/social movement guys are in charge, really bad things happen.”

        Isn’t that exatly what the IPCC and associated water carriers represent?

      • Yep.

    • Pooh, Dixie

      Your example shows the application of fallacy to polling. Here are a couple of references for self-protection. The Trivium is not for 3rd graders. Maybe high school or freshman college. The AGW argument is a target-rich environment.

      Downes, Stephen. “The Logical Fallacies, Stephen’s Guide to : Welcome.” Web. 28 June 2008.

      Joseph, Miriam. The Trivium: The Liberal Arts of Logic, Grammar, and Rhetoric: Understanding the Nature and Function of Language. Philadelphia, PA: Paul Dry Books, 2002. Print


    • Why should I or world have to care about the “climate establishment” or their “maturing”?

      Better science, the opportunity to learn something new, better evaluations of the science,…

  21. Morley Sutter

    From a systems-control point of view (at least in biology), to alter an outcome in any process one has to alter a control point or a rate-limiting step. Therefore if there are several types of “natural” variability plus
    anthropogenicically-induced variability, altering the latter will have very little chance of altering the process in question, i.e., climate. We do not know the true control points of climate.
    To overly-simplify, if the house temperature usually is controlled by a thermostat, when the furnace is out, altering the setting on the thermostat does nothing.

  22. cwon1:

    That was beautifully expressed. I especially agree with: “Of course there is willful ignorance and silence among those on the agw payroll and there should be shame. I couldn’t careless about the establishments maturity, they need ruthless investigations and their funding eliminated.”

    But I believe this thing also has to come apart from the inside. I’ve got to get off the same pony I seem to be riding lately by constantly urging Dr. C. to take a more bold stand, and I’m sorry about it. But I keep reading in places like Scientific American about how she remains firmly in the AGW camp despite her quarrels with the alarmists, and this just isn’t true. Why not at least write them a clarifying letter for publication? Or why not a whole article? I’m certain it would cause quite the stir. The time certainly seems to be ripe.

    To borrow an effective image from the alarmists, I’m hoping for a a kind of intellectual tipping point whereby enough establishment warmists simply can no longer live with themselves. I have to believe some of these guys will be desperate to jump ship when they see the jig is up. People generally take the “high road” only when it’s in the best self-interest.

    • This kind of militant skepticism is useful in general but in the context of this blog it is just silly

    • pokerguy: I think Dr. Curry has made a plenty bold stand and she has remained true to herself. However it’s also true that she is much more orthodox than most of the commenters here.

      Dr. Curry really is persuaded by the basics of climate change. As I read her, her efforts with this blog are to bring the orthodox and skeptics closer together — not to advocate the overthrow of orthodoxy or the dismissal of skepticism.

      • “As I read her, her efforts with this blog are to bring the orthodox and skeptics closer together ”

        She has been a supporter of improving the processes used by climate scientists. This is an important point, in my view, since the results of flawed processes can be assumed to be flawed themselves.

  23. 2. what physical processes or modes of variability are important for the decadal predictability and prediction problem

    http://bit.ly/hUSDD0

    The physical process is the PDO!

    • Richard Hill

      Is the PDO a physical process?

      • The PDO is related to the movement of the sun relative to the centre of mass of the solar system (CMSS) as shown in the following chart:

        http://bit.ly/ghvtRx

      • I’m with you all the way Girma. But we should think of the PDO and ENSO as a single system with a quasi periodicity of 20 to 40 years. Indeed the modes of major climate variability all seem to be linked as standing waves in a dynamically complex spatio-temporal system with am exceedingly large phase space.

        There was a guy 100 years ago called Inigo Owen Jones who predicted rainfall in north eastern Australia based on the movements of the large outer planets. He was of course dismissed as a total crank.

        The motion of the large outer planets determine to a large degree the location of the centre of mass of the solar system – which is inside but not at the centre of the Sun. The movement of the centre of mass in the Sun generates fantastic currents within the Sun that are responsible for the the heliomagnetic field that envelopes the solar system.

        The geomagnetic field waxes and wanes in the longer term and with the quasi 11 year cycle. It even reverses polarity in the quasi 22 year cycle. The heliomagnetic field influences cosmic rays – but also solar UV and the latter has a more direct and obvious link to ocean/atmosphere couplings – including ENSO and therefore much of global hydrology.

        I love the power spectra of Scafetta – is there a reference? We should stick to our guns – Girma. My strategy is to find amusement in small things by messing with their heads a little as I go.

      • Chief

        Thanks for your kind words.

        Here is the reference:

        Climate Change and Its Causes: A Discussion about Some Key Issues
        Nicola Scafetta, Duke University

        http://1.usa.gov/haOuJV

      • Center of mass moves, but I have difficulty squaring Leif Svalgaard’s tiny tidal effects with your ‘fantastic currrents within the sun that generate the heliomagnetic field that envelopes the solar system’. I’m not being critical so much as curious. What is what?

        I note the similarity with a van de Graaf generator and how a tiny change centrally can result in very different sparking pattern.

        I also note that the shape of the curve of maximum cosmic rays alternates from pointed to flat in alternate solar cycles, and that three such cycles fit approximately into one phase of the PDO. This would point to a mechanism to generate this oscillation, which includes approximately the correct timing and a cosmic ray connection. This is a low order effect and may have no consequence, but given your view of the picture, I’d like to see what you think of it.
        =================

      • The Sun spins in a magnetic field that that generates electricity that creates a magnetic field that…

        See for instance – http://istp.gsfc.nasa.gov/earthmag/demagint.htm

        The centre of mass is just a proxy for the interactions and evolution of magnetic fields.

        Think interdecadal Pacific oscillation (IPO) rather than PDO. IPO = PDO + decadal ENSO changes.

        The difference between multi-decadal cool and warm IPO phases is the amount of cold water upwelling in the eastern Pacific – very obvious in biological productivity and, of course, in global hydrology. Phytoplankton in the central pacific, sardines or anchovies in Monterey Bay, chinook salmon in North American streams, etc.

        Sardines or waldorf, pizzas or sardines,
        one brings feast and one brings hunger
        to the seal pups playing in Monterey Bay.
        All this under the cruel eye of the Sun

        I just don’t think that the climate physicists, chemists, statisticians, etc love the seal pups enough to understand. Oh why is the world so cruel?

        I am a bit dubious on cosmic rays – no one has shown a mechanism whereby it can influence upwelling in the eastern Pacific. Solar UV on the other hand varies with magnetic intensity – and therefore with cosmic rays and so has the same temporal signature. It heats and cools ozone in the middle atmosphere and influences downwelling into the polar vortices especially.

        ‘These `top-down’ mechanisms would be effective alongside `bottom-up’ solar heating of the sea surface and the dynamically coupled air–sea interactions. Although differentiating between the effects of variations in the two will often be difficult, recent studies indicate that they are additive, producing amplified SST, precipitation and cloud responses, for example in the tropical Pacific, even for relatively small solar forcing changes.’ http://iopscience.iop.org/1748-9326/5/3/034008/fulltext

        Now do try to follow. The warming and cooling of ozone in the middle and upper atmosphere translates into changes in sea level in Southern Annular Mode. Low pressure and the polar anti-cyclone is restrained to the the polar region – the ocean current through Drake’s Passage intensifies. High pressure and storms and cold polar water is pushed into lower latitudes.

        Here is a current SST anomaly thermally enhanced satellite image.

        http://www.osdpd.noaa.gov/data/sst/anomaly/2011/anomnight.4.7.2011.gif

        You can see the PDO in the north Pacific and the La Niña hanging on in the central Pacific. You can also see cold water being pushed up from the Southern Ocean and pushing up the western coast of South America in the area of the Humboldt Current. The region of the Humboldt Current is the most biologically productive area on Earth because the cold southern water is joined there by upwelling frigid water. The upwelling (or not) in turn determines the thermal evolution of ENSO.

        As an aside – the modellers in the paper this started with want to model ENSO as the result of Rossby waves. These undoubtedly exist and influence the ENSO states. ‘The ENSO cycle can also be explained through the movement of waves in the Pacific as mentioned above. The cycle starts with warm water traveling from the western Pacific to the eastern Pacific in the form of Kelvin waves. After roughly three to four months [Edward Laws, 1992] of traveling across the Pacific along the equator, the Kelvin waves reach the western coast of South America where they mix with the cool Peru Current system; therefore raising sea levels and sealevel temperatures in the region. Upon reaching the coast, the water forks to the north and south and causes El Niño conditions to the south.Because of the changes in sea-level and sea-temperature due to the Kelvin waves, an infinite amount of Rossby waves are formed and move back over the Pacific. The Rossby waves, as mentioned before, are much slower than the Kelvin waves and can take anywhere from nine months to four years [Edward Laws, 1992] to cross the Pacific.Waves move slower when the distance from the equator is increased. (This wave delay is key to the ENSO cycle.) When the Rossby waves arrive at the western Pacific they bounce off the coast and become Kelvin waves and again travel back across the Pacific towards South America. This time, however, the waves decrease the sea-level and sea surface temperature returning the area to normal or La Nina conditions.’

        http://library.thinkquest.org/3356/main/course/moreintro.html

        The Rossby theory fails to explain first of all the irregularity of the ENSO cycle and, secondly, the origins of the multidecadal variability, centennial changes as evidenced for Australian cyclone frequency and millennial changes. Evidence exists for millennial change in a South American lake. A particular red pigment washes into the lake and the change in ENSO state can be seen in a shift in the amount of pigment. States changed from La Niña dominant to El Niño dominant 5000 years ago – something that dried the Sahel and changed the course of human cultural evolution.

        Oh why oh why don’t they understand? Instead we have the pissant tendentiousness of – I don’t know – say cmb.

        Now where was I? Oh yes – I want a blue pony.

      • I hope you know Erl Happ. If not, please do.

        & thanks.
        =====

      • Hey – cool blue pony

      • you are welcome -= I love that picture. it hangs in munich in the lenbachhaus

      • Thanks, but I almost destroyed my keyboard laughing at your graph AND source. lol

      • The PDO is certainly a physical process involving the upwelling of frigid deep ocean water. This water is nutrient rich and leads to good fishing in North American streams.

  24. I am not sure why this issue is raised again, but decadal prediction is known to be difficult because of natural variability, as was discussed here several months ago. The concept people have trouble with is that natural variability averages out to zero on the long term, while climate forcing changes the mean about which natural variability oscillates. Here forcing includes anthropogenic, solar, and volcanoes that affect the radiation budget. Oceans and land don’t add or subtract energy from the system, making them part of natural variability.

    • It seems a rather odd and tendentious claim that natural variability doesn’t add or subtract energy from the system. The potential range of change in albedo is about 25 to 50% snowball Earth to blue/green planet – a difference of about 85W/m-2 in reflected shortwave. Mighty impressive by any ones count. Far, far more than the puny 3.7 W/m-2 for a doubling of CO2.

      Decadal changes in cloud are known to occur – with seemingly far greater effect than any piffling change in greenhouse gases in the period. How do you know that natural variability is merely decadal in nature? Could we not have had a little ice age or a medieval warm period? A Younger Dryass or the mysterious ice ages of the Quaternary? All from natural causes?

      But indeed – you are quite right. In the long term – it all averages out to the 0 Kelvin of maximum entropy.

      • Bitter dust – Bart R – bitter dust

      • CH, you seem to have an idea that clouds change spontaneously and unforced on time scales that are longer than decades. Where does that idea come from? Could not their long-term changes be related to anthropogenic or solar forcings?

      • Who said anything about spontaneous and unforced? There is a post on decadal changes in cloud a little back? The observed changes in the eastern Pacific – that most fundamental region for global hydrological and surface temperature variability – is for a reduction of low level cloud after the late 1970’s and an increase in the late 1990’s (Clement et al 2009, Burgmann et al 2008). The latter change shows up in ERBE and ISCCP-FD satellite data. It seems to be negatively correlated with sea surface temperature – which on the surface (ha ha) appears to be plausible. SST varies with the Interdecadal Pacific Oscillation.

        Looking at ERBE and ISCCP-FD data – damned if I can see any anthropogenic forcing. Solar forcing? What can you mean?

        As I have answered your question – perhaps I can ask a question. What do you understand by the term spatio-temporal dynamical complexity?

      • If cloud variability is associated with natural ocean variability, there is no net gain or loss to climate in terms of energy, since it reverses itself, like the ocean does. Cooling now is warming later. Not relevant to the long-term, only to decadal changes.
        Spatio-temporal dynamic complexity is a term used to make a simple subject seem more complicated. From observations, the magnitude of natural variability on decadal/global scales is plus or minus a couple tenths of a degree, and its importance is blown out of proportion to its magnitude.

      • You obviously are oblivious of the nature of the the variability under discussion – which is based on dynamically complex systems theory. You are blindly hoping, for some reason, that decadal ‘internal climate variability’ – which is chaotic according to the recent Royal Society summary – is in fact cyclical and that we will return in due course to a warming cycle. Most amusing.

        You were wrong in the past when you denied decadal variability was a reality at all, more recently when you declared confidently that ‘internal climate variability’ did not change the global energy dynamic, you are wrong currently in denying the strength of cloud radiative forcing in the satellite record.

        Given that you have demonstrated comprehensively a complete ignorance of ‘internal climate variation’ – you’ll forgive me for questioning your credibility.

      • So called abrupt events are thought to change global temps by several degrees in a few decades. It is speculated that they are due to ocean circulation changes. Of course none of this is well understood, but your few tenths quickly over is not supportable. You are merely wishing skepticism away, wave wave.

      • Jim – I mostly agree with you, but I should point out that rapid, large-scale temperature shifts (more often warming than cooling) have occurred at times, typically separated by thousands of years rather than decades or even centuries. However, they have almost always been associated with glacial climates rather than an interglacial like the one we are in now, and most have been hemispherically weighted rather than global. In general, your reference to a “couple of tenths of a degree” is relevant to natural variations on decadal scales we are likely to encounter in the remainder of this century. (I made a similar point below in the thread, but I’m repeating it here because some responses to your comment have been inaccurate, insulting, or both).

      • I also agree with your impression that there has been a tendency by some to use complex terminology as a form of intimidation rather than edification.

      • In principle, changes in climate on a wide range of timescales can also arise from variations within the climate system due to, for example, interactions between the oceans and the atmosphere; in this document, this is referred to as “internal climate variability”. Such internal variability can occur because the climate is an example of a chaotic system: one that can exhibit complex unpredictable internal variations even in the absence of the climate forcings discussed in the previous paragraph.

        Climate and climate change: some background science
        The Royal Society Climate change: a summary of the science I September 2010

        Beware the Dragon King – he rules non-linearity and makes a nonsense of your certainty and mocks your pissant tendentiouness.

      • I have yet to see any internal variability that is more than a few tenths of a degree, though I will concede to Fred that the periods near Ice Ages were bi-stable and could oscillate between widely separated albedo states. We are not currently in such as state, and moving away from it due to the diminished ice albedo effect, though some effects would be noticed as Greenland greens up after melting a few centuries from now, and later Antarctica (if we get to 1000 ppm). Yes, there are tipping points, but these are responses to forcing changes, not to internal variability in the current semi-warm climate, and CO2 forcing is going to push us through some of these step-like changes.

      • Jim,
        The problem is that our empirical data and our theoretical understanding are not conclusive. It’s just impossible to attribute reliably past variations, when too little is known about most important variables (both on forcings and on the state of oceans and the atmosphere). There is enough freedom of choice for getting reasonable agreement also with seriously flawed models.

        I agree (I think that you agree as well) that the complexity has been used as an excuse for belittling the value of much climate research. On this site we have seen many guest postings and comments that pick one formal argument and overemphasize its significance. Its not correct to claim that those arguments show that we should give no value for the model results. Still I’m not at all convinced that we can put strong limits on the possible role of natural variability on any time scale.

    • Not quite. It’s isn’t that the concept of natural variability integrating out to zero is hard to get intuitively, it’s that it’s not clear that it has to be that way. That assumes some sort of periodic behavior centered around some band of frequencies, i.e. pink noise. But what if the noise is white? Then that isn’t true. If the noise were white (at least out to the scale of ice age cycles), then nothing would ever average out to zero. It would just wander aimlessly.

      • The constraint is the energy budget, and the drivers are solar and albedo changes along with GHGs. If these change, the mean can change, if not, it can’t.

      • And since ALL of them change constantly………?

      • The connection between their changes and global temperature are understood, so it comes down to predicting those changes. CO2 is currently the biggest change we can foresee for this century. If someone knows a bigger one, they need to come forward with it, but nothing credible has been offered yet.

      • Are you telling me that you (or anyone) can predict solar changes – or albedo changes? AFAIK neither solar activity nor albedo has yet been quantified, although that could be done for a short time period – like 24 hours.

        But prediction? Really? So how did that sunspot prediction thingy work out for you over the last several years? Or haven’t you followed that particular prediction farce?

        CO2 changes might be reasonably guessed at, absent vulcanism over a given period, but even there ——–

      • No, I am not saying they are predictable, but that their effects on forcing are predictably less, based on past experience, than CO2 is known to be about to have. No one has said albedo or solar effects will change more than they have in the past millennium as far as I know, but CO2 will.

      • Are you redeemable? Some are not but merely lurk about shark like with tendentious (I am afraid it is my word of the week) menace.

        We have a period of surface warming between 1976 and 1998. 1976/77 and 1998/2001 were Dragons Kings. Periods of high variance associated with bifurcation points. Most of the warming was associated with these events. The other warming – the satellites tell us was largely the result of clouds. Now NASA might be wrong here – I freely admit that – but it behoves us to be less dogmatic in allegiances to global warming ideologies.

        There is a risk in adding CO2 to the atmosphere – of provoking the unpredictable wrath of the Dragon King – certainly not something I would encourage.

        But the world is cooling for another decade or 3 – and this is a political quandary as any future claim as the the necessity of decarbonisation will be sunk under gales of laughter unless the narrative is changed immediately and with great humility.

        The question is – are you part of the solution or part of the problem like Fred?

      • It is funnier than that – when NASA is telling us that clouds caused changes in reflected SW of 5 or 6 W/m-2 between 1985 and 1998 in the tropics – both ERBE and ISCCP-FD.

        http://isccp.giss.nasa.gov/zFD/an2020_SWup_toa.gif

        Now NASA might be wrong – I recognise that – it is the Government and it is rocket science after all.

  25. I was bored silly by this paper by p2 and stopped reading. It is a problem of models in themselves – they need to be a 1000 times bigger to capture micro-scale events, clouds, winds, sea-level pressure, ocean surface temperature, upwelling, downwelling, etc that have macro scale climatic impacts and thus be plausible. Even so, the models are themselves of course dynamically complex – using the same partial differential equations of fluid motion that Edward Lorenz used in his 1963 convection model – to discover chaos theory. Although Tomas would insist that chaos was first observed by Poincaré quite some time before in a 3 body calculation. But who is to say that the models are dynamically complex in the same way as the vastly more dynamically complex climate system?

    More than that – we lack lack any mathematical understanding of majors drivers of climate – so how can it be modelled? Clouds for one – but also solar UV. Lockwood and colleagues last year – http://iopscience.iop.org/1748-9326/5/3/034008/fulltext – showed a long term drift in solar UV, , and have linked this to changes in the troposphere. The role of solar UV in warming and cooling of stratospheric ozone and of these changes propagating to the troposphere – is an obvious mechanism.

    ‘These `top-down’ mechanisms would be effective alongside `bottom-up’ solar heating of the sea surface and the dynamically coupled air–sea interactions [20]. Although differentiating between the effects of variations in the two will often be difficult, recent studies indicate that they are additive, producing amplified SST, precipitation and cloud responses, for example in the tropical Pacific, even for relatively small solar forcing changes.’

    Models that include top-down forcing are appearing – but these people aren’t even in the game yet. Solar UV appears to drive the Southern Annular Mode, the Arctic Oscillation, the Interdecadal Pacific Oscillation, the Atlantic Multidecadal Oscillation… Add to that changes in SST in the Pacific, and therefore cloud changes, global cloud radiative changes and other global climate factors including much of the change in regional hydrology. These evolve as Girma rightfully says from upwelling in the eastern Pacific.

    The solar UV mechanism is in fact somewhat of a alternative explanation to cosmic rays – and a more direct and obvious impact on the modes of major climate variability. It would be very funny if UV retreats from a 1000 year high and surface temperature goes with it.

    Modelling of upwelling would require modelling of turbulent deep ocean jets on a global scale. Good luck with that. Lockwood also talks about regional effects – regional cooling and global warming – see p82 of the Big Book of Egregious Rationalisations. But how can global wide systems change without a global wide impact? How can low level cloud increase without global cooling? How can climate be non-linear and linear at the same time?

    The Dragon King rules over non-linearity – our enemies will drink bitter dust before this is done. The entire idea of climate equilibria, forcings and feedbacks, the whole box and dice, needs to be jettisoned – it cannot be saved.

    • Joe Lalonde

      Chief,

      I find the hilarity in the fact that billions of dollars has been spent in climate science and they are no further ahead.

      Yet a single person looking at the evidence can find the answer of what is currently happening with this weather change.
      Definitely not a CO2 problem but a heat problem that has changed the density of the air which then changes the pressure.
      From increased storm wind speeds to seismic activity can be traced to this one area of changing the sea level pressure.

      This accounts for the surface salinity changes and generates less friction for wind speeds to increase. Back pressure is what is exerted on the planets surface and when that changes, the pressure under the planets crust does not.

      • I don’t mean to ignore you Joe – I just don’t understand and so don’t know what to say?

      • The sun is very sultry and we must avoid its ultry violet rays.

        H/t to Noel Coward and Erl Happ.
        =================

      • Joe Lalonde

        Chief,

        I study the planet and solar system differently.
        I study this area mechanically through the clues left behind and try to recreate this and what this planet was like in the past and what the changes were done.
        So far, I have come across many mistakes that have disrupted our current understanding.

      • Joe, you’re right. I don’t know why I didn’t see it before.

        As the earth’s atmosphere heats up, it becomes less dense and therefore the pressure of the atmosphere on the earth’s surface drops. Except … the total mass of all the gasses in the atmosphere remains the same and … it is the pull of gravity on the mass of the atmosphere that determines the ground level pressure.

        I must be wrong about this so please help me out.

      • In an unchanging atmosphere, pressure reflects the total weight of the air column above the measured point. During a change (e.g., upward convection due to heating), the pressure may drop, but once the atmosphere stabilizes, the pressure should return to its previous value. (This neglects miniscule alterations, such as those related to changes in gravitational strength with altitude).

      • I would be inclined to doubt that hydrostatics is applicable in the atmosphere of any real world.

      • Hydrostatics are the basis of fundamental atmospheric properties, including lapse rates.

      • Oh really – dynamical complexity is the fundamental property of the atmosphere.

      • Concerning the pressure at sea level the dynamical complexity can produce the lows and the highs but cannot influence the average. This is a case, where the static calculation is really reliable.

      • You know that do you? The changing pattern of standing waves in the atmosphere, multi-decadal changes in the SAM and the AO, multi-decadal patterns of changing sea surface temperature, changes in temperature in the stratosphere from UV warming of ozone, changes in the NAO…?

        I always remind myself that the simple stories we tell ourselves are the grossest of approximations – I recommend to you and Fred a similar humbling exercise.

      • With arguments based on physical principles the idea is of learning from them as much as one can without erring too often. Here I am safely on the right side and not even close to the limit.

        It good to be humble, but not too humble.

      • I was questioning the applicability of simple hydrostatic models to an exceedingly complex atmospheric system in order to make a dubious point on a blog.

        To insist on its correctness without the ability to apply the scientific method is an absurd procedure not likely to resolve into anything of much interest or value.

      • Joe Lalonde

        Speed,
        Your not wrong.
        Except gravity in not the pull of objects by our core.
        The sheer complexity of interaction is hard to pick out what is happening.
        Especially when your told you have to follow these consensus laws.
        We cannot understand gravity and try to add magnetics, but that still does not answer questions of why some objects are heavier than others and it is not subject to magnetics.
        Read below as I finally have understood what gravity actually is.

  26. Michael Larkin

    I think Judith is right. This is remarkable. Good God, that there should be even the faintest acknowledgement that, besides an anthropogenic component, there might be a natural one, let alone that this might be worth investigating, is a major crack in the ramparts.

    And as to it being easier to predict what will happen in 100 years as opposed to 10 or 20, that runs so counter to intuition that one can’t blame scepticism about it.

  27. Well, as I have read in all this, the contention is that it is easier to predict climate at some far distant date than some nearer data. The specific meaning of distant is rather vague but appears to be more than several decades. The reason given is that over the long term, short term climate variations average out to a zero trend line. I find that assumption to be unwarranted. There is nothing in the long term temperature reconstructions that indicate that there is some identifiable natural zero anomaly state.

    So, anthropogenic forcing will be differentiated from natural variability by the difference in climate state at some future date from that our computer programs tell us would be the ‘natural’ state at that time? If there was no additional anthropogenic forcing, would future climate be exactly the same as today? How would we know?

  28. Paul Vaughan

    “[…] differences in model estimates […] limit our understanding for the potential of the decadal climate predictions.”

    Translation?
    Better get all those ducks in a row since once the differences between models are gone, understanding will be unlimited.

    Credit is, however, due here:
    “Note that all the techniques are limited by some assumption intrinsic to their analysis […]”

  29. Paul Vaughan

    JC wrote, “[…] separating out natural from forced climate change.”

    Work on understanding the nature of the variance has barely begun, so it seems almost tragically comical that gamblers hope a few naive decompositions might make them mean forecasting experts.

  30. Dr. Muller told the US Congress that only 0.1C of 0.7C warming from 1957 to present was natural.

    • And he could not possibly know that. The BEST project has nothing to do with attribution of warming.

    • Big Dave,
      If Muller said that as if it were a serious verifiable conclusion then he is pulling a SWAG (Scientific Wild Ass Guess) minus the science.

  31. Pooh, Dixie

    There is a problem inherent in this statement: “… an assessment of the physical processes through which anthropogenic forcing may interact with or project upon natural variability.” (Abstract, quoted by Dr. Curry) It focuses on the UNFCCC’s Objective (paraphrased as proving human activity is the cause of “Global Warming”).
    The problem: Understanding the anthropogenic effects is inadequate as a standalone goal. Natural variability is dynamic, in the “continuous” sense. It appears to have lagged teleconnections between regions. One must have understanding of both to distinguish between anthropogenic and natural variations.
    Another statement::
    “… regional changes in the current climate due to natural climate variations and thus transitory, “… Or periodic (“I’ll be back” :-) )
    This was also noted by David Wojick | April 7, 2011 at 4:35 pm

    “I find this distinction problematic — “to what extent are regional changes in the current climate due to natural climate variations and thus transitory, and to what extent are they due to anthropogenic forcing and thus likely to continue.” Why assume that natural variations are transitory? Ice ages last for 100,000 years, which is hardly transitory. This seems like the old idea that natural variation is some kind of oscillation around a fixed state. There is no reason to assume this on the decadal scale.”

  32. Judith: I’m finishing up a post that compares the AR4 model mean to Reynolds OI.v2 SST data during the satellite era. With the exception of the North Atlantic, the model mean linear trends are far higher than the SST measurements.

    The zonal mean comparisons for the ocean basins are very revealing. I’ll leave a link when I’m done.

  33. To separate natural and anthropogenically-forced climate variability, start with GISS temperature calendar year means with 1 year lag, and then subtract ENSO calendar means. This logical method compares ENSO anomalies with ENSO effects (97 El Nino year, compared to 98 temperature residual).
    Linear GISS temperature trend (1950 -2009) = 0.65 deg C

    GISS temperature calendar means with 1 year lag, then
    Subtract ENSO calendar means .

    = Residual linear trend = -0.17 deg C

    The residual trend is negative, therefore ENSO can account for 0.48 deg C of the Linear GISS temperature trend (1950 -2009).
    It’s plausible that 0.17 deg C is anthropogenically-forced, but there is no proof. ENSO, on the other hand, has all the bases belong to us.

  34. Huxley wrote: pokerguy: “I think Dr. Curry has made a plenty bold stand and she has remained true to herself. However it’s also true that she is much more orthodox than most of the commenters here.

    Dr. Curry really is persuaded by the basics of climate change. As I read her, her efforts with this blog are to bring the orthodox and skeptics closer together — not to advocate the overthrow of orthodoxy or the dismissal of skepticism.”

    Huxley, I don’t disagree with anything you’ve said. She’s a “skeptical believer” if I understand her general position properly. Moreover, she’s making a profoundly important contribution to opening up the debate.

    I’m not going to get into it all again, but it’s also true that there have been times when she seems reluctant to fully embrace her own beliefs. If you’re interested there’s a discussion in the previous “week in review” thread.

  35. I don’t see that this paper provides ” fodder for skepticism”.

    The paper seems to set some boundaries on the impact of natural variability which maintain CO2 as the main, long term driver of global temperatures. The two most important factors seem to be

    1)natural variability is redefined as decadal long noise. Call me cynical but does this have anything to do with a decade of unimpressive global warming?
    2)natural variability is essentially a regional issue. Why would anybody use these methods to analyse the global impact from these phenomenon?

    • HR,
      The issue is that while the paper discusses regional scales, everyone knows the data is the same on the global scale. The warmest year on record is still 1998 (with 2005 and 2010 as statistical ties).

    • The word ‘global’ only appears 3 times in the body of the paper and in each case it seems to be related to these analyses as a method for producing a cleaner signal to external forcing. I can’t help thinking this paper is simply re-inforcing (in a nuanced way) the approach of the IPCC.

    • Perhaps that is the intent of the authors, but it is still nice to see them admit any part of climate science is still in its infancy. I have not yet read the paper, only the post by Dr. Curry. I have to give an important presentation (important to me anyway) today so that is my excuse for not reading the paper yet. I do look forward to reading it.

      • Given that there is uniform agreement that modeling on the regional scale over a century or so is THE problem in climate science, Eli is shocked, shocked, that you were not on the distribution list.

      • Willis Eschenbach

        Eli, I love you guys, the world would be a much less funny place without you. Having been shot down with the idea of a “consensus”, you now come back with “uniform agreement”, as in:

        Given that there is uniform agreement that modeling on the regional scale over a century or so is THE problem in climate science, Eli is shocked, shocked, that you were not on the distribution list.

        What is with this craving of you guys for respectability with your repeated insistence that whatever you think, everyone thinks? We don’t all think like you, Eli, and I’m real happy that’s so.

        In my opinion, THE problem in climate science is the lack of an overarching theory about how the climate works.

        I hold that there are a number of homeostatic phenomena that tend to keep it around a certain temperature. Every other complex system has such homeostatic processes, it is inconceivable to me that the climate would not have them. I detail one of them here.

        Eli and others use a very different paradigm. They think the climate like a pool ball on a level table, so if you strike the ball with 3.7 watts of forcing, it moves 3 degrees in that exact same direction. There are no preferred states, no temperatures that are more likely than others. Seems very improbable to me, given the planet’s billion year history of rapidly returning to the same approximate temperature after e.g. a meteor strike or a thousand years of volcanic traps, but that’s what they believe.

        Obviously, before we can make much headway, we need to decide which of these competing theories of the climate is correct. That is why to me, the development of a central theory of how the climate works is infinitely more important than developing a computer program that can reproduce the climate.

        Because to reproduce the climate, you have to understand it. And the trivial advances in computer modeling of the climate in the last two decades indicate that … well … we don’t. And now you are trying to convince us that the problem is on the level of the computer program, rather than on the level of our basic understanding of climate.

        Finally, I’m shocked, shocked that you weren’t on the distribution list for all of that, Eli. Because after all, it is THE important problem in climate science.

        w.

      • Actually, “uniform agreement” is stronger than “consensus” Teh bunny moved teh goalposts down teh bunny hole.

      • :-)

        Martha is shocked, shocked, that Judith Curry makes this out to be something it is not.

      • As if you are a credible source of anything.
        Bitter much?

      • Given that there is uniform agreement that modeling on the regional scale over a century or so is THE problem in climate science

        There’s always that 10% that doesn’t get the memo. But you seem to have missed the superceding memo – that this is, at most, the SECOND problem in climate science. You must have missed that one. :-)

      • Now that climate scientists have wasted our time scaring us with huge global apocalypse tales, they are going to move on to waste more of our time on regional fables.
        Think of Greek regional mythologies vs. Mesoptamian regional mythologies.

  36. I’ve noticed that some readers have been dubious about the notion that model accuracy for global temperature trends could be greater at multidecadal intervals than over the shorter interval of a decade. Empirically in regard to global temperature anomalies, this has proven to be the case for both hindcasts and to date the only model for which forecasts beyond a decade are comparable with observations – Hansen et al 1988 – see Model Updates . This is an attribute of systems in which short term fluctuations can mask long term trends.

    Whether this would apply to 100 year projections is conjectural, but it might. However, here it’s important to avoid confusion about what is meant by “accuracy”. Both modeled and observed trends will be characterized by a “trend line” – with a single slope for each if the trend is linear, but it might not be. Accuracy refers to the extent to which the slope(s) of th.e observed trend line match those of the model projection. It does not refer to the absolute magnitude of differences between the two lines. In the latter case, accuracy would be almost perfect after an interval of one day, even if the slopes diverged by 100 percent. For future projections, it is the trajectory of the trend whose accuracy is important, and which can improve over time, at least over certain intervals, depending on signal to noise ratios.

    • Fred Moolten

      For clearing up the folly of believing that long-term predictions will be more accurate and meaningful than short-term ones, I can recommend you read Nassim Taleb’s “The Black Swan”.

      Max

      • Max- From hundreds of millions of years of climate history, we know that abrupt climate shifts can occur. However, they are exceedingly rare within any centennial interval and are generally separated by thousands of years or more. They have more often been sudden warmings than sudden coolings, and more often hemispheric than global. Almost all major shifts have occurred during glacial intervals rather than interglacials such as the present.

        Despite the improbabilities inherent in that history, the observations alert us to the possibility of “tipping points”, which with current climate trends are more likely to be abrupt warmings than coolings, but they don’t contradict the principle that at present, multidecadal global temperature trends can be predicted more accurately than those on decadal scales.

        The Decadal Predictability Working Group that authored the paper is charged with identifying means to improve decadal predictions. This need is one that has been recognized for some time by the IPCC and others, and in that sense, the paper and the work of the group does not, in my view, represent a change in perspective but rather an attempt to advance the science beyond its earlier recognized limitations.

      • Fred

        I would agree with you that all weather (and, by extension, all climate) is regional rather than global. It is also seasonal and diurnal, so that the “globally and annually averaged land and sea surface temperature anomaly” remains an artificial construct, which has no real meaning, per se, quite apart from any uncertainties regarding its accuracy.

        But I would disagree with you that the recognition of natural variability (a.k.a. “natural forcing” in its time-extended version) as a significant factor in determining our climate has not represented a “change of perspective” by the “insider” climate group.

        You will recall that IPCC AR4 assigned all natural forcing factors from 1750 to 2005 an essentially insignificant role. The “driver” of our climate since pre-industrial times was anthropogenic forcing – basta, end of discussion.

        Now the “camel” of “natural variability” has gotten its nose under the tent and is even being recognized by the “insider” crowd (albeit still only on decadal time scales and with primarily regional impact).

        The next step is that the whole camel is inside the tent and we have the concession that natural climate forcing has played a major role in the climate changes from 1750 to today (obviously reducing the importance of anthropogenic forcing).

        How did this shift occur?

        The past decade of “lack of warming” despite CO2 increase to record levels demanded an explanation. First attempts by the “insider” crowd to deny its existence or simply pooh-pooh it as a “blip” were unsatisfactory.

        Vicky Pope opened Pandora’s Box wit her “natural variability” press release and this latest Solomon et al. paper is a first “insider” reaction to the questions raised by Pope.

        It does represent a sea change, Fred, even though it is still very cautiously worded and coached in the decadal time frame for now.

        It may be the first “insider” concession that natural variability can play a greater role that GHGs, but IMO it will not be the last one. And future studies will undoubtedly expand its relevance to multi-decadal and even centennial time frames.

        Max

        PS Interpretations of paleoclimate data are nice, but largely inconclusive IMO due to their large degree of subjectivity, so I would place far more importance on the “real-life” observations.

      • I believe the perceptions of the IPCC and the Decadal Predictability Working Group are congruent. IPCC assessments since 1750 attempt to evaluate long term trends, but do not contradict the principle that within that long interval, many individual decades have varied substantially in an up and down direction- see, for example, the fluctuations in Temperature Anomalies, which include many decades that deviate from trend averages . The Working Group is addressing the latter phenomenon. I am not a mind reader, but I am willing to guess that the authors of that paper do not believe that decadal fluctuations and unpredictability undermines conclusions about long term anthropogenic influences. I suppose we could research what these authors may have written elsewhere on the topic, but I would be surprised if they express a very new perspective.

      • You will recall that IPCC AR4 assigned all natural forcing factors from 1750 to 2005 an essentially insignificant role. </blockquote

        Did it assign them an “essentially insignificant role,” or a role that long-term balances itself out? Did they say, for example, that solar activity is an “insignificant” forcing on climate, or that the degree change in that forcing was insignificant, and thus did not explain long-term changes in climate?

      • Joshua

        Look at Figure SPM.2. for the answer to your question.

        All natural forcing (including solar variability) was estimated to represent 0.12 W/m^2 (1750 to 2005), while all anthropogenic forcing over the same period was stated to have been 1.6 W/m^2 (1.66 W/m^2 for CO2 with all other anthropogenic factors, such as aerosols, other GHGs, land use changes, etc. essentially cancelling one another out).

        To its credit, IPCC did concede that its “level of scientific understanding” of natural forcing factors was “low”.

        It looks like they were “spot on” with that statement, based on the first decade of the 21st century.

        Max

      • Your earlier statement on the period since 1750 is still in strong contradiction with the IPCC report as IPCC attributes essentially all variation up to 1950 to natural causes.

      • I’m in over my head in terms of the specifics involved – but the way that I interpret your answer is that they were quantifying the degree of influence on change in a given time period, not the degree of influence on the climate as a whole.

        Would a climate scientist (who thinks that GW is probably A) say that if the sun burned itself out tomorrow, we could heat the earth by emitting CO2?

    • Not one of the models described in that Real Climate article accurately predicted the lack of warming for the last 10-12 years. And it is interesting to see that the practice of advocacy by graph is still alive and well in CAGW land.

      In the first figure in the RC article, the line used to show the composite of the model runs is black, and extends beyond the lines for the published temperature series. Block that line out for a moment and the recent lack of warming is clearly in evidence. With that darker line included, the overall optical impression is that everything is going up both before and after 2000.

      That first figure also includes 20 years of hindcasts. Which are of course irrelevant to the issue of whether models can accurately predict future climate. But 20 years of now relatively accurate (through tuning?) hindcasts tacked on to the last ten years of failed predictions visually minimizes the divergence between the models and the temps.

      The fourth graph, re: Hansen 1988, includes trend lines for GISTEMP and HadCRUT3 from 1984 to 2010. The explanation for using that time period for the superimposed trend lines: “The trends for the period 1984 to 2010 (the 1984 date chosen because that is when these projections started).” Right, it’s couldn’t be because trend lines for the disputed portion of the temperature records would show level or declining temps if shown for the period in question, could it?

      If you ignore the PR portion of those graphs, what they show is that Hansen’s scenario A vastly exaggerated warming for the entire period , scenario B only moderately exaggerated warming for the first 16 years, then significantly exaggerated warming for the period from 2000-2010.

      What is the one scenario of Hansen 1988 that tracks the reported temperature trends best? Scenario C, and scenario C was modeled based on significant reductions of CO2 beginning in 2000. Since we know that no such reductions have incurred, scenario C, while matching the temperature trends, simply proves that the model is wrong because it could only get the temps right by getting the CO2 emissions all wrong. So all three models have been proved to be wrong, and markedly so.

      But the underlying issue on this thread has been – are there models that are inaccurate on the decadal scale while more accurate over multi-decadal periods. Hansen 1988 is, at very best, an example of the opposite. If Hansen scenario B is considered “accurate” at all, it was for shortly more than a decade, while it diverges more from the temperature trends as it is compared to multi- decadal trends. Since Hansen 1988 covers 26 years, and is clearly wrong for the last 12, it is hardly an example of a model that is correct on a multi-decadal scale.

      The Real Climate article is very good PR though. It uses graphs to optically “hide the divergence” between model predictions and reported temperature. And it ends with the tag line: “So to conclude, global warming continues. Did you really think it wouldn’t?”

      Gotta love chutzpah.

      • Not one of the models described in that Real Climate article accurately predicted the lack of warming for the last 10-12 years.

        That is the point of establishing a Working Group to improve decadal predictions, which are still daunting, whereas most of the models have done a good job with multidecadal predictions of global temperature anomalies.

        Multidecadal model performance has been exhaustively discussed in many threads here and elsewhere, and it’s unlikely that further contention in this thread that is focused on decadal prediction will change our perceptions. I’ve made some suggestions above as to where readers can find informative articles on the subject, and other readers have also cited links that can be visited.

      • How many models have been around long enough to make multidecadal projections? The day my broker starts letting me buy stock at 1980 prices based on my projection the price will increase is the day I will call imitating the past a projection.

      • Fred Moolten

        Multidecadal model performance has been exhaustively discussed in many threads here and elsewhere, and it’s unlikely that further contention in this thread that is focused on decadal prediction will change our perceptions.

        Aw c’mon, Fred. Get serious.

        Hansen’s “multi-decadal” projections made back in 1988 were dismal failures, showing gross exaggeration of warming rates, as GaryM has pointed out. Since Hansen is still using the same models, it is clear that his projections for the future are worthless, as well.

        The myopic (IPCC and Hansen) fixation on anthropogenic GHGs alone (viz. CO2) is coming home to roost. That simply is not the way our planet’s climate works in the real world.

        Max

      • Current models are better than those of earlier decades. Readers should review upthread links, past threads, IPCC Chapter 8, and other sources to draw their own conclusions. I doubt this will be a useful discussion unless it introduces some new data. Otherwise, I will be happy to leave the last word to someone else who wants to claim it.

      • Fred Moolten

        Thanks for tip, Fred. I have read all the information you suggest and am continuing to read new information as it is published.

        It is clear to me that IPCC AR4 was already hopelessly outdated when it hit the press in mid 2007.

        It is even more so today, four years later.

        And the new information that is coming out largely points to a much less sensitive climate and a much smaller impact from CO2 than was previously assumed by the IPCC models, with natural forcing factors playing a much more important role.

        Max

      • “Current models are better than those of earlier decades.”

        Oh really? How exactly would you know that? Crystal ball that lets you see that their 20 year performance is better than Hansen’s dismal failure?

        Hansen was as confident in his earlier prediction as you are now of his (and others’) later predictions, and for exactly the same reasons. Which is to say, none.

        Faith-based climatology recapitulates the experience of the Millerites in the 19th century. The old prediction didn’t pan out? The new prediction is sooooooo much better…. trust us.

      • Why are they better?
        Because they cost more?
        Certainly not because they inform us of reality any better.

      • “I’ve noticed that some readers have been dubious about the notion that model accuracy for global temperature trends could be greater at multidecadal intervals than over the shorter interval of a decade. Empirically in regard to global temperature anomalies, this has proven to be the case for both hindcasts and to date the only model for which forecasts beyond a decade are comparable with observations – Hansen et al 1988 – see Model Updates .”

        I would not have commented at all if you had not linked to an article the directly contradict the point you have made repeatedly. I really didn’t know whether you had anything that actually supported your claim that it is generally accepted that models are better at multi-decadal forecasts than at short time spans. You wrote with such certainty I thought I may have missed something. But don’t worry, having seen what you consider evidence for that specious claim, when you make the same false claim again in the future, I will ignore it.

        Some people are just impervious to facts.

    • Fred,
      If the climate was a population of people, we could make some great statistical projections that work because we know everyone dies eventually.
      Climate is not close ended. There is no reason to beleive that long distance projections will be anymore useful than shorter term predictions.

      • hunter

        “There is no reason to beleive that long distance projections will be anymore useful than shorter term predictions.”

        In fact, the very opposite is true.

        The longer the prediction period, the greater the likelihood that an unconsidered (or even unimagined) outlier (or black swan) will kick the whole prediction in the head.

        That’s when one hears rationalizations, such as:

        “My prediction was correct, except for…”

        The “except for” today is “decadal natural variability”.

        Who knows? In another 10 years it may be “muti-decadal natural variability”.

        Max

      • Consider Fred’s example of the stock market. People who invest on a short-term time scale are taking a much greater risk than those who invest on a long-term time scale. The longer the prediction period, the lesser the likelihood that the unconsidered (or even the unimagined) outlier will kick the whole prediction in the head.

        It seems to me that trying to abstract absolute rules, either way, that apply to the relationship between short- and long-term trends in climate is overreaching. Short-term deviations could multiply in effect, or they could balance out. Don’t data tell us that long term trends have prevailed over short term contrasting deviations?

      • Yes, almost by definition. That’s how they got detected and labelled, after all!

      • That’s a lot like “I’d have gotten away with it if it weren’t for those…”

      • Yes, Bernie Madoff’s famous philosophy.

      • Hunter – the evidence for better accuracy of multidecadal than decadal global temperature modeling is empirical – the data show that to be the case. We can’t extend that to centennial trends because we don’t have quantitative data. This thread is not about that topic, however, and recent comments illustrate, to me at least, the futility of persisting on the subject, because every statement in the comments has been made many times in the past, regardless of the viewpoint of the individual making it.

      • Fred Moolten

        Sorry to pick on you, Fred, but I will have to correct you as long as you make statements such as:

        the evidence for better accuracy of multidecadal than decadal global temperature modeling is empirical

        I have seen no empirical evidence that this is true. If you have such empirical evidence, please supply it.

        I would suggest, on the contrary, that a 1% inaccuracy in slope of a 10-year projection will, by definition, show less deviation than the same inaccuracy will in a 50-year or 100-year projection.

        Wouldn’t you agree?

        Max

      • See above comments and links for empirical evidence that models perform better at multidecadal than decadal intervals.

      • But it is meaningless for the purpose of projecting a climate catastrophe.
        http://www.geocraft.com/WVFossils/last_400k_yrs.html

      • Fred,
        I think you make stronger statements than either the modelers themselves or IPCC. As long as it’s not possible to determine, whether the climate sensitivity is less than 2 or more than 4, it’s not either possible to claim that the models are empirically confirmed.

        It’s certainly logically possible, and perhaps also plausible that models are better in forecasting indicators for periods of 50-100 years than for decadal effects, which are strongly dependent on initial conditions, but claiming that this has been proven is against my interpretation of what the accepted main stream position is.

        What is certain, is that empirical data accumulates the faster the shorter term models are being tested and the more regional details the models describe. This is one strong reason for the attempts to develop decadal regional models. Another even stronger reason is in the value of regional forecast for developing adaptation strategies.

        As long as the empirical evidence on the validity of climate projections is weak, decision making remains very much dependent on improving the capabilities for acting wisely under large uncertainties. Climate science can provide only one rather unsatisfactory input to that process.

      • Rob Starkey

        Peeka- I generally agree with your summary.

        Wouldn’t you agree that being able to reliably determine what weather/climate is due to natural vs. man made conditions at a regional level is far from possible at this time and any conclusions in that regard should be examined closely before accepted as accurate?

      • Rob,
        I think that the IPCC conclusion that reaching an agreement with the warming since 1970’s without significant anthropogenic contribution is very difficult, but that’s about as much as can be said with reasonable certainty. Now with the present knowledge on the period since 2000 it seems equally difficult to reach an agreement without a sizable warming from natural variability over the same period.

        The fast temperature rise and the subsequent flattening are too much for either explanation alone, but telling whether it is 50-50, 70-30 or 30-70 seems to go beyond what the observations can tell – or the present models. For better determination we would need good decadal models.

      • I should have read my message before posting. The first sentence should tell that the IPCC conclusion is justified.

      • Considering that most of said warming may be artifacts of station exclusion and comprehensive fiddles (retrospective lowering of earlier measurements, etc.), that’s a quakey quagmire on which to build.

      • Pekka – I haven’t claimed that models are “empirically confirmed”, and I think it’s understood that all models are imperfect and certain to remain so. However, I interpret evidence to date to show their greater accuracy at multidecadal than decadal intervals in terms of percent deviation from observations – see the links and references in my comments including the GISS curve – and this item has been discussed enough for those who disagree to review the evidence for themseves for a final judgment.

        A point that deserves some emphasis in my view is that future climates with more or less natural variability than the recent past will not radically alter conclusions that anthropogenic greenhouse gases contribute a substantial warming influence. The current climate sensitivity range of 2 to 4.5 C for a 90-95% CI is fairly broad, with more uncertainty at the high than the low end, but it is not infinitely broad. Greater future natural variability will not change this, but rather change the proportion of future temperature change attributable to anthropogenic vs natural variation. This is because the role of GHGs is not calculated by default – it’s not what is “left over” after other factors are calculated, but rather based on a combination of physical principles and observational confirmation specific to the GHGs and their feedbacks. There is room for future modification, nd future disagreements, but they should be based on those sources. I hesitate to bring this point up, because it may trigger a repeat of many previous discussions of climate sensitivity, but I believe interested readers should revisit those discussions to understand why the 2-4.5 C range has been estimated with confidence despite some outliers in sensitivity estimates beyond that range. This is one topic where quoting selected sources can only be misleading.

        I believe that the IPCC concluded that “most” of the warming since the 1950’s was due to anthropogenic greenhouse gas emissions, and I think their evidence supports that conclusion, although they may have underestimated the role of black carbon aerosols. These have both natural and anthropogenic origins, but increases have been mainly anthropogenic. The IPCC attribution was also a subject of probably three separate posts on this blog, and I believe it is too big a topic to profit by a brief rediscussion here unless new evidence is provided.

      • Fred,
        My view is roughly that I accept IPCC conclusions, when all reservations and caveats stated by IPCC are taken seriously. I am not convinced by people who claim that IPCC is underestimating the risks or excessively conservative when stating uncertainties.

        This makes it very difficult to determine, how drastic reductions in CO2 emissions should be aimed to. On the other hand there is quite a lot of truth in the conclusions of Roger Pielke Jr. when he writes in “The Climate Fix” that political realities determine finally anyway, what is realistic and that there is good reason to aim at the maximum that is politically possible. This is justified both to reduce risks of climate change and to reduce dependence on dwindling fossil energy sources.

        I agree also with William Nordhaus who writes in his book “A Question of Balance”

        Neither extreme – either do nothing or stop global warming in its tracks – is a sensible course of action.

        The big problem is that the present knowledge is in many different ways insufficient for determining, what is the wise compromise between these insensible extremes.

      • Neither extreme – either do nothing or stop global warming in its tracks – is a sensible course of action.

        Which global warming?

        http://bit.ly/fELbNL

        The 0.2 deg C warming in 10 years of the IPCC didn’t materialize.

        http://bit.ly/fELbNL

        The 0.2 deg C warming in 10 years of the IPCC didn’t materialize.

        How can you do something to global warming when it has disappeared?

      • Fred,
        You seem more patient than most true believers, and much better mannered, and that is greatly appreciated.
        Look, you can repeat yourself as patiently and kindly as you like, but that will not make your basic premise any less wrong.

    • Willis Eschenbach

      Fred, you say:

      Fred Moolten | April 8, 2011 at 12:18 pm

      I’ve noticed that some readers have been dubious about the notion that model accuracy for global temperature trends could be greater at multidecadal intervals than over the shorter interval of a decade. Empirically in regard to global temperature anomalies, this has proven to be the case for both hindcasts and to date the only model for which forecasts beyond a decade are comparable with observations – Hansen et al 1988 – see Model Updates . This is an attribute of systems in which short term fluctuations can mask long term trends.

      Hansens 1988 model actually worked pretty well for one decades, from 1988-1998. Since then, the fit with observations has grown worse with each succeeding year.

      This is the opposite of both your and the modeler’s claims, that accuracy will improve over time.

      Now, it’s possible that this is a temporary situation, and that sometime in the future the GISS results will be accurate. And if that time ever comes, you can claim that it supports your (and the modelers’) idea about the models.

      But until then, it’s not a data point that supports your argument.

      w.

      • Although my examples were intended to include the numerous models whose results to date are primarily hindcasted, I believe the Hansen et al forecast model also illustrates the point, but that depends on how you divide up the intervals and/or smooth the data. If the intervals are 1984-1993, 1994-2003, and 2004-2011, and you compute standard errors for the slopes of the modeled and observed data, the accuracy of the model for the entire interval appears to exceed that for the averages of the individual intervals. A different choice of intervals might yield a different result.

      • Willis Eschenbach

        Fred, if you could post the data from which you are drawing your conclusions it would allow us to discuss this question.

        Many thanks,

        w.

      • For the three intervals I cited tried to read numbers off of the graph in the RC link, and that might not be accurate. It’s an interesting question, though, as to how to compare the average performance over the three intervals with the decadal performances (actually only 7 years for the most recent interval).

        The deviation between the model slopes and Hadcrut slope is as follows:

        A. Multidecadal: 0.27 – 0.18 = 0.09 C
        B. Earliest interval: 0.48 – 0.18 = 0.30 C
        C. Middle interval 0.20 – 0.18 = 0.02 C
        D. Latest interval 0.42 – 0.18 = 0.24 C

        Clearly, two of the intervals (B and D) did worse than the multidecadal performance (the slope differences on the right hand side are greater than 0.09), and one did better (C, which differed from Hadcrut by less than 0.09). This is very qualitative and not very meaningful for such a small sample. To attempt better quantitation, I used the 0.09 C overall deviation as a baseline, subtracted that 0.09 from the right hand numbers above for B, C, and D, calculated mean square values for the intervals that did worse, and subtracted the mean square for the interval (C) that did better. The result was 0.0333 minus 0.0049 = 0.0284. This is a positive number, suggesting that worse performance outweighed better performance. Not very conclusive, and there may be more sophisticated methods using the residuals for each individual temperature measurement (I didn’t try to read each of these from the graph), but the result is consistent with better multidecadal accuracy, and at least does not contradict it.

        I’d welcome suggestions for doing this more quantitatively.

      • Looking at the graph, I realize I should have divided the Hadcrut line into individual intervals to match against the modeled intervals, rather than simply matching the latter against the long term Hadcrut regression line. However, eyeballing the graph shows that this would have slightly magnified the effect I noted rather than diminishing it.

  37. Judith

    Thanks for posting an interesting new paper.

    This paper was due from the minute that Vicky Pope, head of climate science advice at the UK Met Office, told us the observed lack of warming after 2000 was caused by “natural variability” two years ago.

    This sounded like a casual shrug-off at first, but it opened a can of worms, raising the question:

    If “natural variability” (a.k.a. “natural forcing” in its time-extended version) was strong enough to overwhelm record increases in CO2, which were estimated by the models to result in 0.2C per decade warming, how can it be that all natural forcing factors from 1750 to 2005 have played an essentially insignificant role, as IPCC has estimated (AR4)?

    Solomon et al. are now giving the climate “insider” community’s initial response to that question.

    I’m still going through the whole paper, but it looks like the authors are sticking with the (new) party line that overwhelming natural variability only operates on a decadal time frame (rather than a multi-decadal or -oh horror! – even centennial time frame), but with their concession they have entered the slippery slope.

    Once they have conceded that NV can have an overwhelming impact on our climate on decadal time scales, it is only a matter of time until the time scales start getting stretched and we begin to talk about long-term natural forcing and the IPCC premise that this is insignificant becomes challenged.

    Max

  38. “Most notably, the eastern equatorial Pacific shows cooling in HadISST and warming in HadSST2 and ERSSTv3 (see also Vecchi et al. 2008). However, independently measured but related variables, such as nighttime marine air temperatures, provide some evidence that the eastern Pacific trends represented in the HadSST2 and ERSSTv3 datasets may be the more realistic ones (Deser et al. 2010b).”
    I wonder if there was also ‘some evidence’ that the trends represented in HadISST ‘may be the more realistic one’ …
    Wouldn’t it make more sense to investigate why this difference exists, and how come, rather than plump for the datasets that show warming?

    • Nebuchadnezzar

      There aren’t a whole lot of observations in the early twentieth century in the tropical Pacific (or in the mid 20th century come to think of it) so working out what happened is tricky. There are a bunch of projects working to get ship log books out of paper archives and into digital ones which will help fill some of those gaps and pin down the reconstructions.

      e.g. http://www.oldweather.org/ which is digitising first world war ship logbooks.

      Different data sets give different results for the trend in the tropical Pacific, but the trend represents only a small fraction of the total variance – it’s measured in tenths of a degree whereas the El Nino La Nina variations are up to a few degrees. If a particular reconstruction method misses an El Nino event early in the record it could easily get the wrong trend.

      ERSST and HadISST make particular statistical assumptions to fill gaps in the data. The ERSST group, I think, publish some kind of uncertainty analysis with their reconstruction. That ought to give some information about how reliable they think the early reconstructions are.

      One reason for suspecting that ERSST is more reliable is that in filling the gaps in the data it makes use of more local information and has some kind of temporal persistence, whereas HadISST uses large scale patterns to fill the gaps and doesn’t use information from later and earlier times to constrain the reconstruction. In some months, the tropical pacific reconstruction in HadISST might depend on only a tiny handful of observations from a geographically distant region. Contrariwise this could be a strength of HadISST because it can infer an El Nino or La Nina event from fewer more scattered observations.

      HadSST2 isn’t infilled at all so what you are seeing there is a trend fitted to the actual data with very little statistical jiggery-pokery although one has to be careful of data gaps.

      If I recall correctly there are a couple of papers looking at this problem. I think Deser et al. is one, and there was another by Karnauskas et al. in the Journal of Climate (the ref is in Deser et al.).

      Deser et al. is here:
      http://acacia.ucar.edu/cas/cdeser/Docs/deser.ssttrends.grl10.pdf

      It considers other physical changes that would accompany a change in the equatorial Pacific SSTs.

      Deser et al. point out that all SST analyses agree that most of the sea surface warmed during the twentieth century (and, globally-averaged, they all warm), it’s only in part of the tropical Pacific that they disagree. I’m not sure how Prof Curry thinks that casts doubt on the IPCC statement that warming of the climate system is unequivocal, since all the SST data sets say (unequivocally) that the world has warmed (as well as air temperature over land and oceans etc..)

  39. “Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on regional scales, …”

    WTF? What about the last 10-30 years? Or the last 30-100 years? Or the last 100- 1,000 years? What is the justification for the assumption that the magnitude of natural decadal variation is increasing?

    Given that the alleged ‘anthropogenically forced climate change’ is supposed to be both dominant and increasing at an alarming rate, the relative magnitude of the natural variation vs that alleged forcing should be decreasing.

    Are there any scientists doing science these days?

    • WTF? What about the last 10-30 years? Or the last 30-100 years? Or the last 100- 1,000 years? What is the justification for the assumption that the magnitude of natural decadal variation is increasing?

      No one stated natural variability would be increasing. It would just be operating in the opposite direction than it has done during the last 30 years (AMO and PDO have both been/turned positive).
      But “what about the last 100-1000” years is the right question, because we do not know how strong turbulence might exist in such timescales. Warm MWP would suggest it might be significant.

      Given that the alleged ‘anthropogenically forced climate change’ is supposed to be both dominant and increasing at an alarming rate, the relative magnitude of the natural variation vs that alleged forcing should be decreasing.

      Yeah, supposed to. But what if it isn’t? What if the majority of the temperature rise during the last 30 years is due natural variability and not man made CO2 emissions, and the climate sensitivity is much less than IPCC has concluded? It is certainly plausible, that the effect of humans is so small that it would be nearly immeasureable from the noise.

      Are there any scientists doing science these days?


  40. Separating natural and anthropogenically-forced decadal climate variability

    As IPCC projection of 0.2 deg C per decade warming has been found (for about a decade) to be completely wrong as shown in the following graph, we can reasonably separate the climate variability to be 100% natural and the 0% man made.

    IPCC projections (GREEN) compared to observation (RED): http://bit.ly/ijZ6Ub

  41. It seems, based on what I’ve read here and on other threads, that weather can be predicted with reasonable skill out to about 2 weeks and that climate can be projected at the multi-decadal scale, albeit with significant loss of resolution. Now it seems, the intermediate scale is receiving some attention.

    From a very high level, it seems like there is something that introduces a “wobble” such that past that 2 week period, you can project the (very) general vicinity of where you’ll end up, but without much accuracy around the path you took to get there. The paper discussed initialization issues, but I’d think those would hurt the short term more. Any thoughts on how else they might attack the issue?

  42. Question: How many times does the Solomon et al. paper use the word “decadal”?

    Answer: 52 times (just to make sure we didn’t get the wrong impression that natural variability might have an effect on “multi-decadal” or even “centennial” climate trends).

    Max

    • And we will run up against a shifting definition of “multi-decadal”

      Is twenty years a “mult-decadal” period ? Since it is two successive lots of decadal periods, I suggest it is. But if the current period of non-warming extends another 5 years, the AGW definition of “multi-decadal” will have to shift to 40 or 50 years to maintain the devil-CO2 narrative. Watch this space …

      We also need a reliable inventory of natural variability factors. Obviously Nino/Nina/PDO etc are candidates, but so also are volcanoes, large submarine tectonic shifts (since they may alter deep ocean current patterns) – relatively rare as these phenomena are, they are not un-natural.

      I am also puzzled by the implicit assumption that homo sapiens and its’ activities are un-natural. We are just as much a product of opportunistic evolutionary trajectories as any other species, so it follows that if we think we can terraform (and perhaps we may be able to eventually), this is not un-natural. This is a bridge too far for the moralists, however.

  43. Natural climate modulation is a function of the oceans’ heat capacity trough interaction with the atmosphere. To understand climate it is essential to know what drives long term oceanic indices PDO, ENSO, AMO, etc.
    The answer is in data, and the data is here:
    http://www.vukcevic.talktalk.net/PDO-ENSO-AMO.htm
    Climate modeling (climate crystal ball sub-science) is a waste of time, neither of the above can be forward extrapolated, thus there is no predictive value in incorporating PDO, AMO etc.

    • Paul Vaughan

      Ask any northern sea-kayaker if the ocean surface is cooler in winter than in summer. The answer is well-known due to respect for a phenomenon known as death by hypothermia. Ocean heat capacity may be small compared to atmospheric, but it is swamped by something so simple as annual & semi-annual insolation/circulation cycles. I refer readers to Le Mouël, Blanter, Shnirman, & Courtillot (2010) [a paper presenting a fundamentally seminal, real-data-based observation] for insight.

  44. The most reliable future prediction of variable data is a straight line regression through all data. No attempt to do anything more complicated than that has proven more reliable in the long term. I think the NASA paleoclimatologists have it right when they say there is a 50% chance that climate change is anthropogenic.

    The biggest travesty of the climate establishment is making long term predictions based on 30 years of temperature data. Unscientific nonsense. Geology and paleoclimatology have billions of years of data.

    • >Geology and paleoclimatology have billions of years of data.<

      The CSIRO in Aus (a leading scientific establishment and shrill AGW advocate) has already declared such data to be irrelevant :)

    • You have to be kidding. Extrapolation is guaranteed to get you into trouble.

      You can’t go wrong buying real estate. Right?

      • Midwestern farm ground values are up 367% from 1988,(from personal farm sale) sold for $320,000 in December 2010, bought for $87,000 in 1987 September.
        it is just the suburban overpriced housing bubble that popped.

        Just like the IPCC UEA bubble has popped, but the climate is still varying naturally.

  45. Gary Mirada

    Judy

    not commented for a while. Been studying at Basil Fawlty’s School of the Bleeding Obvious. Lesson one on climate change was enlightening; if we cannot understand natural variability we cannot understand unnatural variability. I think Willis pointed this out somewhat more prosaically up- thread.
    Your conversion to true scepticism is encouraging.

    Kind regards

    Gary

  46. BlueIce2HotSea

    Natural cycles have been included in past global warming estimates. Check out this graph on page 3 (Science, 1975): Climatic Change: Are We on the Brink of a Pronounced Global Warming? It gives an estimate of .6C for the increase in global avg. temp from 1900 through year 2000. Not bad for a 1975 estimate.

    It assumed that a natural cooling cycle began in 1940 and would reverse around 1985. The reversal in combination with elevated CO2 would cause dramatic temperature increases. But the massive .4C increase estimated for the 1st decade of 21st century did not happen. An inescapable possibility, assuming the paper is sound, is that the natural warming cycle turned much colder, but was held in check by increasing GHG.

    Finishing off this speculative line of thinking: without AGW, current global average temp. could be 1C lower, circa early 1800’s. Springtime temperatures in the Northern plains would be MUCH colder, with much shorter growing seasons due to cold, late springs – a devastating problem for Canadian wheat.

    • “It gives an estimate of .6C for the increase in global avg. temp from 1900 through year 2000. Not bad for a 1975 estimate.”

      Well, it was a good estimate, but they did have a bit of a “head start”.

      From around 1910 to 1944 there had been a strong warming cycle, with a linear warming over the period of 0.53C (HadCRUT and Delworth & Knutson 2000). This followed the tail end of a cooling trend, with a bit less than 0.1C cooling from 1901 to 1910, and was followed by a very slight cooling trend of only a few hundredths of a degree C from 1945 to 1975.

      So they already had around 0.45C of the 0.6C “in the pocket” for the first three-quarters of the century.

      Guessing that it would re-start warming again in 1975 was a good call (since it had not warmed since 1945), but getting to 0.6C for the century was pretty easy after that.

      Max

      • BlueIce2HotSea

        manacker

        What I like best about Broecker’s treatment of projected temperature trend due to GHG is the inclusion of the estimate for natural variability on the graph.

        What are we to make of the current practice of avoiding overt acknowledgement of natural variability on trend graphs… that science is confident that there is NO NATURAL VARIABILITY? Or that it has not been considered? More likely, natural variability is a can of worms best left unopened – that is, if one is inclined toward overstating certainty…

  47. General observation on this topic:

    I don’t think this blog has the statistsl heavy hitters that Climateaudit does, and we seem to be stumbling in the dark on this issue compared to them. Those guys could probably do this topic a lot more justice, because it really does come down to statistics and data analysis. I know just enough about analysis to be dangerous, and have learned to keep my mouth shut and ears open over there.

    • Actually some heavy stats hitters showed up here in the early days of this blog, haven’t seen them here lately.

  48. “I am also puzzled by the implicit assumption that homo sapiens and its’ activities are un-natural. We are just as much a product of opportunistic evolutionary trajectories as any other species, so it follows that if we think we can terraform (and perhaps we may be able to eventually), this is not un-natural. This is a bridge too far for the moralists, however.”

    It’s a bridge too far me, I know that. Am I a moralist? If that means someone concerned with man’s impact on the environment, then I suppose I am. Murder’s “natural.” Rape is “natural.” Obviously this does not excuse those who commit these crimes, just as it would not excuse someone from dumping toxic waste into the sea.

    In addition to being an environmentalist, I’m an AGW skeptic. The two things are in no way mutually exclusive.

    • >In addition to being an environmentalist, I’m an AGW skeptic. The two things are in no way mutually exclusive<

      Nor did my post suggest they are

      My point, made again, is that homo sapiens is an evolved species and consequently as natural as any other species, living or extinct

      As for "morals", I regard them as constructed but conflicting values. The Grey Nurse shark gives birth to one (generally) or at the most two pups – the remainder are eaten by the survivors whilst still developing in the womb. It's quite hard to find a more morally repugnant (to us) example of horrifying behaviour, yet this is precisely how this species has evolved. Homo sapiens is a very aggressive species, yet some natural behavioural traits are labelled as "immoral", rather than the simple truth of being utterly destructive of the tribe. Unconstrained murder is not "immoral", but it is completely destructive of communal society, so it is severely proscribed

      Pouring large volumes of HCl into common waterways is similarly destructive and is similarly proscribed. The desire to dump highly toxic waste as cheaply and quickly as possible is neither unnatural nor immoral, but doing it is destructive of the commune. Moralists have a lot of problems with evolutionary concepts, but as John Cleese remarked in Life of Brian: "It's symptomatic of their struggle with reality"

  49. Not to beat this topic to death, but it remains to be said that we have a very specific and falsifiable example of why multi-decadal temperature predictions are simply impossible to make, now or ever.

    The argument is made that climate cycles and “forcings” that are chaotic in the short run, making predictions impossible, average out over the long run, making “multi-decadal” predictions more reliable. Well, in the very long run, we know that the holocene is overdue to flip back to the old freezerino, so where are the models predicting just when? OK, let’s be more “reasonable” and look at the last couple of thousand years – there have been warm periods and cold periods, especially those pesky “little ice ages” so tantalizingly associated with sunspot minima.

    The sunspot cycle is an excellent example of one of those natural periodic cycles that always average out over time. Whoops – the sunspots are, like all natural cycles, QUASI- periodic. Up until a couple of years ago, any GCM that incorporated solar forcing from sunspots (ignored at their peril IMHO) would have predicted that 20 or 30 or 50 years from now would have been just about like the sunspot cycles have been for the last, well, at least 100 years.

    Today we are face to face with a dramatically reduced (and unpredicted) sunspot cycle. How long will it last? Nobody knows. How deep will it go? Nobody knows. What we do know based on history, is that if it persists, things will get colder. Will it last two years? 20 years? 50 or 100 years? Nobody knows. So how is it possible to construct a GCM that accurately predicts the climate 20-100 years from now? It’s not. If and when the cold sets in, it will interact chaotically with all other climate cycles which in themselves are poorly understood. But since the period of persistence of the minimum is not predictable, then the climate outcomes in the “multi-decadal” range are also not predictable.

    This is only an example and thought experiment. All of the climate “cycles” are actually QUASI-cycles, riding on top of smaller quasi-cycles and so on down to the core of quantum mechanics, affected by everything from the orbit of the sun through the galaxy to the interaction of the planetary orbits around the solar system center of mass, to the mixing of ocean water at the source of the great currents, each affecting and interacting with the others in ways large and small.

    With all this chaos (and the negative examples of Mars and Venus) there is only one thing we know for sure about earth’s climate: For the last few billion years, there has not been one – not even one – single day when the climate was not endurable by life forms. Oh sure, millions of very bad days, but somehow the climate always bounced back. There is only one thing that I can see that makes this possible – all that water in the oceans and its attendant hydrologic cycles. Astroid strikes, continental drift, thousands of years of volcanism like the Deccan Traps, the worst that the chaotic universe could throw at it failed to fundamentally disrupt the hydrological cycle and end life. We are all proof of that fundamental truth.

    If we want to understand more about what truly regulates this magnificent climatic system, we need to know a great deal more about the hydrologic cycle – where the clouds come from and how they behave for example. And to accept that in the long run, what we do or don’t do (or predict) will have little effect on the actual course of climate, no matter how well intentioned or passionate. The ancient Mayan priests convinced their general public that they had offended the weather gods, and they offered up their best virgins to be thrown in the volcano as appeasement. Are we not doing the same today?

    • Marcopanama –
      The ancient Mayan priests convinced their general public that they had offended the weather gods, and they offered up their best virgins to be thrown in the volcano as appeasement. Are we not doing the same today?

      You noticed, of course, that those Mayan priests didn’t offer what would have been the most effective sacrifices – their own lives. Instead they picked on “innocent” virgins.

      There’s definitely a valid analogy here. Has anyone else heard that, in the face of rising sea levels, Al Gore has bought a “cottage by the sea” in California?

  50. The current trick used by AGW advocates is to show the following graph and claim the world is still warming at 0.17 deg C per decade.

    http://bit.ly/g5fqcl

    However, if you divide the two decades into two periods, the following is the picture you get.

    http://bit.ly/fWxIYn

    This graph shows a huge global warming of 0.34 deg C in the period from 1992 to 2002 & a slight cooling since 2002. IMHO, to pinch a global warming rate from a previous decade and give it to the recent decade to “hide the decline” is not fair.

    • They claim a current warming of 0.17 deg C per decade, but they don’t allow me to challenge their claim by the above post.

  51. “’Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on regional scales’… I don’t recall the climate establishment “giving” this one before.”

    Judith, on decadal time scales, this is what is expected. Something is wrong with your recall of current climate science. :-(

    And the focus of this particular reanalysis of past research (the physical basis, and analytics) is consistent with what was recognized by the IPCC in AR4 regarding questions still to be addressed in the science and research for AR5. That is to say, it is part of the anticipated research and progress in relation to decadal climate projections — as demanded by the IPCC.

    Is it that you really don’t know what you’re talking about, or is it that you are just pretending (silly!)?

    • on decadal time scales, this is what is expected. Something is wrong with your recall of current climate science

      You might want to give a specific reference for that – not just “AR4” because it’s not what I’ve been listening to and reading for the last ten years. And I doubt anyone else has seen much of it either.

      Is it that you really don’t know what you’re talking about, or is it that you are just pretending (silly!)?

    • There was a series of threads here on Scenarios: 2010-2030, Parts 1-3, that discussed the difficulties at length in December 2010. Several papers on the difficulties were introduced back then, so I agree we are repeating ourselves a bit here, and this is nothing new.

      • What is new is that this is coming from the climate establishment (including IPCC lead authors), and not just people like Tsonis.

    • Martha, your statement of ,
      ” Judith, on decadal time scales, this is what is expected. Something is wrong with your recall of current climate science?”
      is not only absurd, it is disrespectful. The ‘climate establishment’ has been carrying on for years that it is C02 that has bee the cause of the increasing temperature, and NEVER A WORD was given to ‘natural variability. Not until recently, when after the past dozen years or so, temperatures have NOT been increasing. Why you can’t see this is what Dr. Curry was referring to is beyond me.

      And this focus is consistent with what the IPCC AR4 recognized, with question to be addressed in AR5? I don’t know what the focus for AR5 is, but since the ‘take home’ points of AR4 was that beyond a doubt CO2 produced by man was warming the planet, your statement is a joke.

      In my opinion, you owe Dr. Curry a HUGE apology.

      • As always people hear something, remember something and claim that nothing else has been said.

        The natural variability has always been known to exist, it has always been admitted to exist, and it has been assumed that everybody understands this fact. This has not been repeated in every sentence, but is has never been denied.

      • ‘That said there is a LOT of nonsense about the PDO. People like CPC are tracking PDO on a monthly basis but it is highly correlated with ENSO. Most of what they are seeing is the change in ENSO not real PDO. It surely isn’t decadal.’

        This shows that Trenberth is not on top of this field – but this was and still is for many the ruling paradigm. There was a bit of interannular variation – mostly to do with ENSO. These ENSO ‘oscillations’ were meant to even out and the climb continue.

        The difference is that it surely is multi-decadal and many people are scrambling to incorporate this into their world view – rewriting the past as they go. As they and you are clueless about the origins of multi-decadal variability – the fairy tales now emerging about how it will evolve are utterly worthless.

        I wrote about multidecadal variability in the Pacific as early as 2007 – and only because the IPCC totally missed it – and was placed for my efforts on every source watch list on the planet. To be arraigned in due course I take it for crimes against humanity. I am afraid that you have zero credibility in this – just one more example of pissant tendentiousness.

      • Pekka, your statement is ambiguous and importantly false on one interpretation. Large scale natural variability, like the ice ages, geologic periods, etc., has always been known to exist. Dec-Cen natural variability has been strongly resisted. There are two big NAS studies from the late 1990’s where it was grudgingly admitted. AGW was based on the assumption of natural equilibrium and still is to a great extent. Its recent acceptance is revolutionary.

      • BlueIce2HotSea

        Check out this graph on page 3 (Science, 1975): Climatic Change: Are We on the Brink of a Pronounced Global Warming?

        Long ago, estimated natural variability was included on the graphs for projected temperature trend due to GHG.

      • Your URL does not bring up the article. The abstract is interesting in that the mid-century cooling was considered by this author to be natural, while the usual AGW argument is that it was due to anthro aerosols. To wit: “If man-made dust is unimportant as a major cause of climatic change, then a strong case can be made that the present cooling trend will, within a decade or so, give way to a pronounced warming induced by carbon dioxide. By analogy with similar events in the past, the natural climatic cooling which, since 1940, has more than compensated for the carbon dioxide effect, will soon bottom out. Once this happens, the exponential rise in the atmospheric carbon dioxide content will tend to become a significant factor and by early in the next century will have driven the mean planetary temperature beyond the limits experienced during the last 1000 years.”

        This makes Wallace Broecker an outlier, as he always has been, not a counter example. There are in fact bunch of people who have talked about natural variability, Lamb being the most obvious example. These are mostly skeptics. The fact is that the original climate models were equilibrium models and the assumption of stable climate upset by anthro emissions was pervasive. This is a statistical claim, not subject to simple counter examples.

      • BlueIce2HotSea

        The 1975 graph which includes natural climate trend is at the bottom of Page 3 of the pdf. I repeat the link for those curious but discouraged by David Wojick’s ability to find only the abstract. The article, which immediately follows the abstract on page 2 continues to page 5.
        Climatic Change: Are We on the Brink of a Pronounced Global Warming?

        Broecker’s treatment ought to have been used before the 21st century flat-spot.

      • And what has been published since then (1975)? When in that last 20 years has anyone in the climate establishment even mentioned natural variability?

      • All the time, zillions of times every year.

        As one example AR4 devotes to natural variability several sections. One of them is on observations and the summary of this section of 11 pages starts with the sentence:

        “Decadal variations in teleconnections considerably complicate the interpretation of climate change.”

        Claims that natural variability is not considered are totally counter-factual and plainly silly.

      • I agree with Pekka. Natural variability has always been “mentioned.” But claiming that it is not mentioned is just the wrong claim. The fact is that it has not been seriously considered as part of the explanatory machinery until the last decade or so, and not much even then. Discussing it and then dismissing it, as the IPCC does, is not the same thing as seriously considering it.

        Where are the modeling attempts to derive the entire 20th century temperature profile from natural variation? Where is the model of the MWP-LIA-today natural cycle? That would be serious consideration, as opposed to merely trying to show how natural variation is temporarily masking AGW, which is all we are seeing from science today. This kind of honorable mention is a joke.

      • David,
        There are many reasons for the rather small weight that has been given to the natural variability in attributing observed climate variations. I’m sure that in some cases it has been done purposefully to strengthen the anthropogenic signal. I believe, however, that this a secondary factor, while the main reason is related to the difficulty of estimating its role in situations where it’s not due to easily specified factors like volcanic activity or ENSO.

        When the time dependence of the increased CO2 is known, but the effects of the natural oscillations are unknown, typical statistical methods attribute to AGW as much as the data allows and minimizes the role of natural variability. This was particularly pronounced, when the period of available data ended around 2000, i.e. before the flattening that we have seen since. The full present data is likely to lead to an increased role of the natural variability in the attribution, as we have already seen in many papers of main stream climatologists.

      • ‘The global atmospheric circulation has a number of preferred patterns of variability, all of which have expressions in surface climate… Regional climates in different locations may vary out of phase, owing to the action of such ‘teleconnections’, which modulate the location and strength of the storm tracks (Section 3.5.3) and poleward fluxes of heat, moisture and momentum…Understanding the nature of teleconnections and changes in their behaviour is central to understanding regional climate variability and change. Such seasonal and longer time-scale anomalies have direct impacts on humans, as they are often associated with droughts, floods, heat waves and cold waves and other changes that can severely disrupt agriculture, water supply and fisheries, and can modulate air quality, fire risk, energy demand and supply and human health.’ The quote is from the beginning of the AR4 section Pekka refers to.

        Just a simple point on terminology – decadal implies the potential for multi-decadal change and the terms are used interchangeably as in Pacific Decadal Oscillation.

        Now if only they had included decadal scale effects on the global surface temperature – Pekka might have a leg to stand on. This was missed entirely by just about everyone. To selectively quote AR4 out of context is well…

        The attribution of the modicum of warming between 1976 and 1998 to CO2 has been falsified. The end periods are examples of extreme variance typical of Dragon Kings in the very colourful terminology of Sornelle (Swanson et al 2009). Using the intervening period of 1979 to 1996 sees a linear trend increase is SAT of about 0.1 degrees C – a value that is the new baseline for ‘recent warming’.

        I note that Pekka and others trying on the meme that dynamical complexity is the latest ploy of skeptics to confuse the simple minded. I am Australian – I will call BS when I see it. Dynamical complexity is rather at the core of a more correct understanding of weather and climate.

        The other aspect of ‘recent warming’ is more problematical. NASA/GISS are telling us that between 1985 and 1998 cloud cover change provided by far the largest change in the global energy balance – http://isccp.giss.nasa.gov/zFD/an2020_SWup_toa.gif.

        Now I know it is the government and it is rocket science – NASA might be wrong – but it is data and it is plausible.

      • Pekka Pirilä

        “Mentioning” natural variability in passing but relegating its longer-term version, natural forcing, to insignificance was one big mistake of AR4, along with the ridiculous estimates of strongly positive cloud feedback with anthropogenic warming.

        Sure, they put in “caveats” for both about large uncertainty and low level of scientific understanding, but the silly projections for 2100 and the “unequivocals and very likelys” (as Dr. Curry has called them) remained.

        On can only hope IPCC has learned a lesson from Climategate, the recent lack of warming, etc. and that this will be evident in AR5. Otherwise it is destined for the trash heap at publication.

        I am personally not optimistic that IPCC has changed its approach, which Dr. Curry has likened to “selling snake oil”.

        It seems that a total change of personnel (management plus insider group of climatologists) may be required first.

        Max

      • I have stated in several comments that the dynamical complexity may be important, and I have presented the question, whether any such stationary situation will ever form, where a long term average temperature is well defined. This should be enough evidence on that I give weight on these issues.

        While I consider such problems to be important and worth taking seriously, I don’t know, how important they are quantitatively or whether they make the whole concept of average temperature mute. What I have objected is the certainty (justified by formal arguments) of some people that the dynamical complexity is so dominant that all present modeling activities are of zero value. My view is that model development forms an important activity, but that there should be a lot of caution in using the models. I don’t believe that modelers have spent enough effort to determine, whether the real Earth system has the same stability properties than their models. It may be that not every modeler knows even the stability properties of his own model.

      • Pekka Pirilä

        You wrote:

        When the time dependence of the increased CO2 is known, but the effects of the natural oscillations are unknown, typical statistical methods attribute to AGW as much as the data allows and minimizes the role of natural variability.

        Pekka, this seems like a bit of “circular logic” (on the part of the IPCC) to me, as I am sure you would also acknowledge.

        Was “the time dependence of the increased CO2” [i.e. the time-related temperature impact of increased atmospheric CO2] REALLY “known”?

        Or was it just estimated based on model simulations with assumed sensitivity and “hidden in the pipeline” inputs?

        It was clearly the latter, which makes the second part of your statement even more compelling:

        typical statistical methods attribute to AGW as much as the data allows and minimizes the role of natural variability

        The problem is that this approach has now backfired, in view of the recent lack of warming of both atmosphere and ocean despite CO2 increase to record levels.

        So the Solomon et al. study is really an attempt to salvage the notion of high long-term CO2 climate sensitivity by relegating the natural variability, which overwhelmed it for a decade, to a “transitory” role.

        But, as many have already noted here, this assumption is not substantiated by any empirical data and risks backfiring if the current slight cooling trend continues and becomes a multi-decadal oscillation, similar to the others we have seen since the modern global record started in 1850.

        I think that there is a very good likelihood that this is where we are headed.

        But who knows?

        Max

      • Max,
        What I meant is that we know the history of CO2 emissions and we have some simple models that can be used as guesses of the time dependence of the resulting temperature rise. Over the period 1970-2000 both the actual temperature rise and the result of such a simple model show rapidly rising trends, while there are no external reasons to expect similar behavior over this period for the natural variations. In this situation, it is likely that comparisons with empirical data lead to the conclusion that CO2 dominates as the main reason for the temperature rise.

        That explanation has its problems in years before 1970, but the influence of aerosols provided an explanation or at least an excuse.

        Now we have the years 2000-10 and some other additional knowledge. The earlier fit doesn’t work any more. Many people (including main stream climate scientists) understood that already before that the period 1970-2000 was not fully representative, but it had anyway an reassuring influence on those predisposed to think that the AGW was already the most important factor. The present data is emphasizing the importance of natural variability as an important component in the overall development.

        I would not say that the conclusions should be reversed, but I do certainly think that the relative weights of natural and anthropogenic components have changed also in the main stream thinking.

      • I would’t hold your breath whils waiting for an apology (based on her past performance.)

    • No Martha, the issue is this. I am very tolerant of people with an agenda coming to my site to insult me. I allow this since I think it is very illuminating in this debate to see how people on the fringes resort to insult rather than argument.

      The IPCC AR4 predicted decadal scale warmings of 0.2C per decade in the first half of 21st century. More than a decade has passed, and we haven’t seen this warming. The period 1970-2000 is also “decadal time scales.” Why was the warming during that period not attributed to natural internal decadal variations? The climate establishment has been prepared to say that regional variability on decadal time scales is heavily influenced by natural internal variability. But they have not been prepared to say that global average temperature is influenced in a major way by such variability. This paper, in its discussion of AMOC and PDO which have global temperature signals, begins to address this issue. Yes it seems that the AR5 is starting to pay attention to this. The AR4, with its unequivocals and very likelys, did not.

  52. ‘For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenarios.’ AR4

    AR4 gaps in decadal prediction? Regional hydrology always very inadequate – but you should be more specific or I will suspect you of post hoc rationalisation.

    There are gaps in knowledge identified here – http://www.cgd.ucar.edu/cas/Trenberth/trenberth.papers/DohertyTrenberth_etalBAMS09.pdf – and they include decadal projections.

    But this is informed by the persistance of the decadal climate problem – see for instance:

    ‘This index captures the 1976–1977 “El Niño–Southern Oscillation (ENSO)-like” warming shift of sea surface temperatures (SST) as well as a more recent transition of opposite sign in the 1990s. Utilizing measurements of water vapor, wind speed, precipitation, long-wave radiation, as well as surface observations, our analysis shows evidence of the atmospheric changes in the mid-1990s that accompanied the “ENSO like” interdecadal SST changes.’

    Burgman, R. J., A. C. Clement, C. M. Mitas, J. Chen, and K. Esslinger (2008), Evidence for atmospheric variability over the Pacific on decadal timescales, Geophys. Res. Lett., 35, L01704, doi:10.1029/2007GL031830.

    ‘Our results suggest that global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.’

    N. S. Keenlyside, M. Latif, J. Jungclaus, L. Kornblueh & E. Roeckner (2008), Advancing decadal-scale climate prediction in the North Atlantic sector, Nature Vol 453| 1 May 2008| doi:10.1038/nature06921

    We find that in those cases where the synchronous state was followed by a steady increase in the coupling strength between the indices, the synchronous state was destroyed, after which a new climate state emerged. These shifts are associated with significant changes in global temperature trend and in ENSO variability. The latest such event is known as the great climate shift of the 1970s.

    Anastasios A. Tsonis,1 Kyle Swanson,1 and Sergey Kravtsov1 (2007), A new dynamical mechanism for major climate shifts, GEOPHYSICAL RESEARCH LETTERS, VOL. 34, L13705, doi:10.1029/2007GL030288

    If as suggested here, a dynamically driven climate shift has occurred, the duration of similar shifts during the 20th century suggests the new global mean temperature trend may persist for several decades. Of course, it is purely speculative to presume that the global mean temperature will remain near current levels for such an extended period of time. Moreover, we caution that the shifts described here are presumably superimposed upon a long term warming trend due to anthropogenic forcing. However, the nature of these past shifts in climate state suggests the possibility of near constant temperature lasting a decade or more into the future must at least be entertained. The apparent lack of a proximate cause behind the halt in warming post 2001/02 challenges our understanding of the climate system, specifically the physical reasoning and causal links between longer time-scale modes of internal climate variability and the impact of such modes upon global temperature.

    Swanson, K. L., and A. A. Tsonis (2009), Has the climate recently shifted?, Geophys. Res. Lett., 36, L06711, doi:10.1029/2008GL037022.

    ‘A negative tendency of the predicted PDO phase in the coming decade will enhance the rising trend in surface air-temperature (SAT) over east Asia and over the KOE region, and suppress it along the west coasts of North and South America and over the equatorial Pacific. This suppression will contribute to a slowing down of the global-mean SAT rise.’

    Takashi Mochizuki, Masayoshi Ishii, Masahide Kimoto, Yoshimitsu Chikamoto, Masahiro Watanabe, Toru Nozawa, Takashi T. Sakamoto, Hideo Shiogama, Toshiyuki Awaji, Nozomi Sugiura, Takahiro Toyoda, Sayaka Yasunaka, Hiroaki Tatebe, and Masato Mori (2010) , Pacific decadal oscillation hindcasts relevant to near-term climate prediction, doi:10.1073/pnas.0906531107PNAS February 2, 2010 vol. 107 no. 5

    I rang Bruce at the Philosophy Dept. of the Unviercoty of Wooloomoolooo in 2003 saying the PDO is exactly the same periodicity as global surface trajectories and Australian rainfall regimes. There seems to be a connection between north Pacific sea surface temperature, North American fisheries, global warming and Australian rainfall – what gives?

    But the IPCC and you, dear Martha, missed it entirely. Nah nah nah nah.

    You have nil credibility – go away and play with the other rude little kiddies until you have some depth of understanding and know how to play nicely.

  53. batheswithwhales

    The debate has focused on the decadal vs multidecadal vs centennial scales, but equally interesting is the assumption from the abstract that:

    “Abstract. Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on REGIONAL scales”

    I find it entertaining to insist this is true only on the regional scale, since it obviously is happening on the global scale as we speak (no warming globally for 10 years +).

    It seems they are navigating into a position from where they will be able to explain the lack of warming on a global scale by natural variability on a regional scale.

    And it makes sense for them to do this.

    If they were to explain the lack of warming on a global scale by natural variability on a global scale, the question would emerge: could this also be the cause of the modern warming?

    And this question is something they would rather avoid.

  54. macropanama

    You have hit the nail on the head (April 8, 2011, 10:00pm) when you point out the serious shortcoming of the IPCC GCMs in their myopic fixation on anthropogenic factors almost to the exclusion of anything else. In its latest report, solar forcing was relegated to an insignificant role, and no other natural forcings were even considered (AR4 SPM.2.).

    Today (four years later) there is recognition of naturally forced climate change (hard to ignore after a decade of “unexplained” slight cooling despite CO2 rising to record level).

    But the title of the report cited by Dr. Curry tells it all: “Distinguishing the Roles of Natural and Anthropogenically Forced Decadal Climate Variability: Implications for Prediction”.

    The recently observed “lack of warming” is acknowledged, but it is relegated to a “decadal” phenomenon, with the word “regional” tossed in occasionally in the report to downgrade it even further.

    in terms of attribution, to what extent are regional changes in the current climate due to natural climate variations and thus transitory, and to what extent are they due to anthropogenic forcing and thus likely to continue

    Anthropogenic factors are permanent, while natural factors are transitory?

    The absurdity of this thinking is mind-boggling! Yet the “defenders of the faith” try all sorts of ways to downplay natural climate forcing.

    So an “ambitious effort” is being suggested to enable the GCMs to be able to forecast these transitory natural factors? Hmmm… Lots of luck.

    Step 1 has to be to concede that anthropogenic forcing is likely to be much less important than previously assumed.

    Step 2 is to acknowledge that natural climate forcing is likely to be much more significant than simply the small change in measurable direct solar irradiance (as was previously assumed by IPCC in AR4).

    Step 3 is to get a better understanding of natural climate forcing, including (as you wrote) “of the hydrologic cycle – where the clouds come from and how they behave for example”. Why did we see a reduction in cloud cover and albedo from 1985 to 2000, followed by a reversal? How is this tied in with ENSO, PDO, AMOC, etc? Is this in any way related to changes in solar activity? If so, how? How about the solar / cosmic ray ‘ cloud connection? What does the current low level of solar activity mean for us, and how is this likely to continue? How much of the late 20th century warming cycle can be explained by non-anthropogenic forcing, taking all these factors into account?

    Step 4 is more basic: It involves open acknowledgement that we are unable to make any realistic projections of our planet’s climate with the limited knowledge we have today, therefore we should stop making ridiculous predictions for the future, period.

    Step 5 is even more painful: It involves openly recognizing that we are unable to change our climate, no matter how much money we throw at the “problem” – the only thing we can do is be prepared to adapt to any climate changes that are thrown at us, from whatever source.

    Dr. Curry is apparently optimistic that reports such as the one cited show a first step in the direction of spending more time getting to know natural climate forcing factors.

    I would hope that this is the case.

    Max

  55. batheswithwhales

    As you point out, the Solomon et al. study does emphasize the “regional” and “transitory” nature of natural climate variability (a.k.a. natural forcing).

    As you wrote, the past 10 years have shown that this “regional” phenomenon had “global” repercussions.

    Even before that, the many strong El Niños in the 1990s (culminating with the very strong one in 1997/98) certainly influenced the late 20th century global temperature record.

    Isn’t all weather (and climate) local or regional? And seasonal and diurnal? Isn’t the “globally and annually averaged land and sea surface temperature anomaly” (with all its warts and blemishes) simply an artificial construct of hopefully representative “local or regional” temperatures averaged over the year?

    It would be too painful for Solomon and her colleagues to formally acknowledge that natural climate forcing can have global implications and that these may even have impact over a longer time period, so (for now) they write about “regional” and “transitory” impacts of natural “variability”.

    Max

    • batheswithwhales

      They also state the following:

      “in terms of attribution, to what extent are regional
      changes in the current climate due to natural climate
      variations and thus transitory, and to what
      extent are they due to anthropogenic forcing and
      thus likely to continue.”

      The word “transitory” is of course interesting. On what timescale? Granted, some decades are warmer than others due to natural forcings, but some centuries are also warmer than others for the same reasons. Even millenia.

      Since the paper focuses on the decadal scale, “transitory” would mean that these changes are also likely to be reversed within the decadal timescale.

      If all natural variability is labeled decadal, regional and therefore transitory, the danger is that natural variability on a longer timescale (multidecadal, centennial) and/or of a wider geographical impact (hemispherical, global) will be attributed to anthropogenic activities.

      Which is not a very realistic assumption.

      It seems like an attempt to “contain” natural variability in such a way as to give the “anthropogenic signal” as much elbow room as possible.

      • Nice point. It sort of prefigures Trenberth’s insidious perversion of the Null.
        ===============

  56. When is the climate establishment is going to “give” the lack of warming in the last decade that they admit in private?


    1) I think we have been too readily explaining the slow changes over past decade as a result of variability–that explanation is wearing thin. I would just suggest, as a backup to your prediction, that you also do some checking on the sulfate issue, just so you might have a quantified explanation in case the prediction is wrong. Otherwise, the Skeptics will be all over us–the world is really cooling, the models are no good, etc. And all this just as the US is about ready to get serious on the issue.
…
We all, and you all in particular, need to be prepared.

    http://bit.ly/eIf8M5


    2) Yeah, it wasn’t so much 1998 and all that that I was concerned about, used to dealing with that, but the possibility that we might be going through a longer – 10 year – period [IT IS 13 YEARS NOW!] of relatively stable temperatures beyond what you might expect from La Nina etc. Speculation, but if I see this as a possibility then others might also. Anyway, I’ll maybe cut the last few points off the filtered curve before I give the talk again as that’s trending down as a result of the end effects and the recent cold-ish years.

    http://bit.ly/ajuqdN


    3) The scientific community would come down on me in no uncertain terms if I said the world had cooled from 1998. OK it has but it is only 7 years [IT IS 13 YEARS NOW!] of data and it isn’t statistically significant.

    http://bit.ly/6qYf9a

    • batheswithwhales

      I haven’t seen these emails before. Thanks for posting. Quite revealing, and these little exchanges must be going on even more feverishly these days.

      How to maintain the pet theory as it slowly unravels before their eyes?

      Of course – find one of the 10 000 model runs that predicted somewhat less warming than the 9 999 others and say: Hey – that’s what the models predicted. We knew! It is all under control! Scientifically!

      Hm…

    • You obviously work for the Koch brothers.
      ;^)

  57. Joe Lalonde

    Judith, Chief, Pekka, Fred and Speed,

    Just finally put another piece of this planetary puzzle in place.

    What is gravity? It IS NOT objects being pulled to the core.
    It is the incredible speed of our planet forward momentum. Quite closely related to throwing a balloon filled with water flattens out but the rotation stops the flattening out but it does hold objects close to a mass, especially since we are under atmospheric pressure.
    This is why we have had a hard time understanding the in mix of magnetics when in actual fact, the forward motion has generated this massive energy.

  58. “Moralists have a lot of problems with evolutionary concepts, but as John Cleese remarked in Life of Brian: “It’s symptomatic of their struggle with reality.”

    Well that’s quite a generalization. As I said, I consider myself a moralist (though I dislike the pious overtones inherent in the word), and I have no trouble with evolutionary concepts.. And I doubt I struggle with reality any more than you do.

    In the end, it’s a choice. The natural world is entirely amoral of course, but we can decide as individuals not to be.

  59. BlueIce2HotSea

    There is a rather significant difference between nature lovers and environmentalists. Teddy Roosevelt, the former, created the national park system. Nazi’s, the latter, developed a hatred for humanity in part because of concern over the “rape” and “murder” of the environment. They also considered themselves morally superior.

    I am not pointing a finger at you. But sometimes people transform a specific hatred into a general hatred. This can cause problems.

  60. Judith,

    You have no idea how much science falls into place when replacing the notion that our core pulls material, to of one where the motion is of the solar system’s forward momentum generates gravity.
    In essence we are like bugs on a windshield enjoying the ride. It also conforms to compression and density on a rotating, pressurized planet.

  61. Solomon et al. acknowledge that natural climate variability can impact regional (and hence average global) temperatures.

    They were more or less forced to do so in view of the “unexplained lack of warming” we have observed in the atmosphere since 2001 (Trenberth’s “travesty”) and in the upper ocean since ARGO measurements replaced the old XBTs in 2003, despite CO2 increase to record levels.

    As several posters have remarked, they have taken great pains to put natural climate variability into the box of “transitory” (rather than long-term), without really having any substantiating evidence for this. (I can think of no other explanation for the fact that the report contains the word “decadal” over 50 times!)

    IPCC AR4 had relegated total “natural forcing” (the longer-term version of “natural variability”) to insignificance from 1750 to 2005 (less than 8% of anthropogenic forcing from CO2 alone), .

    So it was important for Solomon et al. to put the observed “natural climate variability” into a different box from “natural climate forcing”, and the way to achieve this is to simply say that it is “transitional” rather than long-term.

    Let’s see what they come up with if the current lack of warming lasts another decade or two. (I’m sure they’ll think of something).

    Max

  62. A useful perspective on this narrative comes from seeing its place in the climatology record of the past 100 years. A few examples are illustrative.

    One is the GISS Global Temperature Anomaly curve. The graph shows the temperature increase recorded over that 100 year interval. However, it is also clear that one can find more than half a dozen occasions when the line between two black dots ten years apart (or more) shows either no change or a cooling. Nothing about the past ten years is unique, but rather an example of the up and down fluctuations that have accompanied the long term rise in temperature.

    If one looks at climate models (e.g., from the Hansen et al 1988 data), we also see that control runs without anthropogenic forcing oscillate over the long term rather than exhibiting a flat line. Many of the models incorporate stochastic variations designed to emulate unpredictable bumps and dips due to volcanism, or to anticipate the frequency of ENSO variations. These too show bumpy control runs, although the model bumps and dips rarely coincide with the observed ones, illustrative of the mismatch between models and observations on decadal scales.

    I’m not sure much of this information in and of itself is news to the climate community, but what seems more recent is a recognition of the need to address decadal variability because of its practical implications.

    One interesting new piece of information (for me), however, appeared recently on Isaac Held’s blog, involving GFDL model runs for the past century forced exclusively with greenhouse gases – i.e., with no input for volcanoes, ENSO, aerosols, etc. – see Climate Responses To Greenhouse Gases. What is striking is the considerable short term bumpiness of the runs simply due to chaotic internal climate fluctuations, with different runs differing in the timing of the fluctuations. All runs, however, yielded fairly similar long term warming trends, as might have been expected from the greenhouse gas input. Interestingly, this input was significant in the first half of the twentieth century, even though with a shallower slope than later. The result is consistent with an early contribution to observed warming involving not only solar brightening but also more GHG forcing than typically estimated, and not too dissimilar from later trends that involved a lesser solar effect, a greater CO2 effect, and cooling influences from anthropogenic aerosols, with further modifying influences from multidecadal PDO and AMO fluctuations. It’s instructive to realize that model simulations, like observational data, are not going to yield smooth lines even when relatively uncomplicated inputs are applied.

    • It is also interesting that the GFDL model runs deviate most from observational data from about 1960 onward, although the slopes are parallel. This presumably reflects the absence from the models of the added variability induced by mid-century anthropogenic aerosol cooling, which declined after the 1980s. Some return of increased aerosol cooling in the past few years has been reported, and may have contributed to the flat curve of the past decade.

    • Fred Moolten

      You’re right. The temperature record shows many “blips”, in addition to the two indistinguishable multi-decadal 20th century warming cycles, a slightly less pronounced late 19th century warming cycle, all of around 30 years, with 30-year cycles of slight cooling in between.

      Since the last warming cycle started there have been no cooling “blips” lasting 10 or more years, up until the most recent decade. So this “blip” is unusual in length compared to the others. Whether or not this “blip” will become the start of a new multi-decadal cooling trend, as we have seen in the past, is a point of conjecture.

      So let’s wait and see if IPCC was right (and it starts warming now) or if the studies that tell us we are entering a prolonged period of slight cooling are right.

      Only time will tell, Fred.

      Max

      • Since the last warming cycle started there have been no cooling “blips” lasting 10 or more years, up until the most recent decade.

        http://bit.ly/g6G517

        The only slight decade long cooling in the last century is only during the period from 1940 to 1970.

  63. At the recent European Geosciences Union general assembly (held in Vienna last week), a talk was given by Antonis Christofides and Demetris Koutsoyiannis on “Causality in climate and hydrology”. (Meeting session details here)

    The paper includes an interesting discussion on attempts to ascribe causality to natural variation and anthropogenic forcing, and provides a discussion of some of the topics raised on this blog by Tomas Milanovic and others. The paper also includes a quote from Dr Curry!

    The paper does not present anything new or earth shattering, but fairly simple and obvious points about causality in chaotic systems, and a demonstration of how long term trends can be generated without forcing in complex nonlinear systems. This is absolutely fundamental to all of the “interesting” questions in climate science (including this post and discussion), yet remains the elephant in the room that is not discussed in climate science circles (or is discussed, but dismissed with ignorant arguments, as we have seen by advocates on this very blog).

    The paper can be found here: http://itia.ntua.gr/en/docinfo/1130/ (click on “full text”). It is short, and very pertinent to the type of question being asked here.

  64. ‘Evidently this model can generate internal fluctuations in global mean temperature that produce substantial departures from a smooth warming trend, but it does not come close to generating variability comparable to the 20th century trend itself.’

    The internal variations of the model are related to the intrinsic non-linearity of the model and this is are not the same as the intrinsic non-linearity of the climate system.

    ‘Although we may expect a chaotic AOS model to be structurally unstable, it is difficult to explicitly make this determination. The attractor cannot be fully visualized or measured because the phase space has such a high dimension (i.e., high order). Probability distribution functions (PDFs) (Fig. 1) give at least a rough view of an AOS attractor. There are many aspects to the equation set for a model, most notably in the choices of discrete algorithms, parameterizations, and coupling scope, and these are usually not systematically explored in AOS practices. To do so requires formulating multiple models for a given problem. Even systematic scans in the parameter values of a complicated AOS model are rarely published, although parameter variations are commonly made while tuning a model to improve its plausibility. ‘

    ‘AOS models are therefore to be judged by their degree of plausibility, not whether they are correct or best. This perspective extends to the component discrete algorithms, parameterizations, and coupling breadth: There are better or worse choices (some seemingly satisfactory for their purpose or others needing repair) but not correct or best ones. The bases for judging are a priori formulation, representing the relevant natural processes and choosing the discrete algorithms, and a posteriori solution behavior. Plausibility criteria are qualitative and loosely quantitative, because there are many relevant measures of plausibility that cannot all be specified or fit precisely. Results that are clearly discrepant with measurements or between different models provide a valid basis for model rejection or modification, but moderate levels of mismatch or misfit usually cannot disqualify a model. Often, a particular misfit can be tuned away by adjusting some model parameter, but this should not be viewed as certification of model correctness.’

    ‘Therefore, we should expect a degree of irreducible imprecision in quantitative correspondences with nature, even with plausibly formulated models and careful calibration (tuning) to several empirical measures. Where precision is an issue (e.g., in a climate forecast), only simulation ensembles made across systematically designed model families allow an estimate of the level of relevant irreducible imprecision.’

    http://www.pnas.org/content/104/21/8709.full.pdf+html” title=”McWilliams 2007″

    ‘Prediction of weather and climate are necessarily uncertain: our observations of weather and climate are uncertain, the models into which we assimilate this data and predict the future are uncertain, and external effects such as volcanoes and anthropogenic greenhouse emissions are also uncertain. (Predicting Weather and Climate – Palmer and Hagedorn eds – 2006).

    Uncertainties include solar irradiance (always in models either constant or varying in the Schwabe quasi 11 year cycle) , top down solar UV forcing (not incorporated into models) and clouds and albedo generally (represented as a constant), aerosols and black carbon (and the interactions thereof) and cloud ‘feedbacks’ (which way it goes in this particular model), how interannular to decadal ocean/atmosphere coupling is addressed (inaccurately as we have limited data, minimal mathematical tools for such analysis and negligible theoretical understanding of the origins), in the neglect of factors in longer term (we don’t in detail understand the originds of the little ice age, the medieval warm period, glacials and interglacials) climate change and in the problem of dynamical complexity in climate as opposed to models (they are not the same system at all and are theoretically determinant but practically incalculable).

    ‘Sensitive dependence and structural instability are humbling twin properties for chaotic dynamical systems, indicating limits about which kinds of questions are theoretically answerable.’ McWilliams 2007.

    We must at length realise that the climate problem is not susceptible to simple reductionism. Especially those based on examining the entrails of the GISS surface temperature record.

    The real problem is that we are in a cool mode of the Interdecadal Pacific Oscillation (it is more likely a chaotic bifurcation than a true oscillation – but we will leave that for the moment) – these tend to last for 20 to 40 years so we are looking down the barrel of global cooling for a decade or 3 more.

    In the very near future any claim that greenhouse emissions should be reduced will be met by gales of laughter – and the clouds are forming already with a certain confident (misplaced) triumphalism of skeptics. Unless the narrative changes dramatically.

    My views on how this narrative should be framed are well summed up by the http://www2.lse.ac.uk/researchAndExpertise/units/mackinder/theHartwellPaper/Home.aspx” title=”The 2010 Hartwell Paper on Environmental Policy”

    For the benefit of Bart R – this is produced by the London School of Economics. Bitter dust – Bart R – bitter dust.

  65. IPCC’s projection of 0.2 deg C per decade warming (BLUE) compared to observed trend (RED) since 2002.

    http://bit.ly/gpr6jv

    Are we going to see a slight global cooling similar to 1940 to 1970?

    Let us wait and see.

    • Hey there Girma,

      ‘Average solar activity has declined rapidly since 1985 and cosmogenic isotopes suggest an 8% chance of a return to Maunder minimum conditions within the next 50 years (Lockwood 2010 Proc. R. Soc. A 466 303–29): the results presented here indicate that, despite hemispheric warming, the UK and Europe could experience more cold winters than during recent decades.’

      Lockwood et al 2010

      No warming? I’m wondering what the odds are of abruptly tipping over into the next glacial. I am also wondering about ‘hemispheric warming’? The usual cooling and warming at the same time?

      • Chief,

        I can only surmise from the evidence.
        I believe you are correct and we are most likely into the glacial age.

        The atmosphere has been stretched.
        Evidence?
        Growth up mountainsides where their has never been growth before.
        Centrifugal force will only allow this to come back slowly as now density has been exerted out.

        What happens when a cooling trend comes and the warm air that stretched the atmosphere now has extra space?
        Certainly makes room for heavier cloud cover.

      • “I am also wondering about ‘hemispheric warming’?”

        I don’t see why it should when the maunder minimum didn’t. Also, I don’t think we should say things like this or the next thing you know we will be having to fight off those that would spread black carbon all over the poles.
        http://denisdutton.com/newsweek_coolingworld.pdf

      • Sorry, meant to copy the comment about wondering on glacials.

  66. note i inserted the links in your original long message

    • Thanks – :oops: – just making sure of the coding – first time I tried it.

    • Dear Dr. Curry:

      Thank you for this post. Yes, it is possible to distinguish between natural and anthropogenic variabilities. Please allow me to use the word transformation instead of variability, since climate variabilities are a result of thermodynamic transformations.

      1) Transformations resulting from the earth’s motion around the sun are characterized by linearity, constant change with time, and complete reversibility. This is due to the slow motion of the earth around the sun, conservation of the earth’s angular momentum, and invariability of revolution period, regardless of orbital shape and time. This is always true and Milankovitch cycles do not change these transformations. For more, please see Article-12, Earth’s Magic on my website http://www.global-heat.net.

      Observations are in agreement. The recorded average monthly surface temperatures show the following: a) when these temperatures are corrected for water vapor effect, they are linear. Please see one example for Los Angeles, CA, Article- 3, on my website; b) The changes in these monthly temperatures is proportional to the amount of heat exchanged with surface in one month. If Ts(t) denotes seasonal surface temperature change from a reference month and the letter, t, denotes the time, the area below the curve Ts(t) is the same for the forward transformation, say seasonal warming, and reverse transformation, say seasonal cooling. These areas are in fact proportional to the amount of heat exchanged with the surface, which cancels out at the completion of a full transformation cycle due to cycle complete reversibility. I did this exercise for many cities and the conclusions are reasonably the same and in agreement.

      2) Transformations resulting from volcanic aerosols are characterized by complete reversibility as well. When aerosols are washed away at the completion of volcanic events, all of the heat removed from surface is added back in full, and the related curve Ts(t) has equal areas for the forward and reverse transformations. Please see references (1) and (2) of Article-13, related to Pinatubo volcanic eruption, in order to estimate the forward and reverse transformation times for this exercise.

      3) On the other hand, transformations caused by changes in the content of carbon dioxide in the atmosphere are distinguished by their irreversibility. Antarctic ice core data, reference (4) article -12, shows clearly that in the past 400,000 years, surface temperature was variable, and the area under the curve Ts(t) for past glacial periods and their succeeding warming periods were not equal, and this behavior is in contrast with item-1 and item-2 above. The difference in the areas is proportional to the sensible heat removed from the surface by these irreversible cycles in the form of ocean cooling and ice inventory.

      4) Transformations that occurred in the last 10,000 years yielded variable surface temperature curves whose Ts(t) have generally unequal areas for the forward and reverse events, suggesting that these events cannot belong to either item-1 or item-2 above. Antarctic ice core data suggest that the last 10,000 years have been a new experiment for the earth, and the variability in surface temperature during this period of time is a result of glacial/warming cycles being disabled by humans and the loss of the earth’s controller as a consequence. Please see article-6.

      5) The warming observed in Lake Superior, Arabian Gulf, and Mediterranean is considerably larger than the average warming observed in open oceans. This warming, and the few studies conducted in this regard agree, is mostly caused by waste heat rejected by humans, another new experiment for the earth. These water bodies have two things in common: a) they are isolated from open oceans and b) they receive large amounts of waste heat rejected by human. Surface water cannot radiate this waste heat, and this heat accumulates in the surface with time. Please see Appendix A-1, page 65, book pdf and Article-12, Earth’s Magic.

      6) The trend in surface temperature since the industrial revolution does not exhibit linearity or constant change with time, and, therefore, can not belong to item-1 discussed above. This trend can be scaled-down and calculated from the last warming period, which is a solid proof that carbon dioxide is the common factor between the present warming trend and last warming period. Please see Appendix A-3, Page-78, book pdf; Chapter-4, Page 14, book pdf; and Article-12, Earth’s Magic. We know well that carbon dioxide emissions are caused by humans, and, therefore, the present warming trend is caused by humans only.

      Based on mathematics and observations, it is possible to distinguish between natural and anthropogenic variabilities.

  67. Chief

    What perplexes me these days is: why has not the Climate Research Unit produced global mean temperature data that matches the 0.2 deg C per decade warming of the IPCC?

    http://bit.ly/gpr6jv

  68. Arfur Bryant

    Girma,

    I believe they have a very large computer working on that right now… :)

  69. From JC summary: “ The same issues and challenges raised for future projections remain also for the warming in the last few decades of the 20th century.“ That would be the decades of the eighties and the nineties. Hansen had no trouble with that in 1988: to him it was all natural. He extrapolated the warming out to 2019 and showed us how bad it would become if “business as usual” was allowed to continue. Business as usual did continue but the world temperature refused to follow any of his scenarios. Besides that “business as usual” he also defined two others, one with “moderate“ additions of CO2 to atmosphere and one where the addition of CO2 stopped in the year 2000. Moderate he defined as having the same fixed amount of CO2 as existed in 1988. In 2006 he pulled them all out again and said – “look – our present global temperature has followed my moderate prediction all the way to 2005.“ His idea of global temperature at that time was the “Land-Ocean“ curve from GISS. It shows the eighties and nineties as a period of rising temperature and his scenarios use this warming as a base for extrapolation into the next century. But since his moderate prediction was based on a fixed amount of CO2 which was frozen to 1988 levels and since the actual amount of CO2 had increased since then it is hard to make sense of his claim. I guess he was forced to gloss over the original premise of his curve in order to show something that approached reality because his “business as usual“ curve was wildly off target by then. I examined his temperature curve closely and found that it differs significantly from other temperature curves for the period. While in the eighties and nineties his curve and HadCRUT3 from the Met Office are very similar they separate in the next century. But when you compare them to satellite temperature curves both of them are way off. Satellite reveal the eighties and nineties to be a period of oscillating temperatures, not one of steady temperature rise as depicted by NASA, NOAA, and the Met Office. These oscillations belong to ENSO and depict a regular alternation of warm El Nino peaks with cool La Nina valleys. When you look at NASA and Met Office curves at higher resolution you see these same oscillations in their curves too. But where they differ from satellite view is in the valleys in between the peaks that correspond to La Nina periods. Both NASA and the Met Office have made them shallow. This gives their curves an upward slope and we are told that this is what the late twentieth century warming is all about. It is that upward slope of late twentieth century warming that Hansen extrapolated to predict his twenty-first century warming. Had he used the satellite temperatures there would have been nothing for him to extrapolate because the curve is horizontal. Since this article is about attempting to separate natural and and anthropogenically-forced decadal climate variability, we should now ask which of these differing curves for the decades of the eighties and nineties that were available to him is anthtropogenically-forced and which is natural. It is my contention that the satellite temperature curve is natural one and that the two others are anthropogenically-forced by deliberately making the La Nina valleys between the peaks shallow. And Hansen’s predictions of warming to come were only possible because he extrapolated these anthropogenically-forced temperature curves to make them. More details can be found in my book “What Warming?“

    • Another mole to whack. First of all of Hansen’s 1988 scenarios A, B and C he specifically called B the most likely, and indeed it was a pretty good guess as to what would happen (there are lots of blog posts on this, one of which Eli shall quote from:
      ————————
      Not bad for a first generation model, and they got a lot of other things right too. Now this does not say the model was perfect, just that it was useful, moreover there are good scientific reasons why it worked, including good physics, and a relatively short period between then and now.

      ” Close agreement of observed temperature change with simulations for the most realistic climate forcing (scenario B) is accidental, given the large unforced variability in both model and real world. Indeed, moderate overestimate of global warming is likely because the sensitivity of the model used (12), 4.2°C for doubled CO2, is larger than our current estimate for actual climate sensitivity, which is 3 1°C for doubledCO2, based mainly on paleoclimate data (17). More complete analyses should include other climate forcings and cover longer periods. Nevertheless, it is apparent that the first transient climate simulations (12) proved to be quite accurate, certainly not ‘‘wrong by 300%’’ (14). The assertion of 300% error may have been based on an earlier arbitrary comparison of 1988– 1997 observed temperature change with only scenario A (18). Observed warming was slight in that 9-year period, which is too brief for meaningful comparison.”

      14 is Michael Cricheton, 18 is Pat Michaels.

      In the period between 1988 and 2005 CO2 concentrations increased about 30 ppm, or about 1/10th of that equivalent to a doubling. Thus, the higher assumed climate sensitivity (~40% high) would only result is a ~4% difference in the forcing if you assume a linear trend, less if you use the proper logarithmic dependence of CO2 forcing on CO2 mixing ratios. I think Hansen et al are being much too modest.

      • Scenario A indicates a growth rate of co2 of 1.5%/year. The growth rate of co2 emissions was 1.3%/year from 1990-1999 and 3.3%/year for 2000-2006: http://www.pnas.org/content/104/47/18866.full.pdf . The 1988 paper states scenarion A as the business as usual scenario. It also states it must eventually be on the high side of reality because eventually we will run out of fossil fuels or serious efforts will be made to reduce emissions. The paper I cited indicates this hasn’t happened. Scenario B is described as a linear forcing scenario. It is described as the most plausible without qualifiers after the statement. The most logical assumption to make at this time is that it is the most plausible for the same reasons that scenario A must be eventually on the high side of reality. The last point I would make is that to say one model was pretty good because, after you make all the required adjustments to make it match another newer unproved model, it matches those results rather well and therefor the current models are probably even better is just comical circular reasoning. This is my evaluation and I’m quite interested in knowing why my evaluation is in error.

      • There is a significant lag between changes in atmospheric CO2 and a full global temperature response, due mainly to the thermal inertia of the oceans. Temperature changes during the past decade therefore reflect the combined effects of CO2 changes extending back for well more than a decade, as well as more recent changes – e.g., back into the 1980s and 1990s. Scenario B is not unreasonable in that context. For similar reasons, among others, disparities between predicted and observed warming on timescales of a decade or less are likely to exceed (on a percent basis) disparities for somewhat longer intervals (e.g., a few decades).

      • Fred, you are arguing that scenario B was right for the wrong reasons. I don’t see a huge difference between that and random lines drawn on a graph. I would say that ocean lag time is a subject that interests me and would love to see it as a topic here. Not that I could contribute much but I would enjoy reading the opinions of those who could.

      • Steven – Scenario B can’t be called “right” because the slope of the regression line was 0.27 C/decade while the Hadcrut observational data yielded a slope of 0.18 C, which was only 2/3 as steep. However, a few points are relevant:

        1. If the CO2 emissions in the 1980s to mid-1990s were slightly lower than Hansen’s estimate, this could contribute to reducing the observed warming subsequently.

        2. CO2 emissions are probably harder to estimate accurately than atmospheric concentrations, which have been rising at only a fraction of 1 percent per year.

        3. The variability within any single decade is likely to exceed multidecadal variability, and the long term Scenario A and B trend lines are not as different as the decadal difference of the most recent years.

        4. It is not considered legitimate to “retune” a model after it is run with a given set of parameters – the tuning must be limited to ensuring that it reproduces the climate at the start of the run, and the modelers must content themselves with what emerges after that, whether they like it or not. Therefore, one can’t rerun the earlier model to give better results. However, we do know that the Hansen model was based on a climate sensitivity of 4.2 C/CO2 doubling, while the modal value for current estimates is about 3 C (90% confidence interval 2-4.5 C), based on dozens of studies. If the input parameters had entailed that lower value, either one of the scenarios would have matched the observed data better.

        5. The oceans are the main repository of heat acquired by the climate system. The upper oceans approach equilibration with the atmosphere over an interval of a decade or two, but the deeper oceans respond over hundreds of years. Atmospheric and surface warming therefore tends to respond to changes in CO2, solar irradiance, etc. in large part over decades but with continued slower and shallower responses over centuries before a final equilibrium would ensue.

        For one informative article on this, see Persistence of Climate Changes.

      • Fred

        Please help me understand you points better.

        1. Why does it matter what emissions were between the 1980’s and mid 1990’s; it matters if the atmospheric concentrations were consistent with Hansen’s forecast during that period. Warming would be the result of the net increase in the atmospheric concentration of CO2. It would not really matter if the increase was the result of natural increases or were human caused.
        2. Based upon what were you able to reach the conclusion that the deep oceans reach equilibrium over hundreds of years?

      • Rob – Hansen et al didn’t attempt to forecast either emissions or concentrations, but asked what temperature changes might be expected from specified changes in CO2 concentration as well as changes in other anthropogenic emissions and natural climate variations. His estimates of emissions were simply a means of getting at concentrations, and if they were off, that was not a test of model success or failure. Scenario B was a reasonable approximation of what actually happened, even if not a perfect one, and the model yielded a result that would have been too warm even if the real world had exactly mimicked that scenario. For the model of his day, the result was not too bad, and current models use better input parameters (e.g., with climate sensitivity lower than Hansen’s 4.2 C); they don’t do this as a “lesson learned” from Hansen, but because the data support the new values. (Actually, it’s my impression that many climatologists though Hansen’s sensitivity numbers were too high back in 1988).

        Regarding the oceans, the PNAS paper I linked to and its references provide a source of relevant information.

      • Fred

        I am assuming that you are meaning the following point in the referenced paper:
        “The transfer of heat from the atmosphere to the ocean’s mixed layer (top 100 m or so) is thought to occur on timescales on the order of a decade or less (30), whereas multiple centuries are required to warm or cool the deep ocean”

        30. Hansen J, et al. (1984) Climate sensitivity: Analysis of feedback mechanisms. Climate Processes and Climate Sensitivity (Am Geophysical Union), pp 130–163.

        I was not able to read Hansen’s 1984 paper, but his brief summary states “are thought to occur on timescales”- it does not state- “have been shown to occur” which would seem to mean they have assumptions and no data to really support their assumptions

      • I believe references 6-13 and 31 have relevant data. I would have to revisit these to know which are most helpful, but I would say that because the known heat capacity of the ocean is so much higher than any other facet of the climate system, and that the deep ocean is much vaster than the upper layers where heat storage has been better quantified and is known to require decades for nearly complete responses to atmospheric forcing, it is almost inconceivable that the ocean response time is anything less than centuries.

      • Fred, scenario B can’t be called right because the criteria set forth in the paper makes it the wrong scenario to compare to observations. The appropriate response to Hansen’s projections would be along the order of: scenario A overestimated the increase of temperature to a business as usual scenario and this is possibly due to the following reasons: (list your reasons).

        The lag time of oceans I find quite interesting. Since you believe the time scale to be on the order of hundreds of years perhaps you would care to take a guess at how much of the more recent warming is due to the oceans working towards equilibrium from previous forcings before co2 is hypothesized by some to have become the primary driver? I expect you will agree that this should not be included in the transient response to co2 and any response towards equilibrium from this eliminated transient response should also be eliminated in those calculations?

      • Steven – I believe scenario B is more reasonable than A for reasons I’ve given, but I won’t argue the point, since we currently have better data to apply to models.

        The deep oceans harbor more heat content than the upper layers, but they equilibrate very much more slowly. Here, the PNAS paper by Solomon et al is relevant. It points out that for a very transient forcing, most of the stored heat will only reach the upper layer, and can be equilibrated with the surface fairly rapidly (decades). For very persistent forcings such as those imposed by increased atmospheric CO2, the addition of heat lasts long enough for most to be stored in the deeper layers, so that even when the forcing ends (e.g., CO2 levels subside), centuries may be required for the stored heat to be fully realized in surface temperature change.

        The consequence of all this is that we are currently experiencing effects from both recent climate forcings and those that occurred in the distant past (e.g., solar changes). However, the further distant was a past forcing, the more dilute will be its current effect. Hence, recent temperature changes have been dominated almost entirely by recent forcings, of which the most potent has been change in CO2 and other greenhouse gas concentrations. Early in the twentieth century, solar intensity increased, and so we are still seeing (in dilute form) some positive temperature change from that, but in the past decade or two, solar intensity has actually diminished, and it is therefore entirely possible that the net effect of solar changes on current temperature trends is slightly negative. That is uncertain, but given the relative magnitudes of all the forcings, it can’t be strongly positive.

      • We will have a much better attribution of solar forcing soon enough I suppose, or at least a better attribution of natural variability since it will be difficult to determine how much to attribute to a weak solar cycle and how much to attribute to the PDO and AMO going negative. I, for one, am waiting with great interest to see how it pans out.

      • Most of the ocean heat is in the top 200m –

        http://en.wikipedia.org/wiki/File:Thermocline.jpg

        The ocean heat content has been taken to be almost all in the top 700m – although there could be a problem with that.
        http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/

        Warm water if buoyant of course and moves to the surface by convection. Energy is gained and lost from the surface few microns in the infrared – the net is up of course. What determines change in ocean heat content is the net radiation – SW in and net IR out.

        The other idea is that heat floats about in the deep ocean for a few decades and then jumps out when you are not looking.

      • The problem is the missing energy problem of Trenberth – a nett increase in the energy in the system mostly from a decrease in reflected shortwave to 2010 – but not showing in the surface or ocean temps to 700m.

        von Schuckmann et al 2009 – show a small increase in heat content to 2000m – someone showed me a reference to heat to 3000m – this may require a rethink about how energy moves into the deep ocean but shouldn’t (?) affect the energy dynamics at the surface.

        But hell – ‘The deep oceans harbor more heat content than the upper layers, but they equilibrate very much more slowly.’
        As I say it just hangs around for a hundred years lurking in the abyss and jumps out when you least expect it.

      • Fred gets the Nobel Prize for Confirmation Bias, the Chief the one for Fear.
        ============

      • “Most of the ocean heat is in the top 200m”

        Robert – That is incorrect. Most of the heat content is below 700 meters. The average depth of the oceans is about 3800 meters. At the relevant temperatures, heat content is roughly linearly proportional to temperature, since ocean water specific heat capacity varies only slightly with depth. Deep ocean temperature is about 277 K, while upper ocean temperatures average less than 300 K, and so approximately 4/5 of ocean heat is in the deep oceans.

      • Funny how nearly everyone measures ocean heat content to 700m. Is there something hickey about using absolute temperature in water as a proxy for heat content?

        Let’s try a new tack Fred. Don’t take it amiss but your basic maths and physics is a bit hit and miss. Moreover, my impression is that once committed to a concept you will fight for it to the death. You will change the grounds on which the battle is fought, you will misdirect and confound the argument, appeal to authority, the mythical reasonable man and to common sense. Try not just to automatically respond with any old argument that seems to fit.

        The formula is Q = mC(ΔT), where Q is heat, m is the mass of the water (given), C is the specific heat capacity of the water (4184 J/kg-K) – a linear relationship so if T changes there must have been a linear change in Q.

        The trouble is that water undergoes a phase change at around zero degrees Celsius. There is no change in heat below the freezing point. As the temperature drops past zero Celsius to absolute zero there is no further decrease in the temperature of the ice but there is a decrease in the internal energy of the system.

        So the formula breaks down around zero C and any actual heat content as temperature can only be observed above that. The oceans are thermally stratified with warm water on top an average of 100m deep – it varies with latitude deeper in the tropics and shallower towards the poles as the water cools. The surface water has many times greater heat content than the bottom water as can be seen in the temperature profile – although we should always regard a wikepedia profile with suspicion it has the right shape.

        Surface currents sink as they move into higher latitudes carrying some warmth into the abyss. There is no thermal barrier to heat in the thermocline and it rises by convection as the water carrying it is less dense than the colder surrounding water.

        Heat is lost to the ocean in the infrared at the surface (<0.5mm) as the sum of downwelling IR, upwelling IR and latent heat in water vapour. And sure the ocean has potentially 1000 times the thermal mass of the atmosphere and has therefore some thermal inertia. But the control on the rate of cooling or heating depends always only on radiative flux at the surface. You'll see in the graph from the NODC that ocean heat content changes quite quickly and substantially on interannual timescales. Indeed the SST and surface temp are in temperature equilibrium on seasonal timescales.

        If the atmosphere cools a little or cloud clears a little, the ocean will cool a little from the surface. Heat hiding in the depth is merely an inaccurate analogy for thermal inertia.

        But let's have a little thought experiment. Let'd say we stopped putting greenhouse gases into the atmosphere and the surface temperature mysteriously stabilised at a higher average temperature. The ocean temperature would stop rising (all other things being equal) – but it would not lose heat because that is determined by the net IR from the surface. It would not start to lose energy until some of these greenhouse gases were washed from the atmosphere and the atmospheric temperature started to decline – both the atmosphere and the ocean would cool at the same time without any lag at all – other than the time for turbulent mixing of warm water into the cooling surface. The ocean and the atmosphere at the surface are in temperature (but not energy) equilibrium.

        Quod erat demonstrandum – there is no heat in the deep ocean waiting to jump out and bite our bums.

      • The deep ocean heat content below 700 meters is about 4/5 of the total, and the heat content of the top 200 meters is a small fraction of the total, even taking into account the different heat capacity of ice and liquid water as well as the heat of fusion needed to convert ice to water.

      • I thought you would at least pick up the deliberate error Fred?

        The fact remains that we have water oceans because we have a Sun. If the oceans were ice there would be a totally different dynamic and I don’t know what they would be called.

        I am going to count the energy from zero C – otherwise it wouldn’t be heat content in liquid oceans.

      • Note that Brian H caught the error in an 11:17 post below. Nice, Chief.
        =======

  70. Paul Vaughan

    Climate Etc. readers are cordially invited to join the following WUWT discussion:

    Vaughan, P.L. (2011). Solar, terrestrial, & lunisolar components of rate of change of length of day.
    http://wattsupwiththat.com/2011/04/10/solar-terrestrial-lunisolar-components-of-rate-of-change-of-length-of-day/

    Best Regards.

  71. ferd berple says:
    April 11, 2011 at 10:46 am
    The classic bathtub curve

    The Bathtub Curve and Product Failure Behavior
    Part One – The Bathtub Curve, Infant Mortality and Burn-in

    http://www.weibull.com/hotwire/issue21/hottopics21.htm

    ferd berple says:
    April 11, 2011 at 12:13 pm
    Here is a comparison folks mightfind interesting.

    I took two graphs from wikipedia, comparing solar activity with temperature:

    http://commons.wikimedia.org/wiki/File:2000_Year_Temperature_Comparison.png
    http://en.wikipedia.org/wiki/File:Carbon14_with_activity_labels.svg

    I flipped the graphs and rescaled so they matched and them plotted them together. It is a bit hard to pick out, but solar activity appears in purple. What it shows is that climate appears to track solar activity for the past 1000 years very closely, and matches many of the details in the proxy records, including where the proxies turn up and down.

    This would appear to indicate that solar variability is a strong driver of average global temperature.

    • Sorry, something went wrong with that post. I meant to only get the last bit. Moderator please delete everything before “ferd berple says:
      April 11, 2011 at 12:13 pm”

  72. http://research.aerology.com/natural-processes/solar-system-dynamics/

    Objective observation of the planetary effects on the NV as seen as the interactive drivers of long term oscillations in the global ocean and atmospheric circulation patterns. Compete with derived long term daily forecast for USA for the past 3 years and the next 6.

    Currently extracting data for Canada, for the expansion of the forecast area to cover all of North America, with the addition of one more 6558 day long analog period ~1938=2011 with less smoothed map contours, and 1 degree temperature gradations instead of 10 degree temperature gradations.

    The consideration of the lunar solar inner planet interactions give a resultant ~85% accurate repeating pattern. With the inclusion of analogs and algorithms for the effects of the outer planets longer period interactions with the movement of the inner planets and sun around the SSB.

    I expect to account for most of the rest of the short term rapid extreme excursions in the weather, accounting for most of the severe weather hurricanes and tornadoes, as due to the outer planet synod conjunctions with Earth.

    Updated higher resolution and bigger forecast maps with the improved revised method due to be on line in a month or so.

    Richard Holle

  73. Correction!!
    Compete with derived long term daily forecast for USA for the past 3 years and the next [3] years.

  74. Will someone say something about regional climate variability which is the subject of the paper supposedly under discussion?

    • It ain’t easy? I believe Manabe and crew started modeling the US before being swept up in the global then coupled global models. The spacial resolution of global models is much too course for regional, but can provide rough boundary conditions, (I guess that is the right term) for fairly large scale regional models. Manabe’s estimation of dry to moist soil conditions with water runoff seems pretty useful, but temperature variation with elevation from what I’ve read, is still a problem.

      There will be a lot of job security in regional modeling. Initial conditions will have to be changed every time there is an ENSO shift, PDO shift, AMO shift etc. then it will take time to adjust for the next shifts that aren’t easily predicated, (There is a list!). I am sure they will improve with time, but when will they be really useful?

    • Since you asked- this was previously posted

      Rob Starkey | April 7, 2011 at 3:14 pm | Reply
      I find the presumptions of the paper almost laughable, except that they are seriously stating that they will be able to segregate the “natural” from the “human caused” climate changes, and at a regional level no less. It sounds like the paper is written as the basis to ascribe future significant weather events to additional human released CO2 so that policies can be justified as a result.

    • The paper has already been properly eviscerated, Mr. Rabett.

  75. Regional, hemispheric or global? Shouldn’t there be some understanding of the the processes in play in ‘internal climate variability’ before determining a sphere of influence?

    ‘Given that over the course of the next 10–30 years the magnitude of
    natural decadal variations may rival that of anthropogenically
    forced climate change on regional scales…’ Solomon et al

    ‘We stress that this is a regional and seasonal effect relating to European winters and not a global effect. Average solar activity has declined rapidly since 1985 and cosmogenic isotopes suggest an 8% chance of a return to Maunder minimum conditions within the next 50 years (Lockwood 2010 Proc. R. Soc. A 466 303–29): the results presented here indicate that, despite hemispheric warming, the UK and Europe could experience more cold winters than during recent decades.’ Lockwood et al 2010

    ‘This index captures the 1976–1977 “El Niño–Southern Oscillation (ENSO)-like” warming shift of sea surface temperatures (SST) as well as a more recent transition of opposite sign in the 1990s. Utilizing measurements of water vapor, wind speed, precipitation, long-wave radiation, as well as surface observations, our analysis shows evidence of the atmospheric changes in the mid-1990s that accompanied the “ENSO like” interdecadal SST changes.’

    Burgman, R. J., A. C. Clement, C. M. Mitas, J. Chen, and K. Esslinger (2008), Evidence for atmospheric variability over the Pacific on decadal timescales, Geophys. Res. Lett., 35, L01704, doi:10.1029/2007GL031830.

    ‘Our results suggest that global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.’

    N. S. Keenlyside, M. Latif, J. Jungclaus, L. Kornblueh & E. Roeckner (2008), Advancing decadal-scale climate prediction in the North Atlantic sector, Nature Vol 453| 1 May 2008| doi:10.1038/nature06921

    We find that in those cases where the synchronous state was followed by a steady increase in the coupling strength between the indices, the synchronous state was destroyed, after which a new climate state emerged. These shifts are associated with significant changes in global temperature trend and in ENSO variability. The latest such event is known as the great climate shift of the 1970s.

    Anastasios A. Tsonis,1 Kyle Swanson,1 and Sergey Kravtsov1 (2007), A new dynamical mechanism for major climate shifts, GEOPHYSICAL RESEARCH LETTERS, VOL. 34, L13705, doi:10.1029/2007GL030288

    If as suggested here, a dynamically driven climate shift has occurred, the duration of similar shifts during the 20th century suggests the new global mean temperature trend may persist for several decades. Of course, it is purely speculative to presume that the global mean temperature will remain near current levels for such an extended period of time. Moreover, we caution that the shifts described here are presumably superimposed upon a long term warming trend due to anthropogenic forcing. However, the nature of these past shifts in climate state suggests the possibility of near constant temperature lasting a decade or more into the future must at least be entertained. The apparent lack of a proximate cause behind the halt in warming post 2001/02 challenges our understanding of the climate system, specifically the physical reasoning and causal links between longer time-scale modes of internal climate variability and the impact of such modes upon global temperature.

    Swanson, K. L., and A. A. Tsonis (2009), Has the climate recently shifted?, Geophys. Res. Lett., 36, L06711, doi:10.1029/2008GL037022.

    ‘A negative tendency of the predicted PDO phase in the coming decade will enhance the rising trend in surface air-temperature (SAT) over east Asia and over the KOE region, and suppress it along the west coasts of North and South America and over the equatorial Pacific. This suppression will contribute to a slowing down of the global-mean SAT rise.’

    Takashi Mochizuki, Masayoshi Ishii, Masahide Kimoto, Yoshimitsu Chikamoto, Masahiro Watanabe, Toru Nozawa, Takashi T. Sakamoto, Hideo Shiogama, Toshiyuki Awaji, Nozomi Sugiura, Takahiro Toyoda, Sayaka Yasunaka, Hiroaki Tatebe, and Masato Mori (2010) , Pacific decadal oscillation hindcasts relevant to near-term climate prediction, doi:10.1073/pnas.0906531107PNAS February 2, 2010 vol. 107 no. 5

    • So let me expand that a little bit. The Solomon paper starts from well behind the 8 ball. It starts from a vague recognition that there is decadal variability and says that here is how we might start to think about it. ‘A reasonable starting point for these metrics is to focus on decadal variability due to ocean processes, as discussed earlier. This requires analyses that assess the spatial patterns and associated time scales of natural variations, and their potential change in structure and frequency due to external forcing. ‘ Have they a handle on the recognised decadal processes in SST? It seems relatively unlikely as they nominate Rossby waves as the physical system in the frame. Rossby waves are implicated in the ENSO but cannot explain decadal variability due to both duration and periodicity considerations. They have a long road to even get to a starting point.

      Lockwood et al is concerned with long term solar UV drift and an assumed connection between ozone in the middle atmosphere and sea level pressure creating blocking patterns influencing storm tracks in the Northern Hemisphere. But this is a process that happens in the Southern Hemisphere as well and has an influence on the evolution of ENSO and the longer term Pacific variability.

      Burgman et al show that this is connected to cloud change in the Pacific and therefore the global energy balance – in quite a big way if you believe ERBE and ISCCP-FD data.

      Tsonis and colleagues show that 4 major ocean and atmospheric indices are in fact globally linked as spatio-temporal standing waves in a chaotic dynamical complex system.

      To quibble about a word dredged out of a paper that purports to provide a start to thinking about analysis – is typical of the querulous rabett.

      • Chief

        I just spent a half hour swearing at you under my breath in either Japanese or Mandarin — I’m not sure which, as I speak neither at all — and realizing that you’re just not cut out for Economics with all this hard science knowledge and habit running around in your noggin.

        There are very few intellectually ambidextrous people (being bipolar doesn’t count, Chief, sorry) who can both excel in a hard science with a strong mathematical component and in the field of Economics.

        Plus, I’m afraid I’m ruining your analytic skills by trying to compress some Economics concepts into your too-crowded cerebellum.

        Look at the mess you’re making of straightforward, simple, basic Chaos now.

        You’ll have to give up the Economics, as the world needs all the qualified hydraulicists it can get.

      • You place me in a quandary. If I accuse you of the unspeakable things I attributed to Craig – it would be credible and therefore an actual ad hom. So I will refrain – also for fear of Judith. Although – in leather. Hmmm. I have been very, very naughty. Now look what you made me do. You are a very dangerous man Bart R.

      • Chief

        More than you can imagine.

        Just say nice things about us instead, and everyone will know you mean the opposite, but you’ll be moderation-proof.

        Which is just my way of saving Dr. Curry some work, as I imagine having to frequently moderate a problem child becomes tendentious.

        Why just look at how many nice things I say about you.

        Though in my case, I’m sincere.

      • Bart my Bart – I’m sorry if I offended. It was unintentional. It is too easy for the untravelled lout that I am to forget that Australian terms of endearment sometimes translate as deadly insult elsewhere. For example the blunt language found in the The Australian – our most prestigious broadsheet – today. The first 3 pages were taken up in a discussion, disapproving, of an unnatural act with a horse in the TV series “Deadwood” and, approving, of the Sydney and Melbourne Theatre Companies and Opera Australia nuding up and making the 2 back beast. It puts a whole new spin on the fat lady singing.

        The other gem I gleamed from the paper this morning is that fully 1% of America’s electricity supply is used to grow hydroponic weed. Apparently there is 2.1 kg CO2 emitted/joint. Seemingly there is a whole lot of smoking weed and watching distasteful TV going on. Just say no Bart.

        Now I know that my feeble economics is on a par with your grasp of the science. Someday I will have to explain to you that making marks with a pencil on a sheet of paper, while undoubtedly a artistic expression and/or therapy, is not a sophisticated climate model. All I can say is that I was falsely encouraged in my incipient economics by finding agreement with Prof. Davidson of the Economic Dept. of RMIT.

        The Australian carbon tax is progressing mightily. Yesterday we found out that the ‘Fuel Equalisation Tax’ would be partially replaced with a carbon tax. That prompted the thought that we should simply rename the ‘Fuel Equalisation Tax’ a ‘Carbon Tax’ and claim the moral high ground from British Columbia. I’m no lawyer but it seems that it would only need some guy with a bottle of white out. Equally ineffective as the BC tax but that’s not the point.

        Just to finish up on an issue raised elsewhere – someone mentioned Skippy as an icon. Frankly, Skippy was a series of paws on a stick because ‘roos have a brain the size of a peanut and can be trained to sit there and look stupid or hop away and look stupid. ‘Roos do have a unique talent of being pregnant all the time and pumping them out at the smell of a green shoot.

        The best place to see ‘roo is on a plate. It can be thinly sliced, marinaded in chilli, soy, rice wine and honey and flash seared. ‘Roo bourguignon is also fairly tasty. Hardly any cholesterol or carbon involved.

        If you know anything of the “Man from Snowy River” you will know we love out wild bush brumbies – but not like that.

        ‘And one was there, a stripling on a small and weedy beast,
        He was something like a racehorse undersized,
        With a touch of Timor pony – three parts thoroughbred at least –
        And such as are by mountain horsemen prized.
        He was hard and tough and wiry – just the sort that won’t say die –
        There was courage in his quick impatient tread;
        And he bore the badge of gameness in his bright and fiery eye,
        And the proud and lofty carriage of his head.’

      • Chief

        You can’t offend me, deadly or otherwise, but by all means so long as you dodge moderation, continue to try if it amuses you.

        Your typing rate and my reading rate I’m sure will leave plenty of time for you to get to the useful stuff that uniquely talented brain of yours produces in sufficient quantities to amuse and pump it out at the smell of a green shoot for our edutainment.

        Wouldn’t it be amusing if it turned out in the fullness of time the only people willing to pay fair market carbon tax prices were drug suppliers, while the rest of us switched quietly and efficiently to those alternatives fueled only by your pocket plasma pinch personal fusion powerplants?

        By the way, you may wish to skip(py) the soy, or go for the low sodium version, considering the liver abuse you’ve engaged in your attempts to self-medicate.

        But I digress.

        You make the classic blunder of the hard scientist in dealing with the soft sciences.

        When faced with an economist, full agreement is always a sign of abject failure to understand what you’re being told. Economic thought is a process, not a commodity.

        The more economists you agree with, the more wrong you become.

        Only by disagreeing mightily with the most accomplished economists can you understand what they’re saying.

        Look at your proposal to simply rename your fuel tax to ‘carbon tax’.

        Brilliant, from the point of view of the straight-talking non-economist who wants to get the best possible economic outcome, and mathematically equivalent in every way, if you choose to do it that way to some of what BC has done, but simply wrong.

        Economically, you have to engage in the administrative process and frame the dynamic in a correct outlook. A bottle of whiteout applied to the screen of your Blackberry will not do.

        If you’re just a dress-up capitalist, still thinking you can subsidize corporate charities and commune your way to a better world by central planning committee politburo strategies, you’ve achieved nothing and are wasting your time and other peoples money. Again. Still.

        (See, at this point, if you weren’t a scientist, you’d know to disagree with me, if I were an Economist.)

        It’s like why you’re wrong when you frame your logic in some sort of Edwardian-era fugue mindset.

        Obtaining the correct result in the wrong reference set?

        You’re better than to be thinking so sloppily, Chief.

        You understand there’s more nuance to this matter, and less, too.

        For shame.

        You have the intellectual tools to do better. That’s just sloth.

        And toying with Fred by hiding little ice follies in your overlong scriptures?

        Puhleeze.

        From me, that sort of lame crap is almost acceptable; I’m no scientist, and I don’t post under a fancy title like Chief, or PhD, or P.Eng.

        From you?

        You have the options of resorting to brilliancy, thorough-going conscientious logic, openminded and diligent investigation, and reasoning from observation instead of from conclusion, because you are possessed of almost the IQ of a fictional child drawn and voiced by people who know Hank Azaria.. So you’re practically American.

        Spare me the Snowy River.

      • Chief,
        Those whom the gods would destroy they first make angry.
        Where has our clever, funny and correct Chief got off to?

      • I am afraid you are drifting off into incoherence again Bart. I will have to speak more slowly.

        The simple proposition – merely a subset of economics – is that your proposed has no validity in the terms of a Pigovian tax. It is instead a punitive tax – intended to punish a means of production such that another, higher cost for now, system is substituted. There are horrendous human costs associated with that if it were ever implemented widely in the world.

        There are 2 potential outcomes – and if I must repeat myself – it is unlikely to work. The cost of changing from coal and gas to nuclear is about $70/tonne CO2 equivalent in Australia. This starting point is $20 increasing at 4%/year. An obviuosly pointless exercise .

        If it was priced at something was sufficiently high to encourage substitution – the tax revenue would dry up and the result would be higher costs and higher taxes for all. It is a very simple chain of logic – one I had arrived at independently before reading Prof Davidson in the paper this week.

        Your tax is about playing word games and feeling morally superior – not about being effective in the multiple goals and multiple objectives of action in the real world. A distraction at best and danger at worst.

        But this has all been said before. Is this the real danger of the blogosphere – riding our wooden horses on the merry-go-round until we get dizzy and fall off?

  76. I hate to contradict you, Chief, but “As the temperature drops past zero Celsius to absolute zero there is no further decrease in the temperature of the ice but there is a decrease in the internal energy of the system. ” is nonsense.

    From the Engineering Toolbox:

    # Specific heat capacity water – 4.187 kJ/kgK
    # Specific heat capacity ice – 2.108 kJ/kgK
    # Specific heat capacity water vapor – 1.996 kJ/kgK

    There are actually some very interesting forms of ice that can appear at very low temps, with different crystal structures than Ice I.
    But ice definitely can cool below 0°C, once it is entirely done with its phase change and no liquid is left in the mix.

    • Well you are right of course – I just didn’t want to admit it to Fred. What is normally discussed is anomalies. So most of the heat change is in the top 700m.

      I tend to think it is more natural to think of oceans as liquid water and not at absolute zero.

  77. Fred, the problem with the deep ocean warming and coming back to haunt us hypothesis is that there has been very little energy increase detected in the deep oceans. It is possible that because of the difficulty in measuring at those depths there is more energy than detected but it is equally as likely there is less. Either way the numbers are unlikely to be impressive and the effects would be small, diluted as you would say. How do I know this? It isn’t that difficult to figure out. Just figure out how much the world has warmed since the last glacial, give that amount thousands of years to have a chance to change the temperature of the deep oceans, then take a look at the temperature of the deep oceans now and figure out how much it could have possibly warmed since we know it wasn’t below the freezing point there. Given all that time and all that change in the earths energy why anyone would worry about this being a factor of concern to us is something that is beyond my reliance on logic and would probably require a very technical explanation that would be beyond my comprehension, but would you care to give it a try anyway?

    • Steven – The amount of heat stored in the deep ocean is a function of the duration of a forcing. For a transient forcing (e.g., a short term methane flux), most heat will be stored in the upper layers and is redistributed to the atmosphere as the forcing subsides, before much can distribute deeper. For persistent forcing such as long term elevated CO2 concentrations, sufficient time is available so that the eventual distribution will entail storage of much more heat in the deep oceans than the upper layers. By the same token, an eventual reduction in forcing will be accompanied by a very long term release of heat from the deep oceans as described in the PNAS paper I referenced above. Ultimately, the release will asymptote toward zero over many centuries, and it is doubtful we are experiencing any effects from the termination of the last glaciation 11,000 years ago.

      You are correct that we have less data for the deep oceans than for the upper layers, but we have some that suggests that heat storage and its rate of accumulation may be faster than previously thought – see, e.g., Purkey and Johnson.

      Given the enormity of the deep ocean, a huge amount of heat can be stored with rather little temperature change. The deep ocean temperature during the Last Glacial Maximum was undoubtedly cooler than today, but it needn’t have been dramatically cooler, and it was certainly well above freezing. In fact, though, there is probably no simple “freezing point” for water at that pressure and salinity – rather water begins to develop ice crystals that squeeze out salt, leaving the remaining water even less susceptible to freezing, until finally a slushy “brine” forms, all of this well below zero degrees C. Mostly, however, as ice forms at whatever might be the freezing point for a particular depth, its buoyancy vis-a-vis the saltier liquid water in its vicinity brings it to the surface, so that the surface becomes ice-covered while the depths remain ice-free. We know that ocean surface ice was very extensive during glaciations, exemplifying this process.

      From a quantitative perspective, therefore, the contribution of deep ocean heat storage to ongoing temperature change will depend on the magnitude of the total heat added to the ocean, the length of the interval during which heat is added, and the length of time since its accrual terminated that deep ocean heat has had to re-equilibrate with the upper layers, and for those to release it to the atmosphere.

      • Obviously the forcing has been persistant or we would have gone back into glaciation. To say once the forcing ceases is meaningless in this context since it is obvious it hasn’t. The question still stands, if thousands of years of considerably higher energy input to the earth hasn’t had a dramatic effect on the temperature of the deep oceans then why would we expect that to suddenly change? If the change in temperature is 1-2 C and it takes thousands of years to occur in what way will that have a noticable impact on us? Wouldn’t we be more concerned about the next glaciation by then?

      • Maybe I wasn’t clear enough. Our interglacial temperatures are still active in determining the Earth’s heat budget, but we have long since had time for the atmosphere, ocean surface, and deep ocean to equilibrate, so that the increase in temperature during the deglaciation is no longer responsible for an imbalance left over from 11,000 years ago. During the time for equilibration to occur, the impact was very large (probably close to 10 deg C).

      • I’m a bit confused. Are you saying the deep oceans are 10C warmer now then they were then or are you saying they warmed and then cooled? If you are saying they warmed and then cooled why would they cool once equilibrium had been reached?

      • Ocean temperatures vary by depth and latitude. During glacial intervals, much more extensive latitudes were covered by ice at sub-freezing temperatures. A warming climate brought those temperatures up to the freezing point – probably a considerable increase over large areas of the globe – and then further added the heat of fusion needed to melt the ice (about 333 Kj perKg). Further warming raised surface water temperature in tropical and mid-latitudes, perhaps by 25 deg C above freezing or more in some areas, although little in areas where ice is still abundant. This almost certainly changed the vertical profile of heat distribution and the pattern of ocean circulation (e.g., the meridional overturning circulation). I don’t know how much deep ocean temperature changed, but its contribution was probably more in the form of slight warming of an increasingly rapid volume flow of water sinking to the depths, warming, and rising to the surface than in the form of a large temperature rise in the deep ocean itself. This would allow much of the heat reaching the surface to come from the deeper ocean without greatly raising the temperature of the latter.

        I expect that additional data on circulation patterns as well as appropriate models could sort this out further, but it’s reasonable to state that equilibration did not require temperature changes to be the same at all depths, nor for layers with a lesser temperature increase to have contributed less heat to the overall process.

      • Regarding my foregoing comment, the details of mechanisms underlying paleoclimatic heat exchange between layers are still poorly known. I’m less convinced we know exactly how the exchange of heat between deep and upper oceans occurred during deglaciations than that exchanges are significantly altered by changes in ocean patterns, so that the change in a layer’s temperature does not necessarily reflect its contribution to total heating.

      • For some data on glacial/interglacial deep ocean temperature transitions, see Deep Ocean Transitions. Changes tend to average about 2 deg C in the sampled locations.

      • Fred

        You make statements regarding what is happening in the deep ocean and seem to believe or at least try to make it appear as definitively proven information, when in fact there is not much actual data available and the conclusions are little more than hypothesis. The papers you have referenced have very little factual information about what is happening in the deep ocean. The 1st link you referenced (by Hansen et all) clearly did not have data to support the conclusion as was written in the paper—the paper acknowledged that “they believed” and not that they had data to demonstrate their conclusions were accurate.

        The latest paper you reference clearly is advocating a position. It opens with “warming climate is unequivocal, with the global top of the atmosphere radiative imbalance currently on the order of 1 W m22, very likely due to anthropogenic greenhouse gases”. When you actually look at the data and how it was collected there is little in the paper that involves actual measurements and the quality of those measurements is somewhat suspect.

        It just appears you are becoming more of an advocate than of an analyst.

      • Fred
        I have to agree with Rob. This sounds more like conjecture than anything else. I am aware that proof of deep ocean warming is important for explaining the missing heat, but it needs to be backed up with something a bit more concrete in my view. I mean, how exactly do you measure minute changes in ocean temperature when currents are continuously shifting both vertically and laterally. It’s just not feasible in my view.

      • Rob and Rob – I suspect you didn’t carefully read the three papers I referenced regarding the deep ocean contribution. They were Solomon et al, Purkey and Johnson, and Cutler et al (with multiple additional references cited in each). Hansen was not among the deep ocean papers I cited. I believe if you go back and look at the actual data, you will find the dominant role of deep ocean heat storage to be demonstrated unequivocally, even if the exact values lie within a range of variability that needs to be resolved more precisely. You can also review my calculations that demonstrate that the deep ocean has far greater heat storage capacity than the upper layers. The Cutler et al paper showing a 2 C difference between glacial and interglacial temperatures reinforces these points, because if you multiply that change by the mass of the deep oceans, and the specific heat capacity of ocean water, you’ll find the change in heat content to be enormous.

        I would invite others as well to review the comments and read the references. Deep ocean heat storage is not one of the climate phenomena that is in serious doubt, but some of the exchange mechanisms among the deeper and shallower layers still need to be resolved better.

      • This OHC paper is also interesting.

      • Fred

        The following two papers that you referenced are the one’s I point to that have little actual reliable data on the deep ocean.
        Fred Moolten | April 12, 2011 at 4:55 pm |

        Regarding the oceans, the PNAS paper I linked to and its references provide a source of relevant information.

        For one informative article on this, see Persistence of Climate Changes. In spite of what you wrote that paper was edited by James Hansen. (I do not think that makes it a bad paper however- it just has zero data about the deep ocean only speculation.

        Fred Moolten | April 14, 2011 at 2:34 pm |
        For some data on glacial/interglacial deep ocean temperature transitions, see Deep Ocean Transitions. Changes tend to average about 2 deg C in the sampled locations.
        This paper actually had data, but when I read it and how the temperature information was taken, I was certainly not confident of the conclusions. Certainly not enough to be as definitive as you seem to be trying to be about what you believe is happening in the deep oceans.

      • JCH- I have not yet the paper, but I went to their website to review how they collected the data and it looks good.

      • JCH – Thanks for the OHC reference.

        Rob Starkey – The papers I cited referred to other studies as well, including studies on rates of deep ocean heat transfer and temperature changes. It’s impossible to sample every location globally, but enough different sites have yielded roughly similar results for us to define a range of variation consistent with the principles I’ve tried to outline above.

      • Fred, I have no particular problem with the 2C estimate of deep ocean warming you are using regarding the change since the last glaciation. There are multiple papers that can reinforce that number. What I am much more curious about is what would it mean to us if the deep oceans warmed by 2C? I would also note that this much warming would be if it warmed as much between now and some future date as it warmed between glaciation and now. It is much more likely, even using the IPCC numbers that I think are way too pessimistic, that the deep ocean warming would be closer to 1C. But I am curious, what do you think a 2C change in the temperature of the deep oceans would make to the surface?

      • Steven – I would answer your question by saying there are some things we know rather well and others we can only speculate about. We know that a 2 C change (or even a 1 C change) would signify a very large increase in ocean heat content. It is more speculative, but if a 2 C deep ocean change was associated in the past with a much larger surface and atmospheric change as was noted between glacial and interglacial conditions, that might be the expected outcome in the future as well. The only alternative, I suppose, would be a mechanism coupling the deep ocean with the atmosphere that is very different from anything we know happened in the past.

      • Regarding the paper you referenced, I have only had a chance to skim it so far but it seems to rely on SLR attribution to a considerable extent. I enjoy discussing the limitations of SLR attributions and wouldn’t place much confidence in temperatures derived in that manner. I consider the trend of deep ocean warming after an increase of forcing over thousands of years to be a more reliable indicator of the process we should expect.

      • Fred,
        Please give us the mechanism to move this forcing from the CO2 into the deep oceans.
        TIA,

  78. Fred, so would you agree that, to the best of our knowledge, the energy that goes to the deep oceans is effectively sequestered until such time as the water above it cools and unlikely to affect the surface prior to the next glaciation?

    • Not at all, Steven. Coupling between atmosphere, upper ocean, and deep ocean is constant – there is no sequestering, but rather a difference in rates of change. As an example, if forcing warms the upper ocean, it will initially transfer heat rapidly to the deep ocean, but as the latter gradually warms, the transfer will slow, although continuing in a warming direction. This is what takes a long time, but it is still in the warming direction as long as equilibration isn’t reached (e.g., for centuries, but at a declining rate that becomes very small with time). The reverse happens with cooling – the deep ocean moderates it, but at a rate that slows over a long interval. It is only with transient forcings that we might expect the upper and deeper layers to temporarily change in different directions. With persistent forcings, the upper layer will move to a point not too distant from equilibrium within a decade or two, while the deep ocean moves in the same direction over the course of centuries, so that final equilibration involving both layers is a very prolonged process.

      • Fred, if the energy is still there it doesn’t have to be exactly the same energy. The net transfer of energy will always be downward as long as the surface is warming and only upward once the surface starts to cool, wouldn’t it? Thus it is sequestered in that that amount of energy will not influence the surface until such time as we may be happy it is there.

      • Steven, the influence on temperature is always bidirectional, regardless of net flux (and more than one net flux is involved). As the deep ocean is warmed, the gradient between it and the upper ocean declines, so that a given influx of heat to the upper ocean will now raise its temperature more than otherwise in response to a warming atmosphere. In other words, if a flux imbalance exists at the top of the atmosphere due to CO2 increases, everything below will warm, but at different rates.

        Actually, for simplicity, and neglecting heterogeneities, these fluxes could all be written into a model as a series of differential equations relating warming (or cooling) rates at any instant to differences between the atmosphere, upper ocean, and deep ocean, but each will be changing until equilibrium is reached. There is a moderating and slowing influence of deep ocean time constants, but no sequestering.

      • Fred, the sst’s went up more since glaciation then the deep ocean temperatures. If we are to expect these to come closer in the amount of change they experience then you are arguing the warming is still a process of deglaciation.

      • I believe I already addressed the notion that not all layers will warm equally at equilibration. Please see my earlier comments.

      • You said the gradient would become less. If the sst’s have warmed more then the deep ocean temperatures then there is more of a gradient from deglaciation, isn’t there?

      • Let me say this a different way so you may understand my confusion. The gradient from deglaciation increased substantially. If this is what we would expect to happen then the warming deep oceans will not increase the warming of the surface at all. If the deep oceans should warm to where the gradient is roughly equal to where it was before the forcing was applied then there is considerable warming of the deep oceans yet to be realised from deglaciation. I don’t see how it can be both.

      • Steven – I understand your confusion. I think one of the reasons it’s hard (for me at least) to give an accurate verbal description is that all the relationships interact in a way that is best described mathematically by differential equations. The best I can do is a very general statement that in response to a forcing (e.g. an increase in CO2 or solar irradiance), the atmosphere, upper ocean, and deep ocean would all keep warming until a new equilibrium is reached at a higher temperature for each, with the deep ocean and its large heat capacity being the slow step that delays equilibrium for a long time, and the atmosphere responding fastest.

        Regarding gradients, I probably misled you by oversimplifying. Yes, it’s true that if the upper ocean starts heating at a rate that puts it into an imbalance with the deeper ocean, a gradient will ensue that would tend to reduce itself through net downward heat transfer if unperturbed, but even as that occurs, opposing tendencies begin to operate – in particular, convection and other forms of mixing will lead to a new vertical profile with the upper ocean temperature rising more than that of the lower ocean from heat that was acquired at depth but which moved upward (without convection to move heat upward, the ocean’s temperature profile would be radically different). Given the heterogeneity of processes in the ocean in different regions and at different latitudes, as well as the effects of ocean currents, I’m not sure the timing, location, and other details of the heat redistribution phenomena are as well understood as the final results, but in any case, a combination of observational data with fairly sophisticated models is needed to balance all the processes. (In geophysics, it’s not unusual for an initial state and final equilibrium state to be easier to estimate than the exact pathways in between).

        If that doesn’t satisfy you, it’s probably because our knowledge (or at least my knowledge) of many details remains unsatisfactory.

      • Fred, I do appreciate the time and effort you put towards our conversation. I am going to stick with being confused a while longer I think.

      • Fred, I now get the impression that you often reason from equilibrium modeling. I would argue that that is inappropriate for the climate, which is a far from equilibrium system, and is also subject to frequent perturbation. Even worse it is misleading, one of the primary sources of the “dangerous AGW” scare. I thought this was well known. Perhaps not.

      • So no sequestering means what to the pipeline analogy?

      • The word “pipeline” is misleading. If CO2 emissions ceased, with no other changes, CO2 concentrations would drop, and there would be almost no further warming. The “pipeline” refers to warming expected if CO2 concentrations were to be kept constant, so as to continue to exert a forcing at the top of the atmosphere. In that case, the atmosphere, upper ocean, and lower ocean would continue to warm until equilibrium was reached. “Pipeline” does not refer to some mysterious place where heat is being stored without yet changing any temperatures.

      • We must insist that the warming or cooling that can only occur as a result of an energy imbalance is instantaneous. If greenhouse gases were constant – the temp would not increase.

        Ein/s – Eout/s = d(global heat content)/dt

      • That is – if you don’t change the radiative flux with constant gases the global temperature doesn’t change.

      • Fred,
        Yet ‘pipeline’ is a standard AGW argument prop to explain why things predicted decline to cooperate.

      • Fred Moolten

        You refer to the long equilibrium time, i.e. the “hidden in the pipeline” hypothesis.

        Let’s talk a bit more about this postulation.

        Hansen et al. describe this here:
        http://www.sciencemag.org/content/308/5727/1431.full.pdf

        We infer from the consistency of observed and modeled planetary energy gains that the forcing still driving climate change, i.e., the forcing not yet responded to, averaged ~0.75 W/m2 in the past decade and was ~0.85 ± 0.15 W/m2 in 2003 (Fig. 1C). This imbalance is consistent with the total forcing of ~1.8 W/m2 relative to that in 1880 and climate sensitivity of ~2/3°C per W/m2. The observed 1880 to 2003 global warming is 0.6° to 0.7°C, which is the full response to nearly 1 W/m2 of forcing. Of the 1.8 W/m2 forcing, 0.85 W/m2 remains, i.e., additional global warming of 0.85 x 0.67 ~ 0.6°C is “in the pipeline” and will occur in the future even if atmospheric composition and other climate forcings remain fixed at today’s values.

        But how did Hansen et al. arrive at this 0.85 W/m2?

        Summarizing Hansen’s determination of the 0.85 W/m2 figure: Total forcing (1880-2003) is assumed to be 1.8 W/m2 (incl. 0.2 from natural forcing = solar) and observed warming was 0.6-0.7 degC. Assumed climate response is 2/3degC per W/m2 (equivalent to an assumed 2xCO2 climate sensitivity of 3 degC), therefore 0.65 degC warming is the response to ~1W/m^2. But since theoretical forcing was 1.8 W/m2, this leaves 0.8 W/m2 still hidden “in the pipeline”.

        Checking Hansen’s logic, it is “circular”. He starts out with an assumed CO2 climate sensitivity, then calculates how much warming we should have seen 1880-2003 based on this assumed climate sensitivity. This calculates out at 1.2 degC. He then ascertains that the actual observed warming was only 0.65 degC. From this he does not conclude that his assumed climate sensitivity is exaggerated, but deduces that the difference of 0.55 degC is still hidden somewhere “in the pipeline”. Using his 2/3 degC per W/m2, he calculates a net “hidden” forcing = 0.82 W/m2, which he then rounds up to 0.85 W/m2.

        Checking Hansen’s arithmetic: The theoretical GH forcing from 1880-2003 is 5.35 * ln(378/285) = 1.5 W/m2; adding in 0.2 W/m2 for solar = 1.7 (not 1.8). Using Hansen’s figure of 2/3degC per W/m^2 puts theoretical warming at 1.1 degC. Observed warming was 0.65 degC leaving 0.45 degC hidden “in the pipeline”. This equates to a “energy imbalance” of 0.45/.6667 = 0.68 W/m2 (not 0.85), all things being equal.

        But all things are not equal. Several solar studies (Geerts + Linacre 1997, Dietze 1999, Lockwood + Stamper 1999, Shaviv + Veizer 2003, Solanki et al. 2004, Scafetta + West 2006) show that 0.35 degC warming (on average) can be attributed to the unusually high level of solar activity over the 20th century (highest in several thousand years), although the exact mechanism for this empirically observed warming has not yet been determined. Let us assume that this covers the same 1880-2003 period cited by Hansen. Much of this occurred during the early 20th century warming period from around 1910 to around 1944, which cannot be explained by AGW alone. This leaves 0.3 degC observed non-solar warming (1880-2003). If we assume that the solar warming occurs without a long equilibrium time delay, but that 40% of the theoretical GH warming over this long period is still hidden “in the pipeline”, we have 0.3 + 0.2 = 0.5 degC equilibrium GH warming 1880-2003 with an “imbalance hidden in the pipeline” of 0.2/.6667 = 0.3 W/m2 (instead of 0.85).

        In addition to the solar studies, there are many observed natural factors that have caused warming. Notable among these are swings in the ENSO, which were partially responsible for many high temperatures in the 1990s, including most notably the all-time record high in 1998. The current cooling after 2000 is being attributed to these natural factors (called “natural variability” by Met Office), despite the fact that all models predicted record warming as a result of record increases in atmospheric CO2 concentration. Whether or not swings in ENSO, PDO, etc. are linked to changes in solar activity is unknown. But, in any case it is wrong to simply ignore these natural factors, as Hansen has done, and assume that essentially all warming 1880-2003 was caused by AGW.

        So much for the theory. Now let’s check the actual physical observations.

        Hansen’s assumed “pipeline” is the upper ocean. This is where the “hidden” energy is assumed to be “hiding”. When this study was published (data to around 2003) it appeared that the upper ocean was warming (Josh Willis, team leader of the ocean temperature monitoring group, is one of the co-authors). Since then the old, relatively unreliable XBT measurement devices, which introduced a warming bias, according to Willis, have been replaced by much more reliable Argo sensors. The record from 2003 to 2008 shows that the upper ocean has not warmed, but cooled instead.
        http://www.friendsofscience.org/assets/documents/OceanCoolingE&E.pdf

        This presents a real dilemma for the “hidden in the pipeline” postulation.

        Atmospheric temperature (HadCRUT) has not warmed after 2000.

        The amount of latent heat from melting ice or water evaporation is insignificant.

        So where is this “hidden energy”?

        It is obvious that if it cannot be found anywhere in our planet’s climate system, it has either been radiated out into space or just does not exist.

        Of course, it could have been transferred to the deep ocean by exchange with the upper ocean, but this is highly unlikely, and if it did go to the vast deep ocean causing a few thousandths of a degree warming it will never come back to haunt us.

        In either case it is not “hidden in the pipeline” waiting to cause even more global warming as postulated by Hansen et al., so we can safely forget about it.

        Max

      • Max,
        There seems to be some confusion on the nature of the pipeline.

        The pipeline discussed in the first excerpt from Hansen’s article refers to the delay in the warming. In that consideration no energy has entered the pipeline, the only thing that we have in the pipeline is power. The difference given by Hansen as 0.85 W/m^2 remains because the Earth has not yet reached the equilibrium value of the temperature that corresponds to the change in forcing.

        My observation doesn’t influence many of your calculations, they are based on the same definition of the pipeline, but everything you write about the observations in the last part of your comment after “So much for the theory. Now let’s check the actual physical observations.” corresponds to a totally different issue. That discussion refers to some energy that might have gone to the oceans, but that is not visible in the data. The “pipeline” of Hansen doesn’t refer to such storage of energy. Actually the lack of such stored energy tells rather that there is still space for further flow of energy to the oceans before the full warming potential will be reached. In this sense the lack of energy is evidence for more warming in the pipeline, although the logic can also be inverted, when some additional factors are included.

      • Pekka,

        I have been wondering about the so-called energy imbalance. Do we add cold CO2 to the atmosphere which then adsorbs more heat radiating from the surface and elsewhere and results in an energy imbalance at TOA? The delay in warming happens because of the delay in reaching the equilibrium temperature as the molecules heat up?

        It seems more physically plausible that CO2 is created in a combustion process mostly at ground level. These molecules start off as warm as they are ever likely to be in the atmosphere and lose heat after that. The energy in this case comes from a spontaneous decrease in enthalpy in the combination of oxygen and carbon. This of course follows an endothermic reaction in plants in the dissociation of oxygen and carbon. The energy input is sunlight of course.

        It is this aspect of energy ‘stored’ in organic material that provides the initial impetus to warming. A new equilibrium at a higher atmospheric temperature is then established – but how can there be a delay.

      • “Checking Hansen’s logic, it is “circular”. He starts out with an assumed CO2 climate sensitivity, then calculates how much warming we should have seen 1880-2003 based on this assumed climate sensitivity. This calculates out at 1.2 degC. He then ascertains that the actual observed warming was only 0.65 degC. From this he does not conclude that his assumed climate sensitivity is exaggerated, but deduces that the difference of 0.55 degC is still hidden somewhere “in the pipeline”. Using his 2/3 degC per W/m2, he calculates a net “hidden” forcing = 0.82 W/m2, which he then rounds up to 0.85 W/m2.”

        Max – I believe you misunderstood Hansen’s reasoning. He did not calculate how much warming “we should have seen”. He was aware that climate sensitivity refers to an equilibrium temperature response approached asymptotically over centuries, and that the observed warming could not have reached the equilibrium response, and so he was not comparing observations with expectations, but rather calculating how much further temperature would rise to reach equilibrium if nothing else changed, including atmospheric CO2 concentrations.

        I italicized this point because it illustrates the confusion about the nature of the “pipeline”. If CO2 did not change, Hansen estimated a 0.85 W/m^2 imbalance remaining from the unchanged forcing. The “pipeline” is that remaining imbalance. It is not heat hidden in the oceans or anywhere else in the climate system, but rather the future heat that would accumulate if the forcing remained unchanged. If the forcing were eliminated by halting all further CO2 emissions, the pipeline would quickly disappear, and temperature, after a brief rise, would actually begin to fall – none of this would be inconsistent with Hansen’s interpretation, nor does the comparison between observed and expected temperature change indicate exaggerations on his part.

      • Fred, don’t you think that Hansen’s view that there is 0.5 to 0.6C heating in the pipeline makes it very unlikely the heat accumulation in the oceans could stop even if all known natural variability factors were negative, which they aren’t? I suppose you could say it’s happened before but I thought that was a skeptics line.

      • Steven – Perhaps I misunderstand your question, but the “pipeline” refers to future heating to be expected if CO2 concentrations and thus CO2 forcing persisted unchanged – it does not refer to energy currently in the climate system. The climate warms or cools depending on the direction of the imbalance between incoming and outgoing energy. If the imbalance shifted so as to make outgoing energy exceed incoming, the climate would start to cool, including net loss of heat from the oceans. The process would begin immediately, but the thermal inertia of the oceans – particularly the deep ocean – would prolong the time needed to fully restore a balance. A cessation of CO2 emissions, by permitting atmospheric CO2 levels to drop rather than remain constant, would initiate such a process, because pre-industrial CO2 levels are not in balance with current climate temperatures.

      • Fred, I am not confused about what he means. I have always seperated the ocean lag question from the heat in the pipeline question. The fact remains that in order for a positive energy imbalance to not warm the oceans it has to be offset by something. If you are going to claim the energy imbalance being offset is enough to warm the world by 0.5 – 0.6C then you have a lot of offsetting to do

      • Actually it is possible I was confused. It is quite possible the news release was including in the calculations future co2 emissions while refering to it as the current energy imbalance. Not a mistake of wording I would expect from a NASA article but one that would be reasonable within the realm of human error. http://earthobservatory.nasa.gov/Features/HeatBucket/heatbucket4.php

      • A significant drop in CO2 levels would convert a positive imbalance into a negative one capable of cooling the oceans.The cooling would start the instant the imbalance shifted to negative but would take a long time to restore a new balance. The net forcing change in W/m^2 need not be enormous for this to happen as long as the change is persistent.

        There’s an interesting small point regarding this, however. If the upper and deep oceans were in a state of warming such that the deep ocean was not yet in balance with the upper ocean (i.e., more deep ocean heating would be needed for the two to equilibrate), then an abrupt shift to a cooling state of the atmosphere could be followed transiently by a circumstance in which the upper ocean starts to cool but is still warm enough for a net transfer of heat downward, thereby delaying the start of deep ocean cooling. During this transient state, total ocean net heat loss would be occurring, but the deep ocean would be gaining heat.

      • Steven – I think the statement by Willis in the NASA article is a bit misleading. An interesting article in Nature Geoscience last year provided a more accurate description. It’s behind a paywall, but RC reproduces their curve showing that the “pipeline”, which they call “climate commitment” is based on the assumption of a future with unchanging CO2, and that the climate is not actually committed to anything if we miraculously stopped emitting CO2 – see Climate Commitment. That extra heat is not yet in the system.

      • Yes, I remember that posting from when I still read real climate.

      • Fred

        I think you have misunderstood.

        I simply pointed out that Hansen’s estimated “hidden in the pipeline” warming estimate was based on circular logic.

        First, he assumes an equilibrium warming (1880-2003) based on an assumed 2xCO2 CS of 3C; second, he sees that this theoretical number has not yet been reached, so third, he concludes that the “missing” warming is “still in the pipeline”.

        Let’s say he had assumed a theoretical equilibrium warming (1880-2003) of 0.65C rather than 1.2C. He would then have concluded that “equilibrium” had been reached, since all the assumed warming to be reached at equilibrium had already been observed.

        Quite simple, actually.

        Max

      • Fred Moolten

        So that you can better understand Hansen’s “hidden in the pipeline” postulation, here are some excerpts from the 2005 paper he coauthored with several others, including Josh Willis.

        Our climate model, driven mainly by increasing human-made greenhouse gases and aerosols, among other forcings, calculates that Earth is now absorbing 0.85 ± 0.15 watts per square meter more energy from the Sun than it is emitting to space. This imbalance is confirmed by precise measurements of increasing ocean heat content over the past 10 years. Implications include (i) the expectation of additional global warming of about 0.6°C without further change
        of atmospheric composition…

        Improved ocean temperature measurements in the past decade, along with highprecision satellite altimetry measurements of the ocean surface, permit an indirect but precise quantification of Earth’s energy imbalance. We compare observed ocean heat storage with simulations of global climate change driven by estimated climate forcings, thus obtaining a check on the climate model’s ability to simulate the planetary energy imbalance.

        The lag in the climate response to a forcing is a sensitive function of equilibrium climate sensitivity, varying approximately as the square of the sensitivity (1), and it depends on the rate of heat exchange between the ocean’s surface mixed layer and the deeper ocean (2–4).

        The lag could be as short as a decade, if climate sensitivity is as small as 0.25°C per W/m2 of forcing, but it is a century or longer if climate sensitivity is 1°C per W/m2 or larger (1, 3). Evidence from Earth’s history (3–6) and climate models (7) suggests that climate sensitivity is 0.75° T 0.25°C per W/m2, implying that 25 to 50 years are needed for Earth’s surface temperature to reach 60% of its equilibrium response (1).

        Ocean heat storage. Confirmation of the planetary energy imbalance can be obtained by measuring the heat content of the ocean, which must be the principal reservoir for excess energy…

        Hope this helps.

        Max

      • Max – I don’t think it helps, because it repeats comments I’ve already made and principles I’ve already alluded to. Regarding your mention of equilibrium above, it is inconceivable that Hansen, who is familiar with climate dynamics, would have concluded that “equilibrium had been reached”. I also discussed this is some detail in earlier comments.

      • Isaac Held’s latest blog item on the Recalcitrant Componetn of Global Warming includes relevant insights into the “pipeline” or “commitment” principle.

      • Fred

        Thanks for link to Held comments on “pipeline”.

        This is all computer model stuff. Doesn’t answer my point about Hansen’s circular logic in arriving at this postulation in the first place. Nor does it discuss the role of the upper ocean, to which H. makes reference in his original paper.

        Sorry.

        Max

      • Max – See above for my discussion of your original point, as well as the role of the upper ocean. Others can review the exchanges as well if they are interested.

        I believe Isaac Held’s blog is one of the more promising developments to emerge in the climate science blogosphere in recent months or even years. It is conducted at a high level of expertise, is not contentious, and avoids conclusions that go beyond what the evidence will support. You may disagree, but I would again suggest that interested readers visit the site to judge whether his use of model simulations, with relevant comparisons to observed data, is informative. I have found it so.

        I was particularly interested in his earlier item on Transient Responses to Greenhouse Gases, because in a world in which the concentration of CO2 is rising, we will not be at equilibrium, and so transient response calculations are often more useful than calculations of equilibrium temperatures.

    • Why would this be? There is a thermal transition – the thermocline. But warm water rises because of buoyancy. What is to stop warm water that has downwelled to the abyss form moving by convection again to the surface layer?

      Downwelling of warmth is one of those things we know little about – certainly not something we can pull out reliable numbers on. But always assuming it happens – as in the von Schuckmann et al 2009 paper – a slight increase in temp to 2000m in ARGO data as opposed to no increase in the other analyses of ARGO data to 700m. Is that a mystery there? I think so.

      But regardless – the atmosphere has much less heat capacity than the oceans. Any heat gained from an energy imbalance is stored mostly in the ocean. The corollary is that any heat lost by the planet as a result of an energy balance is mostly lost from the oceans.

      It makes much more sense to think of the oceans and atmosphere as a single system. Ocean surface air temperature are for instance determined as equivalent to the sea surface temperature. The oceans warm and cool at the same time as the atmosphere. Warm water floats on the top and loses energy from less than the top 0.5mm. The ocean is thermally stratified – which is no barrier to warm water in the abyss floating to the surface layer.

      ‘Heating in the pipeline’ is a misconceived idea. If the net energy balance at TOA turns negative – planetary cooling – the atmosphere does not continue to warm from the oceans but both lose energy and cool.

      • Yes, it’s a mystery. Even Roger Pielke Sr. didn’t see it go down there!

        Unpublished material and Bob Tisdale – currently show ever so slight warming in the 0-to-700-meter layer, but the current La Nina will probably knock that flat to negative.

        I guess I’m simplifying it too much, but if additional GHGs slow the rate at which the oceans cool, then the Mermaids have more time to seduce innocent young hot guys to the depths below. I believe this is an age-old problem with the seas and it explains everything.

      • It must be somewhere according to the CERES data. The changes there were mostly in the SW and as a result of ENSO. The current La Nina will certainly change that that.

        If I were looking for an explanation for lack of ocean temp increases to 700m and lack of surface warming – I would start with the increase in cloud in the Pacific in the late 1990’s. Why a slight energy imbalance in the SW since 2000 should result in warming to 2000m is a mystery.

      • And well yes hot young guys and mermaids is a theory.

      • Chief,
        Until the AGW fanatics can explain how energy from CO2 in the infrared gets down below 700 meters, the deep ocean heating is more of a theological argument than a scientific explanation.

      • hunter

        Even more mysteriously: how did this heat get down below 700 meters without first being detected in the top 700 meters?

        Have we got a miracle here?

        Max

      • It is wrong to think of the ocean being warmed by CO2 in the atmosphere. The net IR is clearly from the ocean to the atmosphere. The flux of energy is from the Sun etc…

        There is a single reason for thinking that the ocean/atmosphere system has warmed a little since 2000. The CERES data that is shown here.

        It shows net warming in the period – mostly in the shortwave. So where is the missing energy? Apparently not in the top 700m of the oceans. But the von Schyckmann for instance using ARGO data shows some warming.

        This is very good data and data always trumps simplistic rationalisation in a ‘superficially scientific idiom’ – :roll:

      • Water evaporates on its journey to the poles. Salty water carries some heat into the deep ocean theoretically. Although the amount is clearly variable – it seems to show up in ARGO data.

        The oceans layers are of of course thermal in origin. Warm water floats on cold – no deep mystery. There is no barrier between the heat in the deep ocean and that at the surface. That we have heat floating about in the deep ocean for 1000 years seems a bit apochyphal.

      • “a bit apochyphal.” ??

        Arr, arr, arr!
        Right after the “aitch”. Then remove the “aitch”.
        “Apocryphal”.

        Thar! Oil fiksed.

        ;)

      • Fair enough – :oops: – but do you know what it means?

      • Yup; fabulation of End of Days mythology. Such as the Apocrypha, with monster angels jetting about on horses propelled with flaming flatulence, or SLT.

      • It could be argued that the deep ocean, whilst still colder than the surface layer, is slightly warmer than it would have been.
        That’s entirely plausible – the same amount of energy it would take to warm the top 500m by 1C would warm the bottom 5000m by only 0.1C, which may not be detectable.
        However, if that were true then that heat energy is effectively lost – it will stay in the depths and never find its way back to the surface.

      • Peter – Any imbalance would restore itself upward mainly through convection, and the huge heat capacity of the deep ocean tells us that even a 0.1 C imbalance has the potential to move a very large amount of heat to the upper ocean, from which atmospheric warming would ensue. You are right, though, in implying that the process would not run to completion for centuries, although the initial phases would be faster than the final ones.

      • Only if the upper layers are colder than the lower layers will convection take place.
        I was talking about a hypotyrtical scenarion where the bottom layers could be slightly warmer than they might have been, but still colder than the upper layers.

      • …even ‘hypothetical scenario’

      • My reply was accidentally posted below rather than as a response to your comment. Convection does not require the deeper layer to be warmer than the upper layer.

      • Peter317

        Just some quick “sanity check” numbers:

        All the optimistically estimated fossil fuels on this planet contain just enough carbon to get the atmospheric CO2 level almost to 1,000 ppmv when they have all been used up. That’s it. That’s all there is.

        C1 = today’s CO2 concentration = 390 ppmv
        C2 = absolute maximum CO2 concentration = 1000 ppmv
        C2/C1 = 2.564
        ln(C2/C1) = 0.9416
        ln 2 = 0.6931
        dT(2xCO2) = 3.2°C (according to IPCC)
        dT to 1000 ppmv = 3.2 * 0.9416 / 0.6931 = 4.3C
        [Equals absolute maximum atmospheric warming we would theoretically ever reach from fossil fuel combustion, using the IPCC model-based estimate of 2xCO2 climate sensitivity of 3.2°C]

        If all of this energy went into the upper ocean (700 m), how much would it warm?

        Answer: approx. 0.02°C

        And if it all went to the deep ocean, how much would this warm?

        Answer: approx 0.005C

        So I do not believe that we are going to see any perceptible warming of the ocean from anthropogenic greenhouse warming.

        Max

        PS The fish are safe!

      • The estimate that you present on the total amount of fossil fuels is controversial. I picked another from a 1997 paper by H-H. Rogner (Ann. Rev. Energy Environ. 1997, 22:217-262). Based on its values for fossil fuel resources the ultimate amount of carbon to be released is around 5300 Gt-C. The lowest estimates have usually been obtained using Hubbert type analysis, which I consider not applicable to the estimate of ultimate resources, when the resource base is as diverse as it is for coal and for unconventional oil and gas.

        As a quick test I used my simplistic model of persistence of CO2 in the atmosphere and let the emissions grow 2% annually until all fossil fuels have been used, which happens in 2135 in this model. At that point the CO2-concentration is 1700 ppm.

        This model is in many ways unrealistic. The most notable error concerns the growth of emissions, which could not continue exponentially even close to the moment, when the coal resources have been exhausted. This would make the maximum significantly lower. Another major error concerns the persistence model, which is valid only at relatively low concentrations. The removal of the CO2 from the atmosphere would slow down very significantly at higher concentrations as the pathways for removal get saturated. These two most important errors have opposite effects. I believe that my estimate of a maximum attainable concentration at 1700 ppm is closer to the truth than 1000 ppm.

        Whichever estimate is closer to the truth, it’s certain that emissions cannot grow forever, and that the warming through anthropogenic influence will also end at some point. Many processes are slow taking centuries, but my view is still that the AGW is a problem of next 100-200 years, after 200 y the warming has most probably turned to gradual cooling (unless some natural warming processes take over). At that time human societies have also adapted to the new conditions stopping the accrual of additional damage.

      • “saturated”? Sez who? The “sinks” are hardly even agreed by type, much less quantified.

        If, e.g., much of it is biological, as diurnal and seasonal swings suggest, the %uptake might even increase as vegetation proliferated.
        Even physical mechanisms like uptake by agitated cold water on the fringes of the Arctic (which I’ve seen documented as a peak area for the globe’s CO2 seawater dissolution) might well be linear.

        In any case, 1,000 – 2,000 ppm is the ideal range for a true greenhouse Earth. Bring it on!

      • Pekka Pirilä

        You wrote:

        The estimate that you present on the total amount of fossil fuels is controversial.

        This may be so, in your mind. But much less controversial than the extremely high number you cite from the 15-year old Rogner study.

        Since then there have been many estimates, including the rather pessimistic Rutledge estimate.

        But here are some realistic numbers.

        To date (2005) we have consumed around:

        142.5 Gt oil
        294 Gt coal
        106 trillion cubic meters natural gas
        http://www.eoearth.org/article/Energy_from_Fossil_Fuels_(historical)

        Oil: 85%C, 25% for non-combustion 142.5 * .85 * .75 * 44 / 12 = 333 GtCO2

        Coal: 91%C: 294 * .91 * 44 / 12 = 981 GtCO2

        Gas: 2.0 GtCO2/trillion cubic meters, 20% non-combustion: 106 * 2 * 0.8 = 171 GtCO2

        Sub-total, fossil fuels: 1,485 GtCO2

        Estimate for deforestation + cement production = 250 GtCO2

        Total: 1,735 GtCO2

        [Rogner estimates only 230 GtC or 843 GtCO2 to date (excl. cement + deforestation)]

        280 ppmv = “pre-industrial” CO2 level (IPCC)
        379 ppmv = 2005 level (Mauna Loa)
        99 ppmv added

        Mass of atmosphere = 5,140,000 Gt
        Theoretical addition (1750-2005):
        1,735 * 1,000,000 / 5,140,000 = 338 ppm(mass) = 222 ppmv

        99 / 222 = 45% of human emissions “stayed” in atmosphere (remainder absorbed by biosphere, ocean, rocks, soil or dissipated)
        [Note: long-term “half-life” of CO2 in system has been estimated to be 100-120 years]

        How much CO2 is left in all of our planet’s fossil fuels?

        Let’s take optimistic estimates of future reserves:

        Oil:
        1,317 billion bbl “proven” 2007 (O+GJ)
        2,800 billion bbl worldwide shale (Wiki)
        500 billion bbl new discoveries (Arctic, ANWR, OCS, misc. offshore, new tar sands, etc.)
        4,617 billion bbl equals
        604 Gt

        At projected future consumption rate of 100 million bbl/day = 126 years reserves

        Natural gas:
        176 trillion cubic meters “proven” (O+GJ)
        180 trillion cubic meters world-wide new finds, incl. shale (Wiki)
        200 trillion cubic meters possible future accessible methane from clathrates (guess)
        556 trillion cubic meters

        At projected future consumption rate of 1.5x Wiki estimate or 4.8Tcm/year = 116 years reserves

        Coal:
        847 Gt “proven” (World Energy Council) [Note: Energy Watch Group thinks this is optimistic}
        700 Gt possible new resources (optimistic)
        1,547 Gt

        At projected future consumption rate of 2x current rate or 12.4 Gt/year = 125 years reseves

        Oil: 604 * 0.85 * 0.75 * 44 / 12 = 1,412 GtCO2
        Gas: 556 * 0.8 * 2 = 890 GtCO2
        Coal: 1,547 * .91 * 44 / 12 = 5,162 GtCO2
        Sub-total from fossil fuels = 7,463 GtCO2
        Deforestation + cement @ 10% =746 GtCO2
        Total human CO2 = 8,209 GtCO2

        8,209 * 1,000,000 / 5,140,000 = 1,597 ppm(mass) = 1,051 ppmv
        Assume 50% “stays” in atmosphere = 525 ppmv

        Today’s concentration (2011) = 390 ppmv

        Total maximum ever CO2 concentration = 915 ppmv [That’s it, Pekka. There is no more out there.]

        C1 = 390 ppmv
        C2 = 915 ppmv
        C2/C1 = 2.347
        ln(C2/C1) = 0.8532
        ln2 = 0.6931
        dT(2xCO2) = 3.2K (IPCC)
        dT (max. ever from human CO2) = 3.2 * 0.8532 / 0.6931 = 3.9K [That’s all, Pekka!]

        Now, of course, if Spencer and Lindzen are correct on climate sensitivity (rather than IPCC), the maximum ever theoretical warming from human CO2 would be below 1K.

        Using the more pessimistic Rutledge estimate, gives a much lower maximum CO2 concentration and maximum warming from AGW.

        The 1966 Rogner study you cited is not based on actually estimated proven and possible new reserves, but simply on a theoretical calculation of how much fossil fuel should theoretically be available at an assumed upper price of $100/boe. Many actual estimates made since then show that Rogner’s figures are far too high. Even the estimates I have listed are considered overly optimistic by most sources.

        So, Pekka, you see that we are unable to reach 1,000 ppmv CO2 in the atmosphere from human CO2 emissions.

        Max

      • Pekka Pirilä

        Let’s do a quick “reality check” on the Rogner study you cited regarding fossil fuel reserves remaining and used to date.

        My estimate show that we have used 405 GtC equivalent (by 2005) and have 2035 GtC still remaining, so we have used: 405 / (405 + 2035) = 17% of all the fossil fuel reserves that ever existed on out planet (and still have 83% left to go).

        Rogner estimates that we have only consumed 230 GtC to date (1996), but still have 10,000 GtC equivalent left, so we have only used 2% of all the fossil fuel reserves that ever existed on our planet to date (and still have 98% left to go).

        This is obviously an absurd assumption.

        I think you’ll find that even my estimate is highly optimistic, and we have probably used up considerably more than just 17% of all the fossil fuels that ever existed. [Most estimates I have seen put this at 25 to 30%.]

        So forget about your “model” assumption of 1,700 ppmv. It is not possible to ever reach even 1,000 ppmv, as I showed you.

        Max

      • Max,
        The only real source of disagreement in these numbers concerns coal. One might argue also on oil shale, but almost all of that is excluded only in numbers that I used from Rogner.

        I remain unconvinced that lower quality coal resources would not provide a lot of additional coal, if and when it’s decided that they should be used. They are not the first choice, but they are not unusable either. Delays in developing those resources remain one reason for making rapid growth in the production volumes impossible, which is a point that I mentioned in my previous comment.

        The estimates for low quality coal and oil shales are very flexible and highly dependent on assumptions concerning future technologies and acceptable maximum costs.

      • Pekka Pirilä

        Yeah. Rogner did not have the shale oil and gas estimates in his total (as these were not known when he wrote his book in 1996).

        But he also did not make a real survey of fossil fuel reserves and potential resources, but simply made a hypothetical calculation of fossil fuels which should theoretically be exploitable at a price equivalent to $100/boe.

        15 years later, we now know that the $100/boe value has been reached, but that there are nowhere near the amount of fossil fuels (including all grades of coal) as he had guessed.

        As I pointed out, his estimate would mean that we have only tapped 2% of the available fossil fuel resources to date with 98% still in the ground, and there are no studies out there drawing this conclusion. My estimate (based on several sources, which I cited) shows that we have consumed 17% of the total to date, with 83% still in the ground.

        I’d say the total reserves figures I cited (including those for coal) are arguably on the high side, unless you can produce figures that show otherwise.

        Therefore, Pekka, it is not logically conceivable that human use of fossil fuels will result in atmospheric CO2 levels of 1,000 ppmv or more. That’s all there is out there, believe me (if you don’t believe me, check the published estimates and draw your own conclusion).

        Max

      • Max,
        Your arguments remain indirect and do not address the point of disagreement: The ultimate resources of coal (and oil shale).

        I picked the number from Rogner, as it happened to be one reference that I had immediately available. I have spent in the past fair amount of effort to find out, what is known about fossil resources.

        For a controversial issue any single reference is just one reference. Based on, what I learned looking at a multitude of sources, the estimates on coal resources remain highly uncertain and very dependent of choices made in deriving estimates. The ultimate production may remain within limits given by you, but it may also reach much higher volumes. The environmental considerations (including climate change) lower the likely ultimate production, but nothing is known with high certainty.

      • Pekka Pirilä

        You wrote:

        Your arguments remain indirect and do not address the point of disagreement: The ultimate resources of coal (and oil shale).

        Please re-read my posts, Pekka.

        You will see that they directly address the point of disagreement, namely the remaining total resources of fossil fuels on our planet, as estimated by several independent sources. Oil and gas shale estimates are included, as are optimistically estimated future coal resources.

        I have cited the sources of these specific estimates.

        These estimates show that the combined carbon content of all remaining fossil fuels on our planet is not quite high enough to reach 1,000 ppmv CO2 in the atmosphere.

        You refer to a 1996 book which does not cite specific estimates of specific reserves, but simply makes a hypothetical calculation of how much fossil fuels should theoretically be available if a price of $100/boe were reached.

        This price has been reached, yet the total reserves have not miraculously appeared as a result.

        But we have beaten this dog to death, Pekka, and you have been unable to specifically refute the estimates, which I cited, so let’s let it lie, unless you can come with something more specific.

        Max

      • Peter317

        The energy required to warm the entire atmosphere by 1°C, would warm the upper ocean (700m) by around 0.005°C, and the deep ocean by around 0.001°C.

        Max

      • Exactly – which is why I am skeptical of all this talk of ocean warming. It’s undetectable.

      • Precisely.

        So there is no ‘missing heat’ hiding there.
        Even if there’s ‘missing energy’ down there, it’s never going to manifest itself as heat.

      • The statement is true, but is basically only a tautology describing the fact that the heat capacity of the atmosphere is miniscule compared with that of the oceans. It tells us nothing about how much the oceans would actually warm, which would in fact be far more than those amounts for a 1 C atmospheric warming.

      • Are you sure you don’t want to rephrase that?

      • I thought it was clear, but if it wasn’t, let me know, and I will try to state it more clearly. We already have observational data for many past decades of recorded long term ocean and atmospheric warming confirming the principle that heat added to the oceans greatly outweighs heat added to the atmosphere, reflecting the differences in heat capacity. Continued forcing from CO2 or from changes in solar irradiance will yield similar types of results in the future if the forcings are similar in magnitude.

      • Odd as that phrasing was, it may actually be better for these purposes to be thinking about energy than temperature. If you look at it that way, that statement actually makes sense.

        But everybody focuses on temperature as the be-all variable, when it has problems as such. Besides, could you imagine the IPCC justifying policy based on a scenario where the earth’s energy increases by xxx petajoules by 2100? People would be asking “so what?”.

      • For a bit more on the complex and probably multiple mechanisms that promote upper and lower ocean heat to exchange in an upper to lower direction here is a commentary in the April 15 issue of Science by Raffaele Ferrari on ocean fronts:

        ” The ocean surface is filled with a convoluted web of “fronts” that separate waters of different temperatures and salinities (see the figure). Just as thin ducts in the lung called alveoli facilitate the rapid exchange of gases when breathing, fronts are the ducts through which heat, carbon, oxygen, and other climatically important gases enter into the deep ocean…

        The convergence of waters at an ocean front can result in water masses sinking rapidly, at rates of 10 to 100 m/day, compared to typical rates of 1 to 10 m/day in the rest of the ocean. These rapid vertical velocities may be key in determining the exchange rate of heat, carbon dioxide, and other gases between the atmosphere and the deep ocean. Fronts, however, are an important element of the ocean-atmosphere coupled climate system, and in efforts to predict the response of climate to anthropogenic activities, such as the burning of fossil fuels.”

        Undoubtedly, there are other mechanisms as well to transport heat from upper to deeper ocean layers. Presumably they include overturning circulations (e.g., the MOC) that depend on surface temperature differences between lower and higher latitudes and would continue to promote upper and lower ocean exchange of heat and other moieties if both low and high latitude surface temperatures increased.

      • Fred Moolten

        It may be true that natural factors may some day in the future cause a detectable warming of the ocean.

        But as I showed Peter317, it will not come from the CO2 produced by human combustion of fossil fuels (a.k.a. AGW).

        Max

  79. CO2 starts in a combustion process – and rapidly loses heat in the atmosphere – warming the atmosphere. The molecules gain and lose heat in the atmosphere. Over the ocean there is a net flux in IR up plus a latent heat movement up through evaporation. As the atmosphere is warmer – there is a reduction in net flux from the surface and the ocean warms. The warming results in an increase in both latent heat and IR up – restoring the temperature equilibrium between the ocean and atmosphere.

    Heating and cooling of the oceans and atmosphere happens only as a result of power fluxes and happens practically instantaneously. The rate that temperature equilibrium between atmosphere and ocean occurs is shown in the practice of measuring surface temperature as ocean surface temperature – seasonal at most.

    There is no energy imbalance – the molecules start off as warm as they are going to get. There is no heating in the pipeline. It is all about radiative flux. Apart from the fact that it is all based on very questionable assumptions about what real world radiative balances are – how does such nonsense get legs?

    There is so little analysis of this – a few papers, a couple of questionable models, little in terms of data. Just the usual.

  80. It’s a travesty that the pipeline has neither beginning nor end. The missing heat will be found in Oakland.
    ===========

  81. ” Only if the upper layers are colder than the lower layers will convection take place.”

    Convection involves both downward and upward movement of bulk water. It is currently taking place on a large scale despite the fact that the upper layers are warmer than the deeper layers. This involves absorption of solar radiation and back radiation from the atmosphere (a warming influence), combined with cooling at high latitudes that causes the cold water to sink – e.g., as part of the meridional overturning circulation, with concomitant rising of deep water at lower latitudes. Any additional heat in the deeper ocean will increase the net upward transfer over what it was previously.

    • That’s hardly the same thing, is it?

      • If Fred is referring to the sinking of the deep water as convection he should have made that clear. Most of us do not use the term that way, although one might I suppose. Especially as it has as much to do with salinity changes as with temperature.

      • Convection is invariably two-way. The law of conservation of mass requires that upward movement in one place be matched by downward movement elsewhere.

      • That assumes that upward movement takes place in the first place – it in no way causes the upward movement

      • Peter 3:17

        Also applies to magnets and this ‘North pole needs a South pole’ nonsense; ;)

        Why should a sheet of paper need a back, just because it has a front?

        If a top spins clockwise seen from above, why need it spin widdershins seen from below?

        Nice universe you and David Wojick live in Peter.

        Thanks for sharing with us from this universe an insight into the special rules of physics that apply across the dimensional rift.

      • What on earth are you on about????
        Did you actually read my posting????

      • Actually, ocean convection occurs upside down relative to atmospheric convection, this is the common framework for ocean convection. ocean convection is associated with density changes, with both temperature and salinity playing a role.

      • I understand that. Instead of warmer air rising we get colder and/or denser water sinking. But, because this occurs in colder waters, it’s not a mechanism for transporting heat to the depths. Heat can only really get down to the depths by conduction from the warmer upper layers, but that’s a very slow process.

        Of course the sinking water must result in overturning, but that’s a largely mechanical process and so does not necessarily result in heat being transported to the surface.

      • Convective changes are initiated in both directions – for example, absorbed sunlight at meters of depth causes warming that induces upward transport, whereas cold water sinking at high latitudes drives convection from above downwards.

        In terms of net transport, heat absorbed by the upper layer of the ocean is transported downward over hundreds of years (see the Isacc Held link to the Recalcitrant Component ) that involves long term heat storage. The mechanisms are probably multiple and not fully understood although their end result can be calculated. In any event, heat eventually stored in the deep ocean will eventually find its way upward by convection if the heating of the deep ocean ceases and the upper ocean becomes a net receptable for deep ocean heat rather than a source. This does not require the surface to be cooler than the deep ocean, but only for the previous steady state to be perturbed by either some extra heat gained at depth or lost at higher layers.

      • I should add that of the multiple mechanisms transferring heat from upper to lower layers, bidirectional convective mixing as well as mixing from the forces imposed by colliding ocean fronts are probably involved , but a contribution from conduction is almost certainly negligible considering the thermal conductance of sea water and the depths involved.

    • Where did this quote come from? Buoyancy is because the warmer water is less dense than the surrounding water. Heat will be carried towards the surface. The thermocline is thermal in origin of course – so there is no barrier to warm water rising to meet the warmer surface layer.

      Is there a real necessity for equal upwelling and downwelling if that is being suggested? Cold water sinks in high latitudes as it loses heat and H2O vapour. It rises in a couple of places on earth – notably on the west coast of the Americas – for an entirely different set of reasons. I suggest this is a misunderstanding of the processes.

      The ocean cools from the top <0.5mm as a result of a net upward IR emission plus heat loss as latent heat in evaporation.

  82. Pooh, Dixie

    I found the following interesting as observation and comment; y’all might also. They might be a clue as to where to look for mechanism.

    • Pooh, Dixie

      Pielke, Sr., Roger A. 2011. Guest Post “Atlantic Multidecadal Oscillation And Northern Hemisphere’s Climate Variability” By Marcia Glaze Wyatt, Sergey Kravtsov, And Anastasios A. Tsonis. Scientific. Climate Science: Roger Pielke Sr. April 21. http://pielkeclimatesci.wordpress.com/2011/04/21/guest-post-atlantic-multidecadal-oscillation-and-northern-hemisphere%E2%80%99s-climate-variability-by-marcia-glaze-wyatt-sergey-kravtsov-and-anastasios-a-tsonis/

      The “Stadium Wave”: Sum of years: 64

      -AMO → (7 years) → +AT → (2 years) → +NAO → (5 years) → +NINO3.4 → (3 years) → +NPO/PDO → (3 years) → +ALPI → (8 years) → +NHT → (4 years) → +AMO → (7 years) → -AT → (2 years) → -NAO → (5 years) → -NINO3.4 → (3 years) → -NPO/-PDO → (3 years) → -ALPI → (8 years) → -NHT → (4 years) → -AMO

      “Index Profile of the Stadium Wave:

      * Atlantic Multidecadal Oscillation (-AMO) – a monopolar pattern of sea-surface-temperature (SST) anomalies in theNorth Atlantic Ocean.
      * Atmospheric-Mass Transfer anomalies (AT) – characterizing direction of dominant wind patterns over the Eurasian continent.
      * North Atlantic Oscillation (NAO) – reflecting atmospheric-mass distribution between subpolar and subtropical latitudes over the North Atlantic basin.
      * NINO3.4 – a proxy for El Nino behavior in the tropical Pacific Ocean.
      * North Pacific Oscillation (NPO) – the Pacific analogue for theAtlantic’s NAO.
      * Pacific Decadal Oscillation (PDO) – an SST pattern in the North Pacific Ocean.
      * Aleutian Low Pressure Index (ALPI) – a measure of intensity of the Aleutian Low over the Pacific Ocean mid-latitudes.
      * Northern Hemisphere Temperature (NHT) – anomalies of temperature across the Northern Hemisphere.”

    • Pooh, Dixie

      Wyatt, Marcia Glaze, Sergey Kravtsov, and Anastasios A. Tsonis. 2011. “Atlantic Multidecadal Oscillation and Northern Hemisphere’s climate variability.” Climate Dynamics (April). doi:10.1007/s00382-011-1071-8. http://www.springerlink.com/content/p1275t4383874p65/

      Non-uniformity in the global warming trend is usually attributed to corresponding non-uniformities in the external forcing. An alternative hypothesis involves multi-decadal climate oscillations affecting the rate of global temperature change.