by Judith Curry
This article aims to portray and communicate the important role played by natural variability in our evolving climate. Understanding and acknowledging these variations is important for society and policymakers. Much of this variability is chaotic and unpredictable but some significant fraction is potentially predictable, providing an opportunity to narrow the uncertainty in climate predictions of the coming decade.
The above quote is from the following article:
Our evolving climate: communicating the effects of climate variability, by Ed Hawkins, published in the magazine Weather. Link to the complete article [here] and [here]. These two versions are slightly different; my quotes are from the actual article published in Weather.
And Ed has a blog: Ed’s Open Lab-Book. I very much like his blog and have added it to my blog roll. Check it out.
IMO this article is climate science communication at its best. It provides the appropriate perspective in terms of complexity and uncertainty, while at the same time relating to his audience, clearly explaining the issues, and creating useful analogies. The entire article is well worth reading. Some excerpts from the article:
However, when presenting this range of responses, the IPCC tends to show the average and spread of the GCM projections, indicating a relatively smooth increase in temperatures over the coming century (Fig. 1b, blue shading). Although this representation provides a range for the likely increase in future temperatures, it tends to disguise the natural variability of climate. One particular projection illustrating the impact of internal climate variability on EU temperatures is shown in Fig. 1c. To highlight the importance of the natural fluctuations in climate, a decade which shows a sharp decline in temperatures is then chosen from this particular projection (Fig. 1d), demonstrating how a climate trend may be misrepresented when considering a relatively short time period. Although this particular decade is chosen specifically, it is not unusual – there are also several periods of rapid warming and rapid cooling in the observational record.
So, what is the chance of one year being cooler than the last? For global mean temperatures in the future, there is roughly a 40% chance that one year will be cooler than the last (Fig. 2a). Equivalently, this could be expressed as `2 in every 5′ years. For smaller regions, such as Europe or the UK, this chance is higher (around 47%). Although this may seem quite counter-intuitive, there is not much difference between the chances of a head or tails when tossing a coin, and whether the temperature in Europe one year will be warmer or cooler than the last.
However, for longer timescales the odds change because of the gradual upwards trend in temperatures. Decades which exhibit a cooling (or a negative temperature trend) are only expected occasionally in the future for the global mean (about 5% of decades), but these chances increase to 24% of future decades for Europe and 36% of future decades for the UK (Fig. 2b). Expressed as odds, there is roughly a 1-in-3 chance of a particular future decade exhibiting a cooling trend for the UK. This is a key point which is essential for society and policymakers to appreciate – temperatures are expected to (temporarily) go down as well as up, even in a warming climate.
Another interesting question to ask is, how long might we have to wait before a warmer year occurs? Although, we may be surprised that 1998 is still the warmest year recorded, the GCMs suggest that, for global mean temperature, it is possible that we could wait 17 years; and so far we have been waiting 12 years. For smaller regions, climate fluctuations are larger and for UK temperatures, we could wait nearly 50 years, although usually it would be under 5 years.
Although there are caveats to this simple analysis, it demonstrates how climate variability is likely to affect an individual’s interpretation of whether there are long term changes in climate or not. To determine whether the climate is truly changing, it is essential to consider long (multi-decadal) timescales and large spatial scales.
I really like this one:
DEMONSTRATING CLIMATE VARIABILITY AT HOME
Take a shuffled pack of playing cards, with red cards representing ‘warm’ years and black cards ‘cool’ years. When dealing the pack there will be times when several warm or cool years appear together. Next, remove some black cards from the pack, and reshuffle. This pack now represents a changed ‘climate’ with less cool years. When dealing the pack for a second time, there will be more periods of warm years, but probably periods of cool years as well. Even though the climate has warmed, every year need not be hot!
JC comment: IMO this is much better than the “loaded dice” analogy that is often used.
Variability as an analogue for the future. As described above, our climate is changing relatively slowly compared to human memory. As significant climate changes tend to only appear after many years, it can be hard to appreciate how the climate has already changed, and hard to imagine the impacts that are projected to occur; the climate in the 2050s may seem quite a remote prospect. However, natural climate fluctuations can help us appreciate what living in a changed climate would be like by acting as an analogue for what the future holds.
Variability as uncertainty. Although climate scientists are confident in the direction of any long term change in temperatures, there is a range of estimates for the magnitude of the change, i.e. some uncertainty. However, there are several different sources of uncertainty in our projections of climate. As indicated previously, the climate we will experience in the coming decades is significantly influenced by natural fluctuations, giving rise to some uncertainty over the trajectory the climate will follow (the internal variability component of uncertainty). There is also uncertainty in future climate due to different GCMs having different responses to greenhouse gases etc (termed model response uncertainty, for example, as shown by the spread in projections in Fig. 1a), and also uncertaintyin the rate of future greenhouse gas emissions (termed scenario uncertainty). The potential to narrow uncertainty in climate projections depends on which source of uncertainty is dominant. For example, climate science can tackle the model response uncertainty, but can do little to reduce the uncertainty in future emissions because this depends on economic development and human behaviour.
Predicting a decade ahead. The inset in Fig. 5 shows individual forecasts started in 2005. Although the subsequent observations are generally within the range of predicted temperatures, they are at the lower end. This could have happened for at least three possible reasons, (i) by chance, or (ii) because the model is not able to fully predict the natural variability (such as the 2008 La Nina), or (iii) because there are other external forcings which are not yet fully accounted for in the model.
JC comment: there is also (iv) model inadequacies associated with treatment of physical processes, e.g. aerosol effects, solar indirect effects, formation of clouds, etc.
Numerical weather prediction has benefited from continually assessing the ability of the computer models to make forecasts. Further testing of our climate models in a similar way is vital to increase confidence in their use for longer term projections and potentially identify parts of the model which require improvements.
JC comment: This is a very important step in validating and improving climate models. In addition, arguably the most important potential applications of climate models are on regional spatial scales and decadal time scales. Focusing on decadal time scales with a good validation strategy should be top priority in climate modeling, IMO.
JC conclusions: IMO this article does an excellent job in explaining the challenges associated with understanding AGW against a background of interannual and multi-decadal climate variability and also regional variability.
The need for better understanding of climate models and how their simulations should be interpreted is illustrated by this article at the Yale Forum entitled “The Case of Missing Coverage of Models” (h/t Bill Collinge, I missed this when it first came out.)
I read “However, for longer timescales the odds change because of the gradual upwards trend in temperatures.”
With a sentence like that, IMHO, the rest of the paper is not worthwhile even glancing at.
Are you a person, who wants to learn, or a person, who wants to avoid it as it might require changing the view on the matter?
Are you a skeptic or a true believer?
Pekka, what is to learn here? The author assumes an upward trend, then explains why it might not show up clearly in shorter time scales. The issue is whether or not there is to be an upward trend in the future, so he has assumed an answer to the question at issue. This renders his analysis worthless, does it not? Even worse, his analysis suggests that the question cannot be answered scientifically in less than several decades. So he gets his assumption for free, as it were.
My post was a reaction to Jim’s comment, not to the paper.
There are all too many contributors here with the attitude expressed in that comment. All too many are very certain on their views even, when most others see clearly that those views are totally wrong.
Yes Pekka many of us are certain that others, such as the paper’s author, do not admit to uncertainty in their stated assumptions and many of those same others, such as yourself, see clearly that our views are totally wrong.
Actually I should not have commented on, what is obviously wrong, because the real issue is not that, but the willingness to learn rather than to search only support to own prejudices.
Sorry, Pekka, but we now have documented evidence of manipulation of scientific data and the news media since about 1972, when Mr. Henry Kissinger took President Richard Nixon to China find a solution to the threat of mutual nuclear annihilation if the nuclear arms race continued.
I personally know about the decision to terminate NASA grant, NASA NGR 26-003-057, after it produced experimental evidence that Earth’s heat source might not be a stable H-fusion reactor [1,2].
See 1974 and 1975 news stories about the danger of another ice age in in Time Magazine  and Newsweek 
1. Nature 240, 99-101 (1972):
Embarrassing data: http://www.omatumr.com/Data/1972Data.htm
2. Proceedings of the Third Lunar and Planetary Science Conference, vol. 2, 1927-1945 (1972);
Embarrassing data: http://www.omatumr.com/Data/1972Data1.htm
3. Time Magazine US (June 24, 1974):
4. Newsweek (April 28. 1975):
With kind regards,
Oliver K. Manuel
Former NASA Principal
Investigator for Apollo
1. Nature 240, 99-101 (1972):
Pekka: I was reinforcing Jim’s comment, with which I agree. This paper is what I call “AGW science,” namely science that assumes AGW and continues from there. In the context of the debate it is worthless, because it assumes a position under debate.
The content of the paper is equally, or perhaps more, relevant to those, who wont to find evidence against main stream views. Of course it’s not of interest to those, who don’t care about evidence, because they know that the mainstream views are wrong even without any evidence.
“The content of the paper is equally, or perhaps more, relevant to those, who wont to find evidence against main stream views.”
Unfortunately that is true. It should be equally relevant to those that believe unforced variations are trivial. The truth is in the middle some where.
Pekka: Assumptions are not evidence.
Your assumptions are not evidence either.
Neither are your other starting points for the use of evidence anything more than your personal opinions.
i disagree. he assumes a warming trend because, until recently (need more data) there was one.
I don’t see this as taking a side in the debate, merely making an observation. You’ll note (unless i missed something) that he doesn’t attirbute the warming trend in this piece- he only mentions it.
Granted, the context surrounding his discussions of GCM’s could be interpreted that way, but i think he’s worth the benefit of the doubt in this read.
Also, model validation +1
Pekka. I have read just about everything that has been written on the observed data that, supposedly, supports the idea that CAGW exists. I have found absolutely no observed data whatsover that supports the contention that in the future global temperatures will rise at some sort of steady rate. The only evidence that I have seen for CAGW is the output of non-validated models. Show me some observed data that supports CAGW, and I will seriously consider that it could be a possibility.
I say once more: My comment was a reaction to your comment. It was directed to you, because you had chosen to write that comment.
It applies to you as much as the comment gives the right view of your approach.
I don’t see his analysis as worthless at all. As you said yourself, he explains that a long term warming trend might not show up in shorter time scales due to natural variablilty – from the evidence of some of the comments sometimes made here that is clearly something which some people don’t grasp. Furthermore there is an accusation often made that “warmists” dismiss the notion of natural variability, this article shows that this is not the case.
Okay it is not worthless, only the assumption is worthless. That is, If we are entering a cooling trend we may not know for decades. What the heck, let’s make it centuries. Maybe we are in a long term cooling trend. It all depends on the nature of natural variability, which we do not know.
As can be seen in the linked gisstemp graph, we have had what looked like the starts of many cooling trends during the past century, but all were short lived. I wouldn’t pin my hopes on a lengthy one emerging anytime soon.
M. carey, that depends on your definition of soon. We are currently in an interglacial and the pattern of ice ages and interglacials seems to point to the odds that we will fairly “soon” be moving into an ice age. Odds are 1,000 years or so from now people will wish that CO2 actually did control climate and had prevented the ice age they are having to deal with.
I wouldn’t bet on that. The negative forcing as a result of Milankovitch cycles which leads to ice ages is pretty weak compared to the positive forcing from increased CO2 in the atmosphere. As long as CO2 levels are increasing then the next ice age is likely to be postponed.
Your prediction of a coming ice age is vague. What’s your forecast of average global temperature to 2100 or beyond and what’s the basis for your forecast?
The combination of changes in oribital ellipticity, obliquity, and precession needed for insolation at 65 N to decline sufficiently to trigger a new glaciation are estimated to require at least another 30,000 years. A new ice age sooner than that would require the operation of mechanisms not known to have been associated with previous ice ages.
Hansen has hinted that the ice ages of the past may be a thing of the past. Human beings know how to purposefully jam them.
one of the interesting issues of glaciation is that you need continuing snow to build the glaciers. Your conventional view allows for less heating of the ocean which also decreases evaporation limiting snow and precipitation. You need a mechanism to retain evaporation to continue the snow as the cooling continues.
Do you know of a solution to this conundrum?
Kuhnkat – I don’t have a precise answer to your question. However, we know from both paleoclimatologic records and basic principles that temperature changes in the tropical oceans (a major source of atmospheric water) are smaller than those at high latitudes during major temperature shifts. This, combined with the ability to a colder atmosphere to shift the balance of precipitation from rain to snow should maintain snowfall at some level. If we add the reduced melting, plus the freezing of high latitude sea ice, the conditions for a glacial climate would probably be fulfilled.
It may be that we’ve been seeing some of this in reverse in Antarctica during recent decades, where the continent has been experiencing a loss of land ice from warming despite an increase in snowfall from the enhanced precipitation.
Do you know of any evidence which might lead us to conclude that we are entering a cooling trend?
Regarding the assumptions made by the authors, your reaction is based on your own assumptions which are not necessarily shared by the rest of us – neither of which means that their analysis is not valid or that you can’t make objections to it. I think it makes a useful contribution to the debate regardless of one’s position on the likelihood of continued warming. And any kind of research which investigates the consequences of a rise in global temperatures essentially makes the same assumption, does this mean that no such research should be carried out? Surely there are all kinds of areas, not just climate science, where the future is uncertain but we still feel it prudent to examine the likely consequences of possible events which are uncertain, or even unlikely.
In late 1980s the subpolar gyre slowed down (= warming)
In 2007/8 speeding up started (= cooling)
The subpolar gyre is the vehicle for the ocean heat transport in the N. Atlantic, in doing so drives the NAO, which has just entered new negative phase, which means considerable cooling for western Europe and the US Atlantic seaboard.. The gyre has a 20 + year cycle and certain conditions have to be initiated to change from one mode to the other (fast – slow). Some relevant details are here .
i dont see how it renders his analysis useless. The assumption does not come for free. Absent any cogent theory that argues that increasing GHGs will lead to a cooler future, it’s rational to assume that radiative physics will hold true tomorrow as it did yesterday
It is not a question of increasing GHGs leading to a cooler future, it is a question of nature leading to a cooler future. The assumption that the future be warmer is no more plausible than that it will be cooler.
I call myself a ‘lukewarming cooler’ because I believe in the radiative effect of Carbon Dioxide, which may warm, but believe we entering a short, medium, and long term cooling trend from the sun. Some find the usage nonsensical, but the ‘lukewarming’ is for the CO2 heating effect, and ‘cooler’ is for climate predictions.
David, It’s a good thing you never said that decades ago. But that was then and this is now. Why go out on a limb? To cover all possibilities you could assume the future will be warmer, cooler, or stay the same. Not that that’s very useful, but nobody can say you are wrong.
The point is that climate research is dominated by the invalid assumption of significant future warming. The result is an endless stream of scary speculation presented as scientific fact. These studies dominate the climate change news. Money for nothing.
David, since average global temperature has been rising for a long time, it would not be an invalid to assume it will continue to rise for a long time, even if man as a driver of warming didn’t exist.
Probably whatever cycles between warm periods and little cold ones over the centennial scale rules the recovery from the Little Ice Age, it’s probably the sun, and it may have changed sign in the last decade.
Disregarding our philosophical discourse elsewhere, you nailed this one to the wall. All the nice words in the world, even if they possess a certain amount of internal consistency, don’t disguise how the convinced patronise the uncertain.
I haven’t read the paper – it’s printing now – but a priori that underlying assumption, whether or not it is correct, does not necessarily invalidate the author’s comments on uncertainty. When I read it, I’ll see whether making a different assumption would in any way change my assessment of the paper. I suspect not.
“So, what is the chance of one year being cooler than the last? ”
Did you know the coldest winter in US history was 1979 with an average temperature of 27.29F?
And the warmest was 2000 with an average temperature of 37.17F.
That sure explains why the satellite records look like it warmed dramatically from when they started – 1979.
Bruce, the record warmest and record coldest winters in the U.S. have little effect on average global temperature.
So how do we figure out how many cards were pulled from the deck?
Look at the dealer.
That sounds exactly like a problem for the statisticians. To examine the various runs in either direction, and from that deduce whether there is in fact an underlying trend (a random walk with bias). Not my strong suit, but I assume that that is right up somebody’s alley. I guess the problem for me is the question of what constitutes a long-enough data series to separate out cyclical variations of various frequencies, a la Steve McIntyre or Richard Muller. The same problem occurs in determining whether a data series is “truly random.”
You may find this interesting.
Taking 1960 as a starting point it is an obvious ‘cherry-picked’ date which casts doubt on the credibility of the argument.
If Mr. Hawkins wants to be convincing, than he should consider much longer record. A quick look at any of these links might be helpful:
His conclusions match fairly closely those published by the NOAA in BAMS 2008. Instead of warmest year the BAMS paper slected 15 years without a warming trend as their falsification point so now we have 2: warmest year doesn’t change for 17 years or no warming trend for 15 years. It’s good to see the cards being put on the table even if they think some are missing.
I remember that Gavin once said in a response to a comment on RC that if by 2013 the 1998 record anomaly will not be beaten in ALL the temperature indexes he will accept that there is a problem with the theory and reconsider his views (something we should remember and not forget to remind him in 2013 if the record will not be beaten in all indexes). So far it’s 2011 and HadCrut, UAH and RSS still show 1998 as the warmest. So, for Gavin there’s still two more years to go but compared to his statement this article has moved the goal post 2 years?
They’ve been moving the goalposts since 2001.
Gotta keep that carrot out there for all those infidels and heretics.
That sounds like an interesting exchange. have a link or know what post it was under?
My view on the situation is fairly simple. We have a negative PDO, a weak solar cycle, and an AMO that will start to head south soon. If the skeptics are correct there should be a lack of warming that will force all but the most unreasonable of scientists to admit their forecasts were wrong. If it warms within the bounds of the models despite these factors it will cause all but the most unreasonable of skeptics to admit they were in error. I doubt seriously the goalposts can be moved by either side to any large extent.
I’m in the same boat with you, Steven. I’m also waiting for the “facts on the ground” to settle the debate.
Look at comment # 57 over here:
It seems my not-so-good memory made me overstate a bit what he said, but in principle it’s the same thing. He answered “yes” to this question: If 1998 is not exceeded in all global temperature indices by 2013, you’ll be worried about state of understanding?
To be clear – I’m not fixed on one outcome, I mean “settle the debate” one way or another…
Though I would really like to see the outcome be the falsification of the theory for two reasons – 1) It would be clearly good not to live in an anticipation of a catastrophe and 2) If the facts will be for the warming to continue there will be a continuation of another debate – the seriousness of the problem and the usefulness and cost of the remedies and so on.
I believe it might be only a few years from now when we’ll know more but the likely outcome will unfortunately most probably not be unequivocal. Like look at 2010. It’s so much in the “gray area” with GISS and NCDC having it the warmest but the others don’t. So it might be moving in this border area for quite some time and both sides might get some ammo and keep on shooting.
sorry, I made a mistake. GISS tied 2010 with 2005 and NCDC, if we look at 4 decimal points, as the data is presented, still has 2005 as the warmest. The rest of the indices still have 1998 as the warmest
He actually did a nice post on model falsification in 2008 and his isn’t far off from what others have said. I would even venture so far as to say there seems to be a consensus.
The Gisstemp 1998 global temperature anomaly record of 0.56 has been topped four times: .063 in 2005, 0.58 in 2007, 0.57 in 2009, and 0.63 in 2010.
Also note Gavin’s answer to Daniel Klein’s third statement
(3) Any ten-year period or more with no increasing trend in global average temperature is reason for worry about state of understandings
3) No. There is no iron rule of climate that says that any ten year period must have a positive trend.
What exactly are you intending to say by that and what does the Gisstemp record alone mean in this context?
a) Gavin’s reply was in december 2007 when GISS indice had already beaten the 1998 record. The fist question to which Gavin replied “yes” was “If 1998 is not exceeded in ALL global temperature indices by 2013, you’ll be worried about state of understanding?” Have you read what I wrote and have you noticed the word ALL?
b) What does his reply to the third question have to do with what we are discussing here? Nowhere have I claimed that a ten year lack of positive trend should falsify the theory. So this is clearly a strawman.
How do you figure factual information related to the subject is a strawman?
So, you don’t even know what a straw man is…
A straw man is a component of an argument and is an informal fallacy based on misrepresentation of an opponent’s position. To “attack a straw man” is to create the illusion of having refuted a proposition by replacing it with a superficially similar yet unequivalent proposition (the “straw man”), and refuting it, without ever having actually refuted the original position.
My post was not intended to attack your position. It was additional related information. I’m sorry you misinterpreted it as a straw man.
Sorry, M. carey, I guess I misinterpreted why and what you said. My bad.
I believe the exchange was with Richard Wakefield. There was an article that indicate an unambiguous global record (greater than .1C above 1998 on all three temperature series) will occur within one of the ~20 years after 1998: 50/50 by ~2014. But don’t hold me to that as the odds I don’t understand odds are extremely good.
Actually it’s 18 years and is the link I posted above.
The card trick here is the assumption that the warming trend is AGW.
We’ve had a warming trend since 1690. It’s the recovery from the little ice age. This article claims to discuss natural variability, but then ignores the historical evidence for it.
According to my model projections this warming trend topped out around 2005 and it’s likely to be generally downhill temperature wise for the next few hundred years.
As tallbloke notes, Ed does address some of the variability, but appears to follow the IPCC conclusion of AGW. Ed notes “(iii) because there
are other external forcings which are not yet fully accounted for in the model.”
However, it appears he seriously underestimates the magnitude of the uncertainties. e.g., see Fred Singer’s upcoming presentation:
NIPCC vs. IPCC Addressing the Disparity between Climate Models and Observations: Testing the Hypothesis of Anthropogenic Global Warming (AGW)
Large run to run variations: From the single published example of a global climate model results of each of 5 runs, Singer shows the individual temperature trends vary by almost an order of magnitude.
Insufficient runs: Singer demonstrates that results of only 2-5 runs on global climate models are grossly inadequate to average chaotic variations.
The climate models to include the major variations from Medieval Warm Period to the Little Ice Age.
Solar-Droughts/Floods: The models do not include the impact of the 21 year Hale cycle on flood/drought as shown by WJR Alexander. see Alexander’s Crossroads, 19 July 2011. In 2008, Alexander predicted:
That drought is now being born out as the UN announced famine in Somalia.
Jo Novo reports in: Climate Money: The Climate Industry: $79 billion so far – trillions to come
“The climate industry is costing taxpayers $79 billion and counting”
Climate Science: follow the money
Our hard earned money that we are forced to pay in taxes is being poured into the black hole of “controlling climate change”. At $79 billion, we have already spent 100 times as much on “climate change” as this $800 million shortfall in funds promised to save the starving in Somalia.
With the myopic focus on catastrophic AGW, where is the real world attention to Alexander’s accurate prediction of drought and the urgent humanitarian need to address the massive danger from such natural climate variations?
Another deficiency of the models is that while the results of inter-model comparisons are always present as ‘anomalies’, the variation between them in terms of the absolute average temperature they obtain for the Earth’s surface varies by as much as four centrigrade.
On costings, the US general accountability office found that from 1993 to 2008, the US spent 120 Billion dollars on ‘global warming’.:
These AGWers had wasted our $123 billions of tax money and our children have to pay as debts. Stop funding them now.
David, Ed invites comments at http://www.met.reading.ac.uk/~ed/blog/our-evolving-climate/ . There aren’t any yet, perhaps you should start the ball rolling.
Thanks Faustino for starting that ball. Ed allowed a link back to my post above.
I realise the paper being discussed talks about the probability of hiatuses in a warming trend…but 200 years? In the CET record the trend from 1700 to 1900 is negative.
One need not attribute the trend to anything to understand the statistics involved.
On your prediction what are the chances of an “up year”
Attributing the warming trend to recovery from the Little Ice Age is like attributing it to magic. In either case, you are saying you don’t know what caused the warming.
Nothing wrong with not knowing. The problem are things we “know” that ain’t so.
However, the warming trend from the LIA is very likely solar. Solar cycles generally got shorter (frequency of solar cycles increased), which is known to be associated with warming.
Shorter cycles tend to be higher amplitude cycles, and as Charvatova showed, tend to be associated with periods when the Sun moves in a harmonious trefoil pattern wrt the centre of mass of the solar system. When the Sun goes through a period of higher than average activity, the ocean accumulates heat-energy and raises the temperature of the atmosphere from below as it emits more energy. This effect lags the more instantaneous effect of the Sun warming the atmosphere from above. This is why air temperatures and sea surface temperatures exhibit a ~105 year cycle wrt each other. Climate science just hasn’t grasped the long timescales over which the ocean oscillates, and the fact that its internal oscillations range in frequency from diurnal to interglacial periodicities and beyond.
Yes, shorter cycles tend to be higher amplitude cycles.
The barycentric explanation is interesting and plausible.
What’s very interesting IMO, is that, despite so many factors possibly influencing global temperature, one variable (SCL) shows such a good correlation with the global temperatures.
Short (strong) cycles are associated with warming and long (weak) with cooling.
Well, it has to be admitted that the SCL to temperature correlataion wasn’t as good after 1980, and thi is one of the reasons for the co2 scare.
However, what hasn’t been taken into account is the accumulation of solar energy in the ocean, which has introduced a non-linear response, which largely manifests in bigger than usual El Nino events such as at 1987 1998 and 2010. These release large amounts of solar derived energy from the ocean into the atmosphere, producing the late C20th warming we saw.
If you integrate the sunspot number as a departure from the ocean equilibrium value, the almost continuous ocean warming from ~1934 to ~2003 becomes obvious
The sun is very sultry and we must avoid its ultry-violet rays,
Else oscillate forever in uncountable wondrous ways.
Well, there is something wrong with someone saying he knows the cause of global warming is from something he doesn’t know. That’s ludicrous.
You are going to have trouble attributing the global warming of the past several decades to just changes in the sun.
I have no trouble.
SC 21 and SC 22 – short cycles (~10.5 years) – warming
SC 23 – long cycle (~12 years) – cooling
SC 24 – very long cycle? – more cooling.
SC 20 – long cycle – ~1970s cooling
and so on…
Average SC in the 20th century is shorter than average SC in 19th century – 19th-20th century warming. Add in some thermal inertia (ice, which needs to be melted) and almost all variation is explained by SCL.
I should have said you would have trouble attributing it in a way that makes sense. Stating about 1950 the trend in sunspot activity plateaued and then turned down slightly near the end of the century.
Thermal inertia doesn’t explain the recent upward trend n global temperature.
According to Solar physicist Leif Svalgard, Waldemeier overcounted the sunspots betrween 1945 and 1985 by around 20%. But more importantly, the average sunspot number was well above the long term average all the way up to 2003. Rather than look at the peak sunspot cycle amplitudes and draw the facile conclusion that solar falls away as temperature rises, you need to examine the steep up and downramps, short minima, and look at the average sunspot number over the period. Once you realise that the ocean accumulates heat over long timescales, and that the sunspot number was well above the ocean equilibrium value for all of the latter half of the C20th, the light may dawn.
Is this that “ocean as a constipated heat-sink” theory?
Stop projecting and start making sense. No wonder consensus ended up with CO2 “knob” if they are such lazy thinkers.
“Stating about 1950 the trend in sunspot activity plateaued and then turned down slightly near the end of the century.”
According to your link, A SMOOTH of the sunspot number plateaued around 1950 and then turned down slightly near the end of the century, NOT the “trend in sunspot activity”. There are 1000s of such “trends in sunspots activity”. That’s wishy-washy.
Again, I claim that short solar cycles (~10.5 years and shorter, f > ~9.5 cycles per century) are associated with warming and longer solar cycles (~11.5 years and longer, f < ~8.6 cycles per century) with cooling. Considering the "multifactoriality" of global temperature, this correlation is remarkable.
In the graph you posted, the smooth plateaued around 1950, but SC20 was long (~1970s cooling). The smooth didn't catch that.
SC21 and SC22 were strong cycles (the warming near the end of the century), but the smooth turned down in the middle of the warming!
SC23, which started in ~1997, was weak (~12 years long) and that stopped the warming. Another weak cycle (or even weaker, like SC24 seems to be) means cooling.
Thermal inertia explains a lot (ocean, ice). A series of short (strong) cycles can warm at very different rates, depending on the climate "history".
On the contrary troll, I know exactly what caused the descent into the little ice age from the medieval warm period, and what caused its recovery from it.
The magicianship is that of the card sharps and data benders who keep their code a secret within the magic circle of the AGW wagons.
You know, but you are keeping it a secret?
M: Regarding your statement that “Attributing the warming trend to recovery from the Little Ice Age is like attributing it to magic. In either case, you are saying you don’t know what caused the warming.”
This is half true in a fundamental way. Emerging from the LIA is a major alternative hypothesis to AGW. The problem is we cannot explain the LIA, so we can neither confirm nor falsify this hypothesis. It is one of several “mystery mechanism” hypotheses, which are based on observation, not physics. So science is stuck, until we understand the LIA. But the climate community refuses to address this fundamental issue, preferring to explore AGW in pointless detail.
The climate community refuses, wow, strong words. My impression is, that my colleagues are rather keen on understanding the LIA, but that’s probably my biased estimate.
I think if you look at the USGCRP spending the LIA research is almost negligible, certainly vis a vis the carbon cycle research budget. I have seen very few papers on LIA but can easily check. Then too, trying to explain LIA from within AGW does not count.
“Refuse” here has the special meaning of Kuhn’s discussion of how anomalies are treated when a paradigm is dominant, as AGW is. Perhaps ignore is a better term.
“Attributing the warming trend to recovery from the Little Ice Age is like attributing it to magic. ”
Here you go M. carey – fill your boots with some peer reviewed science about the recovery from the little ice age.
What about the other side of the argument? Considering only natural variation, what is the probability of having two consecutive warming decades and a third one not warming? Doesn’t look very low on a first glance. Particularly since we have seen other case in the same century, without CO2 worth mentioning. So, what’s left there for “non natural” variability?
But I’ll have to read the article.
Did you know, according to Dr. Don J. Easterbrook, that 9,099 of the last 10,500 years were warmer than 2010?
The oceans and lakes that comprise the largest part of the Earth’s surface accumulate heat over time. Moreover, the oceans also can transmit heat and cool over time. And, the satellite data has been telling us that the Earth has been cooling for a decade. The Oceans are in a cooling trend. And, in a period when the oceans are cooling, there is no global warming during that period; and, there is no end to the cooling in sight.
We have had three successive warming decades as the 2000’s were clearly warmer than the 90’s. If there was no average warming, the odds would be 7/1 against having three successive warming decades because each decade would have a 50% chance of being warmer.
We have had three successive warming decades as the 2000′s were clearly warmer than the 90′s.
Not true. The 90’s warmed, but the 00’s did not. Ask tempterrain – I demonstrated that to him last night. :-)
If you take the average temperature of 2000-2009 and subtract the average of 1990-1999 you get at least 0.15 degrees, hence the 2000’s were warmer than the 90’s.
The trouble with all these discussions is that the temeprature record, as presented to the world, is not reliable. I won’t trust it until it is replicated by researchers who critically analyse every bit of available data, and who make their methodology freely available so that everybody can form an opinion on whether or not they will trust it. Even then, a global average temperature would only be truly reliable if it was created from data collected over a much bigger percentage of the planet than happens now.
the 2000′s were warmer than the 90′s.
Yes. And I knew that would be your answer.
BUT – you claimed a warming decade which is not true. The 00’s were warmer due to the warming of the 90’s – NOT because it warmed during the 00’s.
This is word play, Jim – semantics, if you will. It’s an impreciseness that has the appearance of deception. And it it is very often used as such. I’m don’t expect that you intended that, but you should know that this is one of the many small reasons I’ve learned to distrust statements by warmists. There’s always a “twist” to the words. tempterrain tried to run the same scam on me several days ago. I doubt that he understood what he was doing either – but I did. I think to a large degree it’s that you (and tt) are simply repeating what you’ve been taught. As least, I’m wiling to give you that much slack. But someone taught both of you that particular mantra – and if it was a scientist, then I would believe it to be deception. Scientists are “supposed to be” careful and precise. And the statement, as used, lacks those qualities.
Yeah – I know – even scientists can get sloppy sometimes. But if they’re sloppy with their words – why would their science be different?
Warming successive decades is used in the same way as warming successive years. It doesn’t mean December is always the warmest.
A warming decade is one in which the temp increases during that decade. The 00’s fail to meet that criteria.
The oo’s ARE the warmest of the recent decades because of the warming in the 90’s, which was, without argument, a warming decade.
If you want to change the definition of warming decade then we’re not speaking the same language. I speak only American English, a little Spanish, a little less German, a little more French and a bare touch of Japanese. What language are you using?
How about successively warmer decades? We have had three successively warmer decades. Agreed?
warmer yes –
We’ll see what the next one is like – in 9 years. :-)
warmer sounds more effective too. I’ll use it in the future.
If you believe HADCRUT its been warming since 1910.
1998 was the El Nino jump.
Then it stopped warming.
Jim, I am embarrassed for you. In describing your woodfortrees exercise, you represent the 9-years of 1990-1998 and the 13-years of 1998-2010 as decades. Either you can’t count, or you think “decade” means whatever you want it to mean.
Global average temperature anomalies for the last three decades are presented below. As you can see( I hope), the 2000-2009 decade was warner than the decade of the 1990’s decade, which in turn, was warmer than the decade of the 1980’s.
These averages were computed from Gisstemp annual data available at
M. carey –
“Decades” are an artificial division of time that rarely has any real meaning in the natural world. Sun cycles are not “decades”, nor are the PDO, AMO, NAO. Why then, are you using “decades”?
Nature doesn’t do “decades”
And I don’t do “decades” – unless they fit.
And – your numbers don’t fit. The discussion was wrt the 00’s being a WARMING decade.
But the 00’s were NOT a WARMING decade.
Whether that decade was the Warmest is immaterial to the discussion because that warming took place in the 90’s and there was no warming in the 00’s.
So now you should be embarrassed for yourself.
Jim, if the discussion was about decades, why were you referring to the 9-years of 1990-1998 and the 13-years of 1998-2010 in your woodfortrees graph? Clearly, those aren’t decades. Go back to woodforteees and do decades, Then readers won’t wonder why you don’t know what “decade” means.
Jim, if the discussion was about decades, why were you referring to the 9-years of 1990-1998 and the 13-years of 1998-2010 in your woodfortrees graph?
Because I wasn’t discussing decades – I was discussing warming vs not-warming. And for that discussion, “decades” is a meaningless parameter.
You ARE confused, aren’t you.
Jim, I am quoting your July 20 9:04 pm post below for reference. Note you said ” The 90’s warmed, but the 00’s did not.” To illustrate, you then provide a link to a graph that shows an OLS for 1998-2010 rather than 2000-2010. You should be able to count well enough to know 1998-2010 is not the 00’s.
Jim Owen | July 20, 2011 at 9:04 pm | Reply
We have had three successive warming decades as the 2000′s were clearly warmer than the 90′s.
Not true. The 90′s warmed, but the 00′s did not. Ask tempterrain – I demonstrated that to him last night.
Jim, BTW, if you plot an OLS for 2000-2010, you will see you are wrong about the “00’s not warming.”
Woops ! My previous post linked to a wrong plot. I hope this one works.
This could just as easily be an article on–e.g., why the scientists have decided to take reason mainstream and why those in the global warming fearmongering machine, ‘are now searching for a way to back out quietly without having their professional careers ruined’ (as James Peden observed). There is a reason why global warming hysteria has cooled and more people have decided to get their insight as to climate change from, The Old Farmers Almanac. These are the folks that actually predicted global cooling (not warming) to begin in the winter of 2008-9. And, of course, they also see cooler weather in our future and maybe over the next 50 years.
What amuses/bothers me is that all this current recognition of natural variability was nowhere to be found in the 90’s. When the natural cycle(s) was accentuating the temperature/heat content increase, CO2 was the sole reason I heard about. It made me very nervous.
Now , of course, there is natural variablity and sulpher and solar effects, and trends do not move in straight lines, all true, but it was also true in the 90’s and early 00’s, just not mentioned by the media/scientists.
Similarly, what other damage has been done the culture and to the society because of those for whom the ends justify the means? Global warming alarmism has never been anything more than symptom of something far more alarming.
In addition to unprecedented academic dishonesty, the `official’ inquiries regarding the academic dishonesty demonstrate also a complete lack of institutional honor.
Anyone who may have thought a generation ago that the US, the UK — or, Western civilization in general for that matter — was invulnerable to the crisis of an academic integrity crisis such as the one that has struck the secular, socialist Education Industrial Complex, must surely awaken from their slumber.
We all must look very carefully at the hard facts: the West is dying. The global warming hoax is only a symptom. The killing of morality by the Left is the cause and we cannot survive that. As Dostoevsky warned in his own manner, “the West has lost Christ and that is why it is dying; that is the only reason.”
The principle that long term global temperature trends are punctuated by shorter term up and down fluctuations is illustrated by the record of the past 150 years, at Temperature Record, or Temperature Index, which indicate that the warming trend observed starting around 1910 has been characterized by multiple downward or flat intervals as well as upward ones, even with smoothing.
Although this thread is an inadequate venue for citing (or
arguing about) the physics-based evidence that much of the long term trend reflects anthropogenic forcing from CO2 and other greenhouse gases, it is interesting to look at what GHG forcing might yield over a comparable interval in the absence of intervening variables. By the latter, I refer to anthropogenic and volcanic aerosols, solar changes, internal climate dynamics including ENSO and longer term fluctuations such as the PDO or AMO – all encompassed within a framework of chaotic responses to initial changes in these variables.
The following chart from Isaac Held’s blog depicts a GFDL CM2.1 modeled climate response to GHG forcing without the other variables – Global GHG Response. A few salient points emerge. First, GHGs (mainly CO2 but also methane and other emissions) can yield a warming trend during the first half of the twentieth century; this conflicts with a widespread assumption that the early warming must have been attributable almost entirely to other factors, including observed increases in solar irradiance and possibly a reduction in volcanic activity. The second striking finding is that many of the ups and downs seen with the actual climate record are reproduced by the model runs despite the absence of short term perturbations by the variables mentioned above. These are the chaotic fluctuations that characterize even a trend whose physics have been modeled to vary in a relatively smooth fashion. Equally striking is the fact that the fluctuations appear in different places in different runs of the model – they are clearly not a response to known external perturbations. Finally, and consistent with the article from Ed Hawkins posted above, they tend to even out over the long term.
How much “evening out” is to be expected over the course of a century? In theory, it could be much or little, depending on the timescale of any fluctuations imposed on a centennial trend. In practice, it appears that most identifiable fluctuations tend to average out over intervals either shorter than 100 years or much longer (e.g., orbital forcing). This appears to apply to the chaotic responses as well as more predictable changes. This is not, in my view, something that can be derived from any set of first principles based exclusively on mathematics, but requires empirical evidence about how the climate system actually behaves. The latter must obey the laws of physics as well as mathematics, and it appears as though certain mathematical concepts that might simulate climate changes over certain arbitrarily chosen intervals can’t easily be reconciled with the principles of thermodynamics over the longer term. One example that comes to mind is a “random walk” interpretation of multidecadal temperature anomalies – it appears as though the climate system responds over such intervals with adjustments that tend toward equilibrium even though this is not required by the mathematics alone. In a different climate on a different planet of different atmospheric composition, the result might be different.
The radiative properties of CO2 and other GHGs, and the feedbacks consequent to their changing atmospheric concentrations are properly the subject of a different thread, except to mention that the statistics of temperature change alone are only one subset of evidence bearing on their role.
Thinking about this some more, and I have not even glanced at the details of the paper, I have a feeling it is even more trivial than I first thought. It seems to go back to things I learned as an undergraduate in physics, namely signal-to-noise ratio. According to CAGW, there is a temperature signature of CO2, in that the more CO2 we add to the atmosphere, the higher temperatures climb. We need to detect this signal in the presence of the noise of natural variations of temperature. The stronger the CO2 signature is, with respect to the noise, the easier it is to detect.
All the paper is, I suspect, is a qualitative treatment of standard signal-to-noise ratio physcs.
Yes, it’s simple. If the signal of natural variation isn’t yet known, possibly never for sure, the signal of Anthro CO2 also can’t yet be distinguished. With the state of our knowledge, we haven’t detected it distinctly from natural variation, even though it may be large.
Thanks Kim. I am trying to dredge back what I learned 60+ years ago, and I will almost certainly get the details wrong. When it comes to noise, it is by definition, random. It has a magnitude and what was called something like a “time constant”; that is the time when the average value of the noise is not zero. If the time to integrate the signal is short compared with the time constant of the noise, then there is bound to be a bias from the fact that the noise value has not been integrated to zero,
The problem for the IPCC is that in the TAR, and to as lesser extent in the AR4, they tried to pretend that the natural noise was so small that it is negligible. Now we know that this is simply not true. But we also know that the time constants of the natural noise are long compared with the integration time over which we are trying to detect the CO2 signal. This presents the IPCC with a really major practical problem. For any reasonable time frame, the noise of the natural variations will never integrate to zero; there will always be a residual.
Detecting a CO2 signature in the presence of residuals from random noise will be difficult. Unless the CO2 signal is very strong indeed, or the noise values are very weak, it will probably be impossible to detect the CO2 signal over any sort of reasonable period of time; say the next 50 years.
Try the 60 year Pacific Decadal Oscillation on top of the variation from the Medieval Warm Period to the Little Ice Age.
e.g. Syun-Ichi Akasofu, On the recovery from the Little Ice Age, Natural Science Vol.2, No.11, 1211-1224 (2010) doi:10.4236/ns.2010.211149
Similarly, see Loehle, Craig, and S. Fred Singer. 2010. Holocene temperature records show millennial-scale periodicity. Canadian Journal of Earth Sciences 47(10): 1327-1336.
David, the millenial scale periodicity is the 974 year cycle in the internal distribution of angular momentum within the solar system
David – There is evidence for oscillations (or at least fluctuations) on timescales much longer than the centennial warming trend since 1910. In addition, the PDO and AMO have tended to average out over that interval. In essence, there appears to be no identifiable major global fluctuation that could account for the secular trend of the past 100 years. Some of this was also covered in the previous thread on Time Varying Trend In Global Mean Surface Temperature, which delved into the statistical isolation of a secular trend from superimposed fluctuations in more precise mathematical detail.
I don’t really think the trend is in doubt, but at the risk of belaboring the point, conclusions about anthropogenic contributions to the warming rest on considerably more than the trend statistics, and include fairly detailed understanding of the basic physics reinforced by observational data. Issues remaining in that area (e.g., the quantitative value of climate sensitivity to CO2) are the topic of a different thread – they’ve been discussed fairly extensively in the past, and presumably will be again.
There certainly isn’t space here to revisit all the physics, but it’s worth pointing out that it would be a mistake to interpret the role of CO2 as a “conclusion by default” – i.e., it isn’t simply what is left over after everything else is accounted for, but is grounded in very substantial theoretical and observational evidence. The previous Time Varying Trend thread also elaborated on the point that there is little conflict between natural variability and secular trends when they operate over different timescales. Understanding each of them better shouldn’t be seen as an attempt to discount the role of either one.
Fred , Our understanding of evolution is based upon a sound understanding of biological and natural forces at play, but that does not mean that we can make scientific predictions about the future evolutionary development of any given species.
I can’t help dusting off and re-reading Popper’s critique of Historicism when thinking about Climate Science, and how it tries to base itself upon “non-repeatable” data. In this sense, it fails the demarcation test.
Jim – We can only carry analogies so far. However, we can predict the evolution of drug resistance in bacteria exposed to an antibiotic, or insecticide resistance in insects exposed to particular agents. In other words, the evolutionary response to certain pressures is predictable.
The use of the word “evolve” is probably misleading here, because the response to CO2, for example, is based on observable and reproducible properties of this molecule – similarly for water vapor and other critical moieties – and is therefore as predictable as a variety of other physical phenomena – e.g., the reduction in the freezing point of water in response to added salt. This is not “evolution” in any literal sense of the word, but simply the application of established principles of how certain molecules behave. Other elements of climate change are less predictable, but each must considered in its own right, and on the basis of detailed knowledge, rather than by analogy with biological evolution. I don’t know of any means by which that knowledge can be replaced by abstract reasoning in order to arrive at any accurate understanding of climate responses to CO2, solar irradiance, aerosols, or other climate drivers. It’s essential to be familiar with the physics, the methods, and the evidence in detail.
Fred, When you design a steel been to span between two points, you will know exactly how much weight it can bear before it fails. When you flip a coin, however, you will not know if it’s going to land heads or tails. These two examples are fundamentally different from one another. One meets the demarcation of science, the other is a guess.
When you predict tomorrow’s temperature (or the global temperature anomaly 5 decades from now) you are guessing. Tomorrow’s temperature will be what it will be. You can’t go back in time and “guess” it again to confirm it. Just like you can’t go back and re-guess a coin toss.
There are things in the universe that we can NEVER know — we can only guess. However, guesses cannot form the basis of engineering, which is exactly what climate scientist are trying to do.
The “no guess” can in fact be a guess that nothing will happen.
As for coin tosses, think probability.
The scientific method is not interchangeable with statistics — they are two different things.
“conclusions about anthropogenic contributions to the warming rest on considerably more than the trend statistics, and include fairly detailed understanding of the basic physics reinforced by observational data.”
Agreed – but they also ignore or do not understanding major areas of physics with high uncertainties including indirect solar variations, cloud variations, Pacific Decadal Oscillations, etc. with major impact on the climate sensitivity. After tuning models without these factors, the resultant climate model “projections” are still fail foundationally from an argument from ignorance.
Hagendl – I can’t agree that those items are ignored. They are factored into the analyses. Most operate on timescales different from the centennial warming trend we’ve observed, and so there is little or no conflict between interpretation of that trend and the deviations from it due to the PDO or other climate dynamics. Solar variations are of course part of forcing that goes into projections.
Except for clouds, none of the items you mention “impacts” climate sensitivity, which is defined in terms of response to a specified forcing – usually what is meant is the temperature elevation resulting from a doubling of atmospheric CO2, but the concept can be extended to other forcings as well. If you mean that sensitivity estimates based on forcings require us to know which forcings are operating and at what magnitude, I agree with you. However, climate sensitivity estimates are a topic of enormous breadth and there is insufficient space here to address it in any meaningful way. It has been addressed previously and probably deserves an occasional thread of its own every so often in the future.
Fred -the PDO time scale of 30 yrs warming – now 30 years cooling directly impacts those trends. Furthermore, there are growing number of publications showing much lower climate sensitivities. e.g. Roy Spencer Five Reasons Why Water Vapor Feedback Might Not Be Positive September 14th, 2010. See also Lindzen, Idso etc. Such evidence indicates insufficient attention and modeling has been given to natural effects.
?David – The PDO and AMO have tended to average out over the past century, consistent with my point that oscillations on timescales less than centennial don’t conflict with the centennial warming trend due mainly to anthropogenic factors over much of the century but with a pronounced solar component early on.
The climate sensitivity data of Lindzen, Spencer, and others are beset with enormous problems, but more importantly, this is a topic that can be distorted by citing cherry picked examples. My own interpretation of the entire range of climate sensitivity data involving dozens of studies and multiple approaches is that the canonical range of 2 to 4.5 C per CO2 doubling is probably very reasonable, but this is clearly not a topic that can be adequately addressed in a few comments here.
OK, I went to read this in full, but had to stop at the end of the first sentence:
“The Earth’s climate is changing primarily due to two factors – (i) the response to external forcings such as greenhouse gases, solar output and volcanoes, coupled with, (ii) the inherent internal (or natural) variability of the climate system.”
So the essay only considers internal (natural) variability but not external (natural) variabilities which affect climate because they are ‘forcings.
This restricts the scope of the discussion to decadal scales when a real treatment of natural variabilty would include longer term trends and longer term variations in solar forcing as evidenced by proxy records such as 10Be and C14.
Tallbloke commented earlier in this thread on warming from 1690
The natural variability and longer term cycles is well illustrated in the CET records. This is the one that the Met office tend to use-it goes back to 1772. The continual upwards trend from that date can be clearly seen-albeit with numerous advances and retreats.
This is the extension of the record back to 1660.
It’s especially interesting because of the dramatic warming trend from 1690 to 1730. We have sufficient diaries and observations to extend that tentatively back to what appears to have been the coldest year in the second phase of the LIA-1607. So there appears to have been a gentle warming trend –with advances and setbacks-for some 400 years. I intend to explore this period in a future article.
Good idea. They may have been reading Dr. Curry’s mail. See:
Curry, Judith A. 2010. Hurricanes and global warming: 5 years post Katrina. Scientific. Climate Etc. September 13. http://judithcurry.com/2010/09/13/hurricanes-and-global-warming-5-years-post-katrina/
I neglected to reference:
Hawkins, Ed. 2011. Learning about past climate from ships logs. Scientific Blog. Ed’s Open Lab-book. June 17. http://www.met.reading.ac.uk/~ed/blog/2011/learning-about-past-climate-from-ships-logs/
Our Evolving Climate
Evolving from what to what?
I’m probably just confused here but it seems to me if CAGW is correct and it is a function of the radiative physics as Steve Mosher says then to postulate warmer and colder years existing in a world in which the total energy of the system is constantly increasing requires something like a spring mechanism where the energy is in one place and then oscillates to another. ie it shows up as atmospheric temperature in one year then it moves to another form (ocean temp???) in another. Yet I don’t remember anyone finding the lost energy in a cool year. Unless a physical basis for this energy oscilation can be shown then the analysis seems to me to be incorrect. Wasn’t it Trenbrteth who said it was a travesty the energy can’t be found
I don’t mean to imply that it must be a regular oscillation just that when, at random, it becomes cooler the energy must show up some where else in the system. If it doesn’t then the temperature is being reset and is not an upward trend.
The only independent variable is the sun and nominally, it explains both global warming and global cooling. We know that global warming in not related to CO2.
A study of the Earth’s albedo (project “Earthshine”) shows that the amount of reflected sunlight does not vary with increases in greenhouse gases. The “Earthshine” data shows that the Earth’s albedo fell up to 1997 and rose after 2001.
What was learned is that climate change is related to albedo, as a result of the change in the amount of energy from the sun that is absorbed by the Earth. For example, fewer clouds means less reflectivity which results in a warmer Earth. And, this happened through about 1998. Conversely, more clouds means greater reflectivity which results in a cooler Earth. And this happened after 1998.
It is logical to presume that changes in Earth’s albedo are due to increases and decreases in low cloud cover, which in turn is related to the climate change that we have observed during the 20th Century, including the present global cooling. However, we see that climate variability over the same period is not related to changes in atmospheric greenhouse gases.
Obviously, the amount of `climate forcing’ that may be due to changes in atmospheric greenhouse gases is either overstated or countervailing forces are at work that GCMs simply ignore. GCMs fail to account for changes in the Earth’s albedo. Accordingly, GCMs do not account for the effect that the Earth’s albedo has on the amount of solar energy that is absorbed by the Earth.
Yes, this is a clear winner in the “Occam’s Razor Fight”. It is potent, it is proven, and the “basic physics” of reflective clouds is a lot less iffy and controversial than that of back-radiative trace gas.
Not only do GCMs fail to account for it, they fail to justify their basic design, or even their existence, in the face of it.
So the proper response to the GHG, GCM, AGW-pushers is, “Earthshine. Reflect on that.”
GCMs fail to account for changes in the Earth’s albedo. Accordingly, GCMs do not account for the effect that the Earth’s albedo has on the amount of solar energy that is absorbed by the Earth
Are you sure? I mean I don’t pretend to be an expert on GCMs but that sounded rather unlikely and a quick Google brought up –
On the basis of these observations, we conclude that the albedo feedback from the Northern Hemisphere cryosphere falls between 0.3 and 1.1 W m−2 K−1, substantially larger than comparable estimates obtained from 18 climate models.
John – The upward trend is not in doubt, but it is a trend in surface temperatures, and the fluctuations reflect shorter term shifts between the surface and elsewhere, where elsewhere refers primarily to different ocean layers, although in some cases, enhanced escape of energy to space may also be involved. For example, during an El Nino year, the ocean surface is warmer while the temperature at a lower depth in the upper part of the ocean declines, and the reverse is true for La Nina years. In addition, shifts from the entire upper ocean to the larger, deeper layer below 700 meters account for some of the energy. Because of regional differences, different ocean basins respond differently. All this makes it difficult for a precise accounting on timescales of less than a decade. On larger time scales, the ocean’s heat content has been increasing in a direction consistent with the added energy and the surface warming trend..
Trenberth wasn’t claiming that the energy was entirely unaccounted for, but rather that some of the observed surface warming could not be entirely explained by the magnitude of observed changes in the ocean (as well as the atmosphere and the melting of ice). His speculation has been that the unaccounted for part was probably somewhere in the oceans – possibly the deep ocean – but he did not dismiss the possibility that more energy had escaped to space than was calculated from observations. This would be consistent with occasional years of true “global cooling” interspersed among the larger number of warming years. An example would be a transient loss of energy to space, via the atmosphere, during phases of El Nino when ocean surface temperatures are unrepresentatively warm relative to deeper layers. The reverse would be the case for La Nina. The more important principle is that these are fluctuations around a trend that is upward over the longer interval of the past century.
Above, I was referring primarily to naturally occurring oscillations. Fluctuations also reflect changes in other climate forces such as solar irradiance and aerosols that reflect sunlight and exert a net cooling effect. These have modified the trend of the past 100 years to varying extents, including periods of flattening, but without any long term reversal.
Here is a link to a 2009 Trenberth Paper on the earth’s energy budget, including attempts to fully reconcile measured changes in surface temperature or ocean heat uptake with increases in the energy content of the climate system. New measurements on energy imbalances at the top of the atmosphere since that time have not yet answered all the questions, to some extent because of instrument errors or sampling problems.
O’Really? What warming below 700 meters? Assumes facts not in evidence. Saying, “It just gots to be there!” begs the question.
And “enhanced escape to outer space” IS direct negative feedback, to (any) warming NOTWITHSTANDING CO2 increase.
“Begging the question” by assuming AGW is fundamental to the models. As for the physics, — Alfred Schack, 1972 showed that the radiative component of heat transfer of CO2, though relevant at the temperatures in combustion chambers, can be neglected at atmospheric temperatures. The influence of carbonic acid on the Earth’s climates is definitively unmeasurable.
You are a foolish and bombastic man – an awful combination.
Here indeed is evidence from ARGO of heat content increase to 2000m in the period to 2003 – http://archimer.ifremer.fr/doc/2009/publication-6802.pdf
I do indeed realise that the NODC to 700m shows no heat content increase.
How does that work? I am clueless. But if you look at CERES for the same period – it shows about the same decrease in reflected shortwave so I think that Schuckmann may be right.
But it is all relative – I am less clueless than you by far.
The easiest case to understand is the El Nino which leads to warmer internal variation periods. In this situation the deep warm water block in the West Pacific spreads eastwards along the surface and gets shallower, warming the east Pacific and exposing more area of the warm water block to the atmosphere. Hence the atmosphere gains heat and the ocean loses it faster in El Nino years. The effects of the warmer air are seen in the global average surface temperature.
Pacific Ocean climate states have an origin in the abysmal depths. Deep oceanic currents are driven by thermohaline circulation and by the rotation of the planet. The deep currents interact with a sun warmed surface layer that is a hundred or more metres deep. Deep ocean currents occasionally push through the warm surface layer in the south eastern Pacific in one of the major areas for upwelling on the planet. Upwelling subsurface water is both frigid and rich in nutrients leading to booms and busts in biological activity affecting fisheries, mammals and birds off the Pacific coast of South America. This area is designated as Large Marine Ecosystem (LME) No. 13, is amongst the most productive environments in the world and is known as the Humboldt Current.
The thermal evolution of the Humboldt Current is best understood in terms of ENSO. ENSO is an oscillation between El Niño and La Niña states over a 2 to 7 year cycle. An El Niño is defined as sustained SST anomalies greater than 0.5O C (in the Nino 3 region) over the central pacific. Conversely, a La Niña is defined as sustained SST anomalies less than -0.5O C. The oscillations (more correctly chaotic bifurcation) are driven by complex interactions of cloud, wind, sea level pressure, sea surface temperature, planetary rotation and surface and subsurface currents. The short explanation is that the Pacific trade winds set up conditions for a La Niña. Trade winds, south-easterly in the Southern Hemisphere and north-easterly in the Northern Hemisphere, pile up warm surface water against Australia and Indonesia. Water vapour rises in the western Pacific creating low pressure cells that strengthen the trade winds – piling yet more warm water up in the western Pacific. Cool, subsurface water rises in the eastern Pacific and spreads westward. At some point the trade winds falter and warm water spreads out eastward across the Pacific.
We should not neglect ENSO cloud feedbacks – low level cloud forms over cool oceans in a La Niña and dissipate over a warm ocean in an El Niño. The associated radiative changes that are the biggest factor in decadal global warming or cooling – which does not occur unless there are changes in the TOA radiative imbalance.
Let me repeat my little formula:
Ein/s – Eout/s = d(GES)/dt
Ein/s and Eout/s are the average unit energies in and out respectively. The energies are simply the average of radiative flux (W/m2) in a period for 1 second (unit energy) – to give Joules/s. The differential is the rate of change in energy (heat + enthalpy) stored in the climate system.
If energy in is greater than energy out – then the rate of change is positive – and the planet is warming. And vice versa. The formula perfectly describes global energy dynamics in three terms by the law of conservation of energy.
Solar UV for instance doesn’t change the energy equation by much directly – it may have an influence on THC and ENSO and thus an indirect influence.
Ein changes a little with the 11 year cycle and perhaps a very little more longer term. Eout varies with GHG’s and albedo. But unless there is a change in radiative flux – there is no planetary (oceans and atmosphere) warming or cooling.
Leftist government science authoritarian James Hansen is Western civilization’s Cassandra.
Cassandra was right, but of course you knew that and that was your point I guess.
You mean, Cassandra predicted a coming ice age in the 70s and then changed her prediction to runaway global warming and a 5m rise is sea level by 2100?
In view of the following conclusion:
The simulated temperature changes associated with thermohaline circulation (THC) are large enough to modify estimates of the rate of anthropogenic climate change. 
Is it now accepted by the climate science establishment that the following IPCC statement is incorrect?
For the next two decades, a warming of about 0.2 deg C per decade is projected for a range of SRES emission scenarios. Even if the concentrations of all greenhouse gases and aerosols had been kept constant at year 2000 levels, a further warming of about 0.1 deg C per decade would be expected. 
In the following global mean temperature graph, for the next decade or two, is it not reasonable to project a global warming rate less than the long term warming trend of 0.06 deg C per decade?
 Knight, J. R., R. J. Allan, C. K. Folland, M. Vellinga, and M. E. Mann
(2005), A signature of persistent natural thermohaline
circulation cycles in observed climate, Geophys. Res. Lett.,
32, L20708, doi:10.1029/2005GL024233.
 IPCC Fourth Assessment Report: Climate Change 2007
Working Group I: The Physical Science Basis
Climates don’t evolve, since we’re being all nitpicky here.
They don’t reproduce, don’t have codes to mutate.
Climate just follows physics.
Since you insist on picking nits: climates do evolve, they just don’t evolve by random variation in offspring and natural selection of the next parents.
It is our understanding of climate which is evolving.
Of course, humans evolved, but our niche disappeared, and now we’re building a niche for which we are not yet adapted, and may not succeed in adapting too without a huge drop in numbers. Of course, the rest of the critters here may breathe a sigh of relief, presuming they’re not on our accidental hit list.
“Of course, humans evolved, but our niche disappeared…”
Where did it go? We only had one… who do I speak to about that?
Our niche didn’t disappear. In fact, we’re just in the process of developing it, if the Luddites don’t manage to seize control.
Average – Schmaverage.
Global average ? My plants only feel the weather in my garden and they feel it one day at a time. They don’t remember how warm or cold it was last wek and they don’t care what happens in Bangladesh.
‘Most of the studies and debates on potential climate change have focused on the ongoing buildup of industrial greenhouse gases in the atmosphere and a gradual increase in global temperatures. But recent and rapidly advancing evidence demonstrates that Earth’s climate repeatedly has shifted dramatically and in time spans as short as a decade. And abrupt climate change may be more likely in the future.’ http://www.whoi.edu/page.do?pid=12455
Climate is non-linear – and here I am in good company. The National Academy of Sciences, the Royal Society, the Woods Hole Oceangraphic Institutute and many individual scientists such as Tim Palmer – head of the European Centre for Mid Range Forecasting and atmospheric physicist Anastasios Tsonis.
‘What defines a climate change as abrupt? Technically, an abrupt climate change occurs when the climate system is forced to cross some threshold, triggering a transition to a new state at a rate determined by the climate system itself and faster than the cause. Chaotic processes in the climate system may allow the cause of such an abrupt climate change to be undetectably small.’
‘Researchers first became intrigued by abrupt climate change when they discovered striking evidence of large, abrupt, and widespread changes preserved in paleoclimatic archives. Interpretation of such proxy records of climate—for example, using tree rings to judge occurrence of droughts or gas bubbles in ice cores to study the atmosphere at the time the bubbles were trapped—is a well-established science that has grown much in recent years. This chapter summarizes techniques for studying paleoclimate and highlights research results. The chapter concludes with examples of modern climate change and techniques for observing it. Modern climate records include abrupt changes that are smaller and briefer than in paleoclimate records but show that abrupt climate change is not restricted to the distant past.’ An example of recent abrupt climate they give is the shift in the Pacific to more frequent El Nino after 1976.
A cogent theory for GHG’s causing cooling?
‘Recent research, however, suggests that there is a possibility that this gradual global warming could lead to a relatively abrupt slowing of the ocean’s thermohaline conveyor, which could lead to harsher winter weather conditions, sharply reduced soil moisture, and more intense winds in certain regions that currently provide a significant fraction of the world’s food production. With inadequate preparation, the result could be a significant drop in the human carrying capacity of the Earth’s environment.’ http://www.s-e-i.org/pentagon_climate_change.pdf
One of the factors influencing the major sources of decadal variability is solar UV drift causing changes in the Arctic and North Atlantic Oscillations and the Southern Annular Mode – with implications for storm tracks, thermohaline circulation and ENSO. As solar UV is involved and is declining from a 1000 year high in the 1990’s – another little ice age seems possible. If not a triggering of feedback mechanisms – cloud, ice, vegetation, snow cover and thermohaline circulation – that could result in dramatic cooling ‘in as little as a decade.’
‘During the descent into the recent `exceptionally’ low solar minimum, observations have revealed a larger change in solar UV emissions than seen at the same phase of previous solar cycles. This is particularly true at wavelengths responsible for stratospheric ozone production and heating. This implies that `top-down’ solar modulation could be a larger factor in long-term tropospheric change than previously believed, many climate models allowing only for the `bottom-up’ effect of the less-variable visible and infrared solar emissions. We present evidence for long-term drift in solar UV irradiance, which is not found in its commonly used proxies. In addition, we find that both stratospheric and tropospheric winds and temperatures show stronger regional variations with those solar indices that do show long-term trends. A top-down climate effect that shows long-term drift (and may also be out of phase with the bottom-up solar forcing) would change the spatial response patterns and would mean that climate-chemistry models that have sufficient resolution in the stratosphere would become very important for making accurate regional/seasonal climate predictions. Our results also provide a potential explanation of persistent palaeo results showing solar influence on regional or local climate indicators.’ http://iopscience.iop.org/1748-9326/5/3/034008/fulltext
‘During the past three decades a suite of space-based instruments has monitored the Sun’s brightness as well as the Earth’s surface and
atmospheric temperatures. These datasets enable the separation of climate’s responses to solar activity from other sources of climate
variability (anthropogenic gases, El Niño Southern Oscillation, volcanic aerosols). The empirical evidence indicates that the solar irradiance 11-year cycle increase of 0.1% produces a global surface temperature increase of about 0.1 K, with larger increases at higher altitudes. Historical solar brightness changes are estimated by modeling the contemporary irradiance changes in terms of their solar magnetic sources (dark sunspots and bright faculae) in conjunction with simulated long-term evolution of solar magnetism. In this way, the solar irradiance increase since the seventeenth century Maunder Minimum is estimated to be slightly larger than the increase in recent activity cycles, and smaller than early estimates that were based on variations in Sun-like stars and cosmogenic sotopes. Ongoing studies are beginning to decipher the empirical Sun-climate connections as a combination of responses to direct
solar heating of the surface and lower atmosphere, and indirect heating via solar UV irradiance impacts on the ozone layer and middle tmospheric, with subsequent communication to the surface and climate. The associated physical pathways appear to involve the modulation of existing dynamical and circulation atmosphere-ocean couplings, including the ENSO and the Quasi-Biennial oscillation. Comparisons of the empirical results with model simulations suggest that models are deficient in accounting for these pathways.’ Lean, J., (2008) How Variable Is the Sun, and What Are the Links Between This Variability and Climate?, Search and Discovery Article #110055
GHG’s should be seen as a minor influence based on fundamental physics – and as simply one factor in a complex and dynamic equation of climate. We cannot assume that warming will occur. However – even if a small factor – climate must be more unstable as we changing one of the factors with unpredictable consequences. But it is changed in the midst of other changes – which include both TSI (including UV) and albedo as radiative changes – and which can fundamentally alter the climate trajectory. The observations showing an increase in cloud in the 1998/2001 climate shift are an example clearly on multiple records. I think for the card analogy to work – we must declare dueces and jokers wild.
Well, the recent paleo record shows CO2 rise trailing temperature rise by a little less than a millenium, and that rise always precedes the temperature drop which inevitably supervenes.
‘There was a place, below a lonely, hunched gum,
Skin, sun-aged bark, and flaking,
Where we dipped our feet in the quick-cool stream;
Where you and I built sandcastles.’
Nostalgia: Jamie Holden-King
So what is the take-away WRT production of GHGs (or any other “nudge” to the system? “It’s unpredictably unstable; don’t do anything that might irritate it!”??
Actually, the only important line in the above is this: “GHG’s should be seen as a minor influence based on fundamental physics – and as simply one factor in a complex and dynamic equation of climate. We cannot assume that warming will occur.” The following “However” does not follow. Instability is not a predictable consequence of GHC increase. Given the multi-million-year long periods of Hot House Earth, with “average” temps stuck at 22°C notwithstanding huge order-of-magnitude swings in CO2 levels, the rational expectation is for stability regardless of GHG levels (over time periods relevant to any possible current societal or economic or industrial response).
It may be that a huge unpredictable “swing” is in our future, but nothing we can do will evit the in-evit-able. Not even “rolling back industrial civilization”, as Lindzen put it.
Please see my earlier comment to nonsense you posted upthread. I quote authoritative sources extensively – and you reply with a free form narrative ‘superficially in the language of science’ from your own imagination.
I am sorry to need to be stern – but you accuse me elsewhere of plotting world domination and genocide – and now of rolling back civilisation. This is not the mark of civilised discourse.
The policies you objected to as waffle are those of the Hartwell 2010 Paper. ‘The Paper therefore proposes that the organising principle of our effort should be the raising up of human dignity via three overarching objectives: ensuring energy access for all; ensuring that we develop in a manner that does not undermine the essential functioning of the Earth system; ensuring that our societies are adequately equipped to withstand the risks and dangers that come from all the vagaries of climate, whatever their cause may be.’
They were written by –
Professor Gwyn Prins, Mackinder Programme for the Study of Long Wave Events, London School of Economics & Political Science, England
Isabel Galiana, Department of Economics & GEC3, McGill University, Canada
Professor Christopher Green, Department of Economics, McGill University, Canada
Dr Reiner Grundmann, School of Languages & Social Sciences, Aston University, England
Professor Mike Hulme, School of Environmental Sciences, University of East Anglia, England
Professor Atte Korhola, Department of Environmental Sciences/ Division of Environmental Change and Policy, University of Helsinki, Finland
Professor Frank Laird, Josef Korbel School of International Studies, University of Denver, USA
Ted Nordhaus, The Breakthrough Institute, Oakland, California, USA
Professor Roger Pielke Jnr, Center for Science and Technology Policy Research, University of Colorado, USA
Professor Steve Rayner, Institute for Science, Innovation and Society, University of Oxford, England
Professor Daniel Sarewitz, Consortium for Science, Policy and Outcomes, Arizona State University, USA
Michael Shellenberger, The Breakthrough Institute, Oakland, California, USA
Professor Nico Stehr, Karl Mannheim Chair for Cultural Studies, Zeppelin University, Germany
Hiroyuki Tezuka, General Manager, Climate Change Policy Group, JFE Steel Corporation (on behalf of Japan Iron and Steel Federation), Japan
I suggest – if it is not too late – that you learn some dignity and humility
Good catch, Chief. Some other tidbits In the Hartwwell Paper are:
(Marketing Plan:) Page 07: “Without a fundamental re-framing of the issue, new mandates will not be granted for any fresh courses of action, even good ones. So, to rebuild climate policy and to restore trust in expert organisations, the framing must change and change radically.”
(Justification:) Page 20: “As The Economist wrote in its special survey of climate science on 20th March 2010, “Action on climate is justified, not because the science is certain, but precisely because it is not.” Its view is close to ours.”
(Goal:) Page 20: “Our goal is broad-based support for radical acceleration in decarbonisation of the global energy economy. We believe that an indirect approach, which pulls on the twin levers of reducing the energy intensity of economies and the carbon intensity of energy, is more likely to win public assent than a frontal assault upon carbon emissions, especially one coming soon after the recent turbulences.”
“First, we do not mean that all or any action on the most ambitious goal of radical decarbonisation is postponed until previous steps – such as efficiency improvements – are successfully underway, let alone complete.”
(In spite of:) Page 22a: – 23: After appropriate statements about the role of CO2 / GHG as a driver of AGW, the authors introduce two other factors, each independent of emissions (CO2). Land Use and UHI.
The Hartwell Paper, to which another source responded:
Anonymous. 2010. Oblique strategies. The Economist – Green.view. May 11. http://www.economist.com/node/16099521?story_id=16099521
“The difficulty with the current approach to climate change, according to the Hartwell paper, is that climate change is not a problem.”
A reasonable article.
‘The difficulty with the current approach to climate change, according to the Hartwell paper, is that climate change is not a problem. A problem can be identified, isolated and, in principle, solved. Climate change isn’t like that: it is “better understood as a persistent condition that must be coped with and can only be partially managed more—or less—well.”
Nicely done, Chief!
Here is a graphic of decadal trends using the warm running Giss data.
The two largest trends, both negative and positive, occurred in the first half of the last century.
Any 10-year period can be a decade, but by choosing to start your decades with years 1 instead of years 0, you make it difficult to refer to decades. For example, the decade of the 1990’s generally is understood to mean 1990-1999, not 1991-2000.
Ten years isn’t much time for a trend. Have you tried to identify the sharpest increases and decreases over broader periods?
Perhaps there are larger implications in what you say?
If going from 2990 to 1991 can make a big difference, then could it be that what is being sought is much less significant than many believe?
I’m not sure about that – I tend to assume a decade starts with year ..1
I remember some fairly heated discussions on this point at the end of the last millennium and I was firmly in the camp that the new millenium started on 01.01.2001, although not to the extent that I held my own millennium party on 31.12.2000
Of course it doesn’t really matter as a decade is a pretty arbitrary, but occasionally convenient, period anyway and I think hunter has it right – if moving one year either way makes much of a difference then you can’t really draw strong conclusions either way.
The Giss record runs from 1881-2010. If I start with any other year the new century data couldn’t even be included. You wouldn’t want me to ignore that would you? I only showed decadal trends because there was an entire section in the article on predicting a decade ahead and I was simply illustrating how that looks in the record. If you aren’t content with that, do it yourself.
Ed Hawkins starts off with:
We have seen an earlier study by Wu et al., which raises serious questions regarding the validity of this claim, for exactly the reason, which Hawkins wants to emphasize: a significant portion of the warming can be attributed to “natural variability” (in this case, the multi-decadal Atlantic Oscillation, to which the authors attribute 0.08C per decade of the observed recent warming).
In other words, both Hawkins and Wu et al. are really telling us that it is NOT ”very likely that humans have caused most of the warming of the Earth’s climate since the mid-20th century.
Hawkins points out that we are on a long-term warming trend. This is obviously correct: the HadCRUT3 record confirms that this has been around 0.04C per decade since 1850. He concludes that this underlying warming trend plus the observed multi-decadal warming and cooling cycles observed to date are likely to continue into the future.
As far as global temperature for the immediate future is concerned, Hawkins writes:
IOW we should have a year, which is warmer than 1998, within the next 5 years, according to the GCMs. [Let’s see if they are right.]
Hawkins has “bought in” to the hypothesis being promoted by IPCC and supported by the Met Office: i.e. that human CO2 emissions will dominate the long-term climate projections, but he does not provide any statistical reasoning to support this premise.
He mentions “climate inertia” (another way of calling the “hidden in the pipeline” postulation of James E. Hansen et al.), but also gives no justification for this premise based on actual observations.
Hawkins mentions that the actual temperature record did not show the warming, which was projected by the GCMs and concludes:
Strangely, he does not mention the most obvious reason:
The study shows the relative importance of three sources of uncertainty in future temperature projections: scenario uncertainty, model response uncertainty and internal variability uncertainty.
As I read Figure 4, out of a projected warming by 2100 of 2.0°C ± 1.2°C (or 0.8°C to 3.2°C), the largest source of uncertainty is apparently “scenario uncertainty” (how much CO2 increase will we have to year 2100?), the second is “model response uncertainty” (I read this to mean primarily, what is the real 2xCO2 climate sensitivity?) and the smallest is “internal variability uncertainty” (how large a role will natural variability and forcing really play?). [To me, the latter two are directly related.]
This is simply a fancier way of expressing what Vicky Pope of the UK Met Office concluded when asked why global temperatures had stopped warming despite CO2 increase to record levels (i.e. because of “natural variability”).
The study points out that natural variability has played a role in the past and will most likely continue to do so in the future. This makes good sense.
I personally have two problems with Hawkins’ study:
Hawkins starts his analysis in 1960, and so does not attempt to examine the several multi-decadal warming and cooling oscillations in the past record. This is the biggest problem, because it only enables Hawkins to examine one partial cycle in a record that has numerous multi-decadal oscillations with a total cycle time of ~60 years. It also gives Hawkins a warming trend line which is steeper than the actually observed long-term warming trend.
All in all, Hawkins is much too cautious in skirting around the obvious conclusion that the Met Office (and IPCC) forecasts of warming were too high because the GCMs have exaggerated the importance of anthropogenic forcing (i.e. primarily from CO2) at the same time understating the importance of natural forcing and variability.
Just my comments.
The published version of the paper (behind paywall unfortunately) does include “(iv) because the model is slightly too sensitive.”, which is missing from my pdf version.
In answer to your first comment – there is nothing in the article which assumes a 1960 start and nowhere do I estimate the trend in the observations, I just plot them starting in 1950, and Fig. 5 does show a longer record.
This suggests that the most obvious explanation did not occur to Ed Hawkins until it was pointed out by a reviewer. Perhaps Ed would benefit from talking further with people like Max.
Thanks very much for your post, which addressed my two main concerns.
I realize that anyone in the climate business today has to walk very gingerly around the consensus position on AGW, but papers such as yours are refreshing in that they show that there is still great uncertainty regarding natural-caused versus human-induced changes in our climate (as our host here has also expressed repeatedly), despite the official IPCC position that natural forcing has caused less than 7% of the observed warming. This uncertainty could greatly influence the GCM-based estimates on 2xCO2 climate sensitivity and, hence, the projections of future AGW-caused warming by year 2100.
One may argue about whether or not these projections lie in the 2.0°C ± 1.2°C range (as your chart showed), but one cannot reasonably argue that the uncertainty is “insignificant”.
Well done Max – typical highly literate clarity that others would do well to consider.
Just a point on Wu – they raised changes in THC as one possible source of interdecadal variation. There are of course others.
The method doesn’t exclude the contribution of factors other than C)2 to the 20th century trend. A centennial drift in solar UV perhaps?
“Ed Hawkins starts off with: It is “very likely” that humans have caused most of the warming of the Earth’s climate since the mid-20th century; this was a key conclusion of the 4th Assessment Report (AR4; Solomon et al., 2007) of the Intergovernmental Panel on Climate Change (IPCC). We have seen an earlier study by Wu et al., which raises serious questions regarding the validity of this claim, for exactly the reason, which Hawkins wants to emphasize: a significant portion of the warming can be attributed to “natural variability”… In other words, both Hawkins and Wu et al. are really telling us that it is NOT ”very likely that humans have caused most of the warming of the Earth’s climate since the mid-20th century.”
Max – That is incorrect. The AR4 conclusion related to warming since about 1950, while Wu et al showed that warming from about 1980 to the present was too steep to exclude a larger fractional role for factors other than anthropogenic greenhouse gases (“global brightening” due to reduction in cooling aerosols may have been a major factor in the 1980-2010 trend). In other words, the AR4 conclusion for warming due to greenhouse gases and Wu et al are not in conflict.
The same misconceptions emerged during the earlier thread on time trends, prompting the authors to respond, indicating that their 25-year analysis did not conflict with AR4 50-year conclusions regarding the secular trend (ST) but rather showed that the acceleration during the later 25 years required additional non-anthropogenic factors. Their statement, reported by Dr. Curry, was as follows:
“Note from Mike Wallace and Zhaohua Wu added 7/18:
Thank you for your kind words about our article in Climate Dynamics. In our Table 2 and in the discussion on the page of the article that precedes it we draw a distinction between the partitioning of the trend in the past 25 years and in the past 50 years. Our statement, “Therefore, the estimated global warming due to human activities over the past 25 years ranges from about 0.10 K to about 0.15 K per decade, depending on the assumed partitioning of the MDV between natural and anthropogenic aerosol-forced variability” refers to the past 25 years, also referred to as “the past few decades” in our article. Our ST component accounts for a larger component of the 50-year trend. Our intent was not to contest the IPCC’s attribution of the “late 20th century” (i.e., 50-year) trend, but, rather, to question whether the acceleration in the rate of greenhouse warming was as pronounced as implied by results presented in AR-4″.
The confusion on this point highlights the principle that a strong trend and strong oscillations or other deviations from the trend slope are not necessarily mutually exclusive when they operate over different timescales – e.g.,the “greenhouse warming” the authors refer to and the natural variations they see as adding to the steepness in the interval after 1980.
There is no confusion in my mind at all. It is clear that the uncertainty on natural variability (and./or forcing) versus anthropogenic forcing in shaping our recent climate has been questioned by both Wu and Hawkins.
Hawkins has also pointed out that the published version of his paper did include the question on whether or not the GCMs were using too high a climate sensitivity.
While Hawkins has corrected my statement that his study only “started in 1960”, it appears that most of the graphical comparisons start with 1960. In addition, the repetitive 60-year oscillation in global temperature over the entire record was not studied.
The “steepness after 1980” may well simply be a repetition of earlier indistinguishable upward cycles of this oscillation, rather than evidence of an added “greenhouse warming” signal.
Max – Since Wu doesn’t agree with your statement about what he believes, you should probably explain to him why what he believe is different from what he states he believes. Alternatively, you could depart from your usual inability to admit mistakes and admit you made a big one in this instance.
You seem confused.
I am not.
“Further testing of our climate models in a similar way is vital to increase confidence in their use for longer term projections and potentially identify parts of the model which require improvements.”
What would have to happen for you to conclude that your computer program is wrong? How many decades might it take?
None of this seems very useful to me.
Increased predictability of internal variability will improve regional forecasts and the role of internal variability is also important for decadal predictions? I think we’ve known that for awhile and I appreciate Ed Hawkins’ work. Shorter-term and also regional prediction is a pretty new field and most people agree its development is important e.g. for disaster planning, infrastructure and food security, in a changing climate.
It’s fascinating how your special people react to advances, Judith.
From comments, it’s hard to tell where the ‘rational’ part of ‘rational skeptic’ is hiding.
David and quite a few others have really gone downhill. :-(
The Egyptians practised decadal scale prediction and disaster planning 3000 years ago. They had grain stores adequate for seven years of drought. Today, Oxfam say there is an 810 million dollar shortfall in the disaster relief fund for Ethiopia. This figure is 130 times less than what the US spent on ‘man made global warming’ since 1993.
This sceptic has been saying all along that climates vary in many ways, and preparedness for all possibilities is the only sensible course, because models of decadal local climate change based on a co2 driven model are doomed to failure.
They care more for the unborn than the living!
The opportunity costs of AGW are massive and has a large number of casualties.
Ed’s study is useful for all the reasons you state. It’s one small piece of a very large jigsaw puzzle, most of whose pieces are still missing, but neverheless still interesting .
The paper is not about global warming or the influence of CO2. I’s about the size and length of variations around a trend (or long oscillations with periods of more than about 100 years). It presents some results from model runs and shows that these model runs have similar variability on a level that may be stronger than the main stream estimates for AGW. Thus the paper tells only on the limits of using temperature time series in drawing conclusions on the trend. The results apply in both directions. The paper doesn’t try to prove AGW, but it’s certainly reasonable to compare the fluctuations to the estimates of AGW. Assuming certain level of AGW is needed for that and it’s also present through the use of readily available model runs.
Ten years ago the available temperature data indicated a strong rising behavior over the previous 30 years. Taken alone that did allow both lower and higher estimates for the underlying trend than directly observed. Some part was obviously due to ENSO, but even after the estimated effect of that was removed, the data did allow also for a very rapid increase in temperature. It’s not right to say that the uncertainty was neglected to get a higher estimate for the underlying trend than was supported by the data, because the uncertainty was present in both directions.
After 1998 the temperature has been relatively flat. That excludes with high certainty the highest estimates for an underlying trend that were consistent with the data up to 2000, but it doesn’t change the fact that the data continues to indicate a warming trend from decade to decade in the observations. The best fits to the data do still contain a significant underlying warming trend. There is at the same time more evidence than before for a sizable multidecadal (quasi)oscillatory component. The present best estimate for the climate sensitivity is certainly lower than it would be, if the warming had continued over last 10 years at the speed of warming over the preceding 30 years. Thus the limits based on the climate sensitivity that the measured temperatures provide should now be more restrictive from the upper end. This limit is still not very restrictive, but should be more restrictive that a similar limit was 10 years ago. (On the contrary a rapidly warming decade would have raised the lower bound on the climate sensitivity.)
The previous paragraph applies to the limits on climate sensitivity provided by the temperature measurements. Other information has been as important or more important in providing limits on climate sensitivity. Some of that may be indirectly influenced by the temperatures of the last decade, but most is totally unaffected by that.
‘Some part was obviously due to ENSO, but even after the estimated effect of that was removed, the data did allow also for a very rapid increase in temperature.’
It is a problem with being qualitative. From HadCRUT3 – it is 0.47 degrees C increase in 2 ENSO events.
As you know – I think these are ENSO dragon-kings – which are defined as extreme events at periods of chaotic bifurcation. These increases are a large part of the recent temperature rise and are most ‘obviously due to ENSO.’ This is a very simple observation – but quite rigourous.
If you care to check – all of the recent warming occurred between 1976 and 1998.
ERBS in the tropics and ISCCP in the tropics – show similar warming in SW (2.1 and 2.4 W/m2 decrease in reflected shortwave between the 1980’s and 1990’s) and significant decadal trend in the global data in the same direction – http://isccp.giss.nasa.gov/zFD/an9090_SWup_toa.gif – identified by NASA as a ‘slow decrease of upwelling SW flux from the mid-1980’s until the end of the 1990’s and subsequent increase from 2000 onwards appear to caused, primarily, by changes in global cloud cover (although there is a small increase of cloud optical thickness after 2000) and is confirmed by the ERBS measurements.’ The same satellites show cooling in the IR.
There are COADS observation in the Pacific that show a decrease in cloud in the 1970’s and satelite and ‘Earthshine’ measurments showing about a 2W/m2 increase in reflected SW after 1998 – the cause of the lack of warming since 1998. A consistency between observations from various methods and instruments provides greater confidence.
Most of the recent warming was caused by ENSO and most of the rest was caused by cloud cover changes. As ENSO has cloud feedbacks that are negatively correlated with SST – decadal variability in the Pacific is likely to have contributed to decadal variability of cloud cover.
As I have disccussed in this thread by extensive reference to authoritative sources – climate is non-linear. ENSO is non-linear, non-stationary and non-Gausian. I believe that climate is therefore hyper-sensitive in the region of a chaotic bifurcation but insensitve elsewhere.
“I believe that climate is therefore hyper-sensitive in the region of a chaotic bifurcation but insensitve elsewhere. “
Cheif, maybe you can link to a few of your previous elaborations on this? And do you know of any work on seasonality bifurcations?
Lot of wasted time in reading the Hawkins’ study and many of the posts here.
To understand the CET evolution since 1650’s, it is important to understand the North Atlantic, and it appears that very few commentators do.
Note the important point made by Tony B
“It’s especially interesting because of the dramatic warming trend from 1690 to 1730.”
No science hypothesis even less a theory regarding the CET trends is complete without an explanation for this extraordinary period, it is just no good avoiding it!
I would recommend the article from the WHOI stating:
One of the “pumps” that helps drive the ocean’s global circulation suddenly switched on again last winter for the first time this decade.
then a quick look at:
Going back to Tony B’s point 1690-1730. The WHOI’s “pump” of thermohaline circulation (THC) in the North Atlantic is controlled by the subpolar gyre. Data related to these events clearly shows a direct correlation to the one of more puzzling periods (1690 to 1730) in the CET evolution.
I don’t know if you ever saw our study on ‘cooling’ trends where the variations over the longer term can be clearly seen.
The apparent oscillatory behavior is very clear in the instrumental data, but it has been observed well for less than two full periods, which leaves much space for alternative explanations. The overall period is so short that the last 10 years form an essential part of the evidence, and that tells clearly, how insufficient the data is for concluding that it’s really of oscillatory nature. I know that there is support for the oscillations also from the earlier history, but that is much weaker as all paleoclimatological data.
The oscillations are strong enough to support the conclusion that the temperature change of the period 1970-2010 alone is not strong evidence for AGW. In the Bayesian approach the rising temperatures provide support to all hypotheses that predict a temperature increase of the right order of magnitude, but the support of any hypothesis is weak, if there are several hypotheses that lead to similar warming.
We know, however, more than just the fact that the temperature rise over the period 1970-2000 was quite strong. We know that the starting and ending points were higher in the recent rise than they were in 1910-40. (Not everywhere, but as an average. Some people may say that the data is suspect, but I believe that the evidence is strong enough.) In a oscillatory climate the significant warming is much less likely from the level of 1970 than from the level of 1910. That is one additional point, and that’s indeed what all those have concluded, who tell that oscillations are important, but the AGW is also important over this period. There are of course also other less direct arguments that climatologists have produced in support of their thinking.
Right now we have evidence both for the existence of significant AGW and evidence against the highest estimates that some people have presented for the rate of AGW. The last 10 years have added to this second evidence while they have provided both positive and negative evidence for the existence of a significant AGW in general. One example of the positive evidence is in the fact that temperatures have not fallen significantly (if at all) over the last 10 years, i.e. the temperatures of 1998-2002 were not a similar peak of small duration as the maximum in early 1940’s. (My general view is that the general evidence for a significant AGW has not changed much, but the evidence against the extreme estimates has got stronger.)
From the creation of this planet, the trend is cooling of temperatures through the natural process of distances from the sun.
Many factors make it challenging to understand the physical interaction of many processes at work at the same time on this planet and with the sun.
Changing the rotational speed of this planet, interacts to many areas due to the make up of this planet of gases, liquids and solids that have to change due to the physical energy it is bonded to.
Let me give my interpretation for what happened.
Details that could throw doubt on your interpretation must be
given, if you know them. Richard Feynman
You assumed the current global warming trend of about 0.15 deg C per decade to continue into the future as shown in the graph below (cyan line), and that is why the recent global mean temperature (GMT) observations (red curve) are at the lower end, where the recent observed data touches the bottom GMT trend boundary yellow line.
Instead, what is happening is that the globe is continuing its long-term global warming trend of only 0.06 deg C per decade (green line). In this case, around about year 2000, the GMT reached it’s maximum (top GMT boundary blue line, like in 1880s & 1940s) and then it reversed and is now moving towards its minimum (bottom GMTA boundary pink line) in the next couple of decades.
Also, you wrote,
Decades which exhibit a cooling (or a negative temperature trend) are only expected occasionally in the future for the global mean (about 5% of decades)
How did you exactly arrive at this 5% value?
I’m not sure how he derived 5% but his is approximately the same value as derived by Pielke Jr from the NOAA State of the Climate report 2008 where he determined it to be 5.8% of the modeled decades:
So the non-warming that we are observing is indeed a rare ONE-IN-TWENTY event.
What would happen if it continues for about 20 years more until 2030? Would AGW collapse?
The same paper, State of the Climate 2008, stated 15 years of no warming would be inconsistent with the models.
Spades & clubs VS. hearts & diamonds:
“JC comment: IMO this is much better than the “loaded dice” analogy that is often used.”
Any assumption of randomness is patently absurd.
Can you hear the wind blow?
And did you know?
Your stairway lies
on the whisperin’ wind.” — led zeppelin
One way in which the paper is quite misleading is that figure 1 shows only European temperatures. A curious choice which is rarely seen. This greatly exaggerates the variability, making it easy to find cooling decades (fig 1d). If fig 1 were to be plotted for global temps, the usual measure used, the fluctuation would be much smaller (about a factor of 3, eyeballing fig 2a and b), with only a 5% chance of a cooling decade.
Girma I think the answer to your question regarding the 5% is that you just run lots of climate model simulations and look at how often you get a cooling decade in the results.
Of course the climate scientists could easily increase this percentage by tuning the relevant parameters in the models as required.
In the shuffled deck of cards scenario do the missing cards represent CO2 warming, Trenberth’s missing heat, somebody’s model, an analysis of energy balance based on insolation, back radiation, and earth’s albedo, or the work of a card shark? Is the assumption in the analogy that only black cards can go missing? If so, why? Do the cards explain anything or are they there for entertainment value? Are the cards a proxy for the scientific method? Where are the multipliers implicit in the message the cards deliver to you?
A closer look at the HadCRUT3 record since 1850 shows three such multi-decadal oscillations of roughly 60 years each, with the first cycle truncated in front, so we are actually talking about 2.5 full cycles. The latter two warming half-cycles (early and late 20th century) are statistically indistinguishable while the earlier late 19th century warming cycle is a bit less pronounced.
The argument that the higher “starting point” of the most recent warming cycle is significant is weak, Pekka, when one considers that the entire 161-year record is like a sine curve on a tilted axis of underlying warming of around 0.04C per decade (so obviously that later cycle has a higher “starting point”). Since this 0.04C per decade warming trend started long before there were any significant human CO2 emissions, it cannot be attributed to AGW.
But you are right in stating that even this 161-year record is rather short and thus “leaves much space for alternative explanations”.
Of course (as both Wu and Hawkins allude with their papers) the period cited by IPCC to demonstrate an anthropogenic cause is even “much shorter” and, therefore, even “less significant” (my conclusion).
IPCC tells us AR4 WG1 Ch.3 (p.240):
So we see that in AR4 IPCC is essentially limiting its analysis to the period after 1976. This could be very misleading, of course, if this is only an upward “blip” in a longer-term multi-decadal oscillation caused by as-yet poorly understood natural factors, which is being ignored.
So the whole IPCC premise (AR4 WG1 FAQ 9.2, p.702) that:
is based on questionable reasoning, a logical fallacy known as “argument from ignorance”.
As you wrote, the period is much too short to arrive at any such conclusions.
I think that this is the basic take-home message from both Wu and Hawkins (even if it is stated in a very veiled fashion, so as not to offend the consensus view).
An ‘inconvenient problem’
Posted this on Ed Hawkins ‘Our evolving climate’ blog http://www.met.reading.ac.uk/~ed/blog/our-evolving-climate/
– Scientists avoid the CET before 1772 not because data is unavailable (it is here: http://www.metoffice.gov.uk/hadobs/hadcet/cetml1659on.dat) but it is inconvenient problem of the 1690 – 1730s period. Dr. Lockwood should also note that the sharp rise started some 15 years before the Maunder min ended.
Reply from the author:
Ed: Further unsubstantiated claims of ‘inconvenient problems’ will not be allowed
Hence a request: anyone aware of a science paper (published by a reputable institution) or a known climate scientist that has provided credible explanation for the sudden Central England temperature rise 1690-1730? I would be tempted to ignore any anthropogenic CO2 effect.
Relevant data can be found at: http://www.metoffice.gov.uk/hadobs/hadcet/cetml1659on.dat
Why is that important?
Well, it should be obvious
Not that I would expect any of the CO2 protagonists to respond.
For sceptics it is a more than useful graph, please to take note of it and if you whish you are free to use it.
‘We show that cold winter excursions from the hemispheric trend occur more commonly in the UK during low solar activity, consistent with the solar influence on the occurrence of persistent blocking events in the eastern Atlantic.’
‘During the descent into the recent `exceptionally’ low solar minimum, observations have revealed a larger change in solar UV emissions than seen at the same phase of previous solar cycles. This is particularly true at wavelengths responsible for stratospheric ozone production and heating. This implies that `top-down’ solar modulation could be a larger factor in long-term tropospheric change than previously believed, many climate models allowing only for the `bottom-up’ effect of the less-variable visible and infrared solar emissions.’
You will note from the first study the uptick in open solar flux (and thus solar UV) in the time period you are interested in. Both studies are by leaders in the field with Lockwood as the lead author in both.
We have UV warming and cooling in the stratosphere translating into changes in sea level pressure at both poles.
‘The current weather pattern during early winter 2010, exhibits several large-scale blocking features in the northern latitude. Initial signs would indicate that the current winter resembles that of winter 1962/1963. Both winters present large scale wave breaking events that are a key atmospheric signal that herald cold and snowy weather for Europe and parts of the USA.’
‘When analysing the mid-latitude flow, forecasters often identify a point of transition between cold polar air and the contrasting warm air from the tropics. That means we can visualise the deep cold plunge of air that is heading our way this week, but we might ask what is its root cause? The North Atlantic Oscillation is the egg that should be cracked here, its an index telling us what is happening rather than why.
The effects can be seen (literally in satellite images) as storms spinning of the polar fronts. High values of the Northern and Southern Annular Modes are by definition weak low pressure systems over the polar regions. You may know the NAM better as the NAO.
‘The annular modes are coupled with annular variability in the stratospheric flow during the winter season in the Northern Hemisphere and the spring season in the Southern Hemisphere. For decades, the prevailing wisdom was that stratospheric processes respond to but do not impact the tropospheric flow. But recent observations and model results done in the context of the annular modes suggest the coupling is two-way.
The recently discovered two-way nature of stratosphere/troposphere coupling is important for a number or reasons, not least the fact that the timescale of variability in the stratosphere is generally longer than the ~10 day timescale of variability in the extratropical troposphere. Recent numerical simulations suggest the coupling between the stratospheric and tropospheric circulations has practical applications for weather forecasting and also implications for tropospheric climate change (see text on Climate Change, below). The mechanisms whereby changes in the stratospheric flow impact the troposphere are currently under investigation.’
‘One reason we care about the annular modes is that they impact climate throughout much of their respective hemispheres. To name just a few of the climate impacts of the annular modes: the NAM is associated with large anomalies in surface temperatures and precipitation across North American and Eurasia, in the distribution of sea-ice throughout the Arctic, in sea-surface temperatures over the North Atlantic, and in the spatial distribution of ozone in the lower stratosphere. Similarly, the SAM is linked to variations in temperatures over Antarctica, sea-surface temperatures throughout the Southern Ocean, and the distribution of sea-ice around the perimeter of Antarctica.’ http://www.atmos.colostate.edu/ao/introduction.html
Both annular modes peaked in the mid 1990’s – lower values lead to storms spinning off the polar fronts penetrating further into higher latitudes. In the Southern Hemisphere – SAM leads to changes in the circupolar current and to the volume of cold Southern Ocean water moving through Drakes Passage between the tip of South America and the Antarctic Peninsula – or accumulating off the western coast of South America. Thus influnencing cold water upwelling in the Humboldt Current – which is the thermal genesis of ENSO with global temperature and cloud radiative implications.
Who did I see recently saying that they accepted climate change science -but that the science was very diiferent from what is generally in the public domain.
Thank you for the extensive reply. It’s a bit late 11.30pm, so I shall study it in more detail with a refreshed mind.
I have some reservations regarding Dr. Lockwood’s assertions, but I agree with your remarks regarding the Atlantic oscillations (even did some research into it myself http://www.vukcevic.talktalk.net/NAOn.htm)
A very quick search found this one (in Science) which has a go (and am sure there are more):
The issue is not ignored as you repeatedly suggest.
But you need to appreciate that the observations are very sparse and uncertain in that period. Which is why most climate scientists (who have dedicated years to examining all the raw data) only use CET after 1772. We cannot go back and take extra measurements, much as though we might like to. This may mean we will never know for sure what caused every apparent observed change in the historical record, but that doesn’t matter for attributing the recent changes whch are far better observed. The problem is that you assume that scientists don’t use the early period because of the warming event. And that is simply not true.
Thanks for the note. My opinions are not particularly important, it is the facts that matter.
I have updated comparison between the two 60 year periods:
with the trend lines, but what is striking is similarity between the last 3 decades and the 1710-40 period. It also shows that the current temperatures are only fraction of a degree higher, which is well within the margin of error of the 1730’s temps measurement/estimate. Similarity in the sudden turn around, then and now, is striking. This is in line with the observation from the
I strongly doubt that it is anything to do with the solar activity as Dr. Lockwood suggests.
Shikotsu (Japan) erupted in 1739, hence sharp drop in1740, but within a year or two, the temps recovered to the pre eruption level of ~ 9.5C. If the analogy holds and continues, we could expect that next few years would stabilise around 9.5 -10 C, which would be up to ~1C cooling on the last decade.
I do not see much of a role for the CO2 effect .
But it must be there.
‘Their investigations turned up a myriad of interrelated, nuanced factors that make it difficult to predict future changes in ocean circulation and climate, concluded the research team,’
On “evolving climate” (but not this article), there’s some interesting (peer-reviewed) data refuting serious sea level change.
Coastal inundation? Sea-level rise in the 20th C in Aus and NZ is estimated at 12-22 cm. 21st C rise projected at 15 cm. The Australian has a front-page report on an article in the Journal of Coastal research. Links:
The study, by NSW principal coastal specialist Phil Watson, uses century-long tide guage records from Fremantle, WA, Auckland Harbour, NZ, Sydney Harbour, NSW, and Newcastle Pilot Station, NSW. It finds that there was a “consistent trend of weak deceleration [in sea level rise] from 1940 to 2000.”
Abstract: As an island nation with some 85% of the population residing within 50 km of the coast, Australia faces significant threats into the future from sea level rise. Further, with over 710,000 addresses within 3 km of the coast and below 6-m elevation, the implication of a projected global rise in mean sea level of up to 100 cm over the 21st century will have profound economic, social, environmental, and planning consequences. In this context, it is becoming increasingly important to monitor trends emerging from local (regional) records to augment global average measurements and future projections. The Australasian region has four very long, continuous tide gauge records, at Fremantle (1897), Auckland (1903), Fort Denison (1914), and Newcastle (1925), which are invaluable for considering whether there is evidence that the rise in mean sea level is accelerating over the longer term at these locations in line with various global average sea level time-series reconstructions. These long records have been converted to relative 20-year moving average water level time series and fitted to second-order polynomial functions to consider trends of acceleration in mean sea level over time. The analysis reveals a consistent trend of weak deceleration at each of these gauge sites throughout Australasia over the period from 1940 to 2000. Short period trends of acceleration in mean sea level after 1990 are evident at each site, although these are not abnormal or higher than other short-term rates measured throughout the historical record.
Climate change researcher Dr Howard Brady of Macquarie University said that sea level rises accepted by the CSIRO [and used as a basis for Australian policy] are “already dead in the water as having no sound basis in probability.” Brady says that “the present trend would only produce sea level rise of around 15 cm for the 21st century.”
The first quote is an example of something that concerns me.
Variability seems to only be rolled out when trying to explain away something that on the surface looks contrary to AGW such as a period of cooling or 12 years of no warming. It seems possible that you could use similar work to help explain the possible role of variability in the end of the century warming period (mid-70s to 2000) but the majority of climate xcientist don’t seem to want to do this.
It seems like an obvious bias, for obvious reasons but doesn’t fail to annoy me. Is there any fundamental scientific reason why variability should be only used in this way?
Exactly. Believers seem to swallow the idea that variability is in effect turned off and on somehow.
Just as another example here is an early sentence from Ed’s essay
” These natural fluctuations in climate can temporarily mask or enhance any long-term trends to the extent that one year, or even decade, will not necessarily be warmer (or wetter/drier) than the last. ”
While he concedes at the start of the sentence that variability can enhance or reduce a trend even before the end of this sentence he’s dropped the idea of enhancing and the focus is primarily on the possible cooling. I think it’s fair to say that the rest of the essay continues in this fashion.
I’m trying to figure out what longterm trend has been altered by 12 years of no warming.
Start your trend line in 1850, not 1978. That way you really will have a long-term trend.
The linear equation for the long-term underlying trend is:
y = 0.0041x – 0.496
This means a decadal rate of increase of 0.041°C and an overall warming of 0.66°C over the 161-year period.
And your conclusion is right: the short-term “blips” are really just multi-decadal oscillations (including the one you show starting in 1978), so they are less meaningful that the observed long-term trend since 1850.
Whether the most recent period of slight cooling (since 2001) is the beginning of another such multi-decadal trend line is still too early to tell.
But Max that must mean that the short term trend from 1970-2001 was also influenced by multidecadal variability so less is assigned to external forcings, why not start with an acknowledgement of that?. it starts to throw off the physical explanations for the shape of the 160 year global temperature record. Multidecadal variability (whatever that is) has to be dealt with for the whole of the record not just switched on to explain away ‘cooling phases’.
There seems to be a disjunction between this sort of statistical based method for explaining the short term movements in the temperature record and an approach that is based on physical processes. Hawkins work seems to be a bit of a red herring unless you give it some meaning as to what multidecadl variability is in terms of energy movement around the system.
It is true that a “defined theoretical physical mechanism” is lacking today for the physically observed 60-year cycle in global temperature since the modern record started.
IMO this does not detract from the fact that it exists. We just do not yet know exactly what is causing it and how.
There is also the underlying linear warming trend of 0.04C per decade, which must be considered. We also are not sure what caused this, at least prior to significant human CO2 emissions around 1950.
The “Loehle and Scafetta” thread has some interesting observations on this question based on a new study by L+S.
It doesn’t make any difference.
The longterm trends, starting with the beginning, get stronger every 30 years until 1970. The 30-year trend from 1980 is virtually the same as the 40-year trend. The 20-year trend is virtually the same as the 30. The 15-year trend is down, but we’re no longer talking longterm trends.
Throughout that series I can make downward trends: bunches of them. They ended up stopping nothing. YADAYADAYADA, this one’s different. Maybe, maybe not.
The difference between earlier “cooling blips” in the record since around 1975 is that none of them have lasted as long as the current one (starting from January 2001).
But, other than that, whether or not this one is is really statistically different remains to be seen.
Certainly, if it lasts another 5 to 10 years, it will be dramatically different, and if it lasts another 20 years, it will be a full-fledged cooling half-ctcle such as two others we have already seen.
But that’s speculation about the future (a dicey practice when it comes to climate, although IPCC has been known to fall into this trap).
here This is a nice intro
Numerical weather prediction has benefited from continually assessing the ability of the computer models to make forecasts. Further testing of our climate models in a similar way is vital to increase confidence in their use for longer term projections and potentially identify parts of the model which require improvements.
The first non trivial problem is that model error has increased eg Nicolis and Nicolis 2007
This is evident when seen in the evolution of the weather forecast model ability of the ECMWF.A widely used model producing forecasts in the range for a few days to a number of weeks. The preparation base is a n-day forecast with n= 10 days of the global atmospheric state.
In any forecast there is an error dependent on initial conditions (due to arbitrary assumptions/estimates of unknown qualities) with the ECMWF model over the last 20 or so years in a paradox the model error has increased.
In 1982 in a seminal paper in which ECMWF data was first used ,to measure predictive ability. Edward Lorenz found the mean error evolution (doubling time of initial error) was two days, presently has dropped to 1.2 days.
This suggest that there is a limiting of predictive capabilities for long range weather forecasting with models of increasing sophistication ,owing to interconnected complexity in the atmospheric dynamics.
Sensitivity to the initial conditions-the principle signature of deterministic chaos-is thus not an artifact arising from when lower order models are used but is, rather, deeply rooted in the physics of the atmosphere.
shows that the AGW’s CO2 claims are very doubtful.
I read that and though, “Say what?” My thoughts were neatly echoed by your comment:
This to me is a perfect example of the modeler’s hubris. He lists a number of reasons that the output of the model might be wrong … but not one of them is that the model itself might be wrong.
Which, in turn, is why I am always skeptical of modelers. At some point they all seem to fall prey to the Circean illusion, that their model represents reality.
Finally, whoever the fellow was that wrote the linked paper says:
A simpler way to say this is “The climate models have never been tested in anything like a serious manner.” However, he’s a modeler. He thinks the idea is to “increase confidence” in his Tinkertoy climate models, where in fact many of us have no confidence in them at all.
Because as he points out, the models haven’t been tested. And if more modelers could bring themselves to actually point that out, we wouldn’t be knee-deep in bogus predictions. Oh, sorry, forecasts. I mean scenarios. Or whatever it is that the untested models keep churning out.
When earth is warm, it always gets cooler. When earth is cool, it always gets warmer. There is powerful negative feedback to temperature. The Climate Models do lack this powerful negative feedback. This powerful feedback is ice and water. When it is warmer as we are now, it snows more as it does now. When it is cooler, as we were coming out of the little ice age, it snows less. Put this feedback in the models and take the false feedback out.
This powerful negative feedback has a set point. Again, this can only be ice and water. None of the other parameters that do drive temperature have a fixed set point.
To calculate the long-term trend, I think we should start and end at peak or a valley. The best is to do it from the 1880s peak to the 2000s peak. If you take a longer period, it slightly decreases the overall warming rate.
Which gives a persistent global warming rate of 0.5 deg C per century.
Another way of doing it is simply to draw a linear trend line over the entire 161-year record.
This shows a linear equation of:
y = 0.0041x – 0.496
or a linear warming rate of 0.041C per decade, which comes out close enough to your 0.5C per century.
This is an interesting, well argued post. Many I already visit daily, but there’s some that I’d never heard of and am excited to check out! I admire you and your knowledge in writing.You make essay writing fun!! I’m loving this so much. Thanks :)