by Judith Curry
These results support the notion that the enhanced wintertime warming over high northern latitudes from 1965 to 2000 was mainly a reflection of unforced variability of the coupled climate system. Some of the simulations exhibit an enhancement of the warming along the Arctic coast, suggestive of exaggerated feedbacks. – Wallace et al.
Simulated versus observed patterns of warming over the extratropical Northern Hemisphere continents during the cold season
John M. Wallace, Qiang Fu, Brian V. Smoliak, Pu Lin, and Celeste M. Johanson
A suite of the historical simulations run with the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4) models forced by greenhouse gases, aerosols, stratospheric ozone depletion, and volcanic eruptions and a second suite of simulations forced by increasing CO2 concentrations alone are compared with observations for the reference interval 1965–2000. Surface air temperature trends are disaggregated by boreal cold (November-April) versus warm (May-October) seasons and by high latitude northern (N: 40°–90 °N) versus southern (S: 60 °S–40 °N) domains. A dynamical adjustment is applied to remove the component of the cold season surface air temperature trends (over land areas poleward of 40 °N) that are attributable to changing atmospheric circulation patterns. The model simulations do not simulate the full extent of the wintertime warming over the high-latitude Northern Hemisphere continents during the later 20th century, much of which was dynamically induced. Expressed as fractions of the concurrent trend in global-mean sea surface temperature, the relative magnitude of the dynamically induced wintertime warming over domain N in the observations, the simulations with multiple forcings, and the runs forced by the buildup of greenhouse gases only is 7∶2∶1, and roughly comparable to the relative magnitude of the concurrent sea-level pressure trends. These results support the notion that the enhanced wintertime warming over high northern latitudes from 1965 to 2000 was mainly a reflection of unforced variability of the coupled climate system. Some of the simulations exhibit an enhancement of the warming along the Arctic coast, suggestive of exaggerated feedbacks.
Recently published in PNAS; link to abstract [here].
In the latest issue of PNAS, Jerry North has a commentary on the Wallace et al. article listed on the PNAS early edition site [link]. Since all this is behind paywall, here are some excerpts:
Gerald R. North
One of the more intriguing mysteries of global warming is the fact that the high northern-latitude surfaces are warming faster than those averaged over the globe, a phenomenon known as [Arctic] amplification (AA).
Among the candidates for explaining AA are (i) unforced natural variability in the coupled ocean/atmosphere system; (ii) altered ocean heat transport into the region; (iii) local effects, such as thinning of sea ice and overall reduction of the area covered by sea ice ventilating heat from below into the atmosphere; and (iv) remote sea- surface temperature patterns that might correlate or “teleconnect” through long atmospheric wave patterns with polar conditions. It is possible that these and other effects overlap and combine in different ways in different seasons and that no one of them alone is dominant. An important fact that must be explained is that most of the AA is in boreal (Northern Hemisphere, NH) winter and over land. There is very little amplification in the SH.
The present study is about the trends in the mass and temperature distributions in the atmosphere and how one can parse out the natural fluctuations due to dy- namical instabilities in the atmospheric flows from the purely thermodynamic signals. These empirical dynamical patterns are to be distinguished from those mainly associated with heating due to radiative imbalances in the entire atmospheric column (this latter can be thought of as the pure “greenhouse gas warming” signal, although aerosols are also involved).
The starting point of the present study is the observation that variations in the pressure mode patterns excite the dynamical patterns in the thermal fields, in- cluding the surface temperature. Large pressure fluctuations mean more heat flow toward the poles due to storminess in the upper midlatitudes. Pure radiative imbalances do not induce much pressure variability. The pressure variability pat- terns are always stronger in the winter hemisphere because of strong latitudinal thermal gradients (equator to pole). Temperature responses to heat fluxes from the tropics induced by the pressure fluctuations are largest over large land masses because the effective heat capacity is much less than over oceanic areas. Be- cause there are virtually no large land masses in the high temperate latitudes in the SH, little surface temperature response to dynamical activity (pressure variability) is generated there. By contrast the NH contains the large Eurasian and the North American continents, with their extreme seasonal cycles in surface temperature (so-called extratropical continental climates).
After finding the dynamical component by least-squares regression with the pres- sure modes, the next step is to “remove” this component’s contribution to the trends, yielding the purely thermodynamic or radiative-imbalance signal, due mainly to human greenhouse gas and aerosol emissions.
The authors choose the reference in- terval 1965–2000, a period when the global average surface air temperature was rising at a nearly uniform rate. They wanted to conduct their separation under these most optimal conditions and apply it to multidecadal trends. The dynamical contribution of the trend in temperature is virtually all in the NH winter during the period of study, and it is dominant over the large landmasses. The purely thermo- dynamic component is pretty boring. It does not have much dependence on sea- son or hemisphere or even land vs. sea surface—it just keeps on warming linearly essentially everywhere in all seasons. These qualitative findings hold for both the observed surface temperature fields and those derived from published climate model simulations that cover the same interval. The study includes two types of model runs: (i) a multimodel average that includes the conventional drivers of climate: changing aerosols, increasing greenhouse gas concentrations, volcanic dust veils, and solar variability; and ii) a second series of runs in which the aero- sol, volcanic, and solar forcings are omit- ted but a pure CO2 forcing is included, with concentration increasing at 1.0%/y. In each case, data or model simulation, the dynamical effects were removed in the same way.
The dynamical trends were similar in the observations and the model simu- lations, except that the dynamical effect was much larger in the observations than in both model experiments. In other words, the dynamical effects are clearly present in the state-of-the-art global climate models, but they are much too small to match the data.
There are multiple lessons to be learned here. First, there are unresolved issues in computing the general circulation of the atmosphere. This could be plain old errors in the models, or it could be that some key effect or component is being omitted.
The article shows that a large portion of the climatic change over the period being examined is attributable to the dynamical component rather than direct radiative forcing. This former is the part that is associated with a change in the atmospheric circulation system, such as widening the giant tropical cells known as the Hadley Circulation, and causing the midlatitude storm belts in both hemispheres to gradually extend their centers and boundaries polewards. Satellite observations and other data indicate that these pole-ward shifts of the storm belts are larger than expected according to current climate models that include the conventional forcings. It seems that these shifts have had a larger effect during the study period than has the direct radiative imbalance due to increasing concentrations of greenhouse gases, although the latter is substantial.
What causes the two primary decadal empirical pressure patterns to ramp up their amplitudes and line up their phases in such a way as to encourage lots of warm air polewards onto the big continents during the study period and not at other times? Wallace et al. do not address this issue but list several contenders.
One possibility is that the anthropogenic factors are actually inducing this change in circulation, perhaps even through stratospheric connections as mentioned earlier. They seem to lean toward unforced natural variability at the decadal scales. For example, if there is a long-term unforced oscillation pattern in the coupled system, it might be setting and evolving sea surface temperatures world- wide, and these are in turn causing the conditions favorable for the high-latitude pressure variations.
The central contribution of the present article is that it provides a framework for seeing how the different factors combine in a few simple indices explaining most of the AA. There seems to be a large multidecadal variability in the complex ocean–atmosphere system that can su- perimpose itself atop the global warming signal. It seems to be identifiable with a few large-scale patterns in the temper- ature fields; when phases of the pressure modes match up one with another they can enhance the rate of warming, especially on the large, wintertime continents. It is not yet known whether cooling periods can happen as well in this scheme. Nor do we know exactly the origin of the multidecadal long-term variability that has been identified. It could be related to anthropogenic causes, or it could just be part of the natural system’s internal variability.
JC comment: I like the Wallace et al. paper, and Jerry North did a nice job with his commentary. However, none of this is news to me, since I have published two papers previously that came to same conclusions:
- Recent Arctic sea ice variability: connections to the Arctic Oscillation and the ENSO
- Causes of the northern high-latitude land surface winter climate change
While there is a trend in the Arctic, the amplification is associated with natural internal variability. As per google scholar, the sea ice paper has 7 citations (miniscule) and the land paper has 41 citations (moderate). Neither paper was cited by Wallace et al., presumably they are unfamiliar with these papers.
This brings to my mind the issue of ‘unknown knowns’, to expand upon Donald Rumsfeld’s vernacular. An ‘unknown known’ is something that somebody knows, but isn’t generally known (in this case it was known by myself and my coauthors and a handful of people that read the papers, but not sufficiently known to make into say an IPCC assessment or to be referred to in more than a few scientific papers).
So how to make the knowns actually known? Sign up to be an IPCC author, so you can cite your own papers. Or issue a press release. Hmmm . . . There oughta be a better way.
I’m not being critical of colleagues like Wallace et al.; in fact I wouldn’t have known about the Wallace et al. and North articles if Ronald Hirsch hadn’t sent me an email. How to really mine the published literature for knowledge remains a challenge, although the internet is an enormous boon. Known unknowns and unknown unknowns are extremely challenging to deal with; it seems like dealing with the unknown knowns should be relatively tractable.
I am very appreciative of those of you who send me links to articles, this helps me deal with the unknown known issue in terms of my own understanding and in sharing these through Climate Etc.
However, none of this is news to me, since I have published two papers previously that came to same conclusions:
■Recent Arctic sea ice variability: connections to the Arctic Oscillation and the ENSO”
Connections to ENSO are obvious. One the has to figure out if recent ENSO severity is natural and/or manmade.
You don’t mean to be critical of your colleagues for missing your paper…
Well let me do it for you then.
This is a pretty damn simple google scholar search for arctic sea ice enso and your paper is the second hit from the top. One might ask how a remotely competent search for prior art (excuse the engineering terminology) could have missed it fercrisakes.
Good find David. Burt Rutan’s no climate scientist but I do think your rather effortless find supports his criticisms of the current state of scientific back and forth vs. and engineering approach to problem solving. Someplace he says words to the effect of, “If you brought this stuff (i.e. AGW data) to a design review, you shouldn’t expect to have a fun meeting.”
I totally agree. It would be worse than that. You’d be fired and your certification as a professional standing would be at risk if you repeated it.
The quality of climate science documentation is nowhere near suitable for making huge investment decisions.
It’s occurred to me that the majority of the Climate Etc, contributors are really interested in the down-in-the-weeds science. But the work and interest seems to be undirected. That is it is not directed at producing the information needed for informing policy decisions.
For policy decisions we need cost-benefit analyses of consequences and the various policy options.
The science that is relevant for policy decisions should be presented in a way that is suitable for due diligence.
This means all the information needed to support the inputs to economic cost-benefit models like Nordhaus’s DICE and RICE models must be fully documented to the standard appropriate to support proper due diligence (for decisions to commit to expenditures of many trillions of dollars).
Nordhaus (2008) (Table 7-2 http://nordhaus.econ.yale.edu/Balance_2nd_proofs.pdf ) shows that the two inputs that cause the greatest uncertainty in the cost-benefit analyses are the damage function (DamCoef) and the climate sensitivity (T2xCO2). The science needs to focus on documenting (to the standard required for due diligence) all the inputs, calculations, models, etc. that provide the inputs to models like the Nordhaus DICE and RICE models.
Then our politicians could make informed decisions.
In order to perform due diligence; all computer code, data models, equations, data – utilized and unutilized, meta-data, and any other relevant information must be disclosed by the authors.
Its natural. A major paper coming out on this topic soon (I’ve seen it, it is winding its way through the review process).
“It’s natural” seems like “a very bold statement”, to paraphrase Vince in Pulp Fiction. I sincerely doubt it’s that cut & dried unless the 1998 El Nino was caused by a volcano instead of the usual cause of slower trades.
And there I thought there might be just a bit of uncertainty. Guess not, though. Well, you are the expert, but it is fascinating where uncertainty just up and walks out of the room.
Uncertainty is only in one direction……here at least.
It’s a bit of a shame that “skepticism” has been co-opted from being a fairly decent approach to being a justification for holding onto a particular view regardless of evidence.
Don’t like something? Simply apply “skepticism” to it until it goes away. It’s the modern day wonder cream!
It’s amusing to see the AWG zealots squirm when a climate scientist commits sacrilege. The 411 on that is this: That is the way science is supposed to work.
The uncertainty monster is on hold on line #2.
He’s angry and wants to speak to someone in customer service, but Judith won’t pick up his call.
Maybe you’d like to give it a shot?
I agree it is natural, well most anyway, but what is a realistic “average” temperature to use as a base line to separate natural from anthro?
Equilibrium base line temperature as a function of time in year:
No, but kinda. The average TSI for ocean is greater than the average TSI for land so there is time of year. The average temperature of the oceans is greater than the average temperature of the land surface as well. So Global average is meaningless to a degree.
Sounds kinda like “settled science” to me…
However – We would be well-advised take Dr Curry’s word on this.
“It’s natural” is a rigorous scientific conclusion that comes from a paper that is “major”, so it must be completely correct.
Obviously, Dr Curry doesn’t need to wait for the completion of the review process – Just like Christy didn’t need to see a final version of ‘Watts et al’ before his testimony to congress that implied that Watts has falsified the surface-station record.
Funny how uncertainty is always something that the Other Guys do wrong, isn’t it?
“Its natural” with respect to “Sensitivity” as it is defined. Because of the goofy standard used in climate science, other anthropogenic impacts don’t get proper attribution. Land use impacts on 6% or more of the surface of the Earth doesn’t impact “sensitivity” over the same time period as CO2 “forcing”, so most of the land use change would be “natural”.
It is likely that NH land use change reduced the odds of a new ice age. That impact would likely change the “normal” Arctic sea ice condition, but since that is not a direct radiant impact measurable during a 1900 to 2012 instrumental period it is “natural”. The terminology of climate science sucks :)
Judith: Who’s the lead author on the upcoming paper about ENSO?
Very little is known about the magnitude and effect of underwater volcanic activity in that area.
Warm Oceans and Open Arctic causes more snow that does cool the earth. Cold Oceans and Closed Arctic results in less snow and earth does warm.
Ice is a good insulator. Ocean can’t dump heat through it anywhere near as easily as through open water. It’s pretty much exactly like a thermostat in an automotive cooling system i.e. an iris that opens wider in response to warmer water and closes in response to cooler water placed between the heat source (tropics/engine) and heat sink (space/radiator).
The polar aficianados at Neven’s Arctic Sea Ice weblog are directing particular attention to (1) incredibly warm water temperatures, and (2) incredibly low Northern snow cover for late spring of 2012
Both factors act to make the ice-melt start earlier, run faster, and endure longer. Obviously! :) :) :)
Thus it may well be that the dramatic ice-loss this year results from a confluence of random factors, that moreover are acting to reinforce one-another. Makes sense, eh? :) :) :)
As James Hansen and his colleagues having been pointing-out, on a warming planet, once-rare reinforcing confluences become dramatically more common.
The reason is simple: if a planet’s energy budget is in sustained imbalance (from CO2), then the extra energy must surface somewhere, and neither past history nor purely statistical models can provide much guidance as to where that energy-surfacing will happen.
There’s a lot of excess heat showing itself in the Arctic, that’s for sure. :eek: :!: :eek:
Can you show the thaw and re-freeze dates moving?
If not, this is just more sad comedy from you.
Those are anomalies. Since you have open water where once there was ice you can expect anomalies around that figure 2-3C. There are charts of actual temperature which give you the absolute figures… we’ve entered bottom melt season and winds/currents that will drive compaction and exit via Fram. The right weather pattern and this bad year could be much worse..keep an eye on the drift predictions
Steven Mosher | August 23, 2012 at 10:10 pm |
Connecting temperature to the amount of ice on the polar caps is absurd. The amount of ice depends on the amount of raw material available for renewing the ice. Saying that: colder = more ice on Arctic, is same as saying: if is warmer, Sahara will have more rain. That raw material is demonized by the Warmist and some Fakes. Water freezes on zero C. There is enough coldness, to make another 10km on the top of the existing ice, in one season.
1] more freshwater drains in Arctic from Russian rivers -> spreads on the top of the heavier salty water = ice protected.
2] less freshwater -> more warm salty water goes into Arctic via Bering; that eats the ice on Arctic ocean from below.
3] higher evaporation in Mediterranean, from increasing dry Sahara winds -> deficit comes from Mexican gulf, water siphoned from Arctic -> more salty warm-water from north pacific into Arctic. Mosher, confusing the ignorant by Arctic ice, seawater temperature; some day soon will be treated as a crime:http://globalwarmingdenier.wordpress.com/midi-ice-age-can-be-avoided/
A fan of *MORE* discourse | August 23, 2012 at 8:19 pm | Reply
“There’s a lot of excess heat showing itself in the Arctic, that’s for sure.”
Not as much as you might think. Estimate the mass of the missing ice, the heat of fusion for H20, and then you can get joules (personally I prefer BTUs but to each his own).
The calculate how great a volume of water it takes at 4F warmer to contain the BTUs required to turn the missing ice mass at 32F into liquid water at 32F.
Then after you’ve done that estimate the volume of water that moves through the oceanic conveyor belt from the south Pacific to the Arctic ocean.
You will find a reasonable correlation between the energy in the 1998 El Nino, which was about 4F warmer than normal, and the heat of fusion required to melt enough one-year (thin) Artic sea ice to reduce the summer ice extent by 10%. Then as a sanity check take the velocity of the southern Pacfic to Arctic conveyor belt and see if, after the transport time has expired following the 1998 El Nino Grande, Arctic sea ice started reducing at that time and stopped reducing in a step-change fashion about 10% lower than before the El Nino
I did all the above. It takes an hour if you’re fairly quick at looking up the pertinent facts. Curry mentioned doing a paper on the connection between ENSO and Artic sea ice. There’s about a hundred such papers. It’s a fairly obvious connection.
However, I did not attribute the melt to natural causes because I got stuck at the point of trying to explain abnormally slow trade winds that created the 1998 El Nino as natural or anthropogenic.
Dave Springer, your reply is exceptionally well-reasoned and clearly expressed … very much in the best traditions of the discourse on Neven’s Arctic Sea Ice weblog. Thank you, Dave Springer! :) :) :)
In seeking for rational discourse relating to climate-change:
• Neven’s Arctic Sea Ice weblog is exemplary of discourse that seeks to understand climate-change on short time-scales … a few weeks or months at a time, via in-depth study of data. And Neven’s crew is very good at this … they foresaw this year’s record ice-melt in June, for example.
• Hansen’s Earth’s Energy Imbalance and Implications is exemplary of discourse that seeks to understand climate-change on generational time-scales … say 20-30 years ahead, via in-depth energy balance analysis. And Hansen’s crew is very good at this … back in 1981 they predicted “the opening of the fabled Northwest Passage” for example.
• Here on Climate Etc folks would like to understand the toughest time-scale of all … decadal fluctuations.
When it comes to decadal time-scales, Neven’s methods aren’t much help … the decadal dynamics is too intricately nonlinear for “by hand” forecasts. And Hansen’s methods aren’t much help either … Hansen predicts that excess energy is circulating in the system, but does not predict precisely where it will surface. And purely statistical/formal methods *also* aren’t much help in understanding decadal fluctuations … there is rich structure in the decadal climate-change dynamics, that purely statistical models don’t “understand” (and plausibly never can).
Right now our best tool for understanding decadal fluctuations are the intricate climate simulation codes, and the satellite observation networks, that are slowly building a bridge between Neven-style understanding and Hansen-style understanding. Here progress in modeling is so erratic year-by-year, that we tend to overlook how impressive the progress is generation-by-generation. Fortunately! :)
Conclusion Neven’s short-term understanding of climate-change is pretty good … Hansen’s long-term understanding of climate-change is pretty good … decadal-scale understanding of climate-change needs substantial improving … and this improving is coming, by the orthodox path of more-and-better input data driving better-and-faster simulation codes.
Thank you for a substantial post, David Springer! :) :) :)
Yet another paper from the AGW cabal – relying on rigged models to explain observations.
Well, the results of this paper will obviously be rejected by “skeptics.”
Unfortunately Joshua you are right. Too many model bashers here comment as if models have no value at all. So I too will wait to see how many are ready to discount these results also because they were generated from a model.
“The dynamical trends were similar in the observations and the model simu- lations, except that the dynamical effect was much larger in the observations than in both model experiments. In other words, the dynamical effects are clearly present in the state-of-the-art global climate models, but they are much too small to match the data.”
Oh here we are… though the dynamical effects are clearly present….they are too small to match the data so….the models don’t really work after all.
That’s the out we’re looking for Joshua.
Perhaps that’s being a bit too cynical.
” John Carpenter | August 23, 2012 at 9:41 pm | Reply
Unfortunately Joshua you are right. Too many model bashers here comment as if models have no value at all.”
Actually this use of model, is what models are for- help find unknowns.
Rather than solidify conclusions.
All I’m asking for is a little consistency, John.
Is that really too much to ask for?
It’s not just any model either, it is the IPCC AR4 models. The very same ones the skeptics don’t like for their 21st century warming.
It’s called “arguing in the conditional.” If one can show that even using the favored model of a particular theory-advocate that her interpretation of the data is overly confident, that in no way constitutes a validation of the model in question. It simply shows that the evidence for the theory is even weaker than previously thought.
Example: Astrologer says that my last two weeks of behavior is explained by my being born in the House of Blues with Zevon rising. I use her own data and model to show that they admit the interpretation that my last two weeks of behavior were really caused by the retrograde influence of Lady Gaga. The exercise throws doubt on her suggestion that I sell all my worldly possessions and move to Belarus to meet the woman of my dreams, but it doesn’t represent a vote of confidence in either her data or methods.
Being born in the House of Blues with Zevon rising is some serious mojo :)
Since the models all tend to over estimate starting around 2005, you can use a 2002-2010 base line and work backwards to get an idea that polar de-amplification would be required before polar amplification. Still a lot more puzzle left though.
Nicely put Steve. I think this kind of exercise is unsupportable but it is central to the AGW scare so I am happy to see natural variability raising its ugly head in there. I am therefore conditionally happy. This is where the research needs to go.
If a model by a wide margin results in A, and in one test you use that model with a result of B, you can’t invalidate those proofs of A simply by pointing to outcome B. You would need to explicate a mechanistic difference that shows why result B was valid in comparison to many results of A.
If “skeptic” after “skeptic” says that the models are rigged, either through intentional bias or through invalid “adjustment,” then the by their arguments the models are invalid.
If you argue an model, because of inherent or impost structural flaws, produces invalid results, then you can’t point to the result of such a model and say that the results are valid, let alone that they prove that the model is invalid.
The only people who could be logically consistent in saying that these results are valid are those who say that the models are valid, and who then go on to explain why, in an explicit way, these results were valid even though in contrast to previous results with the use of these models.
There is no “wide margin.” The new paper says that the previous uses of the model were incorrect.
Under the maintained hypothesis that the model accurately captures climate dynamics, the Arctic warming cannot be attributed to CO2/aerosol forcing. OK. But if there were some other line of evidence strongly supporting such forcing-induced Arctic warming, I wouldn’t rely on this paper as proof that that evidence was wrong.
If your wristwatch tells me that I’m late to an appointment but I think your watch is inaccurate, a further argument that you are reading the watch incorrectly and hence I am not late even if it were accurate makes me less worried about being late. But it certainly doesn’t require me to now believe in the accuracy of your watch. And if I were to hear church bells informing me that it is indeed later than I thought, I would take little reassurance from the “corrected” reading of your watch.
That’s fine. Then you think that one approach uses a valid model and a valid methodology, as opposed to another approach that uses a valid model and an invalid methodology. Makes total sense.
Either way, then you are saying that the model is valid. You are explaining in detail why you interpret the previous methodology in how the models are being used to be invalid. Reasonable people could disagree and people can support their arguments about what is the proper methodology.
If all along you said that the models themselves are invalid – if using them with a different methodology produces results you think are accurate, then you’d have to conclude that it is a random coincidence.
An invalid model cannot produce a valid result.
A broken watch is right twice a day. You can’t point to the watch at one of those moments that it happens to be correct and say “See, I’ve proved that the watch is working.”
Joshua: You are knocking on an open door. As I said, I don’t believe the models are reliable enough to tell us how much we need to be worried about CO2. The new paper says EVEN IF the models were accurate, they have been falsely interpreted to make forcings more important than they really are. So 1) the model is probably not accurate and doesn’t tell me very much, even though Urgent Mitigationists say otherwise AND 2) the UMs have been misreading their own inaccurate simulator and overweighting the role of forcings in Arctic warming EVEN IF the model were accurate.
My confidence in the models has not increased. My confidence in the wrongness of UMism has increased by a small amount. New credible evidence of the need for UM from a different, more reliable source would lead me to update my beliefs.
You are too worried about gotchas and debates and it is leading you to obfuscate the dirt-simple points above.
If it was a natural fluctuation, the models would not all be predicting it to work in the same direction. Some would not be doing it. Clearly this is a deterministic, not stochastic, effect that can’t be separated from the GHG forcing change. Perhaps the models are underpredicting the strength of the amplification, mostly because the sea-ice models are not perfect, but they have all predicted a steady sea-ice loss in the 21st century with no reversal, somewhat like we are seeing now. If it was “natural”, the models would have to have an uncanny initialization that just happens to capture it.
If the initial value for the NH is lower than the models suspect, then that would explain the situation. That is why I have been looking at the southern hemisphere and tropical reconstructions compared with sea surface temperature. You can’t have a global average unless you know what is the equilibrium condition is between the hemispheres.
That chart I show about SST approaching a stable temperature is likely what equilibrium should be, so to determine AGW, you have to start at that base line.
You can’t tell what’s abnormal until you know what normal is supposed to be.
Without forcing the models can keep a stable Arctic for centuries. It doesn’t just spontaneously melt unless they add forcing.
The problem is that volcanic forcing has a greater impact on the Northern hemisphere. That is what it has much larger temperature swings, less thermal mass because of less ocean area.
That is RSS SH and NH plus Hadsst2 with the Best Volcanic forcing. There have been more NH volcanoes and even near equatorial volcanoes have a greater NH impact. It looks like the ice age recovery is mainly a NH impact, nearly twice the SH impact because of thermal mass.
So models appear to have a little glitch.
That logic doesn’t work. Do you think the IPCC projections reduced future volcano impacts to get warming? If they had the volcano impacts wrong all this time, why would they suddenly get it right for the future projections? That is what you are saying they did. Apart from which, the paper says nothing about volcanoes, so we are off on one of your tangents again.
Well, OK, it says something about volcanoes, but not that this is the changing effect leading to the Arctic amplification, to be more specific.
JimD, That has been my tangent, “Sensitivity” is non-linear.
Here JimD, review this past tangent of mine, http://redneckphysics.blogspot.com/2012/07/strangely-attractive.html
This climate system is bi-stable. So instead of a neat plain vanilla Gaussian distribution you would have a bi-modal distribution. If you think you have a Gaussian or “normal” distribution but have a bi-modal distribution, then you would think you have a log normal distribution skewed towards some unpredictable limit, the wicked looking fat tail. Depending on which mode you assume is “normal” you would believe you have more or less sensitivity to a particular forcing. Fairly simple.
The fun part is you have to “Cherry Pick” base line values to find out which mode you are closest to :)
It is at least tri-modal, because you have to remember the iceless hothouse state we had when CO2 levels were double what we have now just 100 million years ago, and possibly again in the future.
Nope, bi-modal. http://en.wikipedia.org/wiki/File:Five_Myr_Climate_Change.png
The magnitude of the polar extremes varies more with continental drift, but the equatorial temperature range has likely been consistently bi-modal. With the exception of major surface impacts, the range appears to be remarkably stable.
I didn’t read the whole paper since it agrees well with my model :)
capt. d., what was the CO2 level 5 million years ago? Less than now, in fact. You are being shortsighted in your paleoclimate view.
JimD, not short sighted, just unconvinced that CO2 is a major control variable. “Our results challenge the hypothesis of a gradual decrease in atmospheric carbon dioxide concentrations as a dominant trigger of the longer glacial cycles since 850,000 years ago.”
What looks like a bigger player to me is the ocean dynamics. ” Instead, we infer that the temperature contrast across the equatorial Pacific Ocean increased, which might have had a significant influence on the mid-Pleistocene climate transition.”
Which appears to agree with the paper being discussed in this thread and this chart. http://i122.photobucket.com/albums/o252/captdallas2/climate%20stuff/joethevolcano.png
And tends to explain this chart,
In order for CO2 to not be a major driver,H2O would need to be a better regulator, which appears to be the case.
capt. d., I can play the skeptic game and say that until they can explain why the Cretaceous period was so warm, I am not going to believe anything else they say. They are not even trying, because they know they can’t.
To state one obvious problem with dissemination of science information, both of those links are behind a paywall.
Scientific research is not free. You are not entitled to rent-seek what their study, labor, and writing have created. It would be nice if all scientific papers were free. It’d be nice if cars and houses, music and movies were free too.
I thought I’d paid for the research through my tax dollars.
That would be your mistake, then.
I suggest you go to the nearest Army base and demand that they give you a tank. Remember, you paid for it with your tax dollars!
You are comparing apples and oranges, Robert. It is entirely reasonable to expect to get free access to scientific research paid for by tax payer money. Not so for military products and tools.
Bad comparison. Our tax dollars don’t entitle we citizens to the military hardware for our personal use, but we do expect protection from outside invasion or other threats to our nation. Similarly, I would not expect to have personal use of research tools and equipment (bought and maintained by tax dollars) at a university, but the scientific gains made through the use of such equipment via publications I would.
Obviously it is a bit more complicated than that since the journals require income to exist and selling the rights to view the work is one way to get it.
A better comparison would have been scientific advancements made in the private sector where private money is used to make a product that is intended to be sold on the open market. In that case, science is not free and no one is entitled to know about it except for those who develop and exploit it for their own gain.
At some point, tax dollar generated scientific knowledge has to be turned over to the people as the return on investment. It should not become IP of the researcher or university.
Robert | August 24, 2012 at 5:53 am | Reply
“I suggest you go to the nearest Army base and demand that they give you a tank. Remember, you paid for it with your tax dollars!”
If the best you can do is equate a tank with an electronic copy of an unclassified climate research paper then your point is rather weak. Really Robert, did that comparison actually seem reasonable to you when you wrote it?
“Really Robert, did that comparison actually seem reasonable to you when you wrote it?”
Well… it IS Robert. Robert doesn’t like answering hard questions. ;)
Could I see all the classified military reports then that my tax money has paid for?
“Could I see all the classified military reports then that my tax money has paid for?”
No, that’s why they are ‘classified’ (unless you have clearance). There is plenty of sensitive material within the military we don’t want readily available to protect the country, which is job #1 for the military and the governement.
What kind of publically funded university research would you expect to be classified? A comparison to ‘classified’ information is not equal to what was in question.
to Peter, . You should go to the nearest University library or large city library and they can usually get electronic copies for institutional use.
The paywall is for the publisher not the researchers or research itself however.
“It’d be nice if cars and houses, music and movies were free too.”
It might be a bit like if cars, hourses, music and movies were produced using public funds but then the catalogue that listed them cost $100 and that money went only to the publishers of that catalogue.
Yes. My comment was flippant.
But with the advent of the internet, the catalogue is much cheaper.
You can usually get a copy from the author. For that matter PNAS has an author pays, open access option. They did not take it.
Certainly, one should be aware of what others in his field have done and are doing. Otherwise, a lot of wheels get re-invented. I’m not sure, however, how much one should be influenced by the work of others. Independent thinking can open new doors. Although an inventor rather than a scientist, Gaston Glock, who revolutionized handgun design is an example of the triumph of independent thinking. When asked how Glock “was able to get it right,” firearm authority Patrick Sweeny is reported to have said ” because he hadn’t done it before.”
Kuhn argued that many successful innovators followed the pattern of success in another field, followed by moving into a new one and seeing it with fresh eyes. Don’t know if there is any hard data to back up that supposition, but it makes a certain amount of sense.
Why does some or all of the sea ice loss have to have been caused by natural factors? Perhaps the natural factors would be favorable for more ice, and not less.
It’s a fundamental application of Curry’s Monopolar Uncertainty Principle.
Come on, Gary, keep up.
In Christy’s evidence to the US senate, all of these models gave long term results showing higher levels of temperature than the satellites gave. Explanations for this discrepancy have been advanced, but in the meantime we have to treat model results with skepticism. I would expect that somewhere over the north Atlantic a hot spot would hover, simply reflecting the urban heat centres on both sides and the dynamic plume of CO2 from both sides somewhat distorted by Coriolis.
But it has been shown that he lied to the US Senate…
Are you getting confused with James Hansen?
Lied is a strong word, Louise. Are you sure you want to go there?
Skeptical Science is not a reliable source. It’s a blog with a demonstrably dishonest owner. Try again.
I reviewed your link and while there were many characterizations of his testimony, none were using the word lie. Do you need a primer on the English language? Perhaps the preponderance of recent scientific studies is getting to you.
Australia and South Africa to decide where and how to spend the $100 billion ‘Green Fund” which is to be contributed by rich nations to help poor nations mitigate climate change in their countries
Doesn’t this raise a little bit of concern? The rich countries to contribute $100 billion (per year?) to a Green slush fund, but how it will be spent / distributed (and used to support corruption) will be decided by others.
It will be spent on lavish trips to resort cities. We already know that.
The AGW sand castle, as shown in the graph below, was built by smoothing all the oscillations in GMST before 1970s and leaving the warming phase of the oscillation since then untouched and calling it man-made.
Presumably, that graph is supposed to show how volcanic eruptions cause a temporary reversal in the warming. But if you look at the list of eruptions in the 20th century, it is apparent that you could choose any volcanoes you like to explain cooling at certain points on the graph.
On the other hand, significant eruptions have been ignored, for instance:
“Novarupta 1912 Largest eruption of the 20th century.”
(Temperatures actually showed an increase after this one.)
The eruptions usually pointed out on global temperature graphs are the large tropical volcanic eruptions. From the tropics, sulphate aerosols can spread poleward into both hemispheres so there is a truly global effect. Novarupta was in Alaska, which limited its global reach. Nevertheless, you can see it in the time series of aerosol optical depth. It’s effects are confined to the northern hemisphere. It’s called Katmai in this paper:
I was at the Met office library two weeks ago and you could have played football in the car park, so glad to see that you at least are manning the place. :)
Is there a general rule of thumb with regards to how the location of any volcano AND its intensity will affect the global climate?
Thanks John, I will read that paper later.
John, I have read the paper that you advised. What strikes me most is the lack of information about the stratospheric aerosols in the atmosphere of the tropics and the southern hemisphere, before 1961. Sometimes these are assumed to be the same as the northern hemisphere, and at other times are just given a score of zero. Also anecdotal reports (such as the tints of the sunsets have been converted into numbers). I can’t see how values for the southern hemisphere and the tropics can be used before 1961. This is obvious from a glance at table 2.
But estimated values have been used and these have then been averaged to make a global average. If you look at 1912, this year had the highest figure for any year of the 20th century in the northern hemisphere (above 30 degrees). But the 2 southern hemisphere zones have both been given a 0, consequently lowering the global average.
I have plotted a graph of the temperature data from Armagh (northern hemisphere), which seems to be the most reliable data that we have got. and there seems nothing unusual about the temperature that year.
I have also looked at ice core data:
This looks like particularly good data as this statement has been tagged on to it:
“These data start in 1872 and are absolute dated by counting of annual layers”
I have read that 1816 was the 2nd coldest year for thousands of years (due to a huge eruption in 1815) and was called “the year without summer”. However the coldest year in the ice core record (which goes back to 1899 BC) was 1813 AD, which was 2 years before the volcano erupted.
There are some examples where it appears that there are correlations between volcanic activity and temperature falls, but there are also examples of when there is no correlation. And this applies to sulphide aerosol levels against temperature, also.
So it is just my hunch that the people plotting the graph where subconsciously omitting volcanic activity when it didn’t fit with temperature falls on the graph.
I appreciate that you have far more knowledge than me on this subject and that I am frequently going to get things wrong, but my views are typical of ordinary members of the public when they see graphs and data like this.
Thanks again for your help,
Hi Dr. Kennedy
Your colleague Chris Folland published in 2009 reconstruction of the summer NAO. Recently I did reconstruction of the AMO based on the sun-earth geomagnetic data.
Either both of us are correct or both wrong; may be Mann should have gone to Norway not Yamal.
Thanks for an interesting post. I wonder if/how the mechanism of internal variability in wallace et al and in your 2007 paper are consistent with the observed much greater increase in high latitude daily low temperatures than in daily high temperatures (a la changes in the night time boundry layer stability). Does this seem consistent to you?
Actually, a colleague approached me recently with an idea to write a proposal on exactly that topic
I hope happens. That sort of connection (changes in the winter night time boundry layer due to changes in internal processes like you and wallace et al identify) would seem very interesting, and especially interesting if a connection can’t be identified/verified.
Arctic amplification comes from below not from above.
Hi Judy – Our work discussing the importance of regional circulations has also been mostly ignored. See, for example, this post in 2005
and our paper [with respect to aerosols]
Matsui, T., and R.A. Pielke Sr., 2006: Measurement-based estimation of the spatial gradient of aerosol radiative forcing. Geophys. Res. Letts., 33, L11813, doi:10.1029/2006GL025974. http://pielkeclimatesci.wordpress.com/files/2009/10/r-312.pdf
where we wrote
“Unlike GHG, aerosols have much greater spatial heterogeneity in their radiative forcing. The heterogeneous diabatic heating can modulate the gradient in horizontal pressure field and atmospheric circulations, thus altering the regional climate.”
Marcia Wyatt’s new research that is posted on at
is also directly relevant. Even the NRC report
National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp. http://www.nap.edu/openbook.php?isbn=0309095069
“Despite all these advantages, the traditional global mean TOA radiative forcing concept has some important limitations, which have come increasingly to light over the past decade. The concept is inadequate for some forcing agents, such as absorbing aerosols and land-use changes, that may have regional climate impacts much greater than would be predicted from TOA radiative forcing. Also, it diagnoses only one measure of climate change—global mean surface temperature response—while offering little information on regional climate change or precipitation. These limitations can be addressed by expanding the radiative forcing concept and through the introduction of additional forcing metrics. In particular, the concept needs to be extended to account for (1) the vertical structure of radiative forcing, (2) regional variability in radiative forcing, and (3) nonradiative forcing. ”
It is clear that the new Wallace et al 2012 paper is moving into the direction of assessing changes in regional circulation patterns, but there is, as you have also noted, relevant literature already published that support this view which has been mostly ignored (including in the IPCC WG1 report).
In the new assessment, hopefully, they will finally provide the proper evaluation of this issue.
Roger, I have been meaning to do a post on Marcia Wyatt’s thesis, i didn’t realize that you had posted on this last may. I will contact Marcia about this. I agree that this is a hugely important line of research.
I wa especially intrrigued by this paper bearing in mind my interest in historical climatology
“Northern Hemisphere Multidecadal Climate Variability: dynamics and history of climate-signal hemispheric propagation: 1700 to 2000 [to be submitted] Marcia Glaze Wyatt. ”
Has it been submitted and available to read online?
The AMO is not always good indicator of temperature, especially reconstructions.
Grey and Mann reconstructions appear to be OK, done on tree rings. I did one on geomagnetic signal, if you have patience and good eyesight you may be able to work out how they compare:
It is natural to inquire why this class of climate-change studies has been mostly ignored. :) :) :)
On explanation is, discourse on decadal-scale climate fluctuations is sandwiched between (very successful) Neven-style short-term narratives and Hansen-style long-term narratives … these narratives are appealing by virtue of their relative simplicity and robust grounding in observational data (Neven-style narratives) and fundamental physics (Hansen-style narratives).
The great challenge in studying decadal fluctuations is to establish solid grounding for decadal-scale narratives. Big computer codes are not entirely satisfactory in this regard … the codes may understand the climate, but we humans perhaps don’t. Perhaps we humans will always struggle to understand decadal fluctuations as well as our computer codes do. Statistical analyses similarly similarly are unsatisfactory … they lack natural explanatory power, no matter how compelling the confidence level asserted.
Conclusion It may be that decadal-scale understanding of climate change will always lag behind Neven-style near-term observational understanding, and Hansen-style long-term energy balance understanding. And this will happen not for personal or political reasons, but for purely structural reasons, eh? :) :) :)
I would more point the finger at fashion/funding and the resultant ground swell of interest in aspects of science at any one time. For example Dr Manns hockey stick would have been roundly condemned as inaccurate at one time. It then passed through a period when it was widely accepted.
However with my own study-ably assisted by BEST :) and the recent study on the Antarctic-it appears that we can revert to our earlier ideas that the earth has been warming for the last 400 plus years. This renders Dr Manns later part of the Hockey stick as of dubious value. It also puts Dr Hansens GIss project into its rightful perspective. That is to say he merely monitored an ongoing warmimg trend and did not discover the start of it.
Why do you think the earth has been warming for 400 plus years Fan? :)
Could it be that perhaps co2 does not have much of an effect above 280ppm?
Climatereason, when we juxtapose Neven’s startling ice-melt observations (updated today) with Hansen’s startling ice-melt predictions (from 1981) … your work appears to be late for a party that’s already underway, eh? ;) ;) ;)
It is ignored because you kooks are too obsessed with your cliamte hype to pay attention.
a fan of *MORE* discourse: It may be that decadal-scale understanding of climate change will always lag behind Neven-style near-term observational understanding, and Hansen-style long-term energy balance understanding.
That’s an interesting observation. “It may be” fits with my own slogan that “maybe” is the key concept in the climate change debate. However, long-term energy balance “understanding” is incomplete (at least as presented in Pierrehumbert’s “Principles of Planetary Climate”), and Hansen’s understanding in particular hasn’t been shown to make very accurate predictions.
A fan of *MORE* Discourse: On explanation is, discourse on decadal-scale climate fluctuations is sandwiched between (very successful) Neven-style short-term narratives and Hansen-style long-term narratives …
Another explanation is that climate change hysteria (warming in the 30s, cooling in the 70s, warming in the 80s) is based on decadal-scale climate fluctuations, and the understanding of them represented in dacadal scale models tends to undercut the basis for the hysteria.
Quite the opposite surely?. We seem to have had a natural warming trend for at least 400 years rather than a man made one snce 1979. Little we can do about it, so we can save lots of time and money by closing down the excesses of the cimate science industry and do something more worthwhile with the money saved.
Why do you think its been warming for all these years Fan. How come Dr Hansen didnt realise he hadnt identified its start, and what has caused it-at a constant 280ppm it doesnt appear to be co2 does it?
Fan, Please take a step back from your advocacy of a political thesis and position and look to a restrained evaluation of the data. Tonyb tries to put you on the path time and again but your persistent myoptic bias keeps you veering to an obnoxious pattern of belief blather. Lots of science on this blog and one can learn a lot by not filling up the space with silly faces and ad nauseum arguments. The earth abides and has been changing the climate for a long time. This is complex modeling problems but data and comparison of data to models has to drive the evaluations. Ice ages seem to be driven by orbital changes and we may be overdue for one of them.
Scott, with respect, observation and theory both plainly say otherwise, eh? :) :) :)
Climatereason’s novel assertions lack solid observational *OR* theoretical foundations, and in their present largely-unpublished form, are understood solely by him, eh?
Whereas Neven’s near-term observations accord perfectly with Hansen’s long-ago long-term predictions, eh?
So consensus climate-change science is looking pretty solid, eh? :) :) :)
Observations? Tens of thousands of them. Published? Lots of materal ready for my book ‘The Anecdotal Viking’
Long term warming (MWP) A cold period in the middle (LIA) Modern long term warming. Long predating satellites and James Hansen. Why has it been happening for so long Fan? What causes the changeover from warm to cold and back again with co2 at a constant 280ppm?
. Is CET -which seems to adequately represent temperature trend (if not the precision) showing us the need for a plan B to deal with cooling?
Sharing some of Dr Hansens budget to explore the answers to the questions would be great. Could you have a word with him when you point out that Giss was a staging post and not the Starting post for the warming trend?
You are asking a kook obsessed with smiley faces and who echos climate crisis talking points like beans in an oil drum to be restrained or reasonable?
fan, you are so wrong but I will go back to skipping over your posts. Try to listen to tonyb and Pielke sr and move to something more professional.
A key event for a better understanding of the AA is the dramatic Arctic warming at the end of World War I in winter 1918/19, lasting for two decades, which is discussed in the book: “Arctic Heats Up. Spitsbergen 1919 to 1939“, Bloomington/IN/USA, 2009, (iUniverse), pages 106, and online available at: http://www.arctic-heats-up.com/ .
I fully agree with:
vukcevic | August 24, 2012 at 8:11 am (above) saying that
“Arctic amplification comes from below not from above.”
Here’s a couple of recent papers from the same group I enjoyed. About heat transport in the Atlantic, doesn’t mention Arctic but how could this not affect the Arctic over the past few decades?
Nice links, “In particular, the observed large, rapid rise in SPG heat content in the mid-1990s is successfully predicted in the ensemble initialized in January of 1991. A budget of SPG heat content from the CORE-IA experiment sheds light on the origins of the 1990s regime shift, and it demonstrates the extent to which low-frequency changes in ocean heat advection related to the Atlantic meridional overturning circulation dominate temperature tendencies in this region”
Thanks for the links.
climatereason – I will ask Marica
thank you, i would be most interested to read it
So how to make the knowns actually known? Sign up to be an IPCC author, so you can cite your own papers. Or issue a press release. Hmmm . . . There oughta be a better way.
This is the most interesting topic mentioned in a long time here, and there’ve been several not-bad ideas lately.
I’ve read tens of thousands of academic papers, mainly in the sciences. I’ve observed the careers of hundreds of academicians and scientists well-known and obscure both first hand personally and professionally and from a distance. My impression is that there are better and worse ways to get cited, to get one’s knowledge known.
The best way, in my opinion, to get cited more is to be a good citizen of one’s field; it takes being a good neighbor to make a good neighborhood.
Speak well of others’ science where their science is well done by your own standards and lights. Do it openly and unprompted. Do it in letters to editors or blogs, or personally, or in public talks. There is no need to go overboard, but demonstrating currency and interest in the publications of others encourages exchange of ideas and promotes scholasticism associated with one’s own name and own circles. A Google+ circle is neither a bad model for this, nor a bad forum, nor is Facebook, nor is twitter, nor is face-to-face or private email to the authors or publishers, most especially if they are not well-known to one.
Speak critically but positively and with greatest civility (just observe how I do it, and do the opposite) of points you disagree with only after asking clarifying questions and making best efforts to obtain fullest understanding of what is being claimed. There is nothing that undermines the credibility of a citizen more than slamming something wrongly, and credibility is the greatest gauge of future citation in the long run. The very same forums and methods as speaking well applies, however it is far better to speak well much more often and speak ill much less.
The worst thing in my opinion a citizen can do is to remain utterly silent in one’s field about publications. This weakens the whole field, undercuts every publication and leaves the greatest possible uncertainty. Do not be afraid of epithets like busybody, relentless responder, or attention-seeking gloryhound — most especially by speaking your truth quietly and clearly and not being a gossip or putting personalities ahead of ideas, and of course actively listening* at least twice for every time you speak.
*Active listening is the art of intelligently drawing the most and clearest information from a source given the constraints of the exchange.
When you are curious about why your own work was not cited in the work of another, it’s perfectly valid to ask them directly; it’s also valid to collect data by observation, form hypotheses and use inference. What citations did they use? What qualities do those citations have that differ from your own? Would citing your work have strengthened their paper, or weakened it? Strengthened or weakened the field?
Sometimes — most times — many independent papers are published in fields with near simultaneity sharing similar independently arrived-at ideas. This was seen with what is now called the Higgs Boson, in that no fewer than six such papers appeared within a handful of years around half a century ago, but that is merely one single instance.
Is your work really that directly related? Published before they submitted their work? Of sufficient importance? Adds credibility? Adds information? Adds parsimony? Adds universality (generalizability)? Subtracts feigned, spurious or false hypotheses? Is it readable? Timely? Simplifies without oversimplifying? Does your work cite _only_ and expressly works itself with these qualities?
Do you self-cite or clique-cite too often? Do you cite those with poor credibility, or do you publish with too much of a chip on your shoulder or other agenda that detracts from your Science?
What would Isaac Newton say about your body of work if he read it today?
And what would they say of your correspondences on the published works of others, as a citizen of Science?
Michael | August 24, 2012 at 9:27 pm |
Well, half-said. I left out tons about putting images near the top of the paper and mathematics in appendices, vetting closely for fallacy and propaganda and excising them ruthlessly, taking out anything but Science from a Science paper unless one is a qualified expert in the other topic with utmost authority to speak on it, and a few other niceties, but those are details of citizenship, not its purpose.
It is illuminating that some model results conflict with other model results. It would be helpful to know whether either model is accurate enough for us to know which is false.
Looks like the professionals are, step by step, getting close to my New Climate Model which deals with just such matters as those mentioned in the paper and brings it all together into a broadly plausible narrative.
No, you are a crackpot. All the signs are there.
Such as the narrative fitting the observatins ?
Stephen Wilde | August 25, 2012 at 11:26 am | Reply
“Looks like the professionals are, step by step, getting close to my New Climate Model which deals with just such matters as those mentioned in the paper and brings it all together into a broadly plausible narrative.”
What we need are fewer plausible narratives. So few that only one remains. We’re probably past the point of most people recognizing that the “settled science” claim of only one plausible hypothesis is wrong. If there was only one plausible explanatory process the matter would be settled by now. When a few decades of handwaving about how the science is settled doesn’t work then it’s pretty safe to conclude the settement claim was vacuous from start to finish.
If anyone is interested see here:
as regards the actual mechanism
as regards the basic system energy content
as to how it all fits together.
Not yet perfect but progressing nicely.
why is the Northern Hemisphere (NH) temperature rising at four times the rate (0.28 OC per decade) versus the temperature rise rates of the tropics and the Southern Hemisphere
That is where the ice was that melted and retreated that brought us from the Little Ice Age into the Modern Warm Period. That is where the snow is going to fall that takes us back into another cool period.
More and Less Ice on the ground in the Northern Hemisphere does regulate the temperature of earth. The temperature that Arctic Sea Ice does melt and freeze is the switch that turns tte snow monster on and off. the cold season of 2012-2013 will demonstrate that the snow monster is still turned on. It will stay on as long as the oceans are warm and the Arctic is Open. We are in the warm cycle and that is when it snows more. From the Little Ice Age, until about 1998, the snow monster was more off than on. Since 1998, the snow monster has been more on than off and this will continue because the oceans are warm and they will not cool quickly.
How do we apportion components in Climate Policy?
In law, one apportions costs by several principles.
Among these is the Deepest Pockets principle — a wronged party may pursue payment in full for all of the wrongs suffered from the easiest party to prosecute, or the one with pockets deep enough to cover the damages, and the law will uphold this as there is no good reason to encumber the wronged parties with additional hindrances to prosecution of legitimate claims.
So if even 1% of the harm of weather is due to even 1% increase in probability of that weather in climate, and the change is through negligence or with knowledge and indifferent to the suffering that will be foreseeable, then it’s arguable that that’s all the cause needed for 100% apportionment.
Can benefits of changes decrease apportionment? No. Not without explicit consent to and agreement by the damaged parties, as in the form of a contract where the benefits represent compensation.
Are there other schemes of apportionment less onerous to the guilty parties? No; while there are other schemes of apportionment, it is not for the guilty to determine apportionment, but for the injured party to claim.
Is it for Science to determine apportionment? That’s silly. There’s no inference that can settle these questions objectively, as there will be multiple mutually-incompatible schemes of apportionment each subjectively equally supportable by some rationale.
Unfortunately there is no tort law that applies to one sovereign nation vs. another. Much to your distress I’m sure.
Bart R | August 24, 2012 at 8:53 pm | Reply
“The best way, in my opinion, to get cited more is to be a good citizen of one’s field; it takes being a good neighbor to make a good neighborhood.”
The best way must not mean the most efficient way. Notoriety is far more easily acheived through getting caught in deceit. Who on earth would have ever heard of the Climategate cast of stars if not for the public disclosure of the outrageous email admissions they made privately to each other? Who was Gleick before getting caught with his hand in the coookie jar? Gettting recognition for important work first requires doing important work. Hansen et al are wool gatherers. There’s not a damn thing of actual import they do. No one would know Hansen from Adam if didn’t get out and get his picture in the newspapers wearing handcuffs. That’s negative attention and its really easy to get.
David Springer | August 26, 2012 at 4:25 pm |
There are a number of Cold Fusion adherents who would agree with you; however, they couldn’t get cited if they were dating Lindsay Lohan and Rihana at the same time. Climategate doesn’t seem to have elevated the rate of citation of any of the principles very much, and indeed many of them took a deep hit in credibility to profound as to have delayed by years some elements of the field, and left much of the rest of the field reeling in shock that a half dozen inquiries have not fully staunched.
Gleick was according to some one of the most important moderate voices of reason and ethics before his foot-shootingly bizarre incident, and now has been booted from or withdrawn from several august and influential posts. Notoriety’s done him no good.
Hansen spent from the mid-1980’s to the late 1990’s on the outside, as a heretic who was flatly and roundly dismissed by the authority as well as by fabulists. NASA tried to shut him down, Gray took repeated potshots at him, UAH was hostile from the start; if anything the changing of minds among figures in positions of authority by Hansen et al’s quiet and persistent scholarship swimming against the dominant beliefs of the old guard for almost a quarter century before he became very much of a public figure by public protest is the remarkable story.
Al Gore was practically the first public celebrity dedicated to Climate Change as an issue, but he wasn’t actually the first nor the only.
http://www.leonardodicaprio.org was founded in 1998, for example. Is he notorious enough for you? Did you even know he was involved?
Arnold Schwarzenegger is probably ten times more effective than Al Gore in moving public policy on AGW. Notorious enough for you?
Larry David’s wife, Laurie is far more famous for being Mrs. Larry David, until you find out she co-produced Inconvenient Truth.
And yet, when I look for science on AGW, I pay zero attention to the notoriety of the author; I research the way I research topics like surface plasmonics or the fossil record of elbows: by going to the literature and methodically surveying it. I don’t get my science from Rupert Murdoch.
You should try my approach. As Murdoch’s has failed you.
Interesting two papers. You were just unlucky with the second paper because Kaufman’s work was still two years in the future. You did what you could with what was available, even got the rate of warming (0.4 degrees/decade) from 1958-2004 pretty close to what I pulled out from NOAA’s Arctic Report card (0.5 degrees/decade). But then you also calculated global warming for the total interval which should not be done. That is because different physical processes govern different periods in that time frame and they must not be averaged in together. For example, the period from 1958 to 1976 had no warming whatsoever and should not be made part of the average. This was followed by an apparent step rise associated with the beginning of the warm phase of PDO. It is not well documented but apparently raised global temperature by 0.2 degrees in the late seventies. Then there was another period of no warming from 1979 to 1997 where nothing but ENSO oscillations existed and mean temperature was constant. And that in turn was followed by a complete outlier – the super El Nino of 1998. It brought a step warming which raised global temperature by a third of a degree in only four years and then stopped. It cannot be greenhouse warming because you cannot turn greenhouse on and off like that. Its most likely cause is warm water carried across the ocean by the super El Nino. It is this step warming that is responsible for the 2000-s being the warmest decade on record, not some imaginary greenhouse effect that Hansen is pushing. It is a true global warming incident that stands alone for almost two centuries. It should be intensively studied instead of hidden by falsified temperature curves. Temperature change is not the only sign of it. It has ecological effects by causing population migrations and is also noticeable in Arctic regions. In Natural History there is an article about a visit to Yakutia in Siberian Arctic. Their climate has changed and their land is melting. They get so much rain that they have trouble grazing the cattle on grassland. In a 2008 meeting with the elders several elders remarked that this was a recent phenomenon, only noticeable for ten years or so. Do your arithmetic – their climate changed when the super El Nino arrived.
Hi, its fastidious paragraph on the topic of media print,
we all be familiar with media is a enormous source of information.
With havin so much written content do you ever run into any issues of plagorism
or copyright violation? My blog has a lot of exclusive content I’ve either authored myself or outsourced but it appears a lot of it is popping it up all over the internet without my authorization.
Do you know any ways to help reduce content from being ripped
off? I’d definitely appreciate it.