by Judith Curry
Juoakola spotted an interesting paper, that I missed when it was originally published:
NONLINEARITIES, FEEDBACKS AND CRITICAL THRESHOLDS WITHIN THE EARTH’S CLIMATE SYSTEM
JOSÉ A. RIAL , ROGER A. PIELKE SR., MARTIN BENISTON , MARTIN CLAUSSEN, JOSEP CANADELL , PETER COX, HERMANN HELD , NATHALIE DE NOBLET-DUCOUDRÉ , RONALD PRINN, JAMES F. REYNOLDS and JOSÉ D. SALAS
Abstract. The Earth’s climate system is highly nonlinear: inputs and outputs are not proportional, change is often episodic and abrupt, rather than slow and gradual, and multiple equilibria are the norm. While this is widely accepted, there is a relatively poor understanding of the different types of nonlinearities, how they manifest under various conditions, and whether they reflect a climate system driven by astronomical forcings, by internal feedbacks, or by a combination of both. In this paper, af- ter a brief tutorial on the basics of climate nonlinearity, we provide a number of illustrative examples and highlight key mechanisms that give rise to nonlinear behavior, address scale and methodological issues, suggest a robust alternative to prediction that is based on using integrated assessments within the framework of vulnerability studies and, lastly, recommend a number of research priorities and the establishment of education programs in Earth Systems Science. It is imperative that the Earth’s climate system research community embraces this nonlinear paradigm if we are to move forward in the assessment of the human influence on climate.
This is a really provocative paper, that deals with many topics that have been discussed on previous threads, including chaos and complexity, prediction of emergent behavior, feedbacks and thresholds, natural internal modes and multidecadal ocean oscillations. Numerous examples are given of past abrupt climate changes.
The paper concludes with:
Therefore, we have agreed on a list of desirable research strategies – some of which are specific, employing integrated assessments within the framework of a vulnerability approach, and some of which are general. The list is not intended to be exhaustive but hopefully illustrative of the many challenges (and opportunities) fac- ing the Earth’s climate system research community. Accordingly, we recommend to
• Explore the limits to climate predictability and search for switches and choke points (or hot spots) of environmental change and variability.
• Construct models to explain the nonlinear response of the climate system to changes in insolation forcing due to orbital parameter changes, an objective best approached from the paleoclimate perspective.
• Improve our vision of the climate’s future through a better understanding of its history. Paleoclimate and hydroclimate records exhibit abrupt changes in the form of rapid warming events, the irregular oscillations of ENSO, catastrophic floods, sustained droughts, and many other nonlinear response characteristics. Extracting, identifying, categorizing, modeling and understanding these non- linearities will greatly help our ability to understand the present and future state of the climate.
• Develop GCMs coupled to low-dimensional energy balance ice sheet/litho- sphere hybrid models (e.g., Deconto and Pollard, 2003) that can simulate the interaction between hydrosphere, atmosphere and land over a wide range of spatial (continental to global) and temporal (centennial, millennia) scales.
• Understand the global connectivity and variability of ocean-atmosphere cou- pled phenomena, such as the North Pacific Oscillation (NPO), the Pacific Decadal Oscillation (PDO), the Arctic Oscillation (AO), the North Atlantic Oscillation (NAO), and the El Niño/Southern Oscillation (ENSO).
• Promote research to improve techniques that measure directly or indirectly the spectral variability of the Sun’s irradiance output at decadal and millennial scales.
• Understand the physics of the ocean thermohaline circulation (THC), whose collapse may be one important cause of major climatic change in Western Europe and North America (Rahmstorf, 2000).
• Perform sensitivity experiments with global climate models to evaluate the response of the climate system to biospheric interactions (including vegetation dynamics, and the effect associated with the anthropogenic input of carbon dioxide and nitrogen compounds), the microphysical effects on clouds and precipitation due to anthropogenic aerosol emissions, and land-use change including fragmentation of ecosystems. Existing experiments to explore these effects include Cox et al. (2000), Eastman et al. (2001b), and Pielke (2001a,b).
• Investigate the benefits and risks of large-scale deliberate human intervention in the climate system. For example, carbon sequestration, associated with land-management practices could be a strategy to remove CO2 from the at- mosphere. This should include the concurrent effect on water vapor fluxes into the atmosphere and the net irradiance received at the Earth’s surface (e.g., Betts, 2000, Claussen, 2001). Another example is the effect of the construc- tion of large-scale water systems and the control of large lakes such as Lake Victoria and the Great Lakes on regional climate systems.
• Identify locations or regions that are particularly sensitive to or easily im- pacted by the planetary climate system. The Amazon rain forest and its fluvial regime (Cox et al., 2000), Southeast Asia (Chase et al., 2000), the North Atlantic Ocean (Rahmstorf, 2000), the Arctic Ocean (Foley et al., 1994), the boreal forest (Bonan et al., 1992), and the Nile River system are examples of such sensitive locations.
• Investigate in increasing detail nonlinear interactions involving changes in biospheric emissions of chemically and radiatively important trace gases, changes in atmospheric chemistry affecting the lifetimes of these gases, and resultant changes in radiative forcing. Examples of such investigations using simplified models include Homes and Ellis (1999) and Prinn et al. (1999).
To conclude, we recommend the development of new educational initiatives on environmental/climate science. The complexity of the climate system, its myr- iad of parts, interactions, feedbacks and unsolved mysteries needs researchers able to transcend their own specialties, jump over and build bridges across ar- tificial disciplinary boundaries. Hence, a fundamental requirement for the future environmentalist/climatologist is a firm grasp of the mathematics and physics of nonlinearity and of the methods and goals of interdisciplinary climate science. We enthusiastically endorse John Lawton’s (2001) call for establishing specific programs on ‘Earth System Science’ (ESS) at various institutions and universi- ties, in order to provide upcoming generations of scientists with insight into the complexity, the interdisciplinary nature and the crucial importance of these themes for the future of humanity. The greatest challenge is to build a strong research infrastructure that defines ESS, and as Lawton notes, the greatest barrier at present is the lack of organizations ready to nurture this new discipline.
Sounds really really good to me.
Moderation note: this is a technical thread. Off topic comments will be deleted.
Thank you, thank you, Professor Curry, for continuing to pursue the causes of climate change.
Hang in there!
Oliver K. Manuel
Nowhere do I see any specific reference to urgent study of the three basic conundums facing CS and the planet:
a) Is CO2 forcing?
b) Is warming (A or non-A) beneficial within probable limits (historical)?
c) Is CO2 beneficial to the biosphere?
Non-linear tipping points aside, of course. Understanding those is going to be a medium/long-term process. Meanwhile, the AGW/PNS/UN/EU/EPA machine roars on, working to cripple the planet’s economy (which is the only way “carbon control” can occur.)
There doesn’t need to be- dismissing the 9th point as an ‘it’s ok to publish this compromise’ the rest seems exceptionally promising.
They not only admit the complex and (so far) unpredictable nature of the climate but suggest ways to evaluate it.
The paper seems to actually WANT to investigate the mechanisms that are driving the climate through, it would appear form these excerpts, actual scientific methadology.
CO2 is irrelevant until you can quantify the natural effects, this paper seems to recognise this and further suggests a ‘foundation course’ for future climate scientists.
Finally, it’s always good to see someone calling for people to ‘jump the fences’ between academic arena’s to better a particular fields knowledge.
I’m, for once, genuinley impressed.
That seems like a weird paper to get a journal. It sounds more like a government/professional organization assessment report of some kind…and then I saw Pielke Sr’s name. It makes little more sense now.
I think first thing is first. The community needs to figure which relationships between climatic parameters are super-linear (larger than linear relationship) and which are sub-linear (smaller than linear relationship). From there I think threat, response and adaptation assessments will be somewhat more straightforward.
Actually, this kind of piece is not unusual for the journal Climatic Change (this is the journal that Steve Schneider launched.) Note, this is the journal that is hosting the special issue on Framing and Communicating Uncertainty for the IPCC (for which I am submitting a paper).
Dr. Currey,
Is it safe for a layman to assume that the authors’ case for non-linearity and threshold behavior does not constitute an argument for or against the drastic, irreversible tipping points postulated by some climate scientists?
“Accordingly, we recommend to”
I recommend we cure cancer by finding a thing that kills cancer but not people.
Here I show a nonlinear North Atlantic’s climate regulating ‘input’ (square law) (apparently driven by the solar activity) in a relatively good agreement with the CETs.
Correlation appear to be highest during the last 100+ years, the period of good data availability.
http://www.vukcevic.talktalk.net/CDr.htm
What’s the physical mechanism?
For anyone from an academic or research institution interested in a further elaboration; my email address is on the graph.
There is none.
It’s in the lower right-hand corner of the chart.
vulcevic:
You made an interesting point. Have you taken a look to this recent paper published in Science:
Enhanced Modern Heat Transfer to the Arctic by Warm Atlantic Water
Rober Spielhagen et al.
This research and the unprecedentness of the Atlantic warmth was higlighted in many blogs and news very recently. When I opened the original paper I noticed the warmth in North Atlantic specially in the 19th century seems to follow solar activity from Solanki et al:
http://img269.imageshack.us/img269/3581/natemps.jpg
http://climateaudit.files.wordpress.com/2007/01/solar_141.jpg
maxwell asked for the physical mechanism, I could guess cloud changes due to GCR? According to Laken et al. there is a connection between solar activity and cosmic rays. But as I said, I am just guessing. But this seems to be one possible explanation.
P.S. Judith, you misspelled my nickname. And thank you for this topic! My nick comes from my real name which is Juho Akola.
What I meant to say, Laken et al confirmed a connection between cosmic rays and clouds. This is also a very recent study.
Surface salt changes has inhibited the solar penetration of heat into the oceans which have left the equatorial region ocean waters colder and shifted the ocean heat to the Arctic regions where vast amounts of evaporation is occurring.
This is another angle – solar UV drift and top down forcing by ozone warming
Lockwood, M., Harrison, R., Woollings, T. Solanki, S., (2010) Are cold winters in Europe associated with low solar activity? Environ. Res. Lett. 5 (2010) 024001 (7pp.
This paper is very interesting and seems to recommend a significant shift in emphasis away from what I would call the traditional IPCC type of climate research (with its emphasis on future projections and effects) towards something that accepts a far greater role for unpredictable natural variability and uncertainty.
The following quote is interesting:
“….since the climate system is complex, occasionally chaotic, dominated by abrupt changes and driven by competing feedbacks with largely unknown thresholds, climate prediction is difficult, if not impracticable.”
It would be a big change, and it will be interesting to see how it goes down with the likes of Trenberth, Schmidt etc, and I particularly look forward to hearing the views of Andy Lacis if he swings by.
While I don’t dispute any particular or general aspect of the paper, could anyone clarify is not the goal “..search for switches and choke points (or hot spots) of environmental change and variability..” a Chaos Theoretical impossibility?
(By which, one guesses the authors mean to find Attractors within complex systems undergoing strong external inputs.)
It’s been a long time since I’ve done this sort of analysis myself, so would appreciate any clarification.
Translated: Follow Global Warming and Cooling Road to Mazzarella’s holistic hideaway.
The AGW True Believers obviously have nothing to oppose actual science-based reasoning. In doing so they are forced to argue against a holistic view of the world. That argument is anathema to the hippie scientific-illiterates that flesh out the membership of the AGW Climate-Man Cult.
A part of the AGW model-makers’ erroneous preconceptions obviously is a belief, “that anthropogenic CO2 is producing an enhanced greenhouse effect and that the Earth’s mean temperature will increase in the next 100 years.”
Another part of the AGW True Believers’ misconceptions is the conviction that reliable forecasting is possible based on, “starting from ever smaller boxes containing portions of atmosphere and ocean and where non-linear algorithms are forced to operate.”
Going down these roads, before the first prediction, the GCM model-makers already have become, “convinced disciples of the mathematical reductionism.” They also wish to be counted among the converted who eschew the concept of an ‘average global temperature’ is meaningless because temperature is an intensive variable.
Additionally, since the the AGW model-makers’ GCMs fail grandly when comparing predictions to reality on both a ‘zonal’ and ‘seasonal scale,’ the AGW model-makers must indulge the fiction that their GCMs will be reliable when predicting long range global climate change; again, a preconception but more than that, a belief that has never been validated, nor can it be!
“This reductionist approach is misguided since the model will never be able to be correctly evaluated. To overcome such a paradox, we followed a holistic approach that analyses the Sun, atmospheric circulation, Earth’s rotation and sea temperature as a single unit (ut unum sint): the arrival on the Earth of fronts of hydrodynamic shock waves during epochs of strong ejection of particles from Sun gives rise to a squeezing of the Earth’s magnetosphere and to a deceleration of zonal atmospheric circulation which, like a torque, causes the Earth’s rotation to decelerate which, in turn, causes a decrease in sea temperature. Under this holistic approach, the turbulence of solar wind and the zonal atmospheric wind behave cumulatively rather than instantaneously, where energy inputs are first conveniently accumulated and then transmitted…”
(Adriano Mazzarella, Solar Forcing of Changes in Atmospheric Circulation, Earth’s Rotation and Climate, The Open Atmospheric Science Journal, 2008, 2, 181-184)
Hit the button too soon above. Posting again.
Non-linear climate responses are arguably best defined in the Dansgaard-
Oeschger events seen in the Greenland ice core records. Interestingly, however, there was what appears to have been a small-scale DO event in Greenland during the 20th century. It consisted of a rapid surface temperature increase of 3-4C between about 1920 and 1930 followed by a gradual decrease back to 1920 levels by 1990. The 1920-30 warming appears to have been be confined to Greenland. Temperatures in the Hudson Bay area immediately to the west in fact fell by a degree or two between 1920 and 1930.
As far as I know this event remains unexplained, quite possibly because I have been derelict in my literature searches. But if not it might be a good place to start investigating non-linear DO-type climate responses.
One prominent apparent cycle mentioned above are the Danssgard Osgher 1500 year cycles evident in the Greenland ice cores. In a recent paper we analyzed these for regularity and evaluated several nonlinear dynamics (threshold) theories of their origin.
Holocene temperature records show millennial-scale periodicity
Authors: Loehle, Craig; Singer, S. F.
Source: Canadian Journal of Earth Sciences, Volume 47, Number 10, 1 October 2010 , pp. 1327-1336(10)
Thanks Judith.
This is an especially interesting topic for me.
On the first fast lecture it seems to be a bit too general for me but I will read it tomorrow in detail and comment if I have something relevant to say.
But clearly these issues are the “poor relative” of the climate science while they reign in information sciences and in biology (brain and heart processes).
Tomas, I look forward to your remarks.
CO2 and tree rings are tinkertoys! I’m talking about Earth Systems Science!
This is such a rehash. Does chaos apply in the modern era. Well – yes – probably. But they ignore the most germane analyses of the modern era – and can offer no alternate explanation of modern warming. And it is still only the vaguest of natural science descriptions – there is certainly no math involved.
They offer no warming of the sensitivity of chaotic systems to small initial changes – such as those of greenhouse gas emissions. Such as we have been hearing for a decade or more from people such as Wally Broecker.
While I have no time for the conventional theories – http://www.earthandocean.robertellison.com.au/ –
This offers nothing but the vaguest of platitudes.
They offer offer no alternate explanation of modern warming because that is what we are seeking, or rather we are seeking a complete explanation that properly includes natural variability.
Robert,
It is only chaotic in observed science or just following temperature models. If the perimeters are really understood, of the different interactions, then it is not difficult for future predictions.
But current following of temperature patterns to a changed planet will only show many anomalies in the lower range of temperature.
Massive precipitation anomalies are occurring all over the planet but no one is following this.
Certainly will notice when food shortages and world prices skyrocket like they are currently starting.
Chaotic systems are not sensitive to changes on the order of GHG emissions, rather they are sensitive to infinitesimal changes. It is math, not causality, discovered by Poincare 100 years ago.
The Rial et al paper offers an informative perspective on the role of different elements underlying climate change. The importance of non-linearity is, I believe, already acknowledged as a contributor to prediction uncertainty, but the use of paleoclimatologic examples with particular relevance to abrupt climate shifts reinforces the point.
Still, the perspective is just that. A balanced understanding requires an awareness of trends based on theoretical and empirical evidence that can emerge even in the face of unpredicted variability, and are modified rather than obliterated by the variability. The warming of the past century and its likely continuation in the coming century are an example.
Regarding sudden climate changes themselves, the paleolimatologic record suggests that they are more frequent and/or severe during glaciations than interglacials, at least for those involving temperature change. In part, this may reflect the importance of ice/albedo feedbacks as climate change amplifiers. The greater the ice extent, the greater the potential change in that extent and its consequences. Conversely, a positive feedback on warming would be impossible in an ice-free world. Other shifts, such as those involving vegetation and hydrology, may be less affected by glaciation/interglacial differences.
How might the glacial/interglacial cycles themselves be impacted by trends, including anthropogenic trends? Based on orbital parameters alone that correlate gradual descent into glaciation (over millennia) with persistent summer ice in high Northern Hemisphere latitudes as indexed by reduced summer insolation at 65 N, a new glaciation would be unexpected for at least another 25,000 years or probably far longer –Berger and Loutre 2002 . However, anthropogenic trends appear to be driving Arctic summer ice in the opposite direction – toward a major reduction or disappearance. Would this forestall the natural events anticipated from the orbital forcing, or are current trends irrelevant that far into the future? (Of course, a different type of anthropogenic influence – global thermonuclear war – could trigger a glaciation much sooner, as could a natural catastrophe suggest as a major asteroid impact).
In terms of extent, it is important to recognize that D/O events and similar abrupt changes may often have been hemispheric rather than global, with changes in one hemisphere accompanied by reciprocal changes in the others – the so-called “bipolar seesaw” that perhaps reflects teleconnections involving the southern ocean – Interhemispheric Seesaw . The likely effect of current trends on the operation of this type of reciprocity is difficult to gauge but may be significant.
The implications for human action of the phenomena described in the paper are likely to be vexing. While the possibility of unanticipated abrupt change reduces the level of confidence possible in the accuracy of future predictions, it does not render prediction useless. Within interglacials, major abrupt changes of a global nature appear to have been infrequent. Anticipation of changes likely within intervals limited to multiple decades or a century may well be possible within a reasonable level of accuracy. Most of the abrupt changes described by Rial et al involved abrupt warmings followed by more gradual coolings. If our current trend also entails warming, future abrupt changes may exacerbate the trend. Conversely, abrupt cooling such as occurred during the Younger Dryas would reverse a warming trend, although the trend might ameliorate the extent of cooling.
Fred:
In your post relative to Rial et al. on the Annan and Hargreaves thread you made the following observation: “It’s clear that non-linear processes can lead to climate effects difficult to predict, although I believe the paper overstates the case by neglecting the role of trends predictable on the basis of known phenomena and principles”. I assume that you were referring to anthropogenic forcings, and Rial et al. do indeed ignore them, presumably because there were none over the period of the paleoclimatic record. On the other hand, linear anthropogenic forcings are the ONLY thing current climate models allow for. The GISS Ocean-Atmosphere model surface temperature hindcasts for the 20th century can in fact be replicated almost exactly (R=0.99) simply by multiplying atmospheric CO2 concentrations by 0.0096 and adding 10.9. It doesn’t get much more linear than that.
So as I see it the problem is not whether Rial et al. are overestimating the impact of non-linear natural forcings but that climate models do not allow for them at all. This probably explains why they are unable to replicate a number of features of the 20th century temperature record, including the apparent small-scale Dansgaard-Oeschger event in Greenland described in my earlier post.
Roger – Although the modelers can speak to this better than I can, I believe it’s necessary to identify which models you’re referring to. GCMs addressing long term trends tend to ignore short term variations, but many models attempt to address them, including the effects of ENSO, volcanism, and other variables. The fact that they don’t do as good a job as they do with long term trends is not for lack of interest.
Regarding GCM hindcasts and forecasts, the linearity does a good job representing the way the climate actually behaved over the multidecadal intervals evaluated, so it’s hard to fault the models. The main point has been that the models matched observations when forced with anthropogenic factors (including aerosols, not just CO2), but could not match observations, no matter how tweaked, when the anthropogenic factors were omitted. That is very compelling evidence for the substantial role of anthropogenic factors over the longer term. Combined with paleoclimatologic data, the anthropogenic role appears to be demonstrated conclusively, although the magnitude of the effects is clearly surrounded by a margin of uncertainty.
Fred, the reasoning in your second paragraph is backwards. That models without anthropogenic influences could not hindcast no matter how tweaked could simply mean that the modelers didn’t understand the natural influences. Ockham might prefer this explanation to the kludges and epicycles of aerosols.
====================
Kim,
You are right.
Current models can only predict warmer, warmer, warmer due to following only the temperature records to predict a future outcome.
Models that respond only to linear CO2 forcings are bound to show temperature increases in the 21st century because CO2 is projected to increase in the 21st century.
The natural influences were based on observational data (volcanoes, ENSO, solar changes, etc.), and one would have had to invent unobserved influences to bring “natural influences into line with model projections. The conclusion that anthropogenic influences are required to explain the observations appears to be robust against any plasuible alternative explanations.
Fred
I agree that we can’t presently simulate the 20th century record without using linear anthropogenic forcings. I’ve tried to explain it myself using natural forcings only and found that I could in fact do so, but only by cherry-picking my input data. I further agree that this indicates a significant anthropogenic impact. But I wouldn’t go so far as to say that it conclusively demonstrates a significant impact. I don’t think we know enough about how the earth’s climate works to demonstrate anything conclusively at this point.
I’m also not faulting the models. In fact I think its a credit to the modelers that they come as close as they do given the massive complexity of what they are trying to simulate. However, the best models we have still do not do a “good job (of) representing the way the climate actually behaved over the multidecadal intervals evaluated”. Because they allow for greenhouse gas forcings and effectively nothing else they are unable to hindcast some important features of the 20th-century record, such as the 1940-1970 cooling and the warming differential between the hemispheres – features that could well be a result of non-linear forcings.
Fred,
I agree with some of what you say, but I am curious why you say: “The warming of the past century and its likely continuation in the coming century are an example.” I see no supporting evidence that the continuation would not just as likely be a cooling in the coming century. What do you know that I have missed? The present forecast by many (including some supporters of AGW) is a likely (average) temperature drop for the next couple of decades, and then possible an increase for a while and then who knows what.
I haven’t seen predictions of a temperature drop in the journals – do you have specific references? Of course, a drop is possible, but I perceive it to be unlikely based on the past 100 years. In particular, the 1940’s to late 1970s were characterized by a brief bump and dip, followed by a long interval of flat but not declining temperatures. The lack of an actual long term cooling trend is interesting in that the interval was associated with a strong negative shift of the PDO plus a multidecadal rise in negative aerosol forcing demonstrated by a reduction in the transmission of solar irradiance to the ground as measured under clear sky conditions.
It is likely that CO2 will continue to rise, but aerosols will not, or will be reduced via pollution control measures. Given the lack of twentieth century cooling under circumstances that might have favored it, I judge it unlikely that we will see cooling over the next few decades, although the warming may be shallow.
Could this be wrong? Of course.
He could be referring to Latif-Keenlyside’s work, and the work of Tsonis-Swanson. Both of those groups think warming will resume.
a new glaciation would be unexpected for at least another 25,000 years or probably far longer -Berger and Loutre 2002 .
That was a linear model and makes little sense when variables and phase inversions are nonlinear and the passage between the two climatic states T+ and T- is random ie no prefeered periodicity eg Nicolis and Nicolis 2007.
There is a good paper on the transisitons ie the phase slip mechanism by Bergers pupil and others Crucifix 2010.
The second conclusion is related: as long as the forcing is not too large different synchronised solutions may co-exist and moderate fluctuations may effectively induce jumps between these different solutions,
equivalent to phase slips described for periodic forcing. These jumps effectively restrict the prediction horizon of the exact course of climate evolution at theses time scales.
Throughout this study appeared a tension between the theory—concepts and theorems are valid at the asymptotic limit and the practical needs of palaeoclimate theory. Indeed, it does not make much sense to consider prediction horizons much beyond 1,000 ka in this context because the system can no longer be assumed to be stationary.
Glaciation is massive evaporation and precipitation occurring and the changing of the global climates weather patterns.
Record breaking precipitation would be a good clue.
Maksimovitch – A few points.
First, as a matter of courtesy, could I ask you, when you cite references, to provide either a link to the journal article, or a complete citation – Authors, title (not essential), journal, volume number, page numbers, and year? (A simple link would obviate the need for the exact reference.) It is inconvenient to be informed that a journal reference makes a particular claim and then find it necessary to try to track down the journal and the article. Indeed, some of the claims might refer to blog articles, meeting abstracts, and other sources not found in the journals.
Regarding the specific points, the 25,000 year interval before the start of a new glaciation is a minimum estimate – most estimates put the interval much higher. It is also important to note that the time between the start of a descent into glaciation and a very substantial cooling toward a glacial maximum consumes many millennia. On this very specific issue of global glaciation, and not climate cycles in general, I have not seen published data conflicting with the conclusion that despite uncertainties about the determination of glacial/interglacial cycles (including those in Rial et al), there is any realistic possibility to expect sufficient orbital perturbation to trigger the start of a new glaciation for many thousands of years.
All of this assumes that the reduction in Arctic ice resulting from anthropogenic warming would not delay or prevent the reduced summer insolation at 65 N and thereby avert a glaciation that would otherwise occur. However, given the long time frame, and the likelihood that fossil fuels will have been exhausted many thousands of years earlier, this may not be an important consideration.
The paper referred to is Synchroniisation on Astronomical Forcing . It constructs an interesting mathematical model for evaluating synchronisation processes between very slow events such as glaciations and more rapid processes such as changes in the North Atlantic deepwater circulation. To me, it does not suggest or even imply that a new glaciation beginning as early as 1000 years from now is within a reasonable range of plausibility (although if you disagree, you might wish to contact the authors for their view), nor do I interpret it to claim that its principles can reproduce past glacial/interglacial cycle, but rather that they are simply compatible with those past events.
In any case, it’s an interesting read, and may be useful in understanding why the most simple type of Milankovitch calculations fail to account fully for past climate changes in response to orbital forcings.
Firstly it finds there is a temporal horizon of 1kya for predictions , beyond this the positive lyaponov exponents are an irreducible constraint.This is well understood ie posiitve lyaponov exponents appear at around 0.22 time units a physical constraint for all forecasting communities.
Second in non linear systems away from thermodynamic equilibrium,with competing historicity multiple futures are available and well known in complex systems
http://www.scholarpedia.org/article/Complex_systems
What we need to understand is why does a small decrease in energy of 0.1 percent lower the potential barrier for a transition to glacial states eg nicolis and nicolis foundations in complex systems 2007 pg229.
That should be warning of course – chaotic systems exhibit sensitive dependence on initial conditions. As do the models I might add – they use the same equations of fluid motion used by Edward Lorenz in his 1960’s convection model – to discover chaos theory. It is nonsense to think that models can give numeric as opposed to probabilistic estimates of risk – but the worst case climate risk in a chaotic system is that changes could devastatingly large and abrupt.
So the bottom is still that the carbon emissions must be reduced. How to do that sensibly is the question. I’d lay odds on thin solar and 4th generation nuclear in the next decade – but who really knows. The idea that it can be solved it with taxes is nonsense.
If you think that chaos means that the IPCC is wrong (it is) and that means business as usual doesn’t bring any risk – you are living in a fools paradise.
Nope, Lorenz devised extremely simple systems of ODEs to model, grossly / roughly, some limited aspects of the properties of the atmospheric circulations. The equation systems are known to not be representative of any flows that can be realized in the physical world. The GCMs use more nearly complete representations of the equations of motion. Importantly, the simple ODE systems do not contain accounting of the terms in the complete representations that are responsible for turbulence.
Edward N. Lorenz, Maximum Simplification of the Dynamic Equations, Tellus, Vol. 12, pp. 243-254, 1960.
S. Lakshmivarahan, Michael E. Baldwin, and Tao Zheng, Further Analysis of Lorenz’s Maximum Simplification Equations, Journal of Atmospheric Sciences, Vol. 63, No. 11, pp. 2673-2699, 2006.
Edward N. Lorenz, Deterministic Nonperiodic Flow, Journal of Atmospheric Sciences, Vol. 20, No. 2, pp. 130-141, 1963.
E. Lorenz, Irregularity: a fundamental property of the atmosphere, Tellus, 36A (1984), pp. 98–110.
E. N. Lorenz, Can chaos and intransitivity lead to interannual variability?”, Tellus, Vol. 42A, pp. 378-389, 1990.
I am very pissed this morning. Not a mention of a satellite in this paper? Energy is everything – you do not get global warming or cooling without it showing up in radiative flux at the top of atmosphere. So they miss the biggest factor by a country mile in the Pacific influence on global energy dynamics. They are fools and charlatans in ivory towers for my money.
I don’t give a rats arse how complex things are – energy at TOA is everything in climate change. Climate change can be spotted at the speed of light. For the rest we have the hydrological and oceanographic networks built up over generations. And we don’t need models to predict ENSO – all we need is the bloody SOI – models are in fact no better than randomn three months hence. This is self serving academic bullshit with only a passing relationship with the real world.
I disagree, the piece seems to be trying to fix some very fundamental issues with climate science. I think its very encouraging.
And I don’t give a rat’s… Actually I’m feeling a lot better this morning. We are all a bit stressed this summer in northern Queensland.
I thought it was a very boring rehash – but the 2004 date places it in context. The 2002 NAS report – Abrupt Climate Change: Inevitable Surprises, does a better job of backgrounding. There have been many developments since this 2004. I think the biggest change is we are starting to recognise some of the mechanisms, feedbacks and changes in initial conditions. Clouds and solar uv in particular. There is a nice review (if I say so myself) of, inter alia, Pacific Ocean Science here – http://www.earthandocean.robertellison.com.au/
I am calling chaos the new climate consensus – one that makes a hell of a lot more sense that the old one.
Dear Dr Curry,
But there seems little room for argument that the current prevailing climatic conditions are due to natural variability alone.
As a student pointed out to me today, the changes are too rapid to be attributable to natural variability.
There is some very extreme weather events happening at the moment that need to be taken into consideration.
As we are all well aware, the `argument´ is not in the scientific arena, but in the political.
The extreme weather events happening are I assume the weather in Australia, which, it may be pointed out, fits a pattern of extremes of rainfall there for the past century. Ryan Maue has shown that total global cyclone activity is at a 30 year low. If you are implying that the extreme snows and cold we are experiencing in USA are the result of global warming, you don’t do your case any good with such a claim as such claims that warming makes it colder are a source of great hilarity to the public.
Leaving aside the question of whether Sarah’s argument is correct (although it is worth pointing out that some places have experienced uncommonly warm spells this, and last, winter) there are many things in science which may seem counter-intuitive to the public. To say that we should not make an argument because the public will have trouble believing it is absurd.
I should have put this reply here – if I may?
Cyclones in Australia are much more frequent and intense in La Niña years than otherwise and there scant and conflicting evidence than ENSO has changed at all. There is a certainty that we have returned to a La Niña dominated cool Pacific multi-decadal pattern since 1998. This is a period of frequent and intense La Niña over 20 to 40 years – and cold water rising in the north east Pacific. La Niña bring storms and flooding to Australia, Indonesia, Africa, India and China and drought to America. The cool periods are also biologically very rich – with cold and nutrient laden water rising in the eastern Pacific.
There is a ‘multi-variate’ ENSO index here – http://www.esrl.noaa.gov/psd/people/klaus.wolter/MEI/ – you can quite clearly see the blue La Niña phase to 1976, the abrupt shift to a red El Niño phase in 1976/1977 and a change to a blue phase again after 1998. The latter is a little harder to see – but certain from biological indicators.
These are globally cooler periods as cloud forms over cold water in the eastern Pacific and reflect more of the Sun’s energy back into space. But good records are only available after 1950 – severely limiting the conclusions one can reasonably draw. And I am a little tired of people drawing hard and fast rules from limited records – let alone from a single storm.
Proxy records in coral and tree rings don’t provide the level of detail required for detailed analysis – although perhaps Craig Loehle might disagree. Professor Nott from the James Cook University in Townsville, on the other hand, sorts through storm wrack from cyclones past – and suggests that we have in fact had fewer cyclones in the past century than previously. That must mean that there were more La Niña before the 20th century – an interesting indication of centennial variation in the Pacific.
But climate systems are chaotic, nonlinear and globally linked – there are tremendous energies cascading through powerful systems driving abrupt and violent changes in climate. And really it always has worked like this.
I do disagree with Craig Loehle however. I don’t use the term global warming because I think it is a misunderstanding of the fundamental nature of climate. BUT chaotic systems are sensitive to initial conditions – small changes such as those involved with anthropogenic greenhouse emissions can accumulate until they precipitate a change that is wildly out of proportion to the initial impetus. I would put it at a small risk – hell they’re the ones you need to watch – but a cold, cold, cold day on planet earth is possible within a matter of months to decades.
Climate science failed in following evaporation and precipitation patterns and also the shifting currents that can produce the vast amounts of water vapor that is currently covering almost all of the northern hemisphere land mass.
Just to say the warm currents have moved and not investigate it further is bad science. Ops, oscillation changes.
I am really surprised by any discussion of this as a ‘new’ paper. This Rial et. al. paper is not new or undiscussed. It is 7 years old and at this point familiar to many undergraduate students, Judith. It is a reference in AR4.
Some of the conceptual issues and arguments in this paper, pushed by Pielke, have been discussed in the past on many science sites – including RealClimate.
Of course the climate system is nonlinear and has surprises. However, some of the other claims (in terms of both conceptual science and pragmatics) are questionable.
The strength of the paper is perhaps that it highlights the extreme policy limitations imposed on public discussion, by those scientists who still want to continue to keep a focus on questions of prediction and projection.
I have elsewhere encouraged policy formation as a process in democratic decision-making. Scientists can continue to inform and economists can continue to do cost-benefit analyses, but citizens will have to decide climate policy on the basis of what kind of risks they wish to take and what kind of society they want to live in, because the uncertainties are not likely to be reduced any time soon.
The IPCC AR4 incorporated a correct understanding of this paper and what it does and does not add to the discussion, in their recommendations.
Steve Rayner uses an argument from comparison to help citizens understand that scientists focused on reducing uncertainties, at this point, can be reasonably viewed as leading inaction. :-(
Agreed. Something about this paper made me check the acknowledgements immediately: “This paper resulted from a Workshop…”
Not that its value as such is diminished, but one’s expectations should be adjusted accordingly. It is no doubt useful in describing what attendees agreed upon (at the time) was important regarding current understanding and potentially fruitful approaches for future research, but new and/or surprising insights will probably not be found here.
(Received 11 March 2002; in revised form 1 October 2003)
I hadn’t been aware of this paper although I was familiar with Rial’s previous interest in orbital forcing. In general, when a paper of interest is linked to here, I hope to know the exact reference so that I can cite it later, particularly if there’s a question as to whether it was published in a legitimate journal as opposed to someone’s blog. Some sites don’t provide that information, which is frustrating, but it’s usually possible to track it down via the Web (if the paper has actually been published).
It’s also interesting to compare prepublished drafts with final versions, which are typically toned down to eliminate the most egregious of the exaggerated claims sometimes found in the drafts (and touted by their authors in the blogosphere).
its Rial 2004 (109 citations)
IMO, climate projections/predictions are the achilles heel of the cAGW proponents. If we start to get a cooling phase, as many are predicting, then watch carefully for all sorts of reasons why predictions are no longer necessary but political action is. Martha seems to be hinting at this already, but I doubt it will wash with many.
Martha, this is the first time I encountered the paper and its a good overview for a number of topics i want to raise in the future.
It is your last sentence that I take issue with:
“Steve Rayner uses an argument from comparison to help citizens understand that scientists focused on reducing uncertainties, at this point, can be reasonably viewed as leading inaction. ”
I go further than that. I’m saying that many of the uncertainties are irreducible, and that for every uncertainty that we reduce, two others will pop up. See my uncertainty threads, especially uncertainty monster
Uncertainty doesn’t necessary lead to inaction, but misinterpreting the uncertainties and level of ignorance can lead to costly errors and/or inadequate solutions. Uncertainty is critical information that needs to be incorporated into the decision making process:
1) if the problem is well characterized by statistical risk, then an optimal solution strategy such as setting an emissions target makes sense. If the problem is not well characterized, setting an emissions target runs the risk on one hand of costly mitigation strategies that turn out not to be needed, or on the other hand inadequate mitigation strategy for a problem that is more severe than originally thought
2) if there is scenario uncertainty (and i speak more broadly here of scenario uncertainty than emission scenarios), then a robust approach should be used where strategies are useful across the full range of possible scenarios. Energy conservation, adaptation approaches are examples here, as well as any energy policies that otherwise make sense in context of economics, security, etc.
3) If there is ignorance (and the critical threshold issue and abrupt climate change qualifies as ignorance), the strategy is to increase overall economic and societal resilience so that societies are better armed to deal with the unexpected. this is the libertarian strategy of promoting economic development as the best way to deal with threats.
If people could get off that particular dime about uncertainty leading to inaction, we could actually get some where in terms of the scientific and policy debates on this topic.
Curryja: “If there is ignorance (and the critical threshold issue and abrupt climate change qualifies as ignorance), the strategy is to increase overall economic and societal resilience so that societies are better armed to deal with the unexpected. this is the libertarian strategy of promoting economic development as the best way to deal with threats.”
The first sentence makes sense, but the second is a non-sequitur. How does the libertarian growth strategy (in relation to environmental uncertainty) increase resilience rather than decrease resilience via unsustainable bubble growth?
Good question, and why is libertarianism invoked? To me, libertarianism smacks of utopianism – the belief that an unfettered free market will magically result in making everything right.
Corporations exist to return value to shareholders – if in the short term this means unsustainable exploitation, tough luck. Those in charge will still have attractive parachutes.
There is a blind spot in this from climate warriors from the sceptic (I decline to adopt the American spelling) side – it actually provides some certainty of a quantifiable small risk of extreme and rapid change as one attractor – or possible climate state.
So change is an imperative – although there are so many people at the limit of survival that my heartfelt preference is to avoid negative economic growth. The argument against continued economic growth always involves grain and a hypothetical chessboard. In the real world the board is always swept clean daily to feed hungry mouths and no grain accumulates. Cap and trade and taxes have little utility in doing other than reducing economic activity unless there is an alternative supply of energy. So if that is what you mean by action – well let’s duke it out in the arena of popular opinion.
I can think there is one technology that has both significant promise and a depth of technological development over 50 years – 4th generation nuclear. I, as a young hydrologist, marched against nuclear power in the 1970’s. I have not changed my mind. As human technologies grow more powerful there is a need to systematically evaluate these over as much as 50 years before broad adoption. 4th generation nuclear technology has evolved to the point where it is completely safe and practical. The half life problem for instance is, as a result of sophisticated fuels processing, a matter of 100’s of years as opposed to 100’s of thousands.
At this point indeed – the climate science community has comprehensively stuffed the science and there are many people who have used this to opportunistically push an elitist anti enlightenment agenda. You may well pull a sad face.
Thank you, Chief. I’m with you 100%. The molten salt reactors have a lot of promise and we should be building them instead of watching the rest of the world pull ahead of us.
Greetings from North Queensland.
Cyclones in Australia are much more frequent and intense in La Niña years than otherwise and there scant and conflicting evidence than ENSO has changed at all. There is a certainty that we have returned to a La Niña dominated cool Pacific multi-decadal pattern since 1998. This is a period of frequent and intense La Niña over 20 to 40 years – and cold water rising in the north east Pacific. La Niña bring storms and flooding to Australia, Indonesia, Africa, India and China and drought to America. The cool periods are also biologically very rich – with cold and nutrient laden water rising in the eastern Pacific.
There is a ‘multi-variate’ ENSO index here – http://www.esrl.noaa.gov/psd/people/klaus.wolter/MEI/ – you can quite clearly see the blue La Niña phase to 1976, the abrupt shift to a red El Niño phase in 1976/1977 and a change to a blue phase again after 1998. The latter is a little harder to see – but certain from biological indicators.
These are globally cooler periods as cloud forms over cold water in the eastern Pacific and reflect more of the Sun’s energy back into space. But good records are only available after 1950 – severely limiting the conclusions one can reasonably draw. And I am a little tired of people drawing hard and fast rules from limited records – let alone from a single storm.
Proxy records in coral and tree rings don’t provide the level of detail required for detailed analysis – although perhaps Craig Loehle might disagree. Professor Nott from the James Cook University in Townsville, on the other hand, sorts through storm wrack from cyclones past – and suggests that we have in fact had fewer cyclones in the past century than previously. That must mean that there were more La Niña before the 20th century – an interesting indication of centennial variation in the Pacific.
But climate systems are chaotic, nonlinear and globally linked – there are tremendous energies cascading through powerful systems driving abrupt and violent changes in climate. And really it always has worked like this.
I do disagree with Craig Loehle however. I don’t use the term global warming because I think it is a misunderstanding of the fundamental nature of climate. BUT chaotic systems are sensitive to initial conditions – small changes such as those involved with anthropogenic greenhouse emissions can accumulate until they precipitate a change that is wildly out of proportion to the initial impetus. I would put it at a small risk – hell they’re the ones you need to watch – but a cold, cold, cold day on planet earth is possible within a matter of months to decades.
Wally Broecker famously compared (ah it seems so long ago) anthropogenic greenhouse gas emissions to poking a stick at a wild and angry beast.
A quick observation – I do not think this paper would have been published prior to Climategate. I am thankful to the leaker inside CRU (or the hacker).
It was published in 2004.
I stand corrected!
Multidisciplinary initiatives & programs run into substantial political obstacles in academia. Departments jealously guard their (pure) funding & program structures. Administrative incompatibility between departments (for joint admin) can be truly unworkable (but you’ll never find an administrator who will admit it). The needs of students end up being peripheral despite eloquent lip service & colorful recruitment webpages.
The multidisciplinary road needs to be traveled, but enrolment in multidisciplinary academic programs is (at least presently) an inefficient way to pursue the journey.
Similar comments apply to research funding. Even if the funding is earmarked “multidisciplinary”, departments squabble for their “piece” (which many privately hope to channel into “pure” research).
The best way to avoid all that political nonsense is to independently pursue education & research from outside of academia. This affords the freedom needed to be efficient & to avoid suffocation.
We can always hope the system will clean up its messy politics…
This is an interesting paper purely in the sense that it opens up the conversation. Doing more research, in wider areas, makes a great deal of sense. The fixation with CO2 as thermostat has hampered climate science for decades. The lovely argument that all the stuff we can measure (well, er, but clouds) does not add up to the observed warming and therefore it must be CO2 is simply sad.
Better arguments, based on better measurements and better models, need to be made. But they will take time. And they may very well undermine the CO2 principle.
The science needed to make these arguments needs to be funded. But that is the very limit of what should be done. For the moment, the null policy of doing resolutely nothing needs to be followed unless and until there are engineering quality numbers pointing in a particular direction.
The lovely argument that all the stuff we can measure (well, er, but clouds) does not add up to the observed warming and therefore it must be CO2 is simply sad.
But the argument that the observed warming is largely caused by CO2 stands on its own, it is not dependent on the lackof plausible alternatives (although that does bolster the case for CO2).
If it were found that the warming could be explained entirely by natural factors that would still leave a genuine question of what has happened to the extra energy in the climate system caused by increased CO2 levels.
I suggest we consult Tim Palmer’s ‘Lorenzian Meteorological Office’. (Predictability of Weather and Climate)
Chaos is the new consensus and this puts severe limits on predictability – indeed predictability is not the right word. Climate futures are knowable only as a probability density function.
Perhaps if I didn’t think it was just math – the collapse of the climate probability function into a single reality might result in many climates in alternate universes? No? Never mind.
Climate is chaotic (as in theoretical physics) and so there must be some risk of abrupt and extreme change as a result of anthropogenic greenhouse gas emissions. I’m an engineer – trust me. There is lot to be skeptical of – some of it here – http://www.earthandocean.robertellison.com.au/ – but it is time we moved on to sensible action.
To my mind – it is not putting lot more money into dickheads (can I say that here?) with a pH meter or taxes – but into efficiency, soil and cropping management, technology etc.
And how many more years will this research take?
Currently there are a great deal of weather anomalies that have put a direct threat to the current food supply system. The supply and demand system will only sharply escalate food price beyond the reach of millions of people. Do you not think there will be a great deal of riots and discontent due to this?
Thinking of the climate as governed by a variety of variables in a nonlinear way, certainly seems to open things up a bit. In particular, it relates to something that has worried me about the AGW debate.
While the original idea that increasing CO2 concentrations would raise the mean temperature of the earth had some scientific logic, the newer idea that increasing CO2 is already causing climatic disturbance, seems far less logical to me. This is because from the point of view of a hypothetical true climate model, there is nothing special about the current CO2 concentration – there is no reason (I think) to assume that incremental changes in CO2 or temperature are going to produce anything other than linear changes in the climate, whether or not there are tipping points, etc. lurking somewhere in parameter space.
Is this a fair argument Judith?
Oh my, it really is from 2004. Someone else suggested this paper for me and I instantly thought its new and didn’t spot the year. My bad. But still, worth a discussion?
I didn’t catch the publication date either. But I really like the paper, hadn’t encountered it before either. So this paper is a springboard for a discussion we need to have!
We should be discussing this paper on its merits, not its provenance.
“We are all Broecker now…”
Poking a stick at an angry and dangerous beast?
I don’t think they all see the implications that clearly yet.
I am pleased that one of our papers is being discussed. This paper resulted from a Workshop entitled ‘Nonlinear Responses to Global Environmental Change: Critical Thresholds and Feedbacks – IGBP Nonlinear Initiative’, organized by the International Biosphere-Geosphere Program (IGBP) May 26–27th, 2001, Duke University, Durham, North Carolina.
I initiated the paper and Professor Rial took the lead to complete it as first author. Steve Schneider, to his credit, encouraged us to see this paper through to publication.
The topic of the paper, and our conclusions, are as timely now as they were when the paper was published. Indeed, the recent extreme cold weather that has been experienced, as well as the rapid cool down of the global average lower troposphere, empasizes that the climate system is much more complex than has been reported in assessments such as the 2007 IPCC WG1 report.
You are so correct.
But the complexity is not beyond understanding.
It is who will listen?
The current cooling from my research is much deeper than what science has ever experienced before due to the physical planetary changes.
“The current cooling from my research is much deeper than what science has ever experienced before “, yeah its down to the 30 year average on UAH. Science has never experianced that before…..
Oh wait.
The climate is the continuation of the ocean by other means.
H/t A. Bernaerts
=======
At the policy lever, this proposal makes a good start on a redirected US Global Change Research Program. It contains a bit too much AGW directed content for my skeptical taste, especially GCM’s and aerosols, but that just makes it a good compromise between AGW and skeptical scientific interests. Natural nonlinearity and oscillation must be the core issue.
I am less enthusiastic about creating a new Earth System Science department. At the present time this would likely just be a school for AGW proponents. Let’s develop the new science before we start teaching it.
“It contains a bit too much AGW directed content for my skeptical taste”
Interesting you think scepticism is an issue of taste. Any other vignettes for us?
So why, even with Schneider’s input, did not this paper provoke a discussion seven years ago?
A metric, that the corruption of the CO2 equals CAGW meme causes a lag in science of seven years.
===================
Wait’ll someone shows that the corruption of the science and policy making has created a bubble of financial expectation about decarbonization and ‘greening’ that will make the housing bubble pale in comparison.
====================
I hope this is all very amusing to Pielke Pere. It sure is to me.
===========================
I think that self-similarity (or scale invariance) that was mentioned briefly in the paper (e.g., figure 2b) should deserve more attention because it might have quite large impacts on what kind of statistics could be used. This far it seems like “mainstream” climate scientists try avoid this question just by claiming that “climate is not weather” and thus simple high-school level mehtods are just fine, linear trends are meaningful, errors cancel each other etc. However, I haven’t found any reference to work that really gives a proof for any of those assumptions.
I think that rejecting “climate is not weather” paradigm would have some quite important implications, e.g., is it really possible to extract assumed linear forcing from temperature records or proxies – that is – is it possible to estimate climate sensitivity with any usable accuracy? Another question is that is the internal variability of climate badly misunderstood. There is at least one paper claiming that DO events are not caused by any forcing:
Ditlevsen & Johnsen, Tipping points: Early warning and wishful thinking, GRL, VOL. 37, L19703 http://www.leif.org/EOS/2010GL044486.pdf
From conclusions: “In conclusion, the early warning of climate changes or structural change in any dynamical system driven through a bifurcation, can only be obtained if increase in both variance and autocorrelation is observed. Conclusions drawn based solely on one of the signals and not the other are invalid. Furthermore, detecting increased autocorrelation, or critical slow down, with statistical significance is difficult. For the DO climate transitions, increased variance and autocorrelation are not observed. These shifts are thus noise induced with very limited predictability, and early detection of them in the future might be wishful thinking.”
PS. One comment to claim made by some earlier commenter that we should be very woried if climate is chaotic because a small perturbations could cause large changes. That is true but not the whole picture. A chaotic system can also be insensitive to large disturbancies and its robustnes to different perturbations changes over time. Furhermore, a large change in a chaotic system does not (usually) mean that it goes haywire but that it just moves from one attractor pool to another.
I think my exact words were, I would put it at a small risk…
Climate risk (in a a chaotic system) is predictable only as a probability density function. We can’t predict with any certainty which attractor we will fall into. Merely to some degree the topology of the phase space. To do that requires a systematic evaluation of phase space in a climate model – using the range the plausible initial and boundary conditions. This is in stark contrast to the ensemble, and quite unsupportable, approach taken to date in the modeling community. Models are themselves chaotic of course – using the Navier-Stokes equations that Lorenz used in his early convection model.
It goes beyond the point of ludicrousness that that the arbitrarily chosen best run from diverse models are sent to the IPCC where they are assembled into a graph and the mean calculated – it cracks me up every time.
So saying that there are multiple possible states is only a very vague description of the phase space – and emphasising the states where shifts are minor falsely minimises the perceived risk. I think we have to accept that there is a small risk of abrupt and extreme change – and act accordingly.
The collapse of the probability function creates many climates in alternate universes. No – only joking.
Not to belabor this, Judith, but to be honest with you, one would have to be unaware of some major discussions in the relevant literature for the past decade, maybe two, to not recognize the vintage.
You will need to sort out how you thought it was new in terms of what has been discussed for years, but also why you believed it was new in terms of communicating uncertainty to the IPCC, when in fact this 7 year old paper and related science is clearly referenced in AR4.
First as tragedy, then as farce.
H/t 18th Brumaire.
========
And does referencing make their conclusions any more robust, and what was the papers implication to them? I can see this paper is in direct contradiction with the confidence the IPCC has given to the climate models and their results.
As far as I can see, IPCC referencing it only means that it has been cited and discussed. One can still cite a ton of papers and still claim that moon is cheese with a 95% propability.
Juakola,
“I can see this paper is in direct contradiction with the confidence the IPCC has given to the climate models and their results”.
Yesterday, you thought you had discovered a new paper. Maybe you don’t see as well as you think. Tell me about why you assume various aspects have not been properly considered by others, including the IPPC. Give me specifics to think about.
Yesterday you didn’t even know Rial et al. 2004 has been around and discussed in the literature, so I imagine you have not had sufficient time to do much additional reading, yet. There are a number of papers that take up your question. Among other things, while improved ability to model the distant future is going to be important as we go along, it is not looking as if models projecting future climate change over the next few decades are seriously limited by a current lack of knowledge about what drives abrupt climate change.
But do take some time to read. And next time, do the most cursory checks. It takes as much time to get it completely wrong, as to get it a little closer to right. Quite a bit has been written, since Rial et al, about the capacity of climate models to capture these processes and make projections of future climate; and also, whether risk and vulnerability assessments can and should be so narrowly focused.
Take care
With regards to your last sentence: yes much has been written, including AR4. The overconfidence in the AR4 assessment, and their neglect of many issues not mention the presence of ignorance in many areas, is a topic that I have written tens of thousands of words on at CLimate Etc. on earlier threads, perhaps before you checked in here.
Thank you Judith.
Martha wrote:
“Yesterday, you thought you had discovered a new paper. Maybe you don’t see as well as you think. Tell me about why you assume various aspects have not been properly considered by others, including the IPPC. Give me specifics to think about.”
Please read what R Pielke wrote on this topic, about IPCC referencing the paper. It was another paper by Rial et al according to him. Correct him if he was wrong.
Personally, I dont understand the UN IPCC process and its ability to exclude bias. It is evident when comparing the level of understanding in the physical basis against the summary of policymakers (very low or low understanding on many things becomes very high propability of global warming armageddon in the SPM).
It also seems if almost nothing had been learned from this paper. If it had been new it would have been better for the climate science community than being already six years old. But since its not new, something should have been taken in serious consideration?
Has it?
Some examples: Dessler et al. He makes a linear regression (R^2 is 2%) fit to determine sensitivity when it has already been pointed out climate is complex and nonlinear (Rial et al), and more advanced analysis methods are already presented in the scientific literature analyzing the feedbacks (Spencer). However, what I am certain for is that this study will be cited by the IPCC AR5 as evidence of a positive cloud feedback.
Models are still inadequate to mimic 1940 hump and they still dont know, what causes NAO, ENSO, etc. and what is their cause and effect on global warming. Instead, at least by the impression I got, modelers are trying to somehow dodge these questions. What complexity and nonlinearity might actually mean, is that making useful climate models is currently out of our reach. That would make the modelers jobless, wouldnt it?
I know I wear a tiny foil hat and my opinions might sound irritating/uneducated for some people, but thats what my opinion is and thats what opinions are for.
I love people in foil hats – do you have any photos?
I think you are actually at an intellectual tipping point – it is so hard being so far out in front that no-one understands.
have a look at – http://www.earthandocean.robertellison.com.au/ – and we can swap ideas
Dessler actually
Martha, I do not ready every paper that is published, particularly when it is published. Particularly in 2004, I was not interested in the kind of broader issues that I am currently interested in.
I don’t care whether the paper is new or not. Much of what I discuss here is about previously published papers, things that I am currently reading or topics that I am interested in.
I’ve changed the first sentence of the post to reflect that the paper isn’t “new.”
Judith,
It’s about being up to speed on basics, not reading every paper.
Great, let me know which basics i’m not up to speed on. None of the ideas presented in the paper are new to me (I’ve written extensive posts at Climate Etc. on many of these topics). I found the paper to be a useful integration, with interesting recommendations.
I strongly believe this should be at the top of the list!
You said:
“Improve our vision of the climate’s future through a better understanding of its history. ”
I say:
History is the best model to predict the future.
How can anyone look at the stable temperature range the earth has been blessed with for the past ten thousand years and still say it is anything other than extremely stable? Every time it gets warm, it then gets cold. Every time it gets cold it then gets warm. Some powerful forcing is always in the right direction and it has a set point close to where we are. The only powerful thing on earth with a set point near our temperature is Ice and Water. Most of these discussions are about trace stuff in the atmosphere. You never mention Ice and Water. You are on the wrong track. Look at the Mass of the Ice and Water compared to the Mass of the atmosphere. The trace gases that are featured it the debates are almost the same if they were not there. Something powerful is making the earth temperature stable. Small changes in Solar would put our temperatures off the charts without Ice and Water. You are all looking in the wrong place. The atmosphere is a tiny fraction of the Ice and Water on the earth and Ice and Water is not even in your discussions.
A 10,000 year interglacial? A fleeting moment in the passage of eons.
But you’re right. Water vapour – and hence the oceans – is one of the things we can’t do without. Hell, I’d be out of a job for a start.
Here is how the 2007 IPCC WG1 report commented on the Rial et al (2004) paper
“Abrupt changes in biogeochemical systems of relevance to our capacity to simulate the climate of the 21st century are not well understood (Friedlingstein et al., 2003). The potential for major abrupt change exists in the uptake and storage of carbon by terrestrial systems. While abrupt change within the climate system is beginning to be seriously considered (Rial et al., 2004; Schneider, 2004), the potential for abrupt change in terrestrial systems, such as loss of soil carbon (Cox et al., 2000) or die back of the Amazon forests (Cox et al., 2004) remains uncertain. In part this is due to lack of understanding of processes (see Friedlingstein et al., 2003; Chapter 7) and in part it results from the impact of differences in the projected climate sensitivities in the host climate models (Joos et al., 2001; Govindasamy et al., 2005; Chapter 10) where changes in the physical climate system affect the biological response”
but the wrong citation was listed or the text citing Rial et al (2004) is incorrect.
The IPCC WG1 Chapter 8 lists the Rial et al (2004) paper as
“Rial, J.A., 2004: Abrupt climate change: chaos and order at orbital and
millennial scales. Global Planet. Change, 41, 95–109.”
[from http://www.ipcc.ch/pdf/assessment-report/ar4/wg1/ar4-wg1-chapter8.pdf%5D
This is not the same paper as
Rial, J., R.A. Pielke Sr., M. Beniston, M. Claussen, J. Canadell, P. Cox, H. Held, N. de Noblet-Ducoudre, R. Prinn, J. Reynolds, and J.D. Salas, 2004: Nonlinearities, feedbacks and critical thresholds within the Earth’s climate system. Climatic Change, 65, 11-38.
http://pielkeclimatesci.wordpress.com/files/2009/10/r-260.pdf
Citation and referencing of your shared paper is however clear in a couple of chapters of AR4 WG2.
Oh, apologies, you actually already had corrected him.
In reply to another comment, Andrew Adams made an important point above that I believe deserves an entire comment of its own. The warming consequences of increasing CO2 are derivable from basic physics and are not the result of subtracting other variables from total climate behavior and attributing the difference to CO2.
The physical principles, established from a combination of theory and observational confirmation in both the laboratory and the atmosphere, involve the radiative absorption/emission responses of CO2 to infrared radiation, and the consequent effects on water, whose IR absorption/emission characteristics are also well documented. From this, it has been possible to conclude that observed increases in CO2 will yield a warming response that can be quantified within reasonable limits. It has been possible as well to conclude that a further consequence will be increases in atmospheric water vapor with similar warming effects, a reduction in snow/ice albedo that has been observed and is also a warming factor, and effects on clouds that are subject of some uncertainty but probably operate in a warming direction.
The entire concept of climate sensitivity encompassed by these phenomena is of course a topic of debate and uncertainty. I recommend that interested readers visit AR4 WG1 chapters 8 and 9 for the text, and even more importantly, the references, as well as relevant literature references since AR4, or earlier references that AR4 may have omitted. To date, the evidence supports the canonical estimated range of sensitivity between 2 and 4.5 C per CO2 doubling. Attempts to demonstrate a lower sensitivity have been made, but in general, low sensitivity estimates have been derived almost entirely from analysis of short term climate changes, while evaluation based on multidecadal climate change supports the estimates within the canonical range. Recent examples of this disparity include estimates based on SST changes that mainly involve ENSO variation – from Lindzen/Choi, Spencer/Braswell, and Dessler. The first two are flawed for a variety of reasons. Dessler’s estimates of a positive cloud feedback that would support a moderately high sensitivity appear more plausible, but none of these three studies based on short term changes originating in the ocean and imposed on an unwarmed atmosphere is legitimately extrapolable to long term changes due to warming originating in the atmosphere and imposed on a previously unwarmed ocean. In essence, it is reasonable to conclude with certainty that CO2 exerts substantial warming effects, and with reasonably high probability that these are magnified rather than diminished by feedbacks.
What are the consequences of these basic geophysical phenomena in relationship to natural climate variation? Probably the best way to express this is to state that natural variations are significant phenomena that need to be characterized and quantified, and whose effects are likely to be substantial. However, their role must be seen as anaddition to rather than a substitute for the role of CO2 and other anthropogenic influences in accounting for observed climate change. Appropriate emphasis on natural variation cannot cause the anthropogenic influences to disappear or dwindle into insignifcance.
Unlimited pomposity struggling with a deficit of imagination. Such as “while evaluation based on multidecadal climate change supports the estimates within the canonical range.” Speak English instead of a garbled faux academia.
I have laughed especially at the ‘canonical range’ elsewhere.
Models are themselves chaotic of course – using the Navier-Stokes equations that Lorenz used in his early convection model. It goes beyond the point of ludicrousness that that the arbitrarily chosen best run from diverse models are sent to the IPCC where they are assembled into a graph and the mean calculated – it cracks me up every time. Modelers have themselves suggested an alternative, probabilistic, approach.
We accept – well most of us – the very basic CO2 physics. It a chaotic universe this produces a risk of abrupt climate change. I have just got the latest figures from Tim Palmer’s Lorenzian Meteorological Office – there is a 0.00000000000000000000000001% chance of an ice age in the next year. They advise putting off the igloo build.
“Models are themselves chaotic of course “
That’s not correct, or at least it’s a serious exaggeration. Models themselves are mainly a collection of differential equations. Model projections may be characterized by a large or small contribution from chaotic elements depending on timescale. Weather predictions are subject to severe chaotic uncertainty, and are very susceptible to initial conditions. Conversely, GCMs projecting multidecadal trends are little affected by initial conditions. When initiated under a multitude of different conditions (typical for model ensembles), they tend to converge over time to very similar outcomes, and this property has been validated, at least for global temperature anomalies, by the good match between model outputs and observations.
I’m not arguing against the significance of unpredicted climate variation – it’s important. My main point is that even in the face of uncertainty, the strong role of temperature responses to CO2 and other greenhouse gases persists. Claims made by some that all of the past century’s climate warming might be due to natural variations can be shown to be false, based on the evidence. Obviously, that evidence can’t be distilled into the contents of a single thread like this one, but if readers visit the sources I’ve mentioned and the literature in general, they can judge for themselves, and I expect they will reach reasonable conclusions.
the reply is about 4 comments below
Fred:
You say “(natural variations) must be seen as an addition to rather than a substitute for the role of CO2 and other anthropogenic influences in accounting for observed climate change.” This puts the cart before the horse. Natural variations have been around far longer than anthropogenic influences, and they are responsible for shaping the present climate. Anthropogenic influences are therefore best seen as a recent perturbation superimposed on a continuum of natural variability. And in this case adding or subtracting anthropogenic and natural effects as if they were entries on a balance sheet will not necessarily give the correct answer.
The two main subjects of this thread are nonlinearities and critical thresholds, or “tipping points”. Your above post makes it clear that mainstream AGW theory does not allow for either. This is an omission that needs rectifying. Ocean-cycle tipping points had a significant impact on climate in the 20th century, with the best examples being the abrupt positive-to-negative change in the PDO that began in 1940 and the equally abrupt change in the opposite direction that began in 1975. But we still don’t know what caused the PDO to tip over on these two occasions, and the changes certainly can’t be replicated by climate models that consider only linear forcings.
I partly agree, but not entirely. Tipping points are very much a concern for climate science, although none has been realized yet (20th century changes were not tipping points in the usual sense of the term) – Tipping Elements in the Climate System . Still, you’re right that we don’t know how to model them very well if at all, which is probably all the more reason for concern, since they involve abrupt climate shifts that exacerbate the more gradual effects of the warming trend. Examples include massive methane release, an Amazon forest dieback, an abrupt loss of Antarctic or Greenland ice, and so on.
Regarding the PDO, it does indeed remain an unpredictable phenomenon, but you may recall my earlier citation in the “probabilistic” thread of the Meehl paper suggesting that it may be partially driven by anthropogenic forcings imposed on an underlying natural variation.
Without doubt, natural variations will continue to play out in our current climate as they have in the past. So will a continued emission of greenhouse gases. We have already poured more than one trillion tons of CO2 into the air, and that, in a sense, has become part of “nature”. Like all other natural phenomena, it will have consequences.
Fred:
According to the Lenton et al. paper you cite above, a “tipping element” describes “subsystems of the Earth system that are at least subcontinental in scale and can be switched—under certain
circumstances—into a qualitatively different state by small perturbations.” The 20th-century switches in the PDO clearly fit this definition.
The broader question, however, is whether Lenton et al. even qualifies as a scientific study. The authors acknowledge that it was politically-motivated (“increasing political demand to define and justify binding temperature targets …. makes it timely to review potential tipping elements in the climate system under anthropogenic forcing”) and they leave us in no doubt as to their personal convictions by making reference to such things as “the world we are leaving for our grandchildren” and to “tipping elements” that might trigger “a rapid societal transition toward sustainability”. Their tipping-point predictions are also based on models that they admit have no skill in hindcasting tipping-point events. It’s hard to escape the conclusion that Lenton et al. isn’t science at all, but advocacy.
But the following specific statement in Lenton did catch my attention – “… greening of the Sahara/Sahel is a rare example of a beneficial potential tipping element.” You also mention “abrupt climate shifts that exacerbate the more gradual effects of the warming trend.” My question here is, what is the basis for the assumption that abrupt climate shifts will always make things worse and never better?
Roger – I don’t assume that abrupt changes will always make things worse. It appears from the work of Rial et al and others that many of the abrupt changes (at least during glaciations) involve rapid warming followed by more gradual cooling. In a climate that is already warming, that might make things worse, but certainly, rapid cooling is possible and might mitigate the consequences of a warming trend (I’ll forego arguments about whether civilization will benefit from continued warming and assume that over the long run, most parts of the world will do worse rather than better).
“Tipping points”, however, as the term is conventionally used in the literature, refer to abrupt transitions triggered by an underlying anthropogenic trend. In other words, if CO2 continues to rise, and temperature follows, will a smooth rise be interrupted by a much steeper rise (e.g., due to methane release) than might have been expected from the CO2/temperature relationship alone? The same principle might be applied to a cooling trend – would it trigger an even more abrupt cooling – or for that matter, non-anthropogenic as well as anthropogenic forcings. The reason for concern currently is that the underlying trend direction is a warming one, and so any sudden steepening of the curve would magnify the consequences of warming. I’m not aware of physical principles or paleoclimatologic evidence suggesting that a temperature trend in one direction tends to trigger an abrupt shift in the other direction. Such compensations do occur, but tend to consume many thousands of years at a minimum (e.g. CO2 increases resulting from an ice-covered planet incapable of absorbing CO2 into the oceans).
Fred:
I was going to write something entirely different, but then a sudden thought occurred to me. Here we are discussing whether further warming will or will not cause some climatic variable to exceed a potentially dangerous tipping point in the 21st century and – I think – agreeing that we don’t really know. But the policymakers do. They have established this tipping point at 2C above pre-industrial levels, or at about 1C above where we are now. According to the Copenhagen Accord this is the “scientific view”.
I was wondering if you knew of any studies that offer any support for this “scientific view”. The papers I reviewed a year or two ago didn’t, but maybe I missed something.
Roger – Without reviewing all the quantitation that led to the 2C figure, I would say that it has little to do with any one specific tipping point, although the probability of reaching a a tipping point among the many potential ones would indeed increase as temperature rises. It has been estimated that even the current warming has had adverse effects in some regions (and beneficial effects elsewhere). The balance between harm and benefit will continue to shift toward harm as temperature increases further, but to some extent, a specific threshhold is arbitrary. Rather, the 2C figures is simply an attempt to reconcile the reality that climate will continue to warm toward that point with the evidence indicating that it would avert considerable harm if the increase did not proceed even further.
Google will lead you to many papers on dangerous anthropogenic interference with the climate system. Here is one reference of interest –
Dangerous Anthropogenic Interference
Fred
Thank you. I had seen the Ramanathan and Feng paper. It’s based on tipping point estimates taken from Lenton et al. It also notes that the 2C threshold was adopted over 20 years ago (R&F reference 1) and hasn’t changed since – a fact I had forgotten.
As you mention there are many papers on dangerous anthropogenic interference, but I have still to find one that isn’t based on crystal-ball gazing.
I think we can dispense with the notion that the 2C threshold has anything to do with science.
Roger Andrews –> “I’ve tried to explain it myself using natural forcings only and found that I could in fact do so, but only by cherry-picking my input data. I further agree that this indicates a significant anthropogenic impact…”
I think we all are well beyond that kind of summary of reality. Judith Curry and all of the global warming ‘heretics’ have learned that the IPCC’s claims, and the mumbo jumbo of AGW True Believers groupthink, and the EPA government science authoritarians’ statement that, “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG [greenhouse gas] concentrations,” IS NOT SUPPORTED.
“The fact is that the ‘null hypothesis’ of global warming has never been rejected: That natural climate variability can explain everything we see in the climate system.” ~Dr. Roy Spencer
Academics who continue to defend the concept that a global average temperature can have any meaning in the real world is a complete fraud. MBH98/99/08 (aka, the ‘hockey stick’) is a predatory anti-humanism that is only practiced now by witchdoctors of voodoo climatology that, according to the boffins of Japan, is equivalent to the study of ancient astrology.
Wagathon:
You’re right. I did write that. However, you failed to include the two succeeding sentences, which said: “But I wouldn’t go so far as to say that it conclusively demonstrates a significant impact. I don’t think we know enough about how the earth’s climate works to demonstrate anything conclusively at this point.”
That is a really great read. Thanks for the link. The parts of the analysis including Figure 3a caused me to wonder about the impact of the cyclic collapse of glacial lakes. Being in Washington State, Lake Missoula comes to mind as an example of a recurring glacial lake that results in catastrophic and sudden change in the local geography. I’m curious enough to look for some correlation between short term climate signatures in the climate record and the cyclic collapse of the larger glacial lakes.
Fred says things that sound plausible but has in fact no basis in the scientific method. Post modern as opposed to post normal science.
The differential equations are in fact the partial differential Navier-Stokes equations. These are the same equations used by Edward Lorenz in his 1960’s convection model – to discover chaos theory. The equations in three dimensional are sufficiently complex to show chaotic bifurcation as a result of sensitive dependence on initial conditions. The models also have something called structural instability. Small changes in boundary conditions can also initiate chaotic bifurcation.
Irreducible imprecision in atmospheric and oceanic simulations
James C. McWilliams
Department of Atmospheric and Oceanic Sciences and Institute of Geophysics and Planetary Physics, University of California, Los Angeles, CA 90095-1565
‘Sensitive dependence and structural instability are humbling twin properties for chaotic dynamical systems, indicating limits about which kinds of questions are theoretically answerable. They echo other famous limitations on scientist’s expectations, namely the undecidability of some propositions within axiomatic mathematical systems (Gödel’s theorem) and the uncomputability of some algorithms due to excessive size of the calculation.’
And tuning is not convergence.
I never know which data Fred is referring to – certainly not the satellite data – http://www.earthandocean.robertellison.com.au/ – judge for yourself
Fred simply has no conception of chaos theory – he should attempt some understanding – http://www.nap.edu/openbook.php?isbn=0309074347 before sharing an uneducated comment.
Again, you are incorrect. Models converge to common outputs over time that match observations well – without any “tuning” . GCMs are tuned to match initial climates, but once forced with CO2 increases or other inputs, they are not retuned to make the outputs “come out right”.
Part of the problem is in referring to climate as chaotic, when a more accurate characterization would be that the climate system harbors chaotic elements that are more or less important depending on timescale. The data speak for themselves in terms of model projections of multidecadal temperature change as matched with observations.
Again, your comment is unbelievably convoluted and does not address the issue that you raised first. The fallacy of shifting ground. Your original comment said that I was incorrect in saying that climate models were chaotic. Instead of acknowledging the paper I supplied by James McWilliams and the obvious connection of models to the very roots of chaos theory – you claim that I am wrong about something else.
The models are tuned as you say to match 20th century temps. And this makes a nonsense of your claim of matching models with realities means anything at all – they are already tuned. But in the first real test over the last 10 years it all falls over horribly because of chaotic climate shifts. Tell me again about the ‘canonical range’ (laughs evilly). You make things up and anyone who doesn’t know might take you seriously.
After that there are many plausible differences in boundary and initial conditions and many different outcomes possible. Irreducible imprecision as in the PNAS manuscript by McWilliams. They take one possible run and send it off to the IPCC.
Here is another paper at random – http://www.nws.noaa.gov/ost/climate/STIP/FY09CTBSeminars/shukla_021009.pdf Although really, I despair that I am adding to your education at all.
The scientific proof that climate is a complex and dynamic system – and that this has effects in the modern era – is relatively new. I date it from a 2007 study by Professor Anastasios Tsonis and colleagues: ‘A new dynamical mechanism for major climate shifts’. A numerical network model was constructed for the study from 4 observed ocean and climate indices – the El Niño Southern Oscillation (ENSO), the Pacific Decadal Oscillation (PDO), the North Atlantic Oscillation (NAO) and the Pacific Northwest Anomaly (PNA) – thus capturing most of the major modes of climate variability in the period 1900–2000. This network synchronized around 1909, the mid 1940’s and 1976/77 – after which the climate state shifted. A later study (Swanson and Tsonis 2009) found a similar shift in 1998/2001. They found that where a ‘synchronous state was followed by a steady increase in the coupling strength between the indices, the synchronous state was destroyed, after which a new climate state emerged. These shifts are associated with significant changes in global temperature trend and in ENSO variability.’ Amongst the implications of these studies is that these indices of climate are not independent but interact in a shifting climate. There are tremendous energies cascading through powerful systems – so yes the system is all of one piece and chaotic to the core. So no it is not a little bit chaotic.
These are paradigm changing studies because now we know how it works in the physical indices of climate variability we have monitored for up to 100 years.
I think you show bad faith and are slyly mischievous in your argumentative stratagem. You’re not my first wife are you? If not – can you go away now – I’m a bit tired.
That’s again incorrect. Models are not tuned to match 20th century temperatures. They are tuned only to the conditions at the start of the interval they are evaluating, but once the input is fed in (e.g., a rising CO2 concentration), the models are left to yield whatever output they may, and the modeler can’t go back and correct it to make the temperatures come out they way he or she wants.
Rather than for me to address all your other statements, Robert, I think it’s probably better, considering the repetition involved, for interested readers to review what we each wrote to make their own judgments.
What you say is true but only 1/1 000 000 truth since there will be literally millions of models which are are all tuned slightly differently. Then the modelers pick their favourite poster boy which matches the best with the observations.
Correct me if I am wrong.
I’ll ask you to demonstrate that millions of models exist, but that only those that produce the desired outcome are “picked”. Developing and implementing a GCM is an arduous, expensive, and time-consuming task. Those that can be implemented are added to the dataset, rather than discarded because they fail to yield hoped-for results. If the latter were true, model performances would be far better than they actually are. They do fairly well for long-term global temperature, but worse for short term or regional trends, or for ENSO, volcanoes, etc.
At least some simpler models i’ve seen, have “knobs” which are first randomized (randomized output = models), then those models will have a run and the modeler compares the different outputs.
No Fred…any simple google search will reveal more than you need to know about tuning variables.
http://pielkeclimatesci.wordpress.com/2008/11/28/real-climate-misunderstanding-of-climate-models/
Shall I refer you to the IPCC – http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8s8-1-3.html
‘The models are far from physically realistic systems that evolve in the same way that climate does. Such deliberate misinformation will not pass unremarked.
‘Global climate model simulations of the 20th century are usually compared in terms of their ability to reproduce the 20th century temperature record. This is now almost an established test for global climate models. One curious aspect of this result is that it is also well known that the same models that agree in simulating the 20th century temperature record differ significantly in their climate sensitivity. The question therefore remains: If climate models differ in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy?
The answer to this question is discussed by Kiehl (2007). While there exist established data sets for the 20th century evolution of well-mixed greenhouse gases, this is not the case for ozone, aerosols or different natural forcing factors. The only way that the different models (with respect to their sensitivity to changes in greenhouse gasses) all can reproduce the 20th century temperature record is by assuming different 20th century data series for the unknown factors. In essence, the unknown factors in the 20th century used to drive the IPCC climate simulations were chosen to fit the observed temperature trend. This is a classical example of curve fitting or tuning.’
Professor Ole Humlum – http://www.climate4you.com/ClimateModels.htm
I keep referencing authoritative sources – you keep talking through your hat I’m afraid. The only actual reference provided was to ‘tipping points’ – things that might change in future as a result of global warming. The climate shifts reported by Professor Tsonis – with affects on global temperature – happened 4 times in the 20th century. Again an authoritative peer reviewed source which you pass over in complete silence.
Now I might be an arrogant prick – but I was unfailingly patient and polite until you were rudely dismissive and said you didn’t want to engage with me.
Now you are doing it again. Are people really following this discussion? For their sake I hope not.
I’ll respond, but the point was made earlier in my responses to you and juakola. Models are tuned to match the behavior of a climate before it is subjected to any perturbations such as a rise in CO2. They must simulate seasons, latitudinal differences, wind and ocean currents, etc. In the absence of a perturbation, they will then tend to oscillate among these states without exiting from them. Some of this tuning is described in the links you cite.
They are then “forced” (subjected to a perturbation) – with CO2, solar changes, etc., and allowed to run. Whatever the outcome, they are not retuned to make the outcome match real world observations. If they perform well, fine. If not, there is nothing the modeler can do about it. However, even the poorer performances are averaged together with results from better performing models to indicate how models address a particular perturbation.
Over the course of years, new models are developed to better address the multiple factors affecting climate, based on more accurate observational input data. These may perform better than older models, but the results of the older models remain available for judgment. They have done well with some perturbations, such as the long term temperature response to CO2. An example is Hansen’s 1984 model, which simulated subsequent observations fairly well, yielding a temperature curve that was somewhat but not egregiously higher than observed. The input into that model yielded a climate sensitivity value of 4.2 C per CO2 doubling, whereas if the currently accepted most probable value of about 3 C per doubling had been used, the model would have simulated observations almost perfectly.
Other perturbations such as short term and regional changes in response to forcings or natural variability are handled poorly.
The confusion between tuning a model and retuning it to make it match observations is common, but the latter is not part of GCM performance.
Fred
This is where I get confused about the models.
You say: “The input into that model yielded a climate sensitivity value of 4.2 C per CO2 doubling, whereas if the currently accepted most probable value of about 3 C per doubling had been used, the model would have simulated observations almost perfectly.”
In that statement, the climate sensitivity is both an input and seemingly an output. How can that be? I understood that it was the models that yielded the climate sensitivity but if they require an ssumption about the climate sensitivity as an input ………..Would you mind explaining. Thanks.
Perhaps a climate modeler will provide a response here, but as I understand it, climate sensitivity is only an emergent property (an output) of GCMs used in the manner I described (for their use in paleoclimatology, see AR4 WG1, chapter 9). However, the input values and results calculated from them (temperature, MODTRAN-derived radiative transfer codes, CO2 concentrations, water vapor changes, etc.), yield outputs (temperature change as a function of CO2) that can be translated into a climate sensitivity value. I believe that both improved parametrization of radiative transfer and better observational data on factors affecting feedback (e.g., relative humidity), among other improvements, have caused newer models to produce results equivalent to a lower climate sensitivity than in Hansen et al’s model.
‘I’ll respond, but the point was made earlier in my responses to you and juakola. Models are tuned to match the behavior of a climate before it is subjected to any perturbations such as a rise in CO2. ‘
OMG – these models are tuned to match climate observations before any climate processes are included in the model. Magic.
It might be useful to link to Ole Humlum again for the benefit of everyone but Fred – http://www.climate4you.com/ClimateModels.htm
‘Global climate model simulations of the 20th century are usually compared in terms of their ability to reproduce the 20th century temperature record. This is now almost an established test for global climate models. One curious aspect of this result is that it is also well known that the same models that agree in simulating the 20th century temperature record differ significantly in their climate sensitivity. The question therefore remains: If climate models differ in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy? ‘
You could read on if inclined at Professor Humlum’s site.
See my above comment, Robert. I think readers would be able to get a good sense of how models are developed and used by reviewing what each of us has been saying.
In addition, the climate sensitivity thread is also informative in understanding model performance, and so I won’t repeat here the discussion of how models that are parametrized differently can arrive at similarly accurate projections of temperature trends – it’s related to the fact that models using higher than average responses to variables do this for both positive and negative influences.
I know they all arrive at, and are tuned to, 20th century temps by different paths as in the quote from Professor Ole Humlum.
Professor Ole Humlum – http://www.climate4you.com/ClimateModels.htm
But this discussion started with my obvious claim that models were chaotic systems in their own right. So tuning of intrinsically chaotic systems is somewhat of a distraction – something in which you specialise. I linked to the James McWilliams paper in PNAS – Irreducible imprecision in atmospheric and oceanic simulations. I mentioned the Tim Palmer book – Predicting Weather and Climate – from which I purlioned the idea of the Lorenzian Meteorological Office. I discussed the partial differential equations used in GCM and in the early convection model used by Edward Loeanz to discover chaos theory.
There is no doubt at all that GCM are intrinsically chaotic and negligible doubt that the real world is chaotic as well.
You seem immune to these ideas – which are the very essence of this thread.
Fred Moolten did mention ice/albedo in two of his posts. Albedo is, I believe, a major factor in climate temperature regulation. Albedo almost never gets mentioned in the various threads. There was a record Low Sea Ice Extent last year and coming into this year. The result was record Arctic Ocean Effect Snow and Cold around the northern latitudes. This in the face of rising CO2. Dr Curry, I believe that Albedo should have a thread of its own. When the Arctic Ocean is frozen, water is not exposed and it does not snow much. Ice retreats and albedo decreases and the earth warms. That melts Arctic Ice and exposes Arctic water and Arctic Ocean Effect Snow increases albedo and that cools the earth. This is a strong stabilizing force. This is a major factor that helped stabilize the temperature for the past the past ten thousand years . Before ten thousand years ago, this same factor stabilized the temperature in a wider range, allowing more Arctic Thaw with more major Arctic Ocean Effect Snow that did put earth into one hundred thousand year ice ages. Dr. Curry, you talk about being open to different ideas. Please open up a thread and let us kick this around.
Look at the work of Judah Cohen and see the evidence that Thawed Arctic Water due to warming oceans did cause the Snow and Cold that caught NOAA and most Climate Scientists off guard.
http://www.nsf.gov/news/special_reports/autumnwinter/model.jsp
I do have a Theory of my own, based on the work by Maurice Ewing and William Donn that I would like for you to consider using to start a thread on this topic. It does have about 12 links. I would like to email it to you for your comments and consideration, but I can not find your email address.
Chief Hydrologist | February 5, 2011 at 1:32 pm | Reply
“The question therefore remains: If climate models differ in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy? “
This is a fundamental problem, and one which, if it has been addressed in any fashion, I am unaware of it: are they obsevable?
The basic problem is, do you have enough observations to determine the parameters of your model uniquely? Because, if your system is not observable given your ensemble of measurements, there are an infinite number of parameterizations which will give you precisely the same results.
Oops. “…observable?