by Judith Curry
On possibilities, known neglecteds, and the vicious positive feedback loop between scientific assessment and policy making that has created a climate Frankenstein.
I have prepared a new talk that I presented yesterday at Rand Corp. My contact at Rand is Rob Lempert, of deepuncertainty.org fame. Very nice visit and interesting discussion.
My complete presentation can be downloaded [Rand uncertainty]. This post focuses on the new material.
Scientists are saying the 1.5 degree climate report pulled punches, downplaying real risks facing humanity in next few decades, including feedback loops that could cause ‘chaos’ beyond human control.
To my mind, if the scientists really wanted to communicate the risk from future climate change, they should at least articulate the worst possible case (heck, was anyone scared by that 4″ of extra sea level rise?). Emphasis on POSSIBLE. The possible worst case puts upper bounds on what could happen, based upon our current background knowledge. The exercise of trying to articulate the worst case illuminates many things about our understanding (or lack thereof) and the uncertainties. A side effect of such an exercise would be to lop of the ‘fat tails’ that economists/statisticians are so fond of manufacturing. And finally the worst case does have a role in policy making (but NOT as the expected case).
My recent paper Climate uncertainty and risk assessed the epistemic status of climate models, and described their role in generating possible future scenarios. I introduced the possibilistic approach to scenario generation, including the value of scientific speculation on policy-relevant aspects of plausible, high-impact scenarios, even though we can neither model them realistically nor provide a precise estimate of their probability.
How are we to evaluate whether a scenario is possible or impossible? A series of papers by Gregor Betz provides some insights, below is my take on how to approach this for future climate scenarios based upon my reading of Betz and other philosophers working on this problem.
I categorize climate models here as (un)verified possibilities, there is a debate in the philosophy of science literature on this topic. The argument is that some climate models may be regarded as producing verified possibilities for some variables (e.g. temperature).
Maybe I’ll accept that a few models produce useful temperature forecasts, provided that they also produce accurate ocean oscillations when initialized. But that is about as far as I would go towards claiming that climate model simulations are ‘verified’.
An interesting aside regarding the ‘tribes’ in the climate debate, in context of possibility verification:
- Lukewarmers: focus on the verified possibilities
- Consensus/IPCC types: focus on the unverified possibilities generated by climate models.
- Alarmists: focus on impossible scenarios and/or borderline impossible as ‘expected’ scenarios, or worthy of justifying precautionary avoidance of emitting CO2.
This diagram provides a visual that distinguishes the various classes of possibilities, including the impossible and irrelevant. While verified possibilities have higher epistemic status than the unverified possibilities, all of these possibilities are potentially important for decision makers.
The orange triangle illustrates a specific vulnerability assessment, whereby only a fraction of the scenarios are relevant to the decision at hand, and the most relevant ones are unverified possibilities and even the impossible ones. Clarifying what is impossible versus what is not is important to decision makers, and the classification provides important information about uncertainty.
Let’s apply these ideas to interpreting the various estimates of equilibrium climate sensitivity. The AR5 likely value is 1.5 to 4.5 C, which has hasn’t really budged since the 1979 Charney report. The most significant statement in the AR5, which is included in a footnote in the SPM: “No best estimate for equilibrium climate sensitivity can now be given because of lack of agreement on values across assessed lines of evidence and studies.”
The big disagreement is between the CMIP5 model range (values between 2.1 and 4.7 C) and the historical observations using an energy balance model. While Lewis and Curry (2015) was not included in the AR5, it provides the most objective comparison of this approach with the CMIP5 models since it used the same forcing and time period.
The Lewis/Curry estimates are arguably corroborated possibilities, since they are based directly on historical observational data, linked together by a simple energy balance model. It has been argued that LC underestimate values on the high end, and neglect the very slow feedbacks. True, but the same holds for the CMIP5 models, so this remains a valid comparison.
Where to set the borderline impossible range? The IPCC AR5 put a 90% limit at 6 C. None of the ECS values cited in the AR5 extend much beyond 6 C, although in the AR4 many long tails were cited, apparently extending beyond 10 C. Hence in my diagram I put a range of 6-10 C as borderline impossible based on information from the AR4/AR5.
Now for JC’s perspective. We have an anchor on the lower bound — the no-feedback climate sensitivity, which is nominally ~1 C (sorry, skydragons). The latest Lewis/Curry values are reported here over the very likely range (5-95%). I regard this as our current best estimate of observationally based ECS values, and regard these as corroborated possibilities.
I accept the possibility that Lewis/Curry is too low on the upper range, and agree that it could be as high as 3.5C. And I’ll even bow to peer/consensus pressure and put an upper limit of the v likely range as 4.5 C. I think values of 6-10 C are impossible, and I would personally define the borderline impossible region as 4.5 – 6 C. Yes we can disagree on this one, and I would like to see lots more consideration of this upper bound issue. But the defenders of the high ECS values are more focused on trying to convince that ECS can’t be below 2 C.
But can we shake hands and agree that values above 10C are impossible?
Now consider the perspective of economists on equilibrium climate sensitivity. The IPCC AR5 WGIII report based all of its calculations on the assumption that ECS = 3 C, based on the IPCC AR4 WGI Report. Seems like the AR5 WGI folks forgot to give WGIII the memo that there was no longer a preferred ECS value.
Subsequent to the AR5 Report, economists became more sophisticated and began using the ensemble of CMIP5 simulations. One problem is that the CMIP5 models don’t cover the bottom 30% of the IPCC AR5 likely range for ECS.
The situation didn’t get really bad until economists start creating PDFs of ECS. Based on the AR4 assessment, the US Interagency Working Group on the Social Cost of Carbon fitted a distribution that had 5% of the values greater than 7.16 C. Weitzmann (2008) fitted a distribution 0.05% > 11C, and 0.01% >20C. While these probabilities seem small, they happen to dominate the calculation of the social cost of carbon (low probability, high impact events). [see Worst case scenario versus fat tail]. These large values of ECS (nominally beyond 6C and certainly beyond 10 C) are arguably impossible based upon our background knowledge.
For equilibrium climate sensitivity, we have no basis for developing a PDF — no mean, and a weakly defended upper bound. Statistically-manufactured ‘fat tails’, with arguably impossible values of climate sensitivity are driving the social cost of carbon. Instead, effort should be focused on identifying the possible or plausible worst case, that can’t be falsified based on our background knowledge. [see also Climate sensitivity: lopping off the fat tail]
The issue of sea level rise provides a good illustration of how to assess the various scenarios and the challenges of identifying the possible worst case scenario. This slide summarizes expert assessments from the IPCC AR4 (2007), IPCC AR5 (2013), the US Climate Science Special Report (CSSR 2017), and the NOAA Sea Level Rise Scenarios Report (2017). Also included is a range of worst case estimates (from sea level rise acceleration or not).
With all these expert assessments, the issue becomes ‘which experts?’ We have the international and national assessments, with a limited number of experts for each that were selected by whatever mechanism. Then we have expert testimony from individual witnesses that were selected by politicians or lawyers having an agenda.
In this context, the expert elicitation reported by Horton et al. (2014) is significant, which considered expert judgement from 90 scientists publishing on the topic of sea level rise. Also, a warming of 4.5 C is arguably the worst case for 21st century temperature increase (actually I suspect this is an impossible amount of warming for the 21st century, but lets keep it for the sake of argument here). So should we regard Horton’s ‘likely’ SLR of 0.7 to 1.2 m for 4.5 C warming as the the ‘likely’ worst case scenario? The Horton paper gives 0.5 to 1.5 as the very likely range (5 to 95%). These values are much lower than the range 1.6 to 3 m (and don’t even overlap).
There is obviously some fuzziness and different ways of thinking about the worst case scenario for SLR by 2100. Different perspectives are good, but 0.7 to 3 m is a heck of a range for the borderline worst case.
And now for JC’s perspective on sea level rise circa 2100. The corroborated possibilities, from rates of sea level rise in the historical record, are 0.3 m and less.
The values from the IPCC AR4, which were widely criticized for NOT including glacier dynamics, are actually verified possibilities (contingent on a specified temperature change) — focused on what we know, based on straightforward theoretical considerations (e.g. thermal expansion) and processes for which we have defensible empirical relations.
Once you start including ice dynamics and the potential collapse of ice sheets, we are in the land of unverified possibilities
I regard anything beyond 3 m as impossible, with the territory between 1.6 m and 3.0 m as the disputed borderline impossible region. I would like to see another expert elicitation study along the lines of Horton that focused on the worst case scenario. I would also like to see more analysis of the different types of reasoning that are used in creation of a worst case scenario.
The worst case scenario for sea level rise is having very tangible applications NOW in adaptation planning, siting of power plants, and in lawsuits. This is a hot and timely topic, not to mention important. A key topic in the discussion at Rand was how decision makers perceive and use ‘worst case’ scenario information. One challenge is to avoid having the worst case become anchored as the ‘expected’ case.
Are we framing the issue of 21st century climate change and sea level rise correctly?
I don’t think Donald Rumsfeld, in his famous unknown taxonomy, included the category of ‘unknown knowns’. Unknown knowns, sometimes referred to as ‘known neglecteds,’ refer to known processes or effects that are neglected for some reason.
Climate science has made a massive framing error, in terms of framing future climate change as being solely driven by CO2 emissions. The known neglecteds listed below are colored blue for an expected cooling effect over the 21st century, and red for an expected warming effect.
Much effort has been expended in imagining future black swan events associated with human caused climate change. At this point, human caused climate change and its dire possible impacts are so ubiquitous in the literature and public discussion that I now regard human-caused climate change as a ‘white swan.’ The white swan is frankly a bit of a ‘rubber ducky’, but nevertheless so many alarming scenarios have been tossed out there, that it is pretty unimaginable that a climate surprise caused by CO2 emissions that has not been imagined.
The black swans related to climate change are associated with natural climate variability. There is much room for the unexpected to occur, especially for the ‘CO2 as climate control knob’ crowd.
Existing climate models do not allow exploration of all possibilities that are compatible with our knowledge of the basic way the climate system actually behaves. Some of these unexplored possibilities may turn out to be real ones.
Scientific speculation on plausible, high-impact scenarios is needed, particularly including the known neglecteds.
Is all this categorization of uncertainty merely academic, the equivalent of angels dancing on the end of a pin? The level of uncertainty, and the relevant physical processes (controllable or uncontrollable) are key elements in selecting the appropriate decision-analytic framework.
Controllability of the climate (the CO2 control knob) is something that has been been implicitly assumed in all this. Perhaps on millennial time scales climate is controlled by CO2 (but on those time scales CO2 is a feedback as well as a forcing). On the time scale of the 21st century anything feasible that we do to reduce CO2 emissions is unlikely to have much of an impact on the climate even if you believe the climate model simulations (see Lomborg)
Optimal control and cost/benefit analysis, which are used in evaluating the social cost of carbon, assume statistical uncertainty and that the climate is controllable — two seriously unsupported assumptions.
Scenario planning, adaptive management and robustness/resilience/antifragility strategies are much better suited to conditions of scenario/deep uncertainty and a climate that is uncontrollable.
How did we land in this situation of such a serious science-policy mismatch? Well, in the early days (late 1980s – early 1990’s) international policy makers put the policy cart before the scientific cart, with a focus on CO2 and dangerous climate change. This focus led climate scientists to make a serious framing error, by focusing only on CO2-driven climate change. In a drive to remain relevant to the policy process, the scientists focused on building consensus and reducing uncertainties. The also began providing probabilities — even though these were unjustified by the scientific knowledge base, there was a perception that policy makers wanted this. And this led to fat tails and cost benefit analyses that are all but meaningless (no matter who they give Nobel prizes to).
The end result is oversimplification of both the science and policies, with positive feedback between the two that has created a climate alarm monster.
This Frankenstein has been created from framing errors, characterization of deep uncertainty with probabilities, and the statistical manufacture of fat tails.
“Monster creation” triggered a memory of a post I wrote in 2010 Heresy and the Creation of Monsters. Yikes I was feisty back then (getting mellow in my old age).
Reblogged this on Climate Collections.
how can anyone trust any climate model, which is not able to reproduce the regional historic climate variability. If you eliminate all external forcings, you get a flat line climate over centuries and millennia, something never ever happened in the dynamic nature of the system.
If analysts are going to calculate the worst case for global warming, then the worst case for remediation efforts also needs to be calculated. One could theoretically, see huge increases in poverty or deaths from inadequate medical care if fossil fuels were eliminated and the economy collapses. If unlikely, but very bad consequences are to be theoretically considered, then they need to be considered from all angles.
The warmist/left community has blinders on which leads them to look at problems from a very narrow perspective, which not unsurprisingly is consistent with their world-view and political goals.
As I read things, the whole debate makes the assumption that by belching carbon dioxide into the atmosphere, we have excluded the entry into a new Ice Age.
Clearly when no humans were around to belch carbon dioxide, oceanic temperatures were the primary drivers in regulating levels and when cooling reduced carbon dioxide to just about 200ppm, something triggered descent into cold. That is no longer necessarily the sole case.
It is clear that the IPCC is solely involved in developing radiator technology, namely focussed on warming, how fast and how high. There is no focus on how fast and how low could things go. Mainly as scientists do not have robust understanding of what triggers a descent into cold, even if theories are now being evaluated about how escape into an interglacial may occur.
There is also no focus on the US military heating up the ionosphere using HAARP. SSW frequency has recently increased radically. As a skeptical due diligence practicioner, I would ask them to help humanity with their enquiries, as it would be criminality of the highest order for the military of one nation to control global climate and everything downstream through unaccountable military experimentation. It might also impact on climate modelling, knowing that solar mimicry was being beamed out from Alaska or wherever….it would certainly have impacts on commodity harvests, knowledge which could make insider-traders billions on Wall Street.
While we are wishing – I would like to see a best case, a most likely case and a worst case.
It is also a little frustrating that the existing technology of nuclear power is rejected out of hand for solving the worst case scenario.
In the USA, we have 100 nuclear power plants and they produce 20% of the electricity. It would be easy (relatively) to have each state (or perhaps a region for very small states) build 2 plants each (for another 100 plants) and double the share of nuclear to 40%. Do that every five years and in 15 years 80% of the power is being provided by nuclear power.
No one is even looking at the nuclear option, as far as I can see.
Pick a passive cooling design and build 100 of the exact same design. How hard can this be?
Instead we are screwing around with renewables, which cannot provide power when it is dark and not windy (intermittent). I would leave renewables for about 20%-30% of the solution and do the rest with nuclear power.
The people who sell power have been looking at it for decades. It’s an economic loser.
Cheap renewables and a closure of expensive nuclear plants caused California electricity price skyrocket. An economic winner?
“Great” thinker stifled by paper bag.
IEA/NEA Report on Projected Costs of Electricity Generation 2015
“The report also reveals that nuclear energy costs remain in line with the cost of other baseload technologies, despite persistent reports to the contrary.”
A ‘solution’ to 25% of the ‘problem’? (AGW – SMH)
close to 40% if you accept (as I do) that electric cars and trucks are inevitable. That moves a big chunk of “transportation,” as well as some agriculture and land use (No reason you cant have electric equipment) and “other energy” (which I assume includes a lot of ICE engines for generators, boats, lawnmowers, etc etc etc.)
And, of course, you realize you just made the world’s strongest case for why we should all be ticked off at all the taxpayer cash dumped into renewables- which can’t handle electricity sector, much less any part of the others.
Judy, thanks. Let me work out another aspect in the light of two recent papers. My thesis is: only lukwarmer can set realistic aims for the prevention of a climate disaster and activate the people doing some action against it. When we consider a TCR of about 1.3 ( just like L/C18 do) then we’ll get a critical theshold for GMST at about 2100 ( 1.8 deg. C). Up to this time the mankind has to manage, that emissions go near zero to hold this limit. It’s some kind of sci-fi to hope that it could be earlier. This paper https://www.tandfonline.com/doi/abs/10.1080/14693062.2018.1532872 states that it’s contraproductive to make the people fatalistic with the doom and gloom, this would produce some kind of “it’s all over”- mood and not the wanted technological boost to overcome the fossil fuel as the energy source No. one. This value is the realistic one from observations and the only arument against this is: hidden patterns boosting the ECS to near 3 (TCR about 2). If it would be so the mankind would have no chance! Doom and gloom and there’s no chance to do something about it.
The other paper describing the social effects of climate chance is this one: https://www.sciencedirect.com/science/article/pii/S1631071318301159
Grundmann states that Climate change is a wicked problem and stresses the uncertainty in the decisions to make. If we would have certainty in the doom and gloom predicitons of some models we should stop every acion and enjoy the last years of the mankind! Is this the aim of activists? No, seems to be the right answer. I’m optimistic to solve this problem. But this will only work with scientific confidence when we stop to follow the Frankenstein adorers.
What we need now are expert engineers and physicists to take that ECS chart and duplicate it for “alternative energy sources in developed and rapidly developing nations”
Where the “corroborated possibilities” would be hydro, nuclear, natural gas
The “unverified possibilities” would be CCS, any renewable penetration over about 20-30%,
“impossible” = 100% renewable, “radical lifestyle and economic system changes,” verifiable International treaties absent breakthrough technologies.
Geo-engineering being one that straddles “unverified” possibilities and “impossible.”
At least we’d focus the international and political debate, international investment and R&D, and remove the political activists’ motive to keep absurd alarmist scenarios alive in the media.
good one :)
Here’s a graph from Wikipedia made from empirical data on CO2 free electricity generation:
What you have done for WG1 should also be performed for WG2 and WG3 issues.
You refer to CBAs, but I think arguments referencing the social cost of carbon need to also take into account the very real social costs of reducing carbon.Those costs need to include factors like early retirement of capital investment, foregone development in non-OECD countries, etc.
We could (of course) convert to a mix of nuclear, hydro, solar, wind and biofuels in a fairly short time frame. This problem is eminently solvable. Bu the costs of telescoping this energy transition are roughly double those of allowing market forces to lead us to this promised land, if indeed it is such. I did back of the envelope calculations four or five years ago and they need to be adjusted for inflation, but it sure looked like a rush to green (including a dash to gas as a bridge fuel) would cost around $23 trillion, compared to $12 trillion for natural evolution of our fuel portfolio.
Tom, you did really good work showing that the world will need (and therefore produce) much more energy than it does now.
Early retirement costs of coal plants is real but not the whole story. Gas went from ~20% of the electricity generation in the US to ~34% from 2006 to 2018- adding 20 gigawatts of gas generating capacity in 2018 alone. Renwables from ~4% to ~9%.
If wind/solar were as cost-competitive and effective as the advocates claim, transition to renewables would be well under way in developing nations that need new capacity and in the US.
Interesting link- look at the mix of energy sources by region of the US.
The American midwest is the only place where coal reigns supreme. It’s also the region that pipeline protests would cut off from gas. It’s also the region that would be hit hardest by carbon pricing initiatives (especially if the only allowed option is non-performing renewables).
It’s also where the Democrats lost the election for president in 2016.
Sadly, you continue to maintain the fiction that climate change is fraught with uncertainties, when in fact, it is very simple.
It has two components:
1. Natural recovery from the Little Ice Age cooling: Approx.0 .05 deg C./decade (the warming rate from 1900 to circa 1975). Thereafter, it increased because of reduced anthropogenic SO2 aerosol emissions, due to Clean Air efforts, which cleansed the air, to about 0.16 deg. C/decade.
2. The amount of anthropogenic SO2 aerosol emissions in the atmosphere: Approx. 0.02 deg. C .of change for each net Megaton of change in global SO2 aerosol emissions, anthropogenic or volcanic.
All of the anomalous warming since 1975 has been due cleansing of the atmosphere due to Clean Air reductions in dimming SO2 aerosol emissions, which HAS to occur, and with no hint of any additional warming due to “greenhouse gasses”.
Superimposed upon upon this trend are temporary temperature increases or decreases because of VEI4, or larger, volcanic eruptions: La Ninas, caused by increased atmospheric SO2 levels, typically form about 15 months after the date of an eruption, and El Ninos, due to settling out of the aerosols, appear around 24 months after an eruption.
Anthropogenic SO2 levels can be adjusted, but we are at the mercy of random volcanic eruptions, making it impossible to predict future temperatures beyond a few years.
Currently, our Clean Air efforts to reduce SO2 emissions are solely responsible for the higher temperatures that are responsible for the many climate-related disasters around the wor!d.
I post this with certainty, but if anyone can provide clear evidence that i am wrong, let’s have a discussion.
Wow. Thanks. There are few examples of model uncertainty at the level of a perturbed physics model. The famous Murphy 2009 and Rowlands et al 2012. There may be others?
This shows the growth in uncertainty in a set of solutions – from slightly different starting points – of a model. Even constraining solutions they get a broader range of future climates than the IPCC. The unconstrained solution space is broader still.
Starting with uncertain and imprecise sets of initial conditions – model uncertainty increases until saturating out at a level intrinsic to initial conditions precision and model structure.
The thick, black, linear extrapolation is mine.
Reminiscent of when Curry made basic mistakes in her previous discussion of uncertainty (and her supposed “positive feedback loop”), and other scientists had to come in an correct the nonsense. It’s amazing to me how people are still falling for this manufacture of false doubt (the tobacco strategy).
“Comment on “Climate science and the uncertainty monster” JA Curry and PJ Webster”
“It seems [Judity Curry] is far too busy to deal with this minor issue (which underpins, or rather undermines, every quantitative statement she has made regarding the purported failings of the IPCC analysis).”
“Many of the strategies used by the opponents of both evolution and global warming are based on sowing misinformation and doubt. This approach is often called the “tobacco strategy”, because tobacco companies used it effectively to delay health warnings and regulation of smoking.”
““Corporations use a range of strategies to dispute their role in causing public health harms and to limit the scope of effective public health interventions. […] these industries argue that aetiology is complex, so individual products cannot be blamed […].
Arguments about the complex, multifactorial aetiology of CHD and cancer have long been used by the tobacco industry to dispute the epidemiological and other evidence. […] Demands for perfect evidence, while misrepresenting the existing evidence, can also be observed in climate change denialism.”
Some people are really boring…
A response that perfectly illustrates James Annan’s point. And Gavin Schmidt’s point. And Victor Venema’s point. And…
I’m still yawning.
Maybe you can yawn long enough for the stadium wave to show up.
I’m watching the variations in regional ice extent and AMO. stay tuned.
Judy: very interesting field! Here I show the September extent ( area, the lower graph) with a linear trend and a 10a Loess Smoothing:
Some of the “best freinds” should be alarmed? :-)
Yes Atomski, Some alarmists like to call Judith a denier [sic] when its an obvious lie. Some disagree with her science and resort to personal attacks on her. Kind of reminds some of us of certain “people’s paradise” countries. I would suggest you might need to revisit what “diversity” means and try to focus on the issue at hand rather than clumsy ad hominems. You benefit immensely from Judith’s commitment to providing an open forum. I suggest you show a little appreciation of her hospitality and try to provide something that adds value.
Well said !!
Please forgive me for not being as polite and professional as the majority of those posting comments.
I am so sick and tired of jokers like you nit picking over stuff when every single solitary prediction which has been made has failed to come to pass. When hurricanes, forest fires, floods, droughts, and tornadoes all are doing the exact opposite of what we were told that they would do. When for the past 20 to 30 years we have been told that within the next 10 to 12 years we will reach a tipping point. When we watch CO2 concentrations in steadily increase and temperatures do not by any significant amount. When we are told that we have just experienced the hottest year ever but you fail to tell us that it was only by 0.04 degrees. When the only demonstrable results of increased levels of CO2 in the atmosphere have been beneficial. Have you no shame sir? More and more of the people that I talk to just laugh whenever they hear people like you speak. If you can point to any prediction which has actually been accurate then please tell me.
Re: “I am so sick and tired of jokers like you nit picking over stuff when every single solitary prediction which has been made has failed to come to pass”
If you think every prediction has failed to pass, then you don’t understand the science at all. Or you’re making stuff up. Take your pick.
You’d basically not have to know about stratospheric cooling, thermospheric cooling, mesopsheric cooling, positive feedback from water vapor, positive feedback from clouds, the negative lapse rate feedback, Arctic amplification of warming, decreasing Antarctic land ice, decreasing Arctic land ice, decreasing Arctic sea ice, sea level rise acceleration, and so on.
Really, it’s on par with a creationist there’s not evidence of human evolution, and none of the predictions of evolutionary biology have come to pass.
Re: “When we watch CO2 concentrations in steadily increase and temperatures do not by any significant amount”
Oh, you’re one of those “a pause means CO2 doesn’t cause significant warming” people. That’s like saying Earth’s axial tilt relative to the Sun doesn’t cause a multi-month warming trend from in Canada from mid-winter to mid-summer, since this week was the same temperature as last week. That’s just end-point bias on your part:
“Unusually cold winters, a slowing in upward global temperatures, or an increase in Arctic sea ice extent are often falsely cast as here-and-now disconfirmation of the scientific consensus on climate change. Such conclusions are examples of “end point bias,” the well documented psychological tendency to interpret a recent short-term fluctuation as a reversal of a long-term trend.”
[from: “Recent United Kingdom and global temperature variations”]
crowcane: every single solitary prediction which has been made has failed to come to pass.
That is not a correct assertion. There have been a lot of incorrect predictions, such as [the end of snow as we know it]., but not every single solitary prediction … .
atomsk’s Sanakan: “Unusually cold winters, a slowing in upward global temperatures, or an increase in Arctic sea ice extent are often falsely cast as here-and-now disconfirmation of the scientific consensus on climate change. Such conclusions are examples of “end point bias,” the well documented psychological tendency to interpret a recent short-term fluctuation as a reversal of a long-term trend.”
There were also the Arctic Ice Death spiral which reverted to the general trrend and did not produce an ice-free Arctic by 2013; and children will grow up not knowing what snow is; and the recent el Nino proves that the “pause” is over.
The pause clearly showed that the quantitative forecasts were unreliable, no matter how confidently expressed. Whether the pause will continue after the effect of the recent el Nino has passed is not yet known.
And, of course, James Hansen’s grandchildren have nothing to fear from climate change; possibly irrelevant because the “scientific consensus” has not addressed their particular case.
If only we could get scientists not to stray so far from the consensus! which is that human activities have contributed to global warming, not that temperatures would rise strictly monotonically toward disastrous levels; and not that modest efforts directed against CO2 (e.g. the Paris Accords) will prevent future warming due to other causes such as land use changes..
Seriously? Those are the things that you are grabbing and scraping for in order not to admit that all the the predictions which actually have an impactful effect on the climate which I will experience have utterly and completely failed to come to pass. That has got to be one of the lamest responses that I have ever run across. The factor will increase and that factor will decrease and this other one will change in this or that manner and when it is all said and done niether I nor few if any of the other people populating this planet will know the difference. You sir just identified yourself as someone I should completely ignore. I haven’t a clue who you are or any awards which you may have received or what papers you may have written and really do not care. With such a sorry excuse of a response absolutely none of that matters.
Re: “Those are the things that you are grabbing and scraping for in order not to admit that all the the predictions which actually have an impactful effect on the climate which I will experience have utterly and completely failed to come to pass.”
You don’t understand the science, which is why what you think what I sad was irrelevant.
The “stratospheric cooling, thermospheric cooling, mesopsheric cooling” points to CO2 (and other greenhouse gases) being the predominant cause of the troposphere and surface warming, not increased solar output. That’s relevant for attribution.
The “positive feedback from water vapor, positive feedback from clouds, the negative lapse rate feedback” affects how much warming that increased CO2 causes.
The “decreasing Antarctic land ice, decreasing Arctic land ice, […] sea level rise acceleration” are relevant impacts.
So those are several accurate predictions. Thus you were wrong when you claimed:
“I am so sick and tired of jokers like you nit picking over stuff when every single solitary prediction which has been made has failed to come to pass.”
Re: “The factor will increase and that factor will decrease and this other one will change in this or that manner and when it is all said and done niether I nor few if any of the other people populating this planet will know the difference.”
I don’t care if you notice a difference, anymore than I care if anti-vaxxers notice the effects of vaccination. I care what the scientific evidence shows on attribution, detection, impact, etc. What you happen to notice (or choose to notice) in your everyday life, is not the barometer for what actually happened. What non-experts like you are aware of is dwarfed by what the published peer-reviewed evidence shows.
Atomsk: “You don’t understand the science, which is why what you think what I sad was irrelevant.”
You are without any failure, the only problem of you is your much too big modesty! :-)
Atomsk’s Sanakan: “Many of the strategies used by the opponents of both evolution and global warming are based on sowing misinformation and doubt. This approach is often called the “tobacco strategy”, because tobacco companies used it effectively to delay health warnings and regulation of smoking.”
““Corporations use a range of strategies to dispute their role in causing public health harms and to limit the scope of effective public health interventions. […] these industries argue that aetiology is complex, so individual products cannot be blamed […].
Arguments about the complex, multifactorial aetiology of CHD and cancer have long been used by the tobacco industry to dispute the epidemiological and other evidence. […] Demands for perfect evidence, while misrepresenting the existing evidence, can also be observed in climate change denialism.”
IMO, you’d be better off to stick with the science of CO2 and climate. To start with, nobody here [denies climate change]. Besides that, the case that CO2 has played a major role in climate warming has serious liabilities, a few of which I have pointed out. Everybody is wrong sometimes, and previous errors do not imply that they are wrong this time.
This approach is often called the “tobacco strategy”
So in your world anyone who questions the alarmist point of view is invoking the “tobacco strategy”? What a convenient way to dismiss other points of view. I’m surprised you didn’t use the other often over used non argument which states: Anyone who questions human caused climate change is like questioning Einstein’s theory of relativity. How dare we invoke the scientific method.
Unfortunately the possibility of negative feedback (what almost always occurs to thermodynamic systems when one component is changed, and the system readjusts from direct response to the component change alone) is left out. The most likely mechanism for this on Earth (a mainly water covered planet) is cloud adjustment to modify solar reflection. Also, using a starting point at the end of the little ice age is suspect, as this temperature is lower than the temperature level over most of the Holocene.
The ozone hole and CFC precedent.
Re: “The ozone hole and CFC precedent.”
That was a nice precedent, since it actually worked. I actually take it a be a fairly good prediction on what will happen:
1) political conservatives will continue to lag behind on accepting the science, mostly for ideological reasons [same as they did in the case of smoking and second-hand smoking]
2) industries will support regulations on the topic, for the sake of profit
3) what everyday conservatives think won’t matter as much, since the industries will introduce enough funding to push regulations through.
Is that what I want to happen? I don’t really care. But it’s what I think will likely, as per this:
“The ozone story: A model for addressing climate change?
The ozone hole case suggests that nothing will likely happen until a dominant energy producer—or a consortium of dominant energy producers—develops profitable proprietary alternatives to fossil fuels and other greenhouse gas–emitting processes. Then, they will change positions to support regulations to drastically reduce greenhouse gas emissions, with the purpose of giving themselves a competitive advantage and significant profits. Public awareness and energy conservation are important, and the public’s and government representatives’ opinions can put pressure on corporate profits. But it really doesn’t matter what the public or government representatives want or believe; short-term and long-term corporate profits will drive any decision to greatly limit greenhouse gas emissions.”
Oh, and please don’t pretend the Montreal Protocol didn’t work. There’s abundant scientific evidence showing that it did. Ozone depletion was mitigated. Let me know know when you can address some of the research on that:
“Quantifying the ozone and ultraviolet benefits already achieved by the Montreal Protocol”
“Evidence for the effectiveness of the Montreal Protocol to protect the ozone layer”
“Emergence of healing in the Antarctic ozone layer”
“Antarctic ozone loss in 1979–2010: First sign of ozone recovery”
“Depletion of the ozone layer in the 21st Century”
“The Antarctic ozone hole: An update”
Let’s just say that a social alarm was successfully created over a poorly studied phenomenon, a group of compounds was blamed, and a group of countries passed regulation banning them. Then victory was declared and everybody was happy.
Politicians thought global warming and CO2 was the same only bigger and better, after all you don’t get the chance to tax the air very often. And then the agenda started to grow with wealth redistribution and energy transition until the present conundrum.
Now politicians don’t know how to get out of this mess, so it is all empty declarations and no action. They’ll make scientists pay for it. Look for the backlash.
Scenario planning, adaptive management and robustness/resilience/antifragility strategies are much better suited to conditions of scenario/deep uncertainty and a climate that is uncontrollable. 👍
Dynamical risk is best seen in data – and by the nature of these dynamical systems gets larger with time. With enough time we might revisit any of the ergodic states of the Earth system.
But strength and resilience is a celebration brought about by care of Earth systems – catchments, rivers and oceans. Using modern notions of polycentric policy and local and regional management – of the global commons especially. The fundament is economic development – free trade, fair laws and robust and peaceful democracies.
MS. Curry, J.A. Have you read anything I have sent you, or am I wasting you’r and my time?
Dr. Curry ==> Great Presentation to Rand.
I have been calling one type of “Unknown Knowns” the impossible act of “Unknowing Known Uncertainty”. Using statistics to create over-averaged temperature anomalies producing nearly certain (errorless) global values.
Can you supply a paper title (or link) for the Cheng et al. 2017, the sea level graph used in the PowerPoint? ( I can’t seem to find the right Cheng 2017 — …)
Thank you for sharing your Rand presentation.
Can anyone identify the research paper Cheng et al. 2017 as the source fore the sea level graph (page 27) in Dr. Curry’s presentation?
oops it is Chen
Dr. Curry ==> Thank you!
— I think the “surname et al. year” citing standard needs to change — with so many papers being published, too many surnames get duplicated in a single year, even within one topic — and not just Chinese names either — lots of Jones’ and Hansen’s and Smith’s etc.
I usually try to provide online links to anything I cite, but not enough time on this one.
Dr. Curry ==> No worries, can’t quite imagine how many hats you are wearing simultaneously.
There is no necessity for carbon action because the climate drives CO2:
The worst case scenario is people never learn this en masse and fall for anything… are we there yet?
The best case scenario is people learn there is no such thing as a CO2 driven climate and therefore ECS estimates, CO2 mitigation proposals, and carbon taxes are irrelevant.
This debate appears to be based on the assumption that the temperature trend is as described by the BEST data, which were published in a very questionable journal and are opaque unless you are able to address directly the original files; I believe the closest that one can come to reality lies in some national met agencies – such as Iceland – and in the GHCN archives freely available at KNMI. Only that way can one avoid the manipulations to obtain a desired curve done at Goddard and the CRU.
There has been no warming that cannot reasonably be attributed to natural causes.
we use ghcn.
all of our data is open
goddard uses ushcn. defunct since 2014
Amen! What makes the BEST time-series more than just “opaque” is the total distortion of low-frequency spectral content in by the artificial assembly of mere snippets of record by their geophysically naive “empirical break-detection” and “homogenization” algorithms. Thus their long-term “data” are products more of academic imagination than of direct, model-free measurement,
JC’s post says:
Given that any realistically achievable amount of global warming we could get this century would be beneficial (total of all impact sectors), then the worst case climate change and policy scenarios are:
1. Abrupt global cooling – this could kill millions, or perhaps even billions this century
2. spending huge amounts of funds on policies to reduce global warming, there by retarding global economic growth by a) reducing the benefits of global warming, and b) wasting these funds when they could be spent on doing real good (see for example Bjorn Lomborg’s analyses and publications). The global climate change industry was estimated at $1.4 trillion in 2013 (~1.9% of global GDP) and increasing at 2 to 10 times the average global GDP growth rate.
If you don’t like models, you can use observations instead.
These give a relatively sharp effective TCR near 2.3 C per doubling. It is effective because it takes into account other proportionate anthropogenic effects like methane, black carbon and aerosols, as well as feedbacks to the net (not just CO2) forcing. It turns out to be highly correlated to CO2 and log CO2 for the last 60 years (0.935) because that is the dominant component of the net forcing. From this, an effective ECS near 3 C is very likely, and there is much less uncertainty in the observations than in the rather too cautious IPCC range.
Bottom line: don’t believe the uncertainty range. Sixty years of observations say otherwise and give numbers sharply consistent with and effective ECS of 3 C per doubling.
I get 1.8C, not 2.3C, but yes, observed trends would seem to be a good starting forecast.
Not sure what NCDC is like as a climate dataset, but I use BEST, GISTEMP and HADCRUT4 with higher values than that.
Also my data covers the last 60 years, while yours doesn’t, and you are not using the CO2 forcing which maximizes at 2 W/m2, but some estimate that is rather above the high end of IPCC values. To get warming as a function of CO2, use CO2 for the forcing. It fits with a high correlation.
Also my data covers the last 60 years, while yours doesn’t
Think about what that would imply – decreasing response and decelerating warming since 1979.
But the evidence you presented, temperature and concentration of CO2, is not temperature and assumed radiative forcing anyway.
60 years.! 2 mile long time machine, Greenland and Arctic ice bore holes should be used. Around 800,000 years.
Problem for CAGW activists is resolution shows no changes identified in 60 years!
Science really damaged by these political players.
You have this backwards. CO2 is sensitive to the climate, not the other way around as you presume.
There is no TCR or ECS for carbon dioxide because CO2 lags ocean warming by 10 months, meaning CO2 doesn’t warm the climate, therefore any discussion about the climate being sensitive to CO2 is science fiction.
Do you actually believe the future can control the past?
As far as I am concerned comments like yours enforce the BIG LIE.
I’m asking you all to stop giving credence to an impossibility.
This looks like the cranky Salby stuff.
This looks like the cranky Salby stuff.
The problem here is you wish to deflect with a typical school-yard smear attempt instead of arguing the point.
Would you say the same thing about “The phase relation between atmospheric carbon dioxide and global temperature”, Humlum etal. 2013?
If you are unwilling to acknowledge the fact that CO2 change lags ocean temperature change, and the ocean is driving CO2 via Henry’s Law, and that this fact alone invalidates the entire AGW theory, then what you believe is science is not based on reality – it’s science fiction.
This is old stuff. The ocean sink of CO2 is expected to be weaker in warmer years, which is why you see a correlation, but it is always a sink because the source is us, and you see that because it is acidifying too meaning it is gaining carbon.
The old stuff you referred to wasn’t my topic per se. Whether or not the ocean is becoming more acidic neither proves or disproves whether the CO2 increase is from us.
There is a strong lagged correlation because the ocean temperature is driving the CO2 outgassing/uptake, which has nothing to do with us, it’s natural, as always.
The human contribution to ocean-driven atm CO2 is lost in the noise.
Even if man-made emissions impossibly were the whole CO2 signal you haven’t addressed the 10 month lag and the apparent reverse causality.
The temperature affects the sink, so this explains the lag. It is a sink because you can’t just dismiss that the ocean is gaining carbon and the atmosphere is gaining only half as much as we emit because the rest is going into the ocean and land. These crank ideas are completely devoid of any concept of a carbon budget.
Handwaving about a carbon cycle doesn’t get you out of the logical fail.
The temperature change precedes both the sourcing and sinking of CO2.
Thanks for the “projection”, but its your ideas that are the “crank stuff” as you haven’t really addressed the lag issue at all. Everyone can see for themselves that CO2 follows temperature – admit it or look foolish.
Your side is full of illogical cranks who cannot even reckon with the logical fail I’ve presented you, seemingly willing to accept reverse causality.
BTW, I made my graph yesterday, and I didn’t use Salby or Humlum etal. It’s called independent verification, and I didn’t need your permission to know what I found out is true.
When you can just ignore the carbon budget it makes it easier for you to come up with these wacko theories, right? Follow the carbon.
“The temperature change precedes both the sourcing and sinking of CO2.”
Oh, the Humlum et al paper – ideologically motivated crap “science” …
“Humlum et al., 2013 conclude that the change in atmospheric CO2 from January 1980 is natural, rather than human induced. However, their use of differentiated time series removes long term trends such that the presented results cannot support this conclusion. Using the same data sources it is shown that this conclusion violates conservation of mass. Furthermore it is determined that human emissions explain the entire observed long term trend with a residual that is indistinguishable from zero, and that the natural temperature-dependent effect identified by Humlum et al. is an important contributor to the variability, but does not explain any of the observed long term trend of + 1.62 ppm yr− 1.”
“The paper by Humlum et al. (2013) suggests that much of the increase in atmospheric CO2 concentration since 1980 results from changes in ocean temperatures, rather than from the burning of fossil fuels. We show that these conclusions stem from methodological errors and from not recognizing the impact of the El Niño–Southern Oscillation on inter-annual variations in atmospheric CO2.”
“A recent study relying purely on statistical analysis of relatively short time series suggested substantial re-thinking of the traditional view about causality explaining the detected rising trend of atmospheric CO2 (atmCO2) concentrations. If these results are well-justified then they should surely compel a fundamental scientific shift in paradigms regarding both atmospheric greenhouse warming mechanism and global carbon cycle. However, the presented work suffers from serious logical deficiencies such as, 1) what could be the sink for fossil fuel CO2 emissions, if neither the atmosphere nor the ocean – as suggested by the authors – plays a role? 2) What is the alternative explanation for ocean acidification if the ocean is a net source of CO2 to the atmosphere? Probably the most provocative point of the commented study is that anthropogenic emissions have little influence on atmCO2 concentrations. The authors have obviously ignored the reconstructed and directly measured carbon isotopic trends of atmCO2 (both δ13C, and radiocarbon dilution) and the declining O2/N2 ratio, although these parameters provide solid evidence that fossil fuel combustion is the major source of atmCO2 increase throughout the Industrial Era.”
My graph shows it is an upward curve of both CO2 forcing and the temperature. In fact CO2 forcing is more highly correlated with the temperature than the year is because these changes are nonlinear. Given the 93% correlation it is a good proxy to total forcing and can be used as a direct predictor. This high correlation is not surprising because CO2 forcing is 70-100% (most likely 85%) of the total. As such, when CO2 is doubled, a 2.3 C per doubling is what can be used for a predicted net effect. This is easier than doing it your way where you have to add back in the other ~50% of correlated forcing increase to get a final temperature change.
Hi Jim, I’m wondering what you think of Lewis and Curry ECS estimate. I’m guessing you feel it’s a soft estimate. Can you explain why.
You dont need models
It has happened before
When c02 levels were 400ppm in the past what was the sea level
If you base your scales of Uncertainty on categorical variables (it has happened before) you will very quikcly run into cases where people will quibble about the meanings of “It has happened before”
we know that at 400 ppm in the past that we have had warmer conditions than predicted by models and higher sea levels.
Let the special pleading begin.
But yes we can agree that 10C ECS is off the table.
More importantly, will we continue to subsidize people to re build in places that are prone to risk. Like hurricanes.
Economic Losses, Poverty & Disasters 1998-2017
“In the period 1998-2017, disaster-hit countries reported direct economic losses of US$2,908 billion of which climate-related disasters accounted for US$2,245 billion or 77% of the total.
This compares with total reported losses for the period 1978-1997 of US$1,313 billion of which climate-related disasters accounted for US$895 billion or 68%.
In terms of occurrences, climate-related disasters also dominate the picture, accounting for 91% of all 7,255 major recorded events between 1998 and 2017. Floods, 43.4%, and storms, 28.2%, are the two most frequently occurring disasters.
The greatest economic losses have been experienced by the USA, US$ 944.8 billion; China, US$492.2 billion; Japan, US$376.3 billion; India, US$ 79.5 billion; and Puerto Rico, US$ 71.7 billion.”
“This report highlights the protection gap between rich and poor. Those who are suffering the most from climate change are those who are contributing least to greenhouse gas emissions. The economic losses suffered by low and lower-middle income countries have drastic consequences for their future development.
“Clearly there is great room for improvement in data collection on economic losses but we know from our analysis of the available data using georeferencing that people in low-income countries are six times more likely to lose all their worldly possessions or suffer injury in a disaster than people in high-income countries.”
The report concludes that climate change is increasing the frequency and severity of extreme weather events, and that disasters will continue to be major impediments to sustainable development so long as the economic incentives to build and develop hazard-prone locations outweigh the perceived disaster risks.
“Integrating disaster risk reduction into investment decisions is the most cost-effective way to reduce these risks; investing in disaster risk reduction is therefore a pre-condition for developing sustainable in a changing climate,” the report states.
If history could teach us anything we would stop building in vulnerable areas like low lying coastal zones and flood plains.
See Fig. 1 of Foster and Rohling paper. CO2 varied from 200 to 280 ppm without SUV’s in past 550,000 years. The correlation with sea level is best explained by glacial-interglacial periods. In glacial periods, sea level decreased and oceans were cooler absorbing more CO2. In interglacial periods, sea level increased and oceans were warmer releasing more CO2.
If you apply the same correlation today, you have to assume the 400 ppm CO2 came from the oceans. What about the SUVs?
Pretty sure this is a Groundhog Day argument. In the past when sea levels were 9 meters higher, temperature was driving CO2. I hope I’ve indicated before now, it’s now CO2 driving temperature more than the other way around. We have 3 variables in the discussion: CO2, GMST and Sea Level.
CO2 > GMST > SL
SL > GMST > CO2
Now and Then.
Ice Sheets > SL > GMST > CO2
How can SL drive the GMST? How can the system be bistable? I’d bet on ice sheet mass driving rather than the GMST driving. The GMST is a result of various masses. Let’s try the opposite, Various masses are the result of the GMST. The glacial/interglacial cycle has a good link to the GMST and I’d suggest the average ocean temperature.
Paleo is going to suffer to some extent if CO2 is now the driver and before it wasn’t. Suffer in that, I don’t care what used to happen if the system is now in another mode. You can’t change the system and then use the past to make your points. To the extent it is the fastest mostest ever, unless you say, we didn’t change that part of it. All SLR is limited by mass. If ocean warming is not off the charts yet, it’s not going to go off the charts. If ocean warming is suppressing the GMST warming, how much more can it suck in in terms of joules per year? These background masses are going to determine the fastest mostest ever. It’s been a fun experiment. Look what we can do, look what mass can do. I think all my chains above are linear. Which of course is wrong.
Not Grant Foster1 and Stefan Rahmstorf but
Gavin L. Foster and Eelco J. Rohling.
Phew, you had me going for a moment there.
“we know that at 400 ppm in the past that we have had warmer conditions than predicted by models and higher sea levels.”
Warming seas and massive plant growth fostered by warmer conditions will cause CO2 to go over 400. You know this though. So why claim a cart before the horse privilege?
Rising CO2 can put up temps a little, your concern but claiming precisely which came first a million or 100 million years ago?
Only Foster number 1 would claim that.
Is it correct that ECS increases as GMST decreases, as per Gill Figure 1:
On this basis, if ECS = 1.5 to 3.0°C when GMST =15°C, we should expect ECS will decrease as GMST increases.
Ghil, 2013, A mathematical theory of climate sensitivity
A question. It appears that a primary concern with our projections has to do with the amount of precipital water in the upper troposphere. The empirical evidence that I have seen, one from satellite studies between 1988 and 2001, and a second from radiosonde date from 1948 to 2012 show water vapor declining, in direct contradiction to the theory created in the Charney Report of 1979. Is there other empirical evidence that shows the theoretically expected increase in upper tropospheric water vapor, or are we simply relying on modelled expectations over reality?
If there is no additional empirical evidence, it would appear that modelled projections of temperature increase are starting from a base that is triple what is reasonable.
Thanks; awaiting a response from the moderator or a fellow reader.
Try this paper:-
Thanks, I will read it thoroughly and add it to the collection I am compiling. It does concern me that this and virtually all of the other papers involve “modeling” and “re-analysis” of earlier work that sometimes showed contradictory conclusions. Perhaps it is too much to ask for, but I was hoping for straight forward empirical evidence, as opposed to modeling efforts.
Just had to think of an asylum where the inmates are investigating the probabilities of their delusions and visions. A major question is of course who will qualify as an expert. This is just the “climate” department. Wait until the other departments join the fun and start to cross-compare. At which moment the whole thing begins to somehow look so real ….
Perhaps someone should remind us that this is all about subjective probabilities. No way to give them objective meaning. There is just one event in the sample.
How did we land in this situation of such a serious science-policy mismatch?
Isn’t it because policies were predetermined in a doctrinal way (no science involved)?
When the UNFCCC was agreed in 1992, “climate change” was defined once and for all as ““Climate change” means a change of climate which is attributed directly or indirectly to human activity that alters…“.
Thus the troops were sent to take this hill, no other ones.
The whole matter is a delirious distraction from our prime task of unraveling how the Sun drives our weather pattern and ocean cycles, the stuff we need to know about most.
And the whole show rests upon unproven and irrational net positive feedbacks to increases in climate forcing.
When economists create models there can be no science involved, its bookmaking. It fully meets Feyman’s definition of a pseudo science. There are no laws and it cannot be proven.
Same thing happened with renewable energy. In the case of renewable enrgy they just made up their own laws that said subsidising renewable energy made the known inadequacy and intermittency of its energy sources far more intense and continuous, by law.
This has problems in engineering delivery of what engineers know is impossible. This does no stop economists subsidising it by law and hoping nobody notices it failing as the laws of physics say it must.
As far as climate change. What do you think stops the relentless rise of the interglacial stone dead in while CO2 is still burgeoning/ Answer is evaporation forming clouds at 140W/M^2 currently. So why should 1W/M^2 of AGW not be retuned to equilibrium by asmall change in this effect, which occurs naturally by ocean warming.
Who seriously claims and can show how water vapour amplifies CO2 emissions to catastrophic tipping point runaway again rather than create the dominant clouds that cool things down?
Go Figure. This applies to any other 2 or 3W/m^2 of heat increase that will be transfered to space along with increased albedo from greater cliud formation that will simply maintain the equilibrium. Why the planet has such a stable ice age cycle between very close limiting temperatures, and doesn’t freeze over or boil off the oceans. It’s the clouds, stupid!
Now that’s what I call a climate model, it uses the facts and explains what happens in basic physics terms. No belief is required to grasp the massive forces at work relative to human impact.. No need to believe mumbo jumbo models that are in fact ignoring a chunk of the environment completely, oceans and lithosphere are not passive or constant in this, and that assume pre agreed causes and effects their models are engineered to track by the assumptions o the climate priests who programme them, with many wholly presumptive assumptions that are the antithesis of reality that physics does understand. One key example is the the dominant negative feedback control of water vapour on temperature during interglacials. Cloud control. Why does any real physicist accept the science denying nonsense that asserts otherwise? Show me the physics!
The simple yin and yang of it is that, the network of feedback loops (respect for life, liberty, property and acceptance of personal responsibility) also can ’cause’ greatness beyond human imagination.
The Sun will continue to heat the oceans or not, without our help, and there’s nothing humans can do about it and that’s not a theory: it is a fact, even if we eagerly sacrifice the constitution, the country, capitalism, liberty and the scientific method on the altar of global warming alarmism because we read in the NYT that humanity’s CO2 is a dangerous pollutant that causes global warming [i.e., AGW theory]. Nevertheless, when the oceans begin to cool there will be no global warming, irrespective of Westerners’ theorizing about AGW.
“Climate science has made a massive framing error, in terms of framing future climate change as being solely driven by CO2 emissions”.
It’s framed that way because CO2 is the thing that human beings can change. We are not going to derive future policy over the possibility of a huge volcanic eruption.
There’s no rational grounds for predicting chaos-causing continually increasing average global waming– whether the atmosphere as a whole is either warming or cooling, a pragmatist can only conclude global warming has been good for humanity. No one is clamoring for a colder Earth.
Odd that someone so aware that energy is the line integral of force per distance leaves out of consideration the initiation of force–and power– politicians can practice with impunity in a mixed or socialist economy.
I see no problem. If the earth warms 10 degrees, that means the entire Russian continent becomes a paradise. Putin opens up immigration to Orthodox Christians around the world, we go settle our 40 acres and the West can burn in the godless Hell that it has already become.
The CO2 control knob is rather simplistic and too coarse for decadal time scales. A better analogy would be an airplane:
> CO2 is like the trim wheel.
> Water vapor and aerosols are like the stick and rudder pedals.
> Science is like the instruments. It’s probably a good idea to reset the gyros once in a while.
> The methane time bomb is like the mountain off on the horizon that we’re trying to avoid crashing into.
Methane feedbacks are not entirely positive. Meaning there is a studied biological response to arctic methane release.
Curry presents a laudable breakdown of risk assessment for politicians and other laymen. The portrayal of the “CO2 control-knob” as rubber ducky is particularly apt. Natural variability, however, is by no means any black swan, but as commonplace as mallards in Canada. Only irretrievably politicized minds can argue otherwise.
John, this is what I find so frustrating. Dr. Curry rightly explains the wide margins in ECS estimates and she is crowned the queen the “uncertainty”. Why is it such a cardinal sin to demonstrate the disagreement?
It is really strange that the diagram “Classifying Possibilities” shows possibilities going in all directions from the “Strongly Verified” yet in both diagrams “Equilibrium Climate sensitivity to Doubled Co2” shows only possibilities going in one direction from the “Strongly Verified” . That is nothing less than 1 C.
Murphy’s Law is “Anything that can go wrong (in a scientific theory) will go wrong (In a scientific theory). Especially when billions of dollars have been spent proving the theory and everyone has been told they are idiots if they don’t accept the theory.
My guess is the sensitivity to a doubly of CO2 is less than 1 degree C.
Cheers and have fun.
“More striking is that the [AGW] hypothesis is not testable. It cannot be falsified. The alternate hypothesis, the network hypothesis, is rooted in observation…” ~Marcia Wyatt
Rainfall in the Mediterranean Basin is influenced by ocean surface temperatures in the tropical Pacific and the north Atlantic. The variability in ocean surface temperature year to year, decade to decade, century to century result in persistent regimes of droughts and floods. Because of the importance of Nile River flows to the Egyptian civilisation water levels have been measured for 5,000 years and recorded for more than 1,300. The ‘Nilometer’ – known as al-Miqyas in Arabic – in Cairo dates back to the Arab conquest of Egypt. The Cairo Nilometer has an inner stilling well connected to the river and a central stone pillar on which levels were observed. The exterior of the stilling well can be seen in the photo below. The Nilometer remained useful until the 20th century when major dams changed the Nile River flow regimes. – http://www.waterhistory.org/histories/cairo/
In a field in which observation and an abductive mode of inference is the primary guide – such a record is an amazing resource. And because it’s even more chaotic than random.
In mentioning of the Egyptian Nilometer it is wise to remember Plato saying that according to the Egyptian priests Egypt was not effected by events [ in the Holocene]. The priest were talking about the sun changing its place of rising beside other things.
That said, events in the Mediterranean were far wilder than they are made out to be. Recent earthquakes around Etna and the mention of a possibility of a flank slump, this article here : https://agupubs.onlinelibrary.wiley.com/doi/pdf/10.1029/2006GL027790 [at 28] refers to an earlier event around 7k6 years ago (Holocene max). However that date marks events elsewhere as well, making the Etna event one of several concurrent events. Note from the link here: https://melitamegalithic.wordpress.com/2018/07/24/searching-evidence-update/ around 7k7 BP (5k7bce) a peak in Eddy cycle and correlating with abrupt polar and equatorial temperature trend change. Those dates also correlate with seismic geological events, a whole series of them, that have left their record in archaeology and in a host of proxies from around the world.
Then note also the temperature anomaly level. Those were taken from Wiki. Taken at face value, the chronological correlation was the important aspect here. Now we are heading for another Eddy peak. (They say forewarned is forearmed).
I have been getting a little into megalithic ‘dragon-kings’ – event outliers esp. dynamical – lately.
“Solar activity during the current sunspot minimum has fallen to levels unknown since the start of the 20th century. The Maunder minimum (about 1650–1700) was a prolonged episode of low solar activity which coincided with more severe winters in the United Kingdom and continental Europe.” http://iopscience.iop.org/article/10.1088/1748-9326/5/2/024001/meta
But I was thinking primary operators on major Earth sub-systems. Solar intensity has been linked to surface pressure – and therefore wind fields – in the polar annual modes. Systems with internal feedack loops – but none the less solar modulated to some degree.
A future more stormy and colder for some northern climes, cooler by degrees C in Central England, reduced AMOC and increased ice sheet growth – for centuries. More meridional winds in a cooler sun drive great ocean gyres and set the scene – in both hemispheres – for enhanced upwelling on the western margins of continents. Cool water at the surface of the eastern and central Pacific enhances closed cell, marine strato-cumulus cloud cover – increasing albedo from that of the dark ocean below to bright cloud over much of the global tropics. And thus with more upwelling an internal feedback loop with a negative energy storage dynamic and a cooling planet?
Re your earlier comment on ‘Dragon kings’, has got me intrigued. However note the only solid evidence – the real dragon – to work on is in my second link, at the top of the chart. IMO some of the rest in DK theory is ‘looking for elephants with a microscope’.
Didier Sornette – https://arxiv.org/abs/0907.4290
According to Bjorn Lomborg in The Weekend Australian:
• Paris commitments will cost $1-2 trillion p.a. from 2030 by increasing energy costs, thereby reducing global GDP growth.
• This will do nothing to change the climate.
• Climate scientists say keeping global warming to less than 2C will require reducing emissions by 6000 Gt CO2.
• But Paris commitments, if achieved, will cut emissions by just 56 Gt CO2 by 2030, i.e. leaves 99% in place.
• WHO estimates that since 1970s climate change has causes 140,000 deaths p.a., and projects 250,000 p.a. by 2050.
• For $5 billion per year (i.e. 0.5% of the Paris commitment’s increased cost of energy) more than 1 million deaths from TB could be avoided per year.
• More than $30 billion of OECD aid is climate related.
• That is more than three times what would be needed to eradicate the world’s worst infectious diseases.
• Nearly 10 billion people responded to a survey in which they were asked their policy priorities. The answer:
• Education and better healthcare ranked top priorities.
• Climate polices were bottom of the list.
A recent Dutch report on future sea level rise completely relied on the scenario that half of the Antarctic Icecap could melt this century. Should the dutch policymakers take notice of this report?
Err, no …..
The WAIS is not “half of the Antarctic Icecap”.
And it was up to 2130.
“A possible effect of the ongoing change in the climate system would be the disintegration of the West-Antarctic Ice Sheet (WAIS). Such event would lead to a 5 to 6 m sea level rise. Although chances are small, the consequences are enormous. Obviously, communities in low elevation coastal areas would be threatened. However, what more is there to say about this risk? What could actually happen under such 5 m SLR? This question – the topic of the present paper – is relevant, for instance when considering policies to avoid and adapt to climate change. “
What if the Sun does not rise tomorrow? Although chances are small, the consequences are enormous. Let’s better sacrifice another human.
And then a handful of Spaniards conquered the Aztec empire.
Hi Tony, can’t access your c-drive :-) you are referring to https://www.researchgate.net/publication/24130188_Neo-Atlantis_Dutch_Responses_to_Five_Meter_Sea_Level_Rise
Recently there was are report that under RCP8.5 a three meter sea level rise was possible.
But how likely is it, what are the boundary conditions? Quite important for the dutch taxpayer.
We need to wait for Zwally to publish
Regarding worst case scenarios, policies to reduce global warming are based on what may be a false premise – i.e. that a few degrees of global warming from current icehouse conditions is dangerous. Various lines of evidence suggest global warming would be beneficial, not damaging and not dangerous. One of them is empirical paleo evidence.
Below is a summary of some key points from a discussion on the previous thread, in response to this comment https://judithcurry.com/2018/10/08/1-5-degrees/#comment-881915. The temperatures quoted here are from Scotese 2018, Phanerozoic Temperature chart, p.3 here: https://www.researchgate.net/publication/324017003_Phanerozoic_Temperatures_Tropical_Mean_Annual_Temperature_TMAT_Polar_Mean_Annual_Temperature_PMAT_and_Global_Mean_Annual_Temperature_GMAT_for_the_last_540_million_years
• Optimum GMST for life on Earth over the past 500 million years is centred on about 22°C, which is about 7°C warmer than now (GMST is now about 15°C).
• Life thrived in the Jurassic, Cretaceous, Paleogene and Miocene times when GMST was between 16°C and 28°C.
• Earth is currently in a rare Icehouse. The last Icehouse that was as cold as this one lasted about 30 Ma and ended about 260 Ma ago.
• The last time GMST warmed from 15.2°C to 18.7°C was during the Early to Mid Eocene, 20–14 Ma ago.
• There was no mass extinction during warming through this GMST range.
• The deep ocean temperatures in the tropics are currently about 7°C colder than they were 20–14 Ma ago, when GMST warmed from 15.2°C to 18.7°C. Given this, and the continual supply of cold water from the polar ice caps, a substantial GMST increase over a century is probably impossible.
1. Realistically, by how much can GMST increase this century
• ECS = 1.5-2°C, or 3°C (central estimates)
• RCP4.5 to RCP6 (likely)
• Past the peak of the current interglacial and cooling towards the next interglacial. Started cooling around 8 ka ago.
• Tropical deep sea temperatures have decreased by around 7°C over the last 20 Ma – due to cold water from ice caps (Scotese 2018, p.6)
• Cold water from ice caps will continue to increase the volume of cold water in the deep oceans for as long as the polar ice caps remain.
• The heat capacity of the atmosphere is the same as the heat capacity of the top 2 m of the oceans. So, is it realistic to believe GMST can increase much while deep see temperatures are so cold and the volume of cold deep water continues to increase?
2. Would GMST increase this century be beneficial or damaging
• If GMST increases 3°C, average tropical temperature would increase 2°C, and the polar regions by about 10°C
• Extratropical temperatures would not exceed current tropical temperatures.
• Most warming occurs at high latitudes, during winter and at night (due to the increased insulation effect of increasing GHG concentrations). This is beneficial for life, increases growing seasons, and expands the area of land available for food production
• Humans operate in temperatures from -50°C to +50°C.
• Empirical evidence shows the planet greens as CO2 concentration increases and as it warms.
• Empirical evidence shows the life thrived much better than now at GMST from 16-28 C, with optimum probably about the centre of the range, i.e. about 22C (i.e. 7C warmer than now).
• So, what is to fear from any GMST increase that might occur this century?
• There is clear evidence that global cooling would be seriously damaging, perhaps catastrophic.
• What is the valid demonstrable justification for the presumption that the planet is currently at an optimum temperature, such that both cooling and warming would be dangerous?
• This presumption may be akin to the belief, held long ago, that Earth is the centre of the universe because we are here.
Best case scenario
The planet will warm at the optimum rate this century
As a consequence, the volatility of short term climate changes will decrease (climate volatility is less when the climate is warmer (e.g. during interglacial compared with during glacial periods)
Quality of life and human wellbeing of the world’s population will improve more rapidly
The majority of the population will stop trusting and believing the alarmists
We’ll stop wasting $ trillions per year on climate change policies
We’ll focus on maximising global GDP growth and, consequently, improving the health and well-being of the world’s population.
It is my contention demonstrated through experiment that since BB radiation from the surface is not possible the 396 W/m^2 upwelling LWIR is a theoretical “what if” calculation and does not in fact exist.
With no source of energy the 333 W/m^2 up/down/”back” GHG energy loop also cannot and does not exist.
There is no energy for the GHG molecules to absorb/trap and re-emit nor “warm” the atmosphere, earth or water molecules.
There is no man-caused global warming or climate change.
No, no, no. This is all wrong.
The IPCC has stated its AR5 summary of ECS. It gives a range.
In an ideal world, there should be one value of ECS – the correct one.
Values that are different have a problem with the definition of ECS, or a measurement problem, or some other problem.
To get a better ECS, you search for these problems, then fix them.
You do not pick a winner from a range of values, because all but one pick, possibly all such picks, will be wrong.
ECS looks like a case where the combined errors of co-factors is so large that the outcome, after applying corrections for them, has so large an error as to be useless.
The bigger problem is to stop these theoretical, but probably wrong figures being used to guide public policy. In the good old days there was no way they would, because of the outcry from informed sources. These days, the outcry volume is at the lower end of the probability scale. Like ECS itself might be. That seems to be a consequence of thinking infected by post-modernism virus. Physicists, heal thyselves.
(Incidentally, I have yet to see a lucid argument that ECS cannot be zero, for the whole atmosphere case as opposed to in vitro or lab studies.)
More on correlations. There is a very sharp value of TCR when using observations. Take decadal averages. I use this data, 3 temperatures and CO2. I also put a measure of decadal solar activity there for reference because it may explain some of the smaller variations.
Even though there are only 6 independent points in 60 years, they are covarying in the temperature and CO2, and the temperatures correlate with CO2 at 0.987. Taking the last 50 years, a period when CO2’s forcing started rising at 0.2 W/m2 per decade, the average correlation with the 3 temperature records goes up to 0.993. This gives 0.01 C per ppm as the best fit. Similar correlations (0.985 and 0.995) are obtained with log CO2 giving a best fit of 2.4 C per doubling for 60 years and 2.6 C per doubling for 50 years.
To have 5 or 6 points lined up so much is an unlikely coincidence. If other things are affecting the decadal trend they are also highly correlated to CO2.
Then please explain how you know that the global temperature change 1915-1945 was different to the change 1980-2010.
The first period was before CO2 is said to be a significant climate factor.
The sesond period is much alike, except that CO2 increased 15% or so.
What specific, accepted difference allows you to claim that CO2 had an influence on temperatues in the later period, but not on the first?
I think the skeptics have too easily dismissed or overlooked a possible solar influence from 1910-1940 when the sun was coming out of its lull in 1910.
The cyan line here shows a proxy for solar activity which is the 10-year average of sunspots.
In that period CO2 forcing increased by about 0.2 W/m2 in 30 years. By 1970, CO2 was increasing at 0.2 W/m2 every ten years and became the dominant trend in the forcing which is why it became more sharply correlated in the last 50 years. Today it is increasing by 0.3 W/m2 per decade while the sun has been returning to a relative lull in the last few decades. The relative role of the sun, volcanoes, aerosols and CO2 have been shifting as CO2 has become gradually more dominant especially in the last 50 years where the sharp correlation shows this.
Then please provide us with two sensitivity calculations, one based on the early time period, another on the later. If you can’t do that because it gets too complicated, too much in the realm of unknowns and uncertainties, then we will have to conclude that the response you just gave me is qualitative armwaving, which anyone can do. Geoff
I would assume the sensitivity stays the same over time. Is there any reason why not? What we don’t know is how much the increased solar activity contributed. It looks like CO2 and solar activity would have contributed about a half each to the 0.4 C warming between 1910 and 1940. As far as I know this is an unobserved unknown for that period, while in the last 50-60 years the solar activity has net decreased some, so the estimates of sensitivity I gave would be underestimated if just based on anthropogenic forcing. My guess working backwards for the increase in solar activity between 1910 and 1940 would be a few tenths of a W/m2. How much do you think it increased, if at all? The later period is much easier because the CO2 forcing increased over 1 W/m2 in just 50 years, so it has come out of the noise which is why the correlation to CO2 alone is so high. The sharp value of 2.6 W/m2 per doubling is for a 0.995 correlation in the last 50 years. This makes it clear to observe. Above you said there should be one correct value for sensitivity, and for strong enough forcing changes, >~1 W/m2 in 50 years, we can see just that. This should not surprise you as it confirms what you said.
Clearly, solar activity did influence climate in 1910-1940, but not any longer. Hockey stick mentality again.
If there is a solar difference between the 1950’s and now, it is negative. Plus, the CO2 forcing change since 1950 is 3/4 of the total since pre-industrial and nearly 4 times larger than that between 1900 and 1950. The signal has become that much stronger, and so is therefore the correlation because it has turned into the dominant control variable on the temperature from what was a more mixed set of forcings prior to 1950.
Jim what values are you using for aerosol forcing
owenv…: He never uses forcing data and holds on bothering the community here with his nonsense for many years. And he never makes some rough estimate if his TCR around 2.5°C/doubling CO2 can be true. In other words: We have 1.5*CO2 vs. pre industrial and 0.9°C GMST warming. NOT 1.25! We tried so often to educate him, without any success. I recommend not to try it one more time, it’s wasted time!
Thank you Frank. I have obviously missed the back and forth. More than anything I’m curious as to why a person would choose one ECS estimate over another. Judith has well explained her reasoning, but I would like to here from others
Now that’s funny.
It may be a funny question. I’m not sure why. If we are drawing direct correlations between temp surveys and atmospheric CO2 concentrations then we need to subtract other significant forgings first.
I am using an effective sensitivity just based on CO2 and temperature, other effects (aerosols and other GHGs) being proportional as evidenced by the high correlation to CO2 alone. If you take Lewis and Curry’s TCR along with the CO2 change you only get half the actual warming since 1950. Their TCR is not fit for direct use with CO2 changes because of proportionate anthropogenic factors that they factored out with their assumptions. If they’re going to use their number they need to tell policymakers to add back in the proportionate anthropogenic factors that they have removed because otherwise it is an underestimate by far. Mine is the net anthropogenic number (effectively 2.6 C per CO2 doubling) and theirs is only part of the story which can fool the unwary.
Lewis and Curry also tell you that other anthropogenic forcings are highly correlated to CO2, which is why T is so correlated to CO2 alone for the last 50-60 years (not the sun, also shown here in cyan).
Lovejoy also gets 2.3 C per doubling from a fit since 1750. It is a robust gradient.
Lovejoy also gets 2.3 C per doubling from a fit since 1750. It is a robust gradient.
This is misleading because it ignores radiative forcing from Methane, Ozone and the other GHGs.
It also assumes knowledge of temperatures from very few stations.
There are probably zero stations with a complete record since 1750.
Jim D: Lewis and Curry also tell you that other anthropogenic forcings are highly correlated to CO2, which is why T is so correlated to CO2 alone for the last 50-60 years
Lovejoy also gets 2.3 C per doubling from a fit since 1750. It is a robust gradient.
Because of the correlation that you refer to, Lovejoy’s estimate is biased upward from the effect of CO2 alone.
But taking Lovejoy’s estimate as accurate, for the sake of argument, it implies a 2.3 C increase in temp as the atmospheric concentration doubles from 400 ppm to 800 ppm. That is not going to happen soon, is it. Since the rate of change has not been damaging to date, there is no evidence that it will be damaging during the next doubling of the atmospheric CO2 concentration. Is there.
matthew: It’s no use arguing vs. the BS from JimD. As long as he doesen’t admit that the development ot the GMST has to do something with the forcing ( ant it’s data) one should overlook his writings because there is no progress included for many years IMO. One could it see as a special kind of d….l !!
Use Lewis and Curry’s number for TCR, which is 1.2 C per doubling and look at the Lovejoy picture. Their gradient is half of the green observation line reaching 0.6 C instead of over 1 C for the current period. Furthermore adding in aerosols only makes their line worse. Their trend is clearly not suitable for purpose if it can’t even get close for the past 250 years. They also fall 50% short when compared to post-1958 data where 2.3-2.6 C per doubling (or 100 ppm per degree) fits that warming rate with a correlation near 1.0, and this is a period when the CO2 forcing has grown three times being easily the dominant change agent. If anyone tells you not to believe lines that fit 60 or 250 years of CO2 data and asks you to believe half the gradient instead, you need to ask them how exactly they intend their number to be used. The numbers I use, and Lovejoy, have CO2 as a proxy for net anthropogenic forcing. The high correlation to temperature in the last 50-60 years demonstrates that net forcing has remained proportional to the CO2 change, which is not surprising when you consider CO2 is 85% (AR5) to 100% (AR4) of the net forcing change.
Jim D: The numbers I use, and Lovejoy, have CO2 as a proxy for net anthropogenic forcing.
That is a restatement of the estimation problem: CO2 change is confounded with other changes, so an unbiased estimate of the effect of CO2 alone is harder to arrive at, if even possible.
Using Lovejoy’s number and some estimate of time to double from 400 to 800 ppm, where is the evidence of harm from warming occurring too rapidly?
Use Lewis and Curry’s number for TCR, which is 1.2 C per doubling and look at the Lovejoy picture.
Why? Use Lovejoy’s number, which you report favorably.
I say use Lewis and Curry’s 1.2 C TCR and see what you get, which is a severely flat line predicting 0.6 C of warming instead of the more than 1 C that we have had. This thread is not about warming too rapidly, only about getting numbers from the observations over the past 60 years when we have had 75% of the forcing change and 75% of the warming too. The effective TCR of 2.6 C since 1968 comes with a high correlation of CO2 to the five independent decadal temperatures in this period giving a lot of certainty as to the gradient. We should not ignore correlations like this when doing projections. They should provide the central estimate.
You say getting the effect of CO2 alone is too confounded with other factors, but that is exactly what LC have tried to do. They make numerous assumptions including with endpoints to extract a CO2-alone sensitivity that to even be usable you have to know what they assumed during this extraction and fitting. The direct approach is plot CO2 and T and see how they have covaried. No assumptions needed. If there’s no correlation, move on. If the correlation is near 1.0, it is useful, and also happens to back AGW expectations with TCRs in the 2 C range.
Jim D: This thread is not about warming too rapidly, only about getting numbers from the observations over the past 60 years when we have had 75% of the forcing change and 75% of the warming too.
Why confine yourself to this thread? You have warned that the problem is the rate of warming, so complete the thought. Why else did you present Lovejoy’s numbers if not to regard their implications?
I use Lovejoy’s number as an example of a direct approach using CO2 and T which is what I have done for the post-Keeling period. Both give similar effective TCRs. The TCRs would support ECSs in the 3 C range, so we can use those kinds of warmings when translating emissions to future effects, and then we can calculate 1 C per 2000 GtCO2 emitted between now and 2100.
Jim D: The TCRs would support ECSs in the 3 C range, so we can use those kinds of warmings when translating emissions to future effects, and then we can calculate 1 C per 2000 GtCO2 emitted between now and 2100.
So you are expecting a 1C global mean temp increase over the next 82 years, and that rate of increase is too rapid for natural populations of plants and animals to adapt?
MM, 1 C in 82 years brings us to 2 C total, which can only be achieved with a lot of mitigation that keeps us below 450 ppm. Some would have said at the time of AR5, that 2 C would be a good goal to have, but now that is changing to 1.5 C. If you look at the 1.5 C report, they argue against 2 C, and the reason is natural habitats and adaptation challenges due to being too reliant on past climate for our food and water systems, and increasingly extreme events. More likely changes without mitigation are in the 3-5 C range (larger for land areas and the Arctic), so if 2 C is bad, those are worse.
Jim D: MM, 1 C in 82 years brings us to 2 C total,
I see you have gotten away from the “rate of change” threat to the “total change” threat. And you still consider the 1C change to date to have been harmful on the whole.
MM, in saying that 1 C, which makes a net 2 C by 2100, is harmful I just go by the 1.5 degree report. Even 2 C requires a lot of mitigation, a reduction in emission rates despite population growth. What we are headed for without mitigation is reaching 3-5 C by 2100, rapid climate change by any standards, and another level of harmful. So mitigation gets us from very bad to still quite bad, and that is our choice at present.
Jim D: What we are headed for without mitigation is reaching 3-5 C by 2100, r
I am not seeing how you get that from Lovejoy’s estimate of climate sensitivity.
I think you ignore outright the evidence that the climate at 400 ppm is more beneficial to natural life than the climate was at 280 ppm.
I think that the implications of the current state, the rate of CO2 increase, and Lovejoy’s estimate of climate sensitivity are that climate change in the rest of the 21st century will be benign or beneficial.
Perhaps you think that Lovejoy’s is a significant underestimate.
Lovejoy’s direct transient sensitivity to CO2 together with continued emission growth leads to 3-5 C. It’s what happens when you add 4000-8000 GtCO2 between now and 2100. Even the lower end of that represents a per capita CO2 the same as today with a population growth to 10 billion which would be difficult without some energy transition in advanced nations.
If you like 400 ppm, that doesn’t mean you will like 600-900 ppm that remain possible by 2100. At 400 ppm there is already a risk that Greenland and some of Antarctica are not sustainable glaciers. If you’re choosing favorite climates, there is a case to be made for 350 ppm (see 350.org), which is about 1 degree above pre-industrial, and maintains today’s climate without melting too many glaciers.
Jim, do you believe that both Greenland and Antarctica are at risk? Should we not wait for Zwally to publish before making that claim?
Most of the melting starting around 2000 is from Greenland and Zwally doesn’t seem to have much to say about that. Between Greenland and Antarctica, they contribute 1 mm per year and rising to sea levels, and 75% of that is Greenland. Skeptics are undoubtedly looking at this very closely.
Zwally will only shed more light on East Antarctica. But the results will likely affect the global outlook as it’s expected that the earth has gained ice
Offsetting Greenland losses
Sea levels don’t back that up at all. Sea-level rise rates have accelerated by 1 mm/yr since 1990 and the glacier melt rates have supported that. Skeptics are in damage control mode.
It will affect attribution
Zwally’s last estimate didn’t balance with sea level data either, so why should this one?
Zwally isn’t in the business of making east Antarctic ice match the estimated changes in SL. His purpose is to zoom in on a better estimate for east Ant.
He certainly doesn’t match anyone else or sea level. He tries to explain a sea level reduction when the evidence has it rising instead. A bit off-kilter there.
Well actually, Zwally has provided an estimated range for East Antarctica already. The discrepancy between his and other studies arises soley based upon selecting the low end. There is really no reason to do this. The high level of uncertainty concerning global ice mass is more of a regional problem with global impacts. We can shuffle around our attributions after Zwally has updated the data.
Greenland is the elephant in the room there.
Yes it is possible that Zwally’s numbers will balance the Antarctic ice mass loss vs gains; which still leaves us a net loss due to Greenland losses. I will wait for the paper.
Sea level is the arbiter of what matters.
SL has to be understood relative to causation. As well we need to have sobering discussions on both positive and negative effects of rising SL. To focus on either end exclusively provides an incomplete picture.
This is why Greenland’s accelerating melt rate over the past 20 years matters so much. Skeptics have to figure out how to downplay this.
Jim D: Lovejoy’s direct transient sensitivity to CO2 together with continued emission growth leads to 3-5 C. It’s what happens when you add 4000-8000 GtCO2 between now and 2100. Even the lower end of that represents a per capita CO2 the same as today with a population growth to 10 billion which would be difficult without some energy transition in advanced nations.
If you like 400 ppm, that doesn’t mean you will like 600-900 ppm that remain possible by 2100.
How does 600-900 ppm by 2100 produce 3-5 C increase using Lovejoy’s estimate of sensitivity?
Do you accept the claim that 350 ppm produced a better climate than 280 ppm? 700 would be double that, and Lovejoy’s estimate does not forecast 3-5C increase above the climate for 350, does it?
If temperature rises according to Lovejoy’s estimate of sensitivity as CO2 increases from 400 to 800 ppm, how fast will Greenland ice and Antarctic Ice melt? Too rapidly for natural biological populations to adapt?
Here is more information on recent CO2 concentration.
The rate of increase is under 0.4% per year, and the climate of 350ppm occurred in about 1995. Those numbers give a doubling time of more than 200 years, and a ca 1.2 C increase above the 1995 climate by 2100. If 350 ppm represented a good climate, and if Lovejoy’s estimate is accurate (recall, I think his procedure has an upward bias), there is not much threat of a CO2-induced disaster/calamity/catastrophe by 2100.
Maybe the part you’re missing is that a TCR of 2.3 C implies an ECS near 3 C. For an ECS of 3 C, 700 ppm gives 4 C above pre-industrial which is in the range I quoted.
The relevant rate of increase is that of the anthropogenic component which is 400-280=120 ppm. This is increasing at 2% per year for a doubling time of 33 years in just the manmade part. That’s the rate that gets us over 900 ppm by 2100 which gives us 5 C above preindustrial.
Jim D: ECS near 3 C.
But the equillibration to a doubling of the CO2 concentration occurs long after the doubling has occurred, and we have been focusing (so far), on the threat through 2100.
The relevant rate of increase is that of the anthropogenic component which is 400-280=120 ppm.
I disagree. The relevant rate of increase is the rate of increase in the total atmospheric concentration. The radiative effects of the CO2 molecules do not depend upon their source.
Yes, but 3-5 C is the committed warming based on the 2100 CO2 level. If you use 2.3 C and 900 ppm, it is still about 4 C. These are the ranges I have been talking about.
Also 2% is the relevant rate that describes CO2 growth mathematically. In fact it fits well to the whole industrial era. So if you’re modeling CO2 levels, that is the way to calculate its growth rate. It also happens to be the emission growth rate. Projecting 2% out to 2100 gives over 900 ppm, a CO2 forcing in excess of 6 W/m2, and a warming of 4-5 C.
In response to CO2 concentrations hitting 900ppm by 2100: It’s fair to point out that trajectory is not the same as trend.
Nor is a projection a prediction.
It’s a scenario where the future growth rate matches that of the past century which would be one type of business as usual.
That’s a sensible take. However, the trajectory tells a different story. Even as China and India ramp up coal the rest of the world, including the US, shows a net decline in coal use. As well, the coal plants of today are not the same as those of yesterday. Vehicle emissions standards are increasingly strict. There are examples of advancements, too numerous to list here, which disrupt the “business as usual” scenario, and therefore it’s narrative.
The 2% growth rate has been maintained despite changes in the fossil fuels being used. Creative ways can be found for maintaining future carbon growth in much the same way as Moore’s law has worked for decades and spans several computer technologies. The 2% BAU does however assume a global dumbness where there is no incentive to move away from carbon which only few would advocate for. The consequences of BAU are becoming obvious to the policymakers which is what drives alternatives. However, even with efforts starting as far back as Kyoto in 1997, the manmade CO2 growth rate from 2000-2018 is still 2%.
Jim, the combustion engine has been very good to us. It is an example of an innovation that has transformed our world and largely for the better. However, like smartphones, all ubiquitous technologies are due to change at some point, and regardless of this “dumbness”. I can’t imagine either the use of combustion engines emitting pollution, or that we employ them at all 30 years from now a likely scenario.
The US car industry has no incentive to go in that direction and will fall behind the world if this keeps up. This is dumbness in action.
Jim, again look at the data. Vehicle emissions are down trending globally.
Most countries have plans to increase gas mileage substantially as a matter of policy. The US just abandoned theirs. It’s just part of the cluelessness going on with the government who are also doing their best to revive coal while reducing pollution regulations surrounding it to make the coal baron buddies/funders happy. Gubmint at work.
Interesting point Jim, but let’s expand our reasoning to manufacturing; with increased US based manufacturing servicing the north american continent we aught to see a net reduction in fuel usage; transporting goods from overseas. As well we should expect better environmental controls on the processing of raw materials as compared to China. We should expect a net increase in product quality, usability and in life cycle. Generally speaking.
That would be fine, if it wasn’t for deregulation that reduces pollution controls, worker rights and safety, and consumer protection at the same time. Making America more like Mexico or China to compete with them isn’t the way to do this.
The end user appreciates a quiet non polluting car. And a better economic outcome produces an environment where consumers can purchase quality.
The way things are going, those are more likely to be imports or made by foreign companies.
That was the outlook heading into 2008. I’m not sure it applies now
Jim D: Yes, but 3-5 C is the committed warming based on the 2100 CO2 level. If you use 2.3 C and 900 ppm, it is still about 4 C. These are the ranges I have been talking about.
Also 2% is the relevant rate that describes CO2 growth mathematically.
I recalculated the rate of growth of atmospheric CO2, and got a little under 0.5% for the most recent 25 years. It will likely be higher in future, but it unlikely to reach 0.8% on the evidence to date. As the concentration increases, the processes that sequester the CO2 increase in rate, so the atmospheric growth rate does not come close to the anthropogenic growth rate.
Even if the “committed warming” as of 2100 is greater than the warming as of 2100, that does not imply that the rate of warming through 2100 prevents the adaptation of natural populations. If CO2 increase in the atmosphere ever looks like 900 ppm is a reasonable projection for 2100, that still does not project a 4C warming from the climate at 350 ppm.
MM, to be clear, the baseline for the 4 C warming is the preindustrial temperature (as it is for the 1.5 C and 2 C everyone talks about), and the baseline for the percentage increase is 280 ppm. This is where the 2% growth rate comes from. So 350 ppm corresponds to 1 C warming that we already have. 400 ppm adds about another 0.5 C to make it 1.5 C committed, and 450 ppm makes it 2 C committed. 900 ppm makes it 5 C committed.
Your opinion seems to be that it is better to adapt to 4 C than to try to prevent it at all. Most would disagree and would prefer a 400 ppm climate over a 900 ppm one if there was a choice.
Jim D: MM, to be clear, the baseline for the 4 C warming is the preindustrial temperature (as it is for the 1.5 C and 2 C everyone talks about), and the baseline for the percentage increase is 280 ppm.
To be clear, I am recommending that people think differently than that. Taking the climate at 350 ppm or (as I recommend) 400 ppm as better than it was at 280; and taking seriously Lovejoy’s sensitivity estimate (even though it is likely an overestimate); and taking seriously the observed rates of increase in atmospheric CO2 concentration, there is no case for alarming change in climate by 2100.
Alarm comes from: treating 400 ppm as clearly worse than 280 ppm; treating Lovejoy’s estimate as though the implied equilibrium temp increase will occur soon after CO2 concentration doubles; and exaggerating the rate at which atmospheric CO2 concentration is increasing.
MM, so when the choice is between 400 ppm and stable (translating to 1.5 C at equilibrium, the current upper limit in the latest report), and 700 ppm and rising at 2100 (translating to 4 C of warming and rising), your choice is which? Maybe you would accept 400 ppm, even 450 ppm and stable (the old IPCC limit), but beyond 500 ppm, and still rising, it becomes a little less comfortable, right? Those are levels under which Greenland had no glacier last time for some context. As we get to 700 ppm, Antarctica’s glacier also becomes challenged by new conditions, and this could be where we are at 2100, both polar glaciers unstable.
About the Continent of Antarctica; it produces it’s own climatic systems. It may be less sensitive to warming than we expect. As well, we are residing within an interglacial with unremarkable ocean warmth. A few thousand years of rising SI and we might have a situation where the Antarctic continent begins to loose mass…or…we might have a situation where due to a wetter atmosphere more precipitation falls into the Continent and the ice mass increases. These scenarios highlight how seemingly simple processes can yeild results which seem counterintuitive.
The Antarctic is already losing mass according to most estimates. The warming water around it is destabilizing ice shelves, some of which are breaking off, which in turn increases glacier flows.
Jim D: MM, so when the choice is between 400 ppm and stable (translating to 1.5 C at equilibrium, the current upper limit in the latest report), and 700 ppm and rising at 2100 (translating to 4 C of warming and rising), your choice is which?
My choice is to use parameter estimates and other information as well as possible. If 400 ppm is a good climate (as shown in publications that we reviewed here), or if 350 ppm was a good climate (as you wrote); if CO2 concentration is not increasing at 0.6% per year; and if Lovejoy’s estimate of sensitivity is accurate (I think Lewis and Curry’s estimate is less of an overestimate); then there is no serious expectation of disasters requiring urgent preventative measures to “control” climate change before 2100.
Any warming we get this century is likely to be beneficial. Further, any reduction in risk of next cooling event would be massively beneficial.
And stopping the massive cost of the climate industry would be massively beneficial.
MM, the mathematical fit is that the manmade part (that above 280 ppm) has been increasing steadily at 2% per year since preindustrial times (even since 2000). Carried forwards that gives us over 900 ppm at 2100. That’s 5 degrees above preindustrial for 2% BAU.
Some recently published reading for Peter Lang.
The Permian-Triassic extinction event was nothing like the present conditions – unless we get some bolide strikes, a few million years apart, and GMST soars to 36.3C and tropical average to 43C!
For a better example of the benefits of a warmer planet for life look at the period from PETM to present. Warmer periods and warming were better for life. Cold periods were more difficult. Extinctions were during cooling periods, not warming.
“The end of the Eocene was marked by the Eocene–Oligocene extinction event,” – a cooling event.
Peter, I was going to respond with something similar. But then I thought, “maybe it’s just a waste of time.”
The headline of the article states “We’re doing to the atmosphere today what cataclysmic volcanoes did 250 million years ago.”
I just quote it because I agree. Releasing deeply stored carbon and fast is not a good thing to do to any environment that has life in it. Check out the article and the work it refers to.
Jim this is just plain false equivalence. We need to consider the scale and the context
The context is extinction and rapid warming. The scale is global. What do you have as an argument?
Jim, “the scale is global”? The water is wet; a nebulous meaningless statement wanting for context.
“The context is extinction”? There is really no way to respond to this, throw away, statement. It really sounds like somebody has been watching Guy McPherson videos. How do I respond to a comparison where “reality” portrays none of the preconditions which existed leading into the Permian extinction. It’s a ludicrous argument, and you can do much better.
I gave you a news item to read and so far you have given me nothing to read to refute it. Extinction is just one symptom of unprecedented change rates. Others are new extremes for events and consequent damage from those. The speed of change matters. We don’t tend to adapt in advance, just in response to damage. Planning for the cost of worst cases has to allow for what rapid change does which doesn’t only affect ecosystems but also the human reliance on a stable environment.
Ok. It’s just that I don’t think anybody is studying the similarities between now and the Permian.
Here’s the thing, I can’t even concede to the notion of unprecedented change. Because if I did that I would be forced to unread a paper which compared rates of change between the Holocene and the previous interglacial. And that paper outlined periods of wild changes over very short time frames.
And on adaptation; there is no metric for what we have become in terms of adaptation. My house keeps me cozy and warm during minus 52C winter days (which are rare).
The changes now and expected by 2100 are more important because we have relied on a stable climate. If you think 4 C warmer in the next century is a good thing, you need to make the case why you think that. Rapid change has not normally been considered optimal before, and in fact the contrary is accepted. With rapid change towards unprecedented warming the problem becomes more akin to surviving it than looking forwards to more of it. Why does no one write about how good a warming of 4 C by 2100 is? Are they afraid to appear foolish? Maybe you know the answer to this.
Jim, the climate has not been stable on geological
Sorry, that was sent too early.
Jim, the earth has not been stable since we started roaming about. 1C warming per century is a rather quick pace. However, it is a pace we have seen before.
I don’t think 4C is a likely outcome. Time will tell.
It’s not likely only because we can avoid it, and there is an effort to. However, anything significantly less requires a reduction in global per capita CO2 emissions which doesn’t happen automatically. It needs international agreements that work, and technological advances that should be encouraged too for everyone’ s good. Not much to ask for, but a reality that is meeting surprisingly vicious resistance in some corners.
Jim D: The context is extinction and rapid warming. The scale is global. What do you have as an argument?
In the context of the “rapid” climate change since about 1880, there is much more evidence of benefit to the natural and farmed vegetation than there is of harm.
For a thorough review of the effects of the “rapid” increase in CO2 concentrations and some experiments with even more “rapid” doubling of CO2 concentrations in natural foliage, start here:
MM, I don’t know how you extrapolate from those types of studies to preferring a CO2-doubled climate over today’s climate. There’s a gap in your reasoning there. Why would you prefer 700 ppm to 350 ppm for example? What are the pluses and minuses?
Jim D: I don’t know how you extrapolate from those types of studies to preferring a CO2-doubled climate over today’s climate.
First let me remind you that, in the context of “extinction”, there is no evidence to support a worry that doubling the CO2 concentration to 800 ppm would cause extinction. The plants and animal species today are descended from the survivors of the earlier extinctions that you have worried about; and there hasn’t been evidence of harm to vegetation from increased CO2 concentrations since the late 19th century.
Second, some of the CO2 enrichment studies achieve quite high concentrations, e.g. > 550 ppm, double the pre-industrial estimate.
Third, my “preference” is probably irrelevant. There isn’t evidence to date that recent CO2 increases have been too extensive or too fast for the extant biota.
First, acknowledge that life on Earth thrives better at higher GMST than present – for example, when GMST was 26.5 C there were tropical forests from pole to pole, average tropical temperature was 32 C, GMST was 11.5 C higher than present, and higher than the present average tropical temperature. (more here: https://judithcurry.com/2018/10/11/climate-uncertainty-monster-whats-the-worst-case/#comment-882521 )
After you have acknowledged that this point is correct then we can discuss rate of temperature change, whether higher CO2 concentrations are better or worse for life, and causes of extinction events.
there is no evidence to support a worry that doubling the CO2 concentration to 800 ppm would cause extinction
On the contrary.
IPCC AR5 WG2 SPM A-1
MM, what we have since 1880 is 1 C in a century, mostly since 1950. This is a fraction of the change expected by 2100 under BAU and we already see the changes in extremes and other effects of warming (e.g. Zika, beetle-kill) that are only the start. If you really think 800 ppm is something to look forwards to and not mitigate against, that is an interesting perspective that we don’t hear much on these blogs. Someone should write a main post in favor of 800 ppm.
Rapid warming does cause extinctions and threatens life and livelihoods. Humans do better in temperate parts of the earth, both in lifespan and productivity. There is a negative correlation between mean temperature and both health and wealth. It is not a direction that favors humans, however good it is for citrus fruits and cacti.
I worked in forestry as a beetle surveyor. You are absolutely mistaken when attributing beetle kill to climate change. The number one cause of beetle mass attack is fire control.
Maybe you can get the Forest Service to correct their page then.
“Extended droughts, warm winters, and old, dense forests have enabled this epidemic to become vast.”
Yes, those favtors weeken the trees which have natural defences against attacks. However, what you don’t know is that beetles have a fly radius of about 50 meters unless they fly on a day with very high winds. Fires, which love beetle killed trees typically cpntrol the beetle populations by mitigating their suply of weakened trees.
And the forestry doesn’t need to tell you these things for you to realize that it is extremely difficult to attribute these events to climate change. Localized forest management and weather patterns can easily cause the conditions needed to allow beetles free reign.
Here’s a hint, if you don’t know what you’re talking about, try to listen more
It took about a minute of Googling to find the official Forest Service word on the subject. You need to be arguing with them.
Jim, I sat in a class and with my own two eyes looked at the data. If you want
Sorry, that comment was not meant to be cut short. I’ll try again. As I have looked at the historical beetle data(MPB specifically), I can tell you that the mass attacks we have experienced in the last 15 or so years have occurred before, and within the last century. It’s a misguided business making attribution claims without making the reader understand that both forest management and natural variability in short term precipitation cycles are THE causes of the MPB issue. The forestry website will have no disagreement with what I’ve written here. Any attribution study which makes the type of claim Jim is making had better explain the challenges with asserting confidence.
The picture I get when reading some of the comments on this and other sites, is that many people want the climate system to be perfectly linear. They want cause and effect in simple to understand terms. Well it doesn’t work that way.
Jim, the very Idea that you believe that a weak SI should translate into instant cooling? It boggles the mind. We have to ignore the ocean altogether to make that leap.
I was listening to Professor Lindzen a few days ago and he made this clear; the earth’s climate is variable without any forcing whatsoever. The sun and the earth are all that’s needed to produce cooling and warming and natural disasters. So if we are experiencing an event which, outside of anthropogenic forcing, has family in the records, it becomes very difficult to make attribution claims. And these claims aught to be understood as speculative.
It’s not linear, but it’s long-term trends with noise added. Tomorrow is not warmer than today, and maybe not even next year, but decade on decade there is warming. This has impacts, one of which is milder winters that kill less pests. So, no, it is not linear, but what we have now will only get more frequent making it worse, not better, in the long term regarding those forests. Same with storms, droughts, heatwaves, floods, coastal surges. These are long-term trends in frequencies and not towards the better.
You did not respond to this; https://judithcurry.com/2018/10/11/climate-uncertainty-monster-whats-the-worst-case/#comment-882570 , so it seems fair to assume you recognise it is correct. Please acknowledge that you accept this is correct, so we can get that off the table and move on to the next issue.
verytallguy: IPCC AR5 WG2 SPM A-1
Well, maybe SPM qualifies as at least a little evidence, though it goes beyond what is supported by the technical sections. I think it is overwhelmed by the evidence in the list of Science articles that I provided, as well as recent reviews in Nature.
Peter Lang, it is incorrect for human life that is best adapted to a temperate climate. Trees may be fine. Dinosaurs too, in your ideal world, but humans and mammals get very uncomfortable unless you want to give them a million years to evolve adaptations to different climates the way nature does.
Jim, this statement, “humans and mammals get very uncomfortable…” This statement is unsupportable. You are talking about a human race with unparalleled adaptive technology. Amend your statement please.
Humans used to temperate climates would not want to live in the steamy Amazon jungle (or Eocene) climate by choice. Given many generations to physically adapt, maybe, but those first generations would be very uncomfortable, and prefer to go back to a temperate climate. There are people living there, but they’re adapted over thousands of years, and that population is low because it is far from an ideal existence that is not envied.
A warming world does not deliver the “steamy Amazon” to all life. The province I live in would see slightly less horrific winters. Where the heck is that steamy Amazon when we need it
Peter Lang is talking more about the Cretaceous era if you check. The question I ask is whether today’s Holocene life and plants would do better in Cretaceous conditions if you immersed it there. What do you say?
Jim and Peter, I think I missed that comparison. I’m not really qualified to have an opinion on that question, but here is what I think; stress. A great many species would be stressed and many would not survive.
Saying that, I’m not sure why the question is asked. This scenario is not reasonable.
I don’t think you’re giving Peter the answer he wants.
Jim, giving Peter the answer he wants is irrelevant. Obviously life on this planet is not easily adaptive to 8 and 10C of warming. Did life thrive in those conditions? Yes.
We are talking about the viability of a small increase in surface temp. Which it seems a stretch to invite panic over.
4 C in a century is no small change, and that’s not even considering what happens to sea levels.
If 4C occurs, I will tip my hat to you. But you should at least try to understand that the oceans will take a bit longer to warm than 1 or 2 or three thousand years
For the last 30 or so years, the land has been warming twice as fast as the ocean which is almost twice as fast as the global average. So 4 C for a global average is much more for the land at this rate.
Just answer the question I asked without the continual diversions. We can discuss your diversions and qualifications after you answer the questions I asked.
Peter Lang, the question is far too vague. Define “life”. Do you mean bacteria, humans, trees, potatoes, ants, dinosaurs, mammoths? What exactly?
Multi-cell flora and fauna.
Still too vague, but now we’re getting somewhere. Today’s Holocene-adapted flora and fauna or some other form that doesn’t exist today?
It’s a simple question. Don’t try to complicate it. If you prefer, the question is about all life. At what temperatures does life on Earth thrive best. It’s very simple and obvious answer. The paleoevidence over the Phanerozoic Eon is clear..
Peter Lang, I think you’ve figured out by now that life as we know it thrives in the Holocene climate and not in any vastly different one.
Stop trying to divert from the question I asked you. Just answer the question I asked. I didn’t ask about the Holocene – I asked about life over the Phanereozoic Eon. I asked at what GMST has life thrived best over the period Answer that question! As I said, we’ll get to discussing all of your other points, one at a time, in an appropriate order. But you need to answer each question asked, not answer a different question than the one I asked and not try to change the topic.
I answered, life as we know it does not thrive. I don’t know what point you want to make, but I have made the point I want to, so we can close it at that, and you lost your chance.
You have not answered my question. You keep throwing in diversion and caveats that are not included in my question. Answer my question. I’ll rephrase it to make it clearer for you:
Do you agree that life on Earth thrived much better when GMST was up to 13C warmer than now?
I am trying to have a rational discussion. That means we resolve each issue before discussing a new one.
BTW, Jim D, you have not made any point. I have not responded to your many diversions and I won’t until you have answered my question.
What you have done by continually avoiding the question and changing the subject is demonstrated denial in spades.
I will give you two answers.
1. Holocene life as we know it would not survive too well under those conditions.
2. Given a million years to adapt/evolve, life of some sort can thrive under a range of conditions on earth.
Does life thrive in the ocean? Yes. Does that mean we would want to live there? No. It’s about adaptation and evolution. A point completely missed in your over-simple question.
Jim D: Why would you prefer 700 ppm to 350 ppm for example?
Interesting question. On evidence to date, why would anyone prefer 280 ppm to 400 ppm? Do you?
Experiments with CO2 enrichment show that natural and artificially bred vegetation grows better at the higher CO2 concentrations. Evidence to date shows little harm and much benefit in the change from 280 to 400 ppm.
MM, I may prefer 350 ppm to 280 ppm, but >400 ppm pushes tipping points like Greenland’s glacier (as we see with its melt rate becoming significant for sea level rise), and the further beyond that the worse it gets. However 400 ppm corresponds to the 1.5 degree criterion. Perhaps an upper limit for stability and what we would call normal.
Jim D: MM, I may prefer 350 ppm to 280 ppm,
Do you? You reference all changes to the 280 ppm climate instead of the 350 ppm climate.
Yes, the reference is the pre-anthropogenic climate because that is easy to define objectively. Any later reference is a moving target and doesn’t refer to an equilibrium situation. Everyone would have their own either based on temperature or CO2. So 350 ppm corresponds to ~1 C at equilibrium, 400 ppm is ~1.5 C, 450 ppm is ~2 C. Take your pick.
Peter Lang, your question is far too vague and introduces a whole range of questions for clarification which you don’t want to answer. What exactly do you mean by “life” and what do you mean by “thrives”?
Do you mean Holocene life as we know it now? The answer is definitely not.
Do you mean after millions of years of evolution and adaptation in a steady climate? The answer is yes, and not just that climate but a wide range of cooler ones too including the current one. Life adapts to any climate given a million years and reasonable steadiness. Fast changes introduce calamities for life, aka extinctions.
Just answer the question. It’s perfectly clear and I’ve already addressed all your attempts to divert and change the question. As, I’ve already said in reply to your numerous attempts to divert and change the question, it’s not about the Holocene, its about the Phanerozoic; and it’s not about CO2, nor rate of temperature change, nor adaptation or evolution.
Did life on Earth thrive better at higher GMST than now? (Compare Cambrian Explosion, Jurassic, Cretaceous, Eocene thermal maximum, Miocene warm period versus much colder than now (e.g. during ‘Silurian ice age, Permian-Carboniferous ice ages, Little Ice Age, LGM).
No, it did not thrive better than now. Life is adapted to the current climate and thriving just fine thank you. How can something thrive better? What do you even mean by that? Either it lives or dies. The Triassic species would not like our climate at all, and we would not like theirs. Species thrive when they are adapted and to some extent evolved to their climate.
Jim, you are using circular arguments. It is obvious to most people on our planet that warmer locations are more habitable. Mann will disagree, but he always disagrees.
You mean like India, the Amazon, Sahara, and central Africa? Do you prefer those climates to Europe and the US? Is immigration in that direction?
That’s contrary to all the evidence.
There is 50% more carbon tied up in the biosphere now that at the last GCM.
The planet has been greening over the past 100 years and past 20 years. I.e. life thrives more as GMST increases.
There were tropical forests from pole to pole at the Eocene Thermal Maximum (GMST 13C warmer than now). Clearly there would have been much more C tied up in the biosphere then than now.
Life thrived much better in each warm period than each cold period.
Your assertion is a clear case of denial of the evidence.
Thanks for clearly demonstrating your agenda is to push climate alarmist ideology, to not participate in rational discussion, never admit you are wrong, and the relevant facts.
Peter Lang, you have your own personal definition of “thriving” that does not include survivability of our current species in those conditions. You prefer Triassic life to Holocene life and maybe would prefer to live there yourself. Fine, that’s your opinion. I don’t think I can change your mind. As we head towards a reduced number of species with the forthcoming fast climate change, things are not going to be as good as we have had in the Holocene. This is one of the more cited works on the coming extinction. Thriving – not so much.
Still not answering the question and try to divert back to extinctions again.
Biomass density (tC/ha) ~10 times higher in tropical rainforests than extratropical.
Total carbon tied up in the biomass during the Eocene Thermal maximum would have been around 5 times more than now.
Life thrives better when warmer, and struggles when colder. The direction of the trend is clear.
Your peculiar definition of thriving=biomass that you just seem to have invented is not very useful. Do you prefer to live where there is more biomass like the middle of tropical forests? What is so wrong with croplands and grasslands? Here on earth, the tropics are the least healthy places for humans.
Still haven’t answered the question. Anything but. Denial on steroids
“Gillman et al. (2015) Latitude, productivity and species richness
Excerpt from Abstract:
The answer to my question is clearly yes. Life thrives better when GMST is higher.
My answer was no and you saw it. The Holocene is better for humans to thrive in than the Triassic. You seem to care more about the number of tree varieties than the health of our species in your rather plant-centric definition of thriving, a definition that has no relevance to actual people.
You keep trying to put words in my mouth that I did not say. Please stop that.
I accept that you believe warmer is not better, and any warming that occurs this century will be damaging.
Do you therefore believe that cooling would be beneficial?
Or do you believe the current GMST is perfect for life on Earth?
Your reasoning is equivalent to believing we are the centre of the universe because we are here.
Once again, I believe a stable climate is good for life and a rapidly changing one is not.
As I told you repeatedly the question I asked is not about rates of change. Re read the question and my answers to all your attempts to divert and avoid answering the question I asked.
Above you said:
So, contrary to the unequivocal evidence that life on Earth thrives better (i.e. more productive and more tC/ha) at higher GMST than now, you do not accept the evidence and prefer to believe otherwise, right?
Is that a clear sign of denial?
Thriving and extinction are opposites, and what extinctions have in common is a rapidly changing climate. Not talking about conditions that lead to extinction is missing that whole side of the coin, and makes any definition of what it takes to thrive incomplete to say the least. You prefer conditions with more tree varieties and that’s fine as far as it goes, which is not very far when we talk about humans and climate change.
As I’ve said repeatedly, we need to resolve one point at a time. We’ll discuss all you other points, but not all at once. First you need to acknowldege that life thrives better when the planet is warmer (thirives means more biosphere productivity and more C tied up in the biosphere).
Please accept this point as correct, before we move on.
I think we have settled that we mean very different things by life thrives, and I am not accepting your simplistic definition so it makes it pointless to answer.
It’s not a simplistic definition and you understand perfectly well what it means. In contrast your comments are loaded with undefined terms, ambiguous meanings, diversions, and dodges – anything to avoid admitting you are wrong. Dead wrong!
BTW, Your comment that thriving and extinctions are opposites is incorrect. Thriving (i.e. high biosphere productivity) occurs during warm periods, whereas low biosphere productivity occurs during cold periods, such as the icehouse the world is in now.
Just answer the question, Jim D, then we can get to your other points.
Biodiversity only occurs when you have a stable climate for long enough. But, biodiversity such as occurs even today in tropical forests should not be conflated with conditions that favor humans. In fact it is the opposite, because humans replace biodiversity with monocultures at every opportunity.
Your comment is irrelevant to the question I asked. Again you have demonstrated your inability to admit when wrong, that you always change the subject instead of admitting when wrong, and you deny relevant facts that don’t support your beliefs.
The evidence clearly shows that biosphere productivity increases as temperature increases – from snowball earth, to icehouse to hothouse conditions (GMST approx. 28C). That’s the point you keep dodging.
The mass of carbon tied up in the biosphere increased by about 50% from LGM to pre industrial and was about 5 times higher than preindustrial during the Eocene Optimum (50 Ma ago). That is, carbon mass in biosphere increased about 50% as GMST increased by 5.5C and was about 5 times higher when GMST was about 12.5C higher than now.
Do you accept or deny these facts, Jim D?
You are suggesting that I am denying that there was a lot of biomass in distant past hothouse conditions, when I am only saying “so what” in various ways. Do you prefer to live in a hothouse? What is the point of all this?
As I’ve said repeatedly, you need to accept or refute (with evidence) that life thrives better (i.e. more C tied up in the biosphere) in warmer times then now (GMST’s previously stated), before we move on to discussing you other points.
Equating life thrives to more biomass is your assumption. No one else equates those things. Life thrives now too. What do you think is the thrivingest period? It’s subjective. If you’re a dinosaur it may indeed be the hothouse Jurassic. As a person it is the relatively temperate Holocene where we thrive most.
I disagree that you have to have more biomass to thrive more, which is why the Holocene thrives too. There are forests in the tropics in the Holocene, just a smaller area. Our forests have birds when the Jurassic forests didn’t. We have grasslands and savannahs that allowed for human evolution, when the Jurassic had no such landscape. How do you judge which one is thriving more? Just quantity of biomass? Really? Too simplistic as I have said before. Doesn’t the Holocene meet your definition of life thriving?
My answer is that more carbon tied up does not equate to more thriving. You can see that because life thrives now too with less carbon taken up in biomass. The very idea that life thrives now too, which you don’t dispute, defeats your whole concept.
Holocene life thrives best in the Holocene climate, as I keep telling you.
Do you agree that biosphere productivity was higher when GMST was higher than now and lower when GMST was less than now (GMST’s previously stated); and also, on Earth now, biosphere productivity is higher where temperatures are higher (i.e. in tropics compared with extra-tropics)?
In accordance with rules of rational discussion, we need to resolve this point before we move on to discussing other points.
“biosphere productivity” is not a common term. Reference what you are talking about.
I referenced it in previous comments. Go back and read them and the links.
It seems it would be largely determined by the global mass/area of forests regardless of other life. Is that your take on it too?
You didn’t even know what the term meant until half an hour ago. So you’ve been making off topic replies to me all through this thread without a clue what you are talking about. What a waste of time!
Your reference before spoke about the importance of pole to pole trees in the biomass measure, so your optimal climate is essentially the optimal climate for trees, right? They like hothouse climates more. When you take biomass as a measure it skews to trees for obvious reasons. The Holocene has less trees because of grassland and tundras. Is it any worse for it? No.
People might be interested in my latest post at astroclimateconnection.blogpost.com:
My prediction is for an El Nino starting sometime around July 2019, followed by a La Nina in 2021. I realize that I am sticking my neck out here, but someone has to step forward.
astroclimateconnection: My prediction is for an El Nino starting sometime around July 2019, followed by a La Nina in 2021. I realize that I am sticking my neck out here, but someone has to step forward.
Why have you not brought your graphs up through 2017?
Climate Frankenstein: what’s the worst case?
Climate cools despite continued increase in atmospheric CO2. People realize the GW catastrophe is a hoax. GW funding decreases by 90%. Green politicians lose in elections. Al Gore and company are regarded as charlatans. The deniers play the violin while Climate Frankenstein wreaks havoc worldwide (or is it the best case?)
Judith: You start your thinking with the no-feedbacks climate sensitivity and add positive feedbacks to get likely ECS, worst-case ECS and impossible ECS. I preferred to think in terms of a climate feedback parameter (how many W/m2/K more heat is emitted (OLR) and reflected (OSR) as the planet warms) and use 0 W/m2/K as my starting place. 0 W/m2/K means no change in OLR+OSR with temperature, which would permit a runaway greenhouse or icehouse. Since our planet hasn’t recently experienced such events, we know that our planet’s climate feedback parameter is not 0.
However, since we currently are oscillating between glacial and interglacials without any significant external forcing, it may be that our planet has a climate feedback parameter fairly near zero between present temperature and 6? K colder at the LGM; and a negative climate feedback parameter outside this range.
The climate feedback parameter is the sum of Planck feedback (-3.2 W/m2/K) and all other feedbacks. However, on the multi-millennial time scale there are two additional feedbacks that we normally don’t consider: out-gassing of CO2 from the ocean and slow surface albedo feedbacks. They contribute to climate instability on the multi-millennial time scale, but act too slowly to contribute much to ECS. Therefore we usually don’t call them feedbacks. However, they are critical for the long to stability of climate.
Outgassing of CO2 from the ocean resulted in a net increase from about 180 ppm to 280 ppm, about 64% of a doubling or forcing of 2.3 W/m2. If we express this as a very slow feedback response to 6 K of warming (which it is), that is about +0.4 W/m2/K of feedback. If our planet is going to avoid a runaway greenhouse/icehouse on the millennial time scale, the sum of the fast feedbacks including Planck feedback must be more negative than -0.40 W/m2.
If I remember Hansen’s work correctly, surface albedo changes due to the melting of ice caps produce an even larger forcing than the outgassing of CO2 from the ocean. As the planet warms and ice caps retreat to higher latitudes with less land surface area, the change in reflection of SWR by these ice caps decrease. IIRC, the total change is on the order of 3 W/m2, or an average change of +0.5 W/m2/K. But this change in surface albedo will be larger when it is colder, perhaps say +0.8 W/m2. Now the fast feedbacks that contribute to ECS must be more negative than 2.0 W/m2 (ECS = 1.8 K). Otherwise the combination of these very slow feedbacks plus the faster feedbacks that contribute to ECS will produce an unstable climate in the long run.
Climate scientist typically call the very slow changes positive forcing rather than positive feedback because the radiative imbalance can approach zero and the temperature stabilize in a few centuries, long before these processes are complete. If a worst case postulates that ECS is 7 K per doubling and the sum of Planck and fast feedbacks is therefore -0.5 W/m2/K (on the century time scale), I’m going to say that slow outgassing of CO2 and melting of ice caps would produce an unstable climate. A climate feedback parameter of -1.0 W/m2 (ECS 3.5 K/doubling) seems like a reasonable upper limit.
Since climate has been oscillating between similar glacial cold and similar glacial warmth for many cycles, it seems to me that the climate feedback parameter likely become more negative outside of this range, making it harder for temperature to change. If it gets too cold and too much CO2 disappears into the ocean, plants may stop thriving and dust (a negative feedback) will cover the ice caps.
If we look at the Eem, we see high T over the early interglacial followed by a increase in CO2. Than we see a relative sharp decrease in T with still retreating ice aerea over centuries. There must be much more unknown “loops” or feedbacks in the dynamic climate system which are not to be solved with some W/m² calculations.
Frank, I agree that the long-term feedbacks aren’t included in ECS. At the end of the day, it has to be the three phases of water and CO2 that are stabilizing the climate within certain limits. In some ways, the long-term water related feedbacks (mainly albedo) are easier to understand and quantify than the short-term climate feedbacks associate with water vapor, clouds, sea ice.
I would think that atmospheric circulation also plays a role in “stabilizing the climate within certain limits” although I am not sure what that expression means. Do you mean keeping the climate within certain limits?
This reminds me of a favorite climate joke:
The stranger asks the farmer “Think it will rain?”
The farmer replies “It always has.”
But given that climate is chaotic the term “stabilizing” might be questionable. On the other hand, chaos is a powerful form of stability, the price of which is unpredictable oscillation. Sounds like climate.
Judith: Thanks for the reply. In case my point was lost in too many words, I’ll restate it. The climate feedback parameter including slow feedbacks from melting ice caps and CO2 outgassing must be at least slightly negative, but could be near zero between glacial and interglacial states. These slow feedback total about +1 W/m2. The climate feedback parameter without these slow feedbacks determines ECS. It must be more negative than about -1 W/m2, meaning ECS can’t be greater than about 3.5 K.
The fact that glacials and interglacials oscillate between current temperature and about 6 K colder suggests that the climate feedback parameter could be very low when these very slow feedbacks are included. Outside this temperature range, the climate feedback parameter likely becomes more negative (stabilizing) outside this temperature range. Otherwise glacial terminations would have a tendency to overshoot today’s temperature and glacial inceptions would eventually create a snowball Earth. The fact that we avoid these extremes implies that the climate feedback parameter including these slow feedbacks is not near near zero when the temperature is warmer than present.
Dr. Curry, there is real science backing the Green House Gas Effect, and the IPCC covers none of it. The question needs to be asked, “why are the models so wrong?” If I was going to model Weightloss and the variables I included in the model were # of Pens in home, area of driveway, distance one lives from Florida, I would expect my model to produce results like the IPCC Models. If you don’t include and weight the correct factor, you will never develop an accurate model. Everything regarding CO2 and its impact on global climate must revert back to CO2’s one and only mechanism by which to affect climate change, that being the thermalization of LWIR between 13 and 18 microns. That is the only defined mechanism for CO2, and therefore every observation must be explained in that context. LWIR between 13 and 18 won’t melt ice or warm water, in fact ice emits LWIR of around 10 microns, and very cold ice will emit LWIR between 13 and 18 microns. There is also only a limited amount of LWIR between 13 and 18 microns emitted by the earth and 100% of it is “trapped” by CO2 and H2O by the altitude of 5 feet. More CO2 would simply lower that level slightly towards the surface, but with the convection of the lower atmosphere, that becomes meaningless. Simply use MODTRAN, double CO2 in the atmosphere and measure the impact on the lowest 0.1km of the atmosphere. It has no impact.None, Nada, Zip. Focus on the significant factors. The oceans clearly dominate the climate, the oceans are warming, the oceans aren’t warmed by LWIR between 13 and 18 micron. What is warming the oceans is warming the climate, and what is warming the oceans is more incoming visible light, especially on the blue end of the spectrum. To warm the oceans you need fewer clouds over the oceans or a hotter sun, or both. The IPCC doesn’t focus on those factors so they will never have an accurate model.
CO2islife also incorrectly writes: “There is also only a limited amount of LWIR between 13 and 18 microns emitted by the earth and 100% of it is “trapped” by CO2 and H2O by the altitude of 5 feet.”
“Trapping” is an invalid term foisted by alarmist to convince a naive public they understand the GHE and enhanced GHE. It is 100% incorrect. GHGs both emit and absorb thermal IR. When the density of GHGs is high (near the surface), absorption and emission are in equilibrium at wavelengths that are “trapped” within “5 feet” or even 500 feet. Planck proved that radiation in equilibrium with quantized oscillators will have have blackbody intensity given by B(T). The vibrational and rotational states of GHGs are quantized oscillators. So, after traveling 5 feet upward, all of the photons absorbed by GHGs have been replace by new photons emitted by GHGs. The intensity remains that of a blackbody at the local temperature. After traveling upward 5,000 feet, the local temperature is about 10 degC colder, so its blackbody intensity has dropped by a factor of (278/288)^4, 87% of what it was. And the amount of water vapor is dropping rapidly. By the time upward radiation reaches the tropopause, there is almost no water vapor (3 ppm instead of thousands of ppm), the density of CO2 has dropped by 95%, and absorption lines have narrowed (less Doppler and collision broadening). Absorption and emission are no longer in equilibrium and photons are escaping to space through formerly opaque wavelengths. And the remaining GHGs at the tropopause are 50-100 degrees colder than at the surface and emit about half as much thermal IR than they did at the surface. That is why you see a notch in OLR at the wavelengths strongly absorbed by CO2 and emission of OLR at wavelengths strongly absorbed by H2O (which is depleted far below the tropopause).
If you have used a laboratory IR spectrophotometer, you may feel that samples (like CO2) only absorb incoming IR and do not emit it. This is an illusion created by using a hot filament at several thousand degK as a source of incoming radiation. The incoming intensity of light used by the spectrophotometer is perhaps 10^4 greater than the emission from the much cooler sample, and is negligible under these circumstances.
Many of us learn about Planck’s Law – which only applies when absorption and emission are equal – and Beer’s Law – which only applies when emission is negligible. In the atmosphere, emissions is never negligible and we often don’t have equilibrium between absorption and emission – because radiation travels to different temperatures before absorption and emission come into equilibrium. In the atmospheric window, radiation passes through the atmosphere unchanged and has blackbody intensity appropriate for the surface temperature where it was emitted. In a non-scattering troposphere and lower stratosphere, Schwarzschild’s equation – which is rarely taught – is valid. The incremental change in radiation intensity at a particular wavelength, dI, when passing an incremental distance, ds, through the atmosphere is given by:
dI = n*o*B(T)*ds – n*o*I*ds = n*o*[B(T) – I]*ds
where n is the density of GHG, o is its absorption cross-section at a given wavelength, B(T) is the Planck function at that wavelength and local temperature and I is the intensity of the radiation entering the increment ds. The first term is emission from the increment and the second is absorption by the increment. When absorption equals emission, I = B(T); Planck’s Law. When the incoming radiation is much greater than the local B(T), neglecting the B(T) term gives Beer’s Law.
Please learn the correct physics that describes how thermal IR interacts with the atmosphere. “Trapping” within “5 feet” isn’t appropriate for this blog.
That is why I put “trapping” in quotations. CO2 absorbs LWIR between 13 to 15 Microns. This absorption of a photon puts the molecule in a higher energy vibrational state. The electron then falls back to a lower energy state releasing the photon. Those dynamics are played out by an altitude of 5 ft. Doubling CO2 marginally decreases the level at which 100% is absorbed. At least that is what a gas cell demonstrates. Are you disagreeing with That? If you are, please tell me how, when ice emits LWIR of 10 micron, 13 to 18 micron can melt ice?
CO2islife incorrectly writes: “CO2 absorbs LWIR between 13 to 15 Microns. This absorption of a photon puts the molecule in a higher energy vibrational state. The electron then falls back to a lower energy state releasing the photon.”
The 13-15 um absorption for CO2 does not involve an electron or excited electronic state. It involves the first excited vibrational (bending) state of CO2 coupled with many different rotational states. 99+% of these excited states are produced by molecular collisions and 99+% of them are relaxed (returned to ground vibrational state) by collisions, NOT by emitting a photon. For this reason, the number of CO2 molecules in this excited state capable of emitting a photon – and therefore their emission rate – depends only on the local temperature via the Boltzmann distribution. (IIRC, it takes an average of 1 second for this excited state to emit a photon and collisions occur about once a nanosecond near the surface.) When collisions redistribute energy among kinetic energy and molecular excited states (rotational, vibrational and electronic) much faster than radiation (or any other process) brings transfers energy in and out, we say that a local thermodynamic equilibrium (LTE) exists and use the Boltzmann distribution to predict the fraction of molecules in an excited state. Derivation of Planck’s Law and the Schwarzschild equation begins with this assumption.
If you want to know the truth about how radiation interacts with the atmosphere, I recommend Grant Petty’s “A First Course in Atmospheric Radiation” for about $40. It is intended for meteorologists and says nothing about GHG-mediated climate change.
CO2islife writes: “Those dynamics are played out by an altitude of 5 ft. Doubling CO2 marginally decreases the level at which 100% is absorbed. At least that is what a gas cell demonstrates.
Absolutely wrong. A gas cell means you are thinking in terms of a laboratory spectrophotometer, which uses a very strong source of light to overwhelm the thermal IR that fills the atmosphere: W = eoT^4. A filament at several thousand degK puts out so much radiation that it overwhelms the thermal IR being emitted by everything else in the laboratory, including your sample.
The surface and the atmosphere near the surface are approximately the same temperature. CO2 in the atmosphere “shines” or emits just as strongly as the surface at 13-18 um. After radiation from the surface has traveled 5 feet or 500 feet upward through 1X or 2X CO2 in the atmosphere, the intensity in the 13-18 um band is UNCHANGED: For every photon absorbed another one has been emitted. To understand, go to this website which performs calculations using Schwarzschild’s equation for radiative transfer of energy in our atmosphere:
Choose the 1976 US Standard Atmosphere, No clouds or rain, and “looking up” from 0 km. The intensity from 13-18 um has the appropriate intensity for a blackbody at about 288 K. Now look up from 3 km above the surface where the temperature is about 20 degK. The radiation in this band has the appropriate intensity for a blackbody at 268 K. Look up from 6 km; intensity for 248 K. Look up from 9 km; intensity for 228 K. But notice that the band has now narrowed substantially. With atmospheric pressure and CO2 down by a factor of about ten, the weaker lines on the sides are no longer emitting at blackbody intensity. Continue the process. Note that the temperature in the US Standard Atmosphere stops dropping at 11 km and starts rising at 20 km. Change the amount of CO2 in the atmosphere if you want.
Repeat looking at water vapor absorption from 5-8 um and above 20 um. They change more rapidly because water vapor drops from about 1000 ppm near the surface to 3 ppm at the tropopause.
There relatively little energy arriving in the atmospheric window from 8-13 um. (At these wavelengths, you should be looking straight through the atmosphere to empty space at 3 K, but scattering or some other phenomena is providing a little radiation.)
You can also look down from various altitudes, but looking down, you can’t distinguish between radiation emitted by the surface that is 100% transmitted and 100% absorbed and completely replace with new photons emitted by the atmosphere. When you look up, you know that what you are seeing was emitted by the atmosphere, not empty space above.
(Please don’t give me any BS about the 2LoT, which applies to the net flux of radiation in both directions. The downward flux from the colder upper atmosphere has no way of “knowing” it is traveling through a warmer atmosphere or heading for cold space.)
CO2islife askes: “please tell me how, when ice emits LWIR of 10 micron, 13 to 18 micron can melt ice?”
Whenever radiation of any wavelength is absorbed, the resulting excited state is relaxed by collisions with neighboring molecules and becomes part of the local kinetic energy. The average kinetic energy of a group of molecules is proportional to its temperature. Many call this process “thermalization”. In almost all cases, excited states are relaxed by collisions much faster than the emit a photon. For CO2 in the atmosphere near the surface, a CO2 collides with another molecule about once a nanosecond (but all collisions don’t cause relaxation). The 15 um excited state takes an average of 1 second to emit a photon. 99+% of excited CO2 molecules are created by collisional excitation, not by absorbing a photon. In the troposphere and stratosphere, re-emission of an absorbed photon without collisional activation and collisional re-excitation is a negligible. Emission is determined by the local temperature, not the number of photons being absorbed. If iron is heated until it is glowing red hot, it emits the same amount of visible radiation, whether you observe it at night (few visible photons) or in daylight.
Once radiation has been absorbed, it quickly interconverts by collisions between translational (kinetic energy) and other forms of internal energy (quantized rotational, vibration and electronic excited states). That energy no longer knows whether it arrived from a 10 um or 15 um photon, or by conduction (collisions) or convection. If more energy arrives by ALL routes than leaves by ALL routes, then temperature rises. If we are talking about ice at 0 degC, the extra energy goes into melting. Absorptivity is always equal to emissivity at a given wavelength. For ice, both are high for thermal infrared radiation and low for visible radiation. (Nothing in the environment is hot enough to emit a significant amount of blackbody radiation at visible wavelengths, but if ice could be heated to several thousand degK and still remain ice, its emissivity would be low.)
Consider the top 10 um of the ocean: It is losing energy by emitting thermal IR (on the average 390 W/m2), by evaporation (80 W/m2), by sensible heat/collisions with molecules in the air (20? W/m2) and gaining energy by absorbing DLR (333 W/m2). During daytime, a small fraction of incoming SWR is absorbed by this layer, but most of it ends up deeper in the ocean. The heat from SWR absorbed deeper in the ocean is conducted and convected to the top 10 um. The surface of the ocean cools only about 1 degC at night, so heat transfer by conduction and convection is very efficient. If no SWR is arriving (the current situation near the North Pole, areas of open water are cooling and freezing, because absorption of DLR is not enough to compensate for the loss by emission and evaporation.
Although coherent light exhibits constructive and destructive interference under special circumstance (a double slit experiment, for example), the radiation traveling through the atmosphere is not coherent. If we think of light as photons, streams of photons easily pass through each other without annihilating each other. (When massive particle collide and annihilate, they produce photons, but massless photons don’t collide and turn into something different in our atmosphere.) In order to satisfy a perverted form of the 2LoT, some skeptics have devised a universe where radiation traveling through the atmosphere in opposite directions cancels, but predict the same net energy transfer as without cancellation. It turns out that photons and other fundamental particles obey the laws of quantum mechanics and not the 2LoT. A thermodynamic concept of temperature doesn’t even exist in the absence of a large group of rapidly colliding molecules. We can talk about faster- and slower-moving molecules, but there is no such thing as a hotter or colder single molecule. With a Boltzmann distribution of molecular speeds, the kinetic energy one particular molecule has for a nanosecond between collisions has no direct bearing on the local temperature of the group of molecules it is colliding with. And slower-moving molecules collide with and transfer kinetic energy to faster-moving molecule all the time – otherwise a Boltzmann distribution of molecular speeds would not exist. Collisions do transfer kinetic energy from faster to slower more often than slower to faster. A photon emitted by one molecule and absorbed by another has no way of knowing whether it is traveling from hot to cold or vice versa, when the instantaneous kinetic energy of both molecules does not depend on the local temperature. The 2LoT is a CONSEQUENCE of the behavior of a large group of rapidly colliding molecules following the laws of quantum mechanics. This subject is taught in a branch of physics and chemistry called statistical mechanics.
I am still 100% positive about my solar /geo magnetic/climate connections but the problem is what are the threshold levels needed? What degree of magnitude change and how long of a duration?
I think as far as the solar component, solar readings which occur during a typical solar minimum are adequate as long as the duration is much longer.
As far as the geo magnetic field I think it has to fall more from here and the magnetic poles have to move more toward the equator. I guess this can all happen sooner rather then later if the rate of weakening of the field is maintained or accelerates.
The thing that is disturbing is how the overall oceanic sea surface temperatures recently rose from a deviation of around +.15c from 1981-2010 means to now almost +.40c above 1981-2010 means in a matter of weeks. I think this is a blip because I can not see this being sustained in the face of weakening UV light and increasing galactic cosmic rays ,which should translate to an increase in global cloud coverage and major geological activity, and lower overall oceanic sea surface temperatures. It has to happen it is just when. It looked like it was until this recent blip, which is a complete surprise to me.
In any event what I am looking for is overall oceanic sea surface temperatures to start a downward trend again, an increase in global snow and cloud coverage, a more meridional atmospheric circulation , and an increase in major geological activity. If this happens global temperatures continue the trend down.
I have maintained this year is the transitional year but maybe I am early. I still think it will happen this year. I do know if these
climatic factors I mentioned in the above do not materialize into global cooling (NOW-NEXT FEW YEARS) , the global cooling is not going to occur and this will keep the AGW agenda alive.
The only way to end AGW theory is for the global temperatures to continue to drop and this should happen now-next few years in the face of both weakening solar/geo magnetic fields. If not now I say when? .
So we will know very soon.
As far as El Nino ,it is less likely today then it was a week or two ago.
The atmosphere above the eastern and central Pacific is a window for changes in planetary energy dynamics,
More high albedo closed cell cloud over cool upwelling water and less over warm.
Unlike El Niño and La Niña, which may occur every 3 to 7 years and last from 6 to 18 months, the PDO can remain in the same phase for 20 to 30 years. The shift in the PDO can have significant implications for global climate, affecting Pacific and Atlantic hurricane activity, droughts and flooding around the Pacific basin, the productivity of marine ecosystems, and global land temperature patterns. This multi-year Pacific Decadal Oscillation ‘cool’ trend can intensify La Niña or diminish El Niño impacts around the Pacific basin,” said Bill Patzert, an oceanographer and climatologist at NASA’s Jet Propulsion Laboratory, Pasadena, Calif. “The persistence of this large-scale pattern [in 2008] tells us there is much more than an isolated La Niña occurring in the Pacific Ocean.”
Natural, large-scale climate patterns like the PDO and El Niño-La Niña are superimposed on global warming caused by increasing concentrations of greenhouse gases and landscape changes like deforestation. According to Josh Willis, JPL oceanographer and climate scientist, “These natural climate phenomena can sometimes hide global warming caused by human activities. Or they can have the opposite effect of accentuating it.”
While I quote Josh Willis and the recently retired Bill Patzert – it is not a startling revelation. The window – and it’s reflective blinds in low level marine strato-cumulus cloud – modulate energy flows and stores over years to millennia.
The process involves Rayliegh-Benard convection seen in the atmosphere in the space age.
January 31, 2008:
From your article:
Unlike El Niño and La Niña, which may occur every 3 to 7 years and last from 6 to 18 months, the PDO can remain in the same phase for 20 to 30 years.
So 20 to 30 years of this?
October 11, 2018:
Centuries at least.
What happened inevitably was more cloud in the central and eastern Pacific in 2008 than 2018. This changes – of course – so what 2 points signify is sfa.
The extreme uncertainty cuts both ways. As Javier has pointed out, we don’t really know enough to rule out the begin of an irreversible descent into LGM conditions over the next couple millennia, particularly after a hundred or so years when fossil fuel becomes largely uneconomic and CO2 begins to decline. And we don’t know how well humanity will make that energy transition, or what kind of political/economic conditions will dominate the future — the decisions made by the next few generations could turn out to be critical to staving off the next glaciation and preserving human civilization.
How Tamino proved himself wrong.
Tamino has made it clear, that he is a slowdown, pause, and hiatus, denier.
But in a recent post, Tamino has made a stupid mistake.
In his eagerness to show how bad global warming is, Tamino has accidentally acknowledged that the recent slowdown exists.
Pingback: Weekly Climate and Energy News Roundup #331 |
The worst possible case? How about a model in which all incoming energy, e.g. 240W/m2, is dissipated in the atmosphere creating weather, chaos, etc. and only dissipated energy, unable to do any further work within the atmosphere, escapes. The steady-state solution for this nonlinear, 3-
dimensional system is a basic thermodynamics problem. Were this the case, CO2 doubling parameters would increase the temperature differential across the atmosphere by at most 1.5K and more likely less than 1K.
SUBJECT- THE FLAWED MODELS. AND AGW THEORY
The models are all flawed because they do not incorporate solar/geo magnetic effects. The only reason why this is not more apparent yet is because thus far the solar/geo magnetic fields have yet to reach threshold values of weakness which would result in a major rather then a minor climatic impact.
So if you think the climate models are off now just wait a few more years. This is going to become more and more apparent as the climate models forecast a continued warming trend while reality is likely a continued cooling trend.
AGW is flawed for the same reasons and in addition has hi jacked natural climatic variation to somehow verify it’s theory which is now falling flat on it’s face.
The cooling trend now in place over the past few years and should continue as we move forward from here.
What is different about your current prediction and reasons when compared to your cooling predictions of a decade ago? CO2 seems to have taken over the natural forcings and doesn’t appear to be letting go.
Yet everything says that large intrinsic variability remains. Despite their best efforts.
Do you really believe in hockey sticks?
“Believe” in hockey sticks? You mean understand them? You don’t “believe” in science, but it is possible to deny that every science organization in the world has come to the same graph.
Do you deny that the hockey stick has been proven to be correct many times over and by several disciplines?
There are considerable leads and lags through the Earth system. The stadium wave at long timescales? Polar anti-phasing?
Believing it can be represented in one graph is a hell of a leap of faith.
To learn about the hockey stick graph at least something, please watch following video (with an inevitable warning by Google):
Muller was skeptical and led the Koch-funded BEST project, and then some time after this video he reversed his position based on observations of the warming and agreed it really was happening that fast and the hockey stick was real after all, which is his current view.
Quick, on CPAN now… “Global Climate Action Summit.” Info on the guide tells us it’s Gov. Jerry Brown (D-Calif.) and former NY City Mayor Michael Bloomberg hosting a conference in San Francisco on lowering emissions by 2020. However, go there and it’s–
No signal in the data is nothing new when it comes to Leftists politicians pontificating on Climate Change. What happened to Al Gore? No private jet available?
I’ve been skipping around the video. At 2:20 Nancy Pelosi introduces Michael Bloomberg. As he begins, a bunch of protesters storm the stage with some kind of anti-capitalism banner.
At 3:00, the mayor of Houston says his city gets 80% of its energy from wind and 10% from solar
Harrison Ford has really gone off the deep end! Hey man, calm down, get in your plane and go find somewhere to have a cheeseburger.
I can’t seem to find Moonbeam anywhere, but there’s still a critical mass of self importance.
Check your cable. It might have come loose.
They have the video on this page:
Bonfire of the Virtue Signalling Vanities
The worst case?
A rapid refreeze in the Arctic underway in the Beaufort and Chukchi seas 2018 offset by tropical warming.
The best case, more of the rest of the Arctic refreeze gets quickly and the tropical heat abating.
In the Beaufort/ Chukchi area a small island of ice is rapidly extending forming a future extent edge which will fill rapidly on the polar side and help reach the southern land edge much more quickly than just normal regrow the. Check it out.
Was this supposed to mean something?
The climate worst case is biosphere extinction resulting from CO2 starvation from silicate weathering. This long term secular process will eventually choke to death all life on earth, earlier than hearing from a sun turning to red giant:
Sentient creatures can at times reverse CO2 starvation by recycling fossil deposited carbon into atmospheric CO2. However sentient life forms have the curious suicidal tendency to voluntarily cease such recycling, removing the impediment to CO2 starvation extinction.
Have you seen anything that Mr. Corbett has been releasing over the years? This is his latest with many references for further research. What do you think?
How would you mathematically prove that a world with positive net system feedbacks is unstable?
T = (−F -PRE)/(λ0+λ)
T is surface temperature – it is all computer derived – F is anthropogenic forcing, PRE is something I am calling the planetary radiative effect – it changes with temperature, water vapor, cloud, ice, dust, vegetation and biology as the planetary system evolves – biology is very important – λ0 is the Planck feedback, λ is the sum of other feedbacks. λ0 plus λ is probably negative. Whatever the method of decomposition – the net of F is some -2 W/m2/K.
Before AGW cloud feedbacks.
Klein et al 2017 suggest a ‘consensus’ of 0.25 W/m2/K.
The formula is valid for a small range of T as the planet evolves dynamically with feedbacks on feedbacks.
T = T0 + ΔT1 + …. + ΔTn
With net positive feedbacks future ΔTn are positive and T grows inexorably from the first disturbance. If feedbacks are net negative increases in T are damped and reduce to zero at maximum planetary entropy. The Planck feedback is the fundamental planetary stabilizer.
As positive feedback drives temperature up, negative feedbacks grow in response till the temperature falls again – and abnoscillation sets in. Like a predator – prey curve. One of the purest examples of this regular procession of positive then reactive negative feedbacks is the monotonic oscillation in brightness of Cepheid variable stars:
Pingback: Panicking about climate change? See the rest of the story. - Fabius Maximus website
Pingback: Panicking about climate change? See the rest of the story. – Climate Collections
“(CNN)You may be aware that a plant-based diet can make you healthier by lowering your risk for obesity, heart disease and Type 2 diabetes. Now, a study suggests there’s another good reason to regularly eat meatless meals. By filling your plate with plant foods instead of animal foods, you can help save the planet.”
” As noted in the previous post, exceptional winter snow accumulation and heavy, summer snowfall, drove the net snow input mass to 130 billion tons above the 1981 to 2010 average. This was followed by a near-average melt and runoff period, resulting in a large net mass gain for the ice sheet in 2018 of 150 billion tons. This is the largest net gain from snowfall since 1996, and the highest snowfall since 1972.”
on the verge of big ice regrowth or not? One big refrigerator of ice in Greenland.
Beaufort did its bit. Piomas did not, As the last of the El Nino heat moving north has dissipated it is time for a rapid refreeze on both sides, or not. Kara sea looks very promising as the next domino to fall.
JCH this might be the last expected global temp fall of 0.04 C for a while, amazing how much it has come down this year. Expecting a plateau for 6 months but hope it will fall further. Fourth to 6th warmest year? Lower next year again.
Oct 30, sky high temperature anomaly:
Oct 31: sky high temperature anomaly:
d a r n.
Pingback: Will Climate Change Bring Back the Black Plague? | sixdaysblog
Pingback: New University Of Exeter Study Finds Climate Models Skewed, Overhype CO2 …”Uncertainties Rigorously Concealed”!
Pingback: Climate Change And The Uncertainty Monster – Truth is difficult but essential; to find, to understand, to accept