Does a new paper really reconcile instrumental and model-based climate sensitivity estimates?

by Nic Lewis

A new paper in Science Advances by Cristian Proistosescu and Peter Huybers (hereafter PH17) claims that accounting for the decline in feedback strength over time that occurs in most CMIP5 coupled global climate models (GCMs), brings observationally-based climate sensitivity estimates from historical records into line with model-derived estimates.

A longer version of this post is at ClimateAudit, with additional technical details.

PH17 is not the first paper to attempt to bring observationally-based climate sensitivity estimates from historical records into line with model-derived estimates, but it makes a rather bold claim and, partly because Science Advances seeks press coverage for its articles, has been attracting considerable attention.

Some of the methodology the paper uses is complicated, with its references to eigenmode decomposition and full Bayesian inference. However, the underlying point it makes is simple. The paper addresses equilibrium climate sensitivity (ECS)[i] of GCMs as estimated from information corresponding to that available during the industrial period. PH17 terms such an estimate ICS; it is usually called effective climate sensitivity. Specifically, PH17 estimates ICS for GCMs by emulating their global surface temperature (GST) and top-of-atmosphere radiative flux imbalance responses under a 1750–2011 radiative forcing history matching the IPCC AR5 best estimates.

In a nutshell, PH17 claims that for the current generation (CMIP5) GCMs, the median ICS estimate is only 2.5°C, well short of their 3.4°C median ECS and centred on the range of observationally-based climate sensitivity estimates, which they take as 1.6–3.0°C. My analysis shows that their methodology and conclusion is incorrect for several reasons, as I shall explain. My analysis of their data shows that the median ICS estimate for GCMs is 3.0°C, compared with a median for sound observationally-based climate sensitivity estimates in the 1.6–2.0°C range. To justify my conclusion, I need first to explain how ECS and ICS are estimated in GCMs, and what FH17 did.

For most GCMs, ICS is smaller than ECS, where ECS is estimated from ‘abrupt4xCO2’ simulation data,[ii] on the basis that their behaviour in the later part of the simulation will continue until equilibrium. When CO2 concentration – and hence forcing, denoted by F – is increased abruptly, most GCMs display a decreasing-over-time response slope of TOA flux (denoted by H in the paper, but normally by N) to changes in GST (denoted by T). That is, the GCM climate feedback parameter λ decreases with time after forcing is applied.[iii] Over any finite time period, ICS will fall short of ECS in the GCM simulation. Most but not all CMIP5 coupled GCMs behave like this, for reasons that are not completely understood. However, there is to date relatively little evidence that the real climate system does so.

Figure 1, an annotated reproduction of Fig. 1 of PH17, illustrates the point. The red dots show annual mean T (x-coordinate) and H (y-coordinate) values during the 150-year long abrupt4xCO2 simulation by the NorESM1-M GCM.[iv] The curved red line shows a parameterised ‘eigenmode decomposition’ fit to the annual data. The ECS estimate for NorESM1-M based thereon is 3.2°C, the x-axis intercept of the red line. The estimated forcing in the GCM for a doubling of CO2 concentration (F) is 4.0 Wm−2, the y-axis intercept of the red line. The ICS estimate used, per the paper’s methods section, is represented by the x-axis intercept of the straight blue line, being ~2.3°C. That line starts from the estimated F value and crosses the red line at a point corresponding approximately to the same ratio of TOA flux to F as currently exists in the real climate system. If λ were constant, then the red dots would all fall on a straight line with slope −λ and ICS would equal ECS; if ECS (and ICS) were 2.3°C the red dots would all fall on the blue line, and if ECS were 3.2°C they would all fall on the dashed black line. The standard method of estimating ECS for a GCM from its abrupt4xCO2 simulation data, as used in IPCC AR5, has been to regress H on T over all 150 years of the simulation and take the x-axis intercept. For NorESM1-M, this gives an ECS estimate of 2.8°C, below the 3.2°C estimate based on the eigenmode decomposition fit. Regressing over years 21–150, a more recent and arguably more appropriate approach, also gives an ECS estimate of 3.2°C.

[i] ECS is defined as the increase in global surface temperature (GST) resulting from a doubling of atmospheric CO2 concentration once the ocean has fully equilibrated.

[ii] The abrupt4xCO2 simulations involve abruptly quadrupling CO2 concentration from an equilibrated preindustrial climate state; most such CMIP5 simulations were run for 150 years, but a few for up to 300 years. The use of abrupt4xCO2 simulation data to estimate the ECS of GCMs, most often by regression of TOA flux against GST change, is standard. Most GCMs have not been run to equilibrium with doubled CO2 concentration. Even where they have, any change in their energy leakage over time or with climate state would bias the resulting ECS value.

[iii] The authors define λ as −ΔH(t)/ΔT(t), corresponding to the negative of the slope for the overall changes in H and T at time t after a forcing is imposed, rather than as −dH/dT|t, the negative of the instantaneous slope at time t.

[iv] Values are changes from those in the equilibrated control simulation from which the abrupt4xCO2 simulation was branched, adjusted for drift and halved to restate for doubled CO2 concentration, making the assumption that for CO2 forcing is exactly proportional to log(concentration).

Fig. 1. Reproduction of Fig. 1 of PH17, with added brown and blue lines illustrating ICS estimates

Observationally-based climate sensitivity estimates derived from instrumental data are determined as ICS, since the climate system is currently in disequilibrium, with a positive TOA flux imbalance.

The most robust observational estimates of climate sensitivity based on instrumental data use an “energy budget” approach, described in IPCC AR5. That is they estimate the ratio of the change in GST to that in total forcing net of TOA flux imbalance, and scale the resulting estimate by F to convert it to ICS, as an approximation to ECS. To minimise the impact of measurement errors and internal climate system variability, these changes are usually taken between decadal or longer base and final intervals early and late in the instrumental period. The intervals chosen should be well matched in terms of volcanic activity (which has different effects from other forcing agents) and multidecadal Atlantic variability. Both Otto et al 2013 (estimate based on 2000s data) and Lewis and Curry 2015 satisfied these requirements. Otto et al used a GCM-derived forcing time series adjusted to match the overall change per IPCC AR5; Lewis & Curry used forcing time series from AR5 itself. Their observationally-based ICS median estimates were respectively 2.0°C and 1.6°C.

PH17’s statement: “A recent review of observationally based estimates of ICS shows a median of 2°C and an 80% range of 1.6° to 3°C” is based on a sample of 8 studies that included outdated and/or unsound ones. A number of other sound observationally-based ICS estimates not included in the sample used by PH17 fall within the 1.6–2.0°C range spanned by the Otto et al and Lewis & Curry estimates (Ring et al 2012 1.8°C; Aldrin et al 2012 1.76°C; Lewis 2013 1.64°C; Skeie et al 2014 1.67°C; Lewis 2016 1.67°C). I consider 1.6–2.0°C more representative than 1.6–3.0°C of the range of median ICS observationally based estimates from high quality recent studies.

PH17 uses an energy budget method to estimate ICS. If the energy-budget method is applied, based on the evolution of forcing over the historical period, to a GCM in which λ decreases with time, as in Figure 1, the resulting ICS estimate will obviously be lower than the GCM’s estimated ECS. However, contrary to what PH17 claims, if ICS is estimated using sound methods then the underestimation relative to ECS is typically modest, and the median CMIP5 model ICS estimate is still well above ICS for the real climate system as estimated by the best quality instrumental studies.

[See the technical post at Climate Audit for description of the eigenmode decomposition fitting method used in PH17].

ICS calculation

In PH17, ICS was inferred by applying total historical forcing F (per AR5 median estimate time series) over 1750–2011 to the estimated eigenmode fits for each GCM, thus deriving emulated time series of its H and T values. This was done 5,000 times for each GCM, sampling from the derived posterior probability distribution for the eigenmode fit parameter values. The 2.5°C estimate for GCM-derived ICS is the median across the 24 GCMs of all the sample ICS estimates – 120,000 in all.[i] This approach seems very reasonable in principle, but the devil is in its detailed application.

PH17 states that ICS is obtained as F/λ(t), where λ(t) = (FH)/T, with F, H and T being departures in 2011 from preindustrial conditions. Each of F, H and T is taken to have zero value in preindustrial conditions; total 1750 forcing was zero in the AR5 time series and the initial simulated values of H and T are zero.

PH17 also states that as values of F associated with each posterior draw could vary from the 3.7 Wm−2 assumed in the AR5 estimate of historical forcing, they multiplied F by F/3.7 for each draw before obtaining the values of H and T. While doing so is logical, it actually has no effect on the derived value of λ, since the multiplier scales equally both the numerator and denominator of the fraction representing λ. What is, however, critical to correct estimation for ICS for a GCM is that the F value into which the estimated λ is divided is, as implied by PH17, the estimated F for that particular GCM (which will vary between samples), and not some other value, such as the 3.7 Wm−2 used in AR5. Per PH17 Table S1 the median estimated GCM F values range from 2.9 to 5.8 Wm−2.

Error in ICS calculation

Cristian Proistosescu has very helpfully provided me with a copy of his data and Matlab code, so I have been able to check how the PH17 ICS values were actually calculated. Unfortunately, it turns out that the calculation in PH17 is wrong. Although for each GCM and each set of its sample eigenmode parameters, PH17’s code scales the AR5 forcing time series by the F value corresponding to its sampled eigenmode parameters (and thus also scales the related simulated H and T time series), it then divides the resulting λ estimate into 3.7 Wm−2 rather than into the F value applicable to that sample. Essentially, what PH17 did was to correctly estimate the slope of the blue line but, instead of estimating ICS directly from its x-axis intercept, they shifted the blue line down so that its y-axis intercept was 3.7 Wm–2.. In the case shown in Figure 1, doing so reduces the ICS estimate from 2.3°C to 2.1°C.

I have rerun the PH17 code with the ICS calculation corrected, applying the F value applicable to each sample to compute the ICS estimate for that sample. The resulting overall median ICS estimate increases from 2.5°C to 2.8°C. The 2.5°C value found by PH17 is quite clearly incorrect.

Volcanic Forcing

The corrected median ICS estimate for GCMs of 2.8°C, based on changes over the entire 1750-2011 period, is still a little below the value I would have expected from previous work of mine using rather similar methods. The reason for this is the incorrect treatment of volcanic forcing in PH17. The points involved are quite subtle.

The problem is that PH17 did not adjust the AR5 forcing time series to make average volcanic forcing zero. If one does not do so, that implies preindustrial (natural only) forcing was on average negative relative to that in 1750 (when all forcings, including volcanic forcing, are set at zero in the AR5 time series), meaning that in 1750 the climate system (which is assumed to be in equilibrium with pre-1750 average forcing) would not be in equilibrium with 1750 forcing (which is higher by the negative of average pre-1750 natural forcing. That would invalidate the PH17 derivation of (FH)/T and hence of ICS. Although average pre-1750 natural forcing values are not given in AR5, it is reasonable to estimate them from the average over 1750–2011. That average is negligible for solar forcing, but material for volcanic forcing, at −0.40 Wm–2.

The need to account for preindustrial volcanic forcing when computing subsequent warming is known,[ii] although it appears to have been overlooked by many GCM modellers. A simple solution is to adjust the AR5 forcing time series so that it has a zero mean over 1750-2011. This is essentially the same approach as was used when the RCP scenario forcing time series were produced. The volcanic forcing in 1750 then becomes +0.4 Wm–2, reflecting unusually low volcanism in that year.

When I adjusted the AR5 forcing time series by subtracting the average volcanic forcing over 1750–2011, the ICS median estimate over 1750-2011 rose to 2.92°C.

IRF versus ERF

There is a third reason why the PH17 estimate of ICS for GCMs is too low.

When CO2 concentration is abruptly doubled, it initially produces what is termed instantaneous radiative forcing (IRF). However, for estimating the response of the climate system it is best to use effective radiative forcing (ERF), which is forcing after the atmosphere has adjusted and surface adjustments that do not involve any change in GST have taken place; see IPCC AR5 Box 8.1. Such adjustments take up to a year, perhaps more, to complete. The IPCC AR5 forcing series are for ERF, and adopt an F2× value of 3.71 W m–2. ERF for CO2 is believed to be some way below IRF.

However, in PH17, F is estimated by projecting back to time zero using, primarily, mean values for the first and second years of the abrupt 4xCO2 simulations. Since during year one the atmosphere and surface are adjusting (independently of GST change) to the quadrupling in CO2 concentration, doing so produces a F value that is in excess of ERF. Thus, PH17 derives a median GCM F of ~4 Wm–2 (the median values for λ and Contribution to inferred equilibrium warming given in Table 1, imply, in conjunction with the median GCM ECS given in Table S1, an F value of 4.0 Wm–2).

It is difficult to estimate ERF F for CO2 very accurately from abrupt 4xCO2 simulation data. A reasonable method is to use regression over years 1 to 20 of the abrupt 4xCO2 simulation,[iii] which is consistent with the recommendation in Hansen et al (2005)[iv] of regressing over the first 10 to 30 years. The ensemble median F obtained by doing so is the best part of 10% lower than per PH17, although the ratio for individual GCM medians varies between 0.72 and 1.20. To obtain an apples-to-apples comparison, the F values implicit in the fitted model eigenvalue parameters must be for ERF, as for observationally-based estimates, not for something between ERF and IRF. The brown line in Figure 1 illustrates the issue. The intersection of the blue and brown lines corresponds to where we are now, in terms of how long the climate system has had on average to adjust to forcing increments during the historical period (scaled to a doubling of CO2 concentration). The brown line corresponds to estimating ICS using the same data relating to the current climate system state as for the blue line, but with the F estimate reduced from PH17’s 4.0 W m–2 to 3.6 Wm–2. The result is to increase the ICS estimate by approaching 0.2°C – the difference between the x-intercept of the brown and the blue lines. I cannot accurately estimate the depressing effect on ICS estimation of using F estimates that exceed those corresponding to ERF, as doing so would require refitting the statistical model and obtaining fresh sets of 5,000 sample eigenmode fits for each GCM.[v] However, based on my previous work I estimate the effect to be ~ 0.1°C. When this is added to the 2.94°C median ICS estimate, after correcting the two problems previously dealt with, for time periods used in instrumental-observation studies the median GCM based ICS estimate would slightly exceed 3.0°C.

Other issues

There are a few other points relevant to appraisal of PH17.

The PH17 calculations of T for CMIP5 GCMs using AR5 forcing time series reveal that, for the median fitted eigenmode parameters, simulated warming between 1860–79 and 2000–09 was 1.10°C.[vi] That exceeds recorded warming (using a globally-complete GST dataset)[vii] of 0.84°C by almost a third, supporting the conclusion that the median GCM is substantially too sensitive.

It is also worth noting that, although of considerable interest in relation to understanding climate system behaviour, any difference between ICS and ECS is of relatively little importance when estimating warming over the next few centuries on scenarios involving continuing growth of emissions and CO2 concentrations, as the slow mode will contribute only a small part of the total warming.

Conclusions

When correctly calculated, median ICS estimate for CMIP5 GCMs, based on the evolution of forcing over the historical period, is 3.0°C, not 2.5°C as claimed in PH17. Although 3.0°C is below the median ECS estimate for the GCMs of 3.4°C, it is well above a median estimate in the 1.6–2.0°C range for good quality observationally-based climate sensitivity estimates. PH17’s headline claim that it reconciles historical and model-based estimates of climate sensitivity is wrong.

End Notes

[i] The total sample size is slightly lower, since for eight of the GCMs the simulation method fails in a number of cases, due to use of an approximation that breaks down for samples with a very small fitted short time constant.

[ii] Gregory et al 2013 doi:10.1002/grl.50339; Meinshausen et al 2011 DOI 10.1007/s10584-011-0156-z Appendix 2

[iii] As in Andrews et al (2015, DOI: 10.1175/JCLI-D-14-00545.1)

[iv] Efficacy of climate forcings, doi:10.1029/2005JD005776

[v] Probably requiring setting λ1 = λ2;it is impossible to estimate a separate λ1 if one seeks to estimate ERF, as the relevant time constant, τ1, is too short – typically less than a year.

[vi] With volcanic forcing adjusted to zero mean over 1750-2011

[vii] Cowtan and Way v2 kriged HadCRUT4v5: http://www-users.york.ac.uk/%7Ekdc3/papers/coverage2013/series.html

End Notes

[1] ECS is defined as the increase in global surface temperature (GST) resulting from a doubling of atmospheric CO2 concentration once the ocean has fully equilibrated.

[2] The abrupt4xCO2 simulations involve abruptly quadrupling CO2 concentration from an equilibrated preindustrial climate state; most such CMIP5 simulations were run for 150 years, but a few for up to 300 years. The use of abrupt4xCO2 simulation data to estimate the ECS of GCMs, most often by regression of TOA flux against GST change, is standard. Most GCMs have not been run to equilibrium with doubled CO2 concentration. Even where they have, any change in their energy leakage over time or with climate state would bias the resulting ECS value.

[3] The authors define λ as −ΔH(t)/ΔT(t), corresponding to the negative of the slope for the overall changes in H and T at time t after a forcing is imposed, rather than as −dH/dT|t, the negative of the instantaneous slope at time t.

[4] Values are changes from those in the equilibrated control simulation from which the abrupt4xCO2 simulation was branched, adjusted for drift and halved to restate for doubled CO2 concentration, making the assumption that for CO2 forcing is exactly proportional to log(concentration).

[9] The total sample size is slightly lower, since for eight of the GCMs the simulation method fails in a number of cases, due to use of an approximation that breaks down for samples with a very small fitted short time constant.

[10] Gregory et al 2013 doi:10.1002/grl.50339; Meinshausen et al 2011 DOI 10.1007/s10584-011-0156-z Appendix 2

[11] As in Andrews et al (2015, DOI: 10.1175/JCLI-D-14-00545.1)

[12] Efficacy of climate forcings, doi:10.1029/2005JD005776

[13] Probably requiring setting λ1 = λ2;it is impossible to estimate a separate λ1 if one seeks to estimate ERF, as the relevant time constant, τ1, is too short – typically less than a year.

[14] With volcanic forcing adjusted to zero mean over 1750-2011

[15] Cowtan and Way v2 kriged HadCRUT4v5: http://www-users.york.ac.uk/%7Ekdc3/papers/coverage2013/series.html

Moderation note:  As with all guest posts, please keep your comments civil and relevant.

122 responses to “Does a new paper really reconcile instrumental and model-based climate sensitivity estimates?

  1. All computer models are hypotheses. The credibility of an hypothesis is based on the assumptions made. If the assumptions are incorrect then the output of the model will be incorrect and require creative “tuning software”. Do all the computer models out there give a comprehensive list of assumptions made? If not why not?

    • climate541 | July 8, 2017 at 8:32 am |
      “Do all the computer models out there give a comprehensive list of assumptions made? If not why not?”

      One might also ask if they are falsifiable? “Popper saw falsifiability as a black and white definition, that if a theory is falsifiable, it is science, and if not, then it is unscientific.” Ref explorable.com.

      • They all ignore the cooling from shedding ice in both hemispheres.
        When too warm, it snows more to increase the ice volume available for shedding. When too cold, it snows less to allow the ice volume to deplete and make less available for shedding.

      • David Springer

        Go Trump!

        USA! USA! USA!

      • Popper also says you can’t discount a hypothesis until it is falsified.

      • Jim D | July 8, 2017 at 7:13 pm |
        Popper also says you can’t discount a hypothesis until it is falsified.

        Please source your statement “can’t discount a hypothesis until it is falsified.” If I take that statement at face value we are in the unscientific world of having a hypothesis that can’t be discounted because there is no method of falsifying it. Sounds like that could be said about ET flying saucers and we should spend $100T studying them. I assume nothing has changed since page 640 of IPCC AR4, last sentence section 8.6 where it states, “a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed.”

      • His theory of induction from Wikipedia –
        “Among his contributions to philosophy is his claim to have solved the philosophical problem of induction. He states that while there is no way to prove that the sun will rise, it is possible to formulate the theory that every day the sun will rise; if it does not rise on some particular day, the theory will be falsified and will have to be replaced by a different one. Until that day, there is no need to reject the assumption that the theory is true.”
        So with AGW we hypothesize that when CO2 is doubled the earth’s temperature will rise 2-4 C from this effect alone. Until we reach a doubling it can’t be rejected, but it is falsifiable at that time. Meanwhile it is on track to verify.

      • “So with AGW we hypothesize that when CO2 is doubled the earth’s temperature will rise 2-4 C”

        2-4C? That’s not a theory. That’s a wild ass guess.

      • It is falsifiable making it a hypothesis in the Popper sense. There is the theory of radiative transfer behind it plus thermodynamics for the water vapor feedback plus a lot of paleo evidence. It only seems like a guess to those that don’t quite yet understand the background for how climate sensitivity is quantified. Doubling CO2 is equivalent to increase the solar forcing by 1%. Most would expect a lot of warming from both of these. And, as I said, it is on track too. To scientists it is as obvious as the sun coming up that adding CO2 warms significantly. Popper says you can’t reject it until it is falsified.

    • To return to my go to source for selective quotes on modeling… sans an estimate of irreducible imprecision it all seems a bit of an academic question.

      “Where precision is an issue (e.g., in a climate forecast), only simulation ensembles made across systematically designed model families allow an estimate of the level of relevant irreducible imprecision.” James. C. McWilliams, 2007, Irreducible imprecision in Ocean and Atospheric Simulation.

  2. Peter Lang

    Ni Lewis,

    Thank you. Your posts are invaluable. Possibly as significant as Steve McIntyre’s exposure of the Hockey Stick misrepresentation.

    Although 3.0°C is below the median ECS estimate for the GCMs of 3.4°C, it is well above a median estimate in the 1.6–2.0°C range for good quality observationally-based climate sensitivity estimates. PH17’s headline claim that it reconciles historical and model-based estimates of climate sensitivity is wrong.

    Why is it taking so long to reconclie the ECS estimates from historical data with the estimates from the models?

    • Why is it taking so long to reconclie the ECS estimates from historical data with the estimates from the models?

      When people do not understand natural cycles and use a trace gas to control the temperature of earth, it will never be reconciled. It is flawed theory to ignore the self regulation by using ice and water. The pretend nothing else changed to cause the changes of the past. Past warming occurred as ice retreated after the little ice age. They ignore the actual facts. Temperature stopped going up when snowfall increased enough to halt or pause the ice retreat.

  3. What did Cristian say when you pointed out his error in the ICS calculation?

    • He has claimed it wasn’t an error. His reasoning is faulty.

      • I think it’s now a very complicated situation not only for the authors of this study. The shouted out very loud that they “resolved a major conflict in estimates of how much the Earth will warm in response to a doubling of carbon dioxide in the atmosphere.” (source: https://www.eurekalert.org/multimedia/pub/144866.php ) Now it’s clear that they didn’t resolve it but they admit that there is a major conflict between models and obs. This will stand in the end beyond all efforts to minimize this conflict. Some kind of backfire…

    • niclewis: He has claimed it wasn’t an error. His reasoning is faulty.

      Did you submit your comment to the journal?

  4. Water, in all of its states is abundant. Water, in all of its states causes temperature to be self regulating. When oceans are warmer, it snows more until it gets colder. when oceans are colder, it snows not enough until it gets warmer. This easily overpowers anything CO2 can do. This is very clear in ice core data which are records or ocean temperatures, where the water came from to create the snow. The idea that a trace gas is the control knob for the earth temperature is really not something that I could ever believe. If earth needs more greenhouse gas, earth would just crank up water vapor. If earth needs less greenhouse gas, earth would just crank water vapor down. Most of the cooling of earth comes from IR in the tropics. The fine tuning of temperature in both hemispheres is done with ice and water in the polar regions, separately, but using the same method. It snows more when the oceans are thawed and it snows less when oceans are frozen. The two hemispheres are hugely different, but the temperatures are self regulated in the same bounds. Almost 40 watts per meter squared left the Northern Hemisphere and entered the Southern Hemisphere, over the past ten thousand years and it did not change the temperature bounds in either hemisphere. CO2 did not adjust differently in the two hemispheres to counter this. Ice and water took care of this huge change. The southern hemisphere grew more ice so more ice could shed and cool the southern oceans. The northern hemisphere lost ice because it needed to shed less to cool the northern oceans.

    These climate ice cycles are robust, resilient, self-regulating, and self-correcting.

  5. /Users/marty/Desktop/Latest news

  6. In reality, there essentially has been no significant global warming in the US since the 1940s. Warming before 1940 accounts for 70% of the warming that took place after the Little Ice Age ended in 1850. However, only 15% greenhouse gases that global warming alarmists ascribe to human emissions came before 1940.

    How do we even begin to adjust for land-based data that has been corrupted by the Urban Heat Island effect and many instances where the raw data has been manipulated so badly that examples exist where Antarctic temperature readings ere changed from minus to plus signs.

  7. Do not believe the models because they do not account for solar effects and with solar conditions becoming extreme in that they are becoming very quiet, the models will not pick up on any solar secondary effects upon the climate ranging from an increase in albedo(due to an increase in cloud coverage /snow coverage), lower sea surface temperatures, atmospheric circulation changes and even major volcanic activity.

    When solar is not extreme models can get away without incorporating solar because the solar effects are to small , which occurs when the sun is in it’s regular rhythmic 11 year sunspot cycle.

    The above is in reference to long term and short term climatic predictions.

  8. https://wattsupwiththat.files.wordpress.com/2017/07/modsvsobs.png

    models versus reality -= conclusion is the models are way off and useless

  9. PH17 essentially postulates that climate sensitivities are not linear to CO2 loading, hypothesizes that we are currently in a particularly insensitive part of the curve, and cobbs together an equation using “inflections” (thinly disguised parameters with no physical basis) that crudely reconciles the modelled and observed discrepancies.

    The implication is that at future higher concentrations, sensitivities will increase.

    Difficult to falsify; contrary to experimental evidence of a negative response curve to increased concentration.

  10. Nic Lewis is ruthless at finding errors.

  11. Don Monfort

    Starting pitcher and batting cleanup for the Red Team: Nic Lewis!

  12. If Steve M. were willing, he would be a great player on the Red team. Maybe he could just play part time coach? Willis E. would be good. These guys have a proven track record. Of course, J. Curry and some of the other professional scientists we all know well would be needed.

  13. We have already had one degree of warming from half a doubling giving an effective transient rate of 2 C per doubling. Clearly if ICS is less than this, it is not telling the full story of the warming we have had, and should not be used for policy relating to CO2 levels unless care is taken to add in the net of other proportionate anthropogenic forcings that they assumed to boost its effect by 30% in LC14. The effective transient rate for the last 60 years is 2.3 C per doubling. Lovejoy gets a similar number going back to 1750. This comes from taking the temperature change over the CO2 forcing change used as a proportional proxy for total anthropogenic forcing, two numbers with a lot more certainty than the uncertain ones going into ICS that has a range of 1-4 C per doubling. The ICS uncertainties include other GHGs, aerosols, volcanoes, solar effect and majorly the ocean heat content change over the considered period, and no wonder it has a large uncertainty. The relation between CO2 and temperature is inescapable and shows it to be a good proxy for total forcing.
    http://woodfortrees.org/plot/gistemp/from:1950/mean:12/plot/esrl-co2/scale:0.01/offset:-3.2

    • Jim D, the premise in your first sentence is false. From there your whole argument is circular and simply falls apart.

      • It’s the same premise as Lewis and Curry that all the warming comes from forcing. Move on.

      • No Jim D. No scientist has ever said that all warming comes from forcing and certainly no scientist has said that all warming comes from CO2 levels. That is a different premise to your first statement and is plainly at odds with the warming in the first half of the twentieth century. You can’t expect a free pass when you write things like that.

      • The premise of LC14 is that all the dT comes from the dF with dQ being the only other term, which is the unrealized imbalance and also contributes positively to the ICS. Read it.

      • Jim

        So what date do you estimate that we will reach the warmth experienced during the MWP, the Roman warm period and the Minoan warm period?

        Tonyb

      • We already blasted through 1 C above preindustrial on the way to 2 C by 2050, 3-4 C by 2100 and more after that. What is your estimate for the temperatures of those periods? My understanding is that no one really knows the global average, so if you have it, it would be great. Before you answer, there was a paper that the earth has cooled less from those periods until preindustrial than was first thought. It corrects Marcott’s cooling trend into something that could be flatter to even warming. It’s related to a bias towards NH summer trends in previous proxy datasets.
        http://www.readcube.com/articles/10.1038/ngeo2953
        It resolves the conundrum that the models want to warm during this period due to increasing CO2, and the jury is out on how relatively warm those periods really were. Holocene temperatures are still a moving target.

      • Jim

        I go by recrords of observations, crop dates and harvests, glacier lengths, type of trees and the altitudes they can grow at together with sea levels. None of those indicate today is as warm as some periods in the past.

        Your link did not lead anywhere as it said it was not available for a mobile device and I am using my iPad

        Tonyb

      • The paper is by Baker et al. in Nature Geoscience from May 2017. The NH summer bias in proxies has led to a misleading emphasis of the summer cooling Milankovitch trend. If the proxies were more balanced, you would get a different story. The paper identifies Urals cave data that shows warming winters. When you add that to your picture it becomes less certain, but it seems to agree with CO2 and sea levels rising slowly since the last Ice Age.

      • That study you link is 1 location I think. Marcott is many more.

      • It has implications for Marcott that it says is more weighted to NH summer. The NH summer has been cooling due to precession and now occurs at almost the furthest distance from the sun. Meanwhile the NH winter and SH summer have been warming for the same reason, but proxies don’t sample those so well. Also given the CO2 increase and sea-level rise it is consistent with those other pieces of information which had been a conundrum with the cooling. Generally makes more sense to me.

      • @ Jim D – 1 time series in 1 location (Ural Mountains) isn’t enough to reverse the trend of Marcotte et al. Though you do have a bit of a point about a potential bias in the Marcotte et al. reconstruction.

      • stevefitzpatrick

        Jim D,
        While I normally only find your comments humorous, this:

        “We already blasted through 1 C above preindustrial on the way to 2 C by 2050, 3-4 C by 2100 and more after that.”

        Is worth noting. So, 1C warming by 2050…. that is 0.3C per decade, or about twice the measured rate of warming for the last 4 decades. If the warming over the next decade is way less than 0.3C, will you admit the sensitivity to forcing is lower than you have claimed? (I suspect not.)

      • The rate of warming per CO2 increase is about 1 C per 100 ppm. So 3 ppm per year averaged over the next few decades which would be a BAU scenario would lead to the next 1 C in the vicinity of 2050 along with the 500 ppm levels by then.

      • Cut it out Jim D. CO2 didn’t magically replace natural variation in the 1980s. No scientist worthy of the name would be so silly as to claim 100% attribution the way you do.

      • About 50% of the effect has been since 1980 where it dominates because of its rate of change. The other 50% is spread out over the century before that, a much smaller rate of change against the background things going on. So it doesn’t surprise me that the rate is more noticeable now.

      • JimD:

        “The time series on the left shows that the global annual temperature for combined land and ocean surfaces in 2005 was 0.58°C (1.04°F) above average, ranking 2nd. The time series from the improved Smith & Reynolds data set on the right provides a global temperature of 0.62°C (1.12°F) above the 1880-2004 mean, while 1998 was 0.59°C (1.06°F) above average.”

        From https://www.ncdc.noaa.gov/sotc/global/200513

        We hit 380 in about 2005, so 100 ppm (280 to 380) is about .6 ish C, not 1C. Obviously, cutting your number by 40% would make a huge difference on forward projections.

        Lets be generous and say that warming has actually been .8C for 100 ppm (taking out the recent El Nino), that is still 20% less than your 1C number.

        Where do you get 1C per 100 ppm? Just curious what your source is for this number, since it seems a bit high to me.

      • Richard I get it from GISTEMP and the CO2 levels over the last 60 years. I linked this at the top, but here it is again. Note that the scaling is exactly 100 ppm per degree.
        http://woodfortrees.org/plot/gistemp/from:1950/mean:12/plot/esrl-co2/scale:0.01/offset:-3.2

      • Jim D, every scientist worthy of the name knows that correlation is not causation. Not even the worst of the data fiddlers claim that there is a linear relationship the way you do.

      • It’s evidence.

      • No Jim D. Every scientist worthy of the name knows that correlation MAY OR MAY NOT be evidence of causation.

      • Yes, some deny that there is even remotely a chance it could be evidence of a connection.

      • Well Jim D, you’ve reached the point once again where you need to say exactly who is denying exactly what. Good luck.

      • … and of course it is right about this stage that our little conversations are forcefully ended and erased forever. All the best.

      • stevefitzpatrick

        Jim D,
        1C per 100 PPM is it? Well, aside from the forcing function being roughly logarithmic, not linear, aside from the shading of the temperature data (as already pointed out), and aside from the fact that CO2 is only about 65% of the overall forcing, you did not answer a simple question: If warming is well below 0.3C over the next decade, will you then admit that climate sensitivity is lower than you currently think? I mean, you made a specific projection: 1C increase by 2050. That is far higher than the average CMIP5 projection, and wildly out of line with the rate of warming for the last few decades. I would like to know what it would take to convince you your projection is mistaken.

      • Steve, depending on the uncertain aerosol effect, CO2 is 75-100% of the net forcing. If CO2 grows by 30 ppm per decade, a moderate BAU scenario, yes, 500 ppm and another 1 C by 2050, 35 years after we crossed the first 1 C, is possible and within the error bounds. The land is already warming by 0.3 C per decade since about 1980, so it is not an unheard of warming rate.

    • JIM D there is no warming due to CO2. That is what is inescapable.

    • https://twitter.com/cohodashoward/status/883694280715489281

      More reality and exposing the fake co2 Arctic warming scam..

    • He doesn’t mention the greenhouse effect. Does he know about it, and that the largest forcing change this last century by far has been from increasing greenhouse gases? If he doesn’t even want to talk about the biggest factor of the century, it is a waste of time.
      http://www.realclimate.org/images/ipcc_rad_forc_ar5.jpg

      • That is 100% fake news Jim D

      • And because of that fake data the climate models are way off .

        As was shown by my earlier post and this is just the beginning. Next year at this time instead of being off by 100% the models will probably be of by 200%

        The fake CO2/global warming myth is coming to an end.

      • JIM D – you say the global temperatures will go up from here. I say the global temperatures will be at or below the 30 year means within a year or so.

        We will see who is correct you have your reasons and I have mine. We shall see.

  14. Let me start by saying global cooling has started this year and will be the rule going forward.

    Solar is now reaching the criteria which should produce this.

    My contention is very low solar will increase the albedo of the earth and lower sea surface temperatures which will result in global cooling.

    To me this is a very reasonable theory and why some would dismiss this I can’t understand especially when one looks at the historical climatic record and sees how the climate has responded to very low prolonged solar conditions in the past.

    Not to mention all the possible solar/ terrestrial ties which could move the climate into another mode ranging from an increase in volcanic activity , to more global cloudiness ,snow cover, atmospheric circulation changes ,lower sea surface temperatures etc.

    The test is on now and we will know well before 2020.

    • Well thank you Salvatore:
      from a sociopolitical point of view should we not be trumpeting this simple data from the global rooftops. Here in the UK,where politically its now a priority to attract the now important under 25 vote, even the right of centre politicians want to seriously ratchet up political commitment to the Paris 0.2 degree reduction nonsense and also fervently increase measures to reduce fossil fuel use. Or do we wait for 2018 with cooling Ave mean global temperature and 200% wrong ipcc computer models. I find it so hard to see how the ‘Religion’ of climate change will effectively collapse. Doctors were still bleeding patients 200 years after William Harvey proved that blood circulates around the body and bleeding simply weakens the body.
      Kevin

  15. http://models.weatherbell.com/climate/cdas_v2_hemisphere_2017.png

    Encouraging if one is calling for global cooling which I have been doing for years when solar conditions reach low average value parameters following 10+ years of sub solar activity in general which is finally being realized in year 2017.

    I expect temperatures to be or below 30 year means not 5 years from now not 10 years from now but within a year or so.

    They are almost there even now.

  16. Pingback: Does a new paper really reconcile instrumental and model-based climate sensitivity estimates? — Climate Etc. – NZ Conservative Coalition

  17. With PH17, hasn’t Blue conceded that there is a ‘pause.’
    Why stand in their way?

    • Simply for the sake of the best work product possible.

      • I can accept that.
        Guide them and then let them pretend it was their idea.
        It was clear to me who the good guys were from the very beginning.

  18. None of this will matter when Pruitt terminates all funding for federal data collection. No data = stable climate. Simple and something trump can understand.

    • I think this to be highly unlikely. And you have no evidence it is being discussed by the Trump administration.

      • For NASA, Trump wants to reduce budgets for instruments pointed at earth, and increase funding for space telescopes and missions to Mars and Europa. This is a clear slap at getting data relevant for climate change. He doesn’t want to know.
        https://spaceflightnow.com/2017/05/23/trumps-nasa-budget-request-reduces-earth-science-eliminates-education-office/

      • Jim D:
        Your link:
        “The Trump administration has made no secret of its skepticism about global warming and its presumed causes and impacts and as expected, the budget eliminates funding for five Earth science missions and instruments. Earth science would receive $1.8 billion overall, reflecting a reduction of nearly $170 million.”

        ““The hard choices are still there, and we can’t do everything,” Lightfoot said. But the budget “still includes significant Earth science efforts, including 18 Earth observing missions in space as well as airborne missions.””

        Why is it that the Japan or some other country can’t do these needed missions? The developed nations hate us. They can pay for this stuff. Don’t call us names and expect us to pay for all this stuff. If Germany thinks it’s so important judging by all their wind turbines, they can pay for instruments pointed at Earth.

      • It’s just another area where Trump wants to give up the US lead. They are withdrawing from the world. At least they’ll go to Mars and Europa with that money. Great.

      • Don Monfort

        We are 20 trillion$ in debt. Why should we continue to pour billions and billions into a settled science? Isn’t a 97% consensus good enough? All we will find out from the usual suspects is that it’s worse than we thought.

        We have been informed by our BEST scientist, Steven Mosher, that the settled science blue team is done with its work. Thank you, Steven. Fine and dandy. All funding under Trump Rules goes to the Red Team. Elections have consequences. Just ask the previous POTUS, whose legacy is being flushed down the toilet.

      • Jim D:
        People didn’t want to go to the Moon as we had enough problems on Earth. That kind of argument has been around for maybe 50 years.

        It is symbolic that instead of looking outwards to space, we turned around and looked at Earth.

        I have long been a fan of NASA. My son hears the story of the Saturn V launch vehicle. The Dawn mission. How to catch something in orbit, hit the brakes and take the inside track. I told him you can’t reuse boosters (landing fuel has to be launched) as it doesn’t make sense. Guess I missed on that one.

      • Don Monfort

        Trump isn’t giving anything up, yimmy. He is leading in another direction. Away from silly globalist no borders socialism. We didn’t get to be the most successful nation in history and savior of the planet from totalitarian domination on that dumb crap.

      • We have a rapidly changing planet. There is no more important time than now to keep track of the changes. Satellites are the best way to do that and create a long-term continuous record, which is just what is needed for climate. NASA’s Mission to Planet Earth was one of the most worthwhile things they’ve been doing with taxpayer money. We now have satellites that trace CO2 and trace gases in the atmosphere, albedo changes, top-of-atmosphere radiation budgets, gravitational changes due to glacier melting and water motion, etc. Important stuff.

      • Don Monfort

        What about healthcare, yimmy? We need many, many billions to do something about the failure of Obamacare. Money doesn’t grow on trees. We will still have satellites looking at the earth. Stop the ridiculous overwrought pearl clutching. Trump Rules! Get over it.

      • jim2 – You really don’t know how the government works. Oh to be so innocent.

      • David Springer

        Looking inward from space instead of outward is the ultimate form of naval gazing by the loony left moonbat brigade.

      • NASA shouldn’t be collecting data from Earth. We have a few Earth-centered agencies for that. NOAA, for instance. And I second the idea that some other countries can pick up the study of Earth. It’s time for the rest of the world to do their part.

        And then there’s the fact that nothing has happened yet. Trump has shifted positions from before to after being elected due to the more complete knowledge that comes with being President. You still have no clue what might happen.

        Finally, what I have found WRT intentions after the election is this:
        “On the chopping block are the Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) satellite; the Orbiting Carbon Observatory-3 (OCO-3) experiment; the Climate Absolute Radiance and Refractivity Observatory (CLARREO) Pathfinder; and the Deep Space Climate Observatory (DSCOVR).”

        Not exactly gutting the entire Earth program, is it?

        https://www.space.com/36112-trump-budget-cancels-nasa-earth-science-missions.html

      • We have a rapidly changing planet.

        No, we don’t. If we did have a RAPIDLY changing planet it would be possible to measure those changes. As it is, the changes are so slow and miniscule that they fall within both the natural variability of the Climate and within the measuring error of our instruments.

        Without adjustments, the global temperature record doesn’t show today being warmer then 60 years ago. The Climate Faithful literally needed to reduce what they claimed the global temp was 15 to 20 years ago to ‘Bust’ the Pause. ‘Rapid’ changes aren’t ones you have to accept on faith.

    • This halucination that Trump is stupid is really funny to me. Some real cognitive dissonance going on.

  19. Nic Lewis, thank you for the essay.

  20. Thank you Nic, please keep up the work for a good cause. JIM2, The reduction in Federal budget proposal for temperature recording is in line with a stable source. We do not need to increase such measurements, we just need to keep track of the parameters now known. Other governments could contribute to the funding, don’t you think? I think the data set from UAH is the best overall data set. Keep that one funded.

    The USA is leading the world, or maybe you have just not noticed.

    • A real leader doesn’t do the work, tek. Others need to pitch in.

      NOAA runs the microwave sat program, not NASA.

  21. I used to think the Argo floating sensors were good. The I read that they “adjusted” them to match the temps from ships engine room logs – evidently, the ship records were deemed better than the Argo temps. Really? When I was sailing (early 70’s) you could not read the thermometer with greater accuracy than +/- 2 degrees. I was amazed at how stable the temps were. The Captain used them to determine when we sailed into a warm current.

    Anyway, unless we use the “raw” Argo temps, I don’t trust the results.

    • Tek Jeff Chap – Thanks for the headsup! Flabbergasted.

      I have been unable to find a source for this ……. it seems to fly in the face of common sense with impunity. Am I to presume that the raw intake temperatures (which do not vary predictably as vessel type, weather, instrument etc aren’t really considered) are adjusted (homogenised …
      cut, interpolated, Kriged etc), to be used as control for the argo floats (which of course then reconciles with the ship board data perfectly).

      That makes no sense to me at all, especially given the new roll out of the deepwater program as tacit endorsement of the 0-700m and 0-2km work. Ideally also a comprehensive 0-bottom depth to better understand the other 48% Vol of the oceans.

      I too ply the SH Indian and SH Pacific and occasionally tropics. I’m with you Tek Jeff!

  22. “No data = stable climate”

    Well, since climate is not scientifically determined, we can safely dismiss this as poetry.

    Andrew

  23. Nice analysis, Nic. Imoortant to show this paper wrong.
    IMO there is no way to reconcile observational with modeled sensitivity, because the models are fatally flawed. Essential processes like convection cells need grids less than 4km a side. That is computationally intractable by about 6 orders of magnitude. So they have to be parameterized, and the parameters tuned to best hindcast. That drags in the attribution problem. The warming ~1920-1945 is indistinguishable from the warming ~1975-2000. AR4 WG1 fig SPM.4 says the former warming is mostly natural. Natural variability did not stop in 1975, yet the CMIP5 assumption is explicitly that the later period is all GHE, and the tuned CMIP5 hindcasts are explicity from YE 2005 back 3 decades to 1975. See models and attribution guest posts at WUWT for details.

  24. Physics is a social arrangement joined by a common set of equations.

    Climate science is a social arrangement joined by a common set of predictions.

  25. So true.

    Let me see if I understand this

    Global warming will make the world warmer, unless it doesn’t, then it won’t. If it doesn’t and it won’t, global warming will make the world colder, even though you counted on it getting warmer and windier, and invested a very large amount of money based on the idea that you could use the effects caused by warming to make electricity with giant windmills, that now aren’t getting enough wind because global warming really made it colder instead of warmer the way all the computer models predicted.

    And though all the global warming prophets of doom were wrong about it getting warmer and windier, they’re really right, because cooling is caused by warming, even though it’s not windier, and you’re stuck with outrageous electric bills, but it’s not their fault.

    Or something like that

    • aporiac1960

      Salvatore del Prete: “And though all the global warming prophets of doom were wrong […..] they’re really right”

      I’m pretty sure the explanations will be along the lines you describe. Look forward to the mainstream media headlines when none of the ‘incontrovertible’ doom and gloom actually comes about : –

      “New paper vindicates climate scientists: Why they were right in every important sense despite being totally wrong!”

      The fact is, doublethink is no problem for these people – it’s their natural mental habitat.

  26. Nic: Very useful. However, every time you carefully illustrate how ECS is derived from 4XCO2 experiments, I rebel at how irrelevant these experiments appear to be, because they involve conditions different from those our planet will ever encounter. (We have discussed increased stratification of the ocean. There are also transient effects that cause some oto ignore the data point for the first year.)

    There are a lot of different definitions for radiative forcing. Does extrapolation from a 4XCO2 experiment provide us the number we really need to know (or is 3.7 W/m2 a better answer)? This may be the fundamental disagreement between you and the authors of PH17.

    From my self-taught perspective, we have a climate feedback parameter (lambda, in W/m2/K) that tells us how the sum of LWR cooling to space plus reflection of SWR varies with surface temperature. This is an innate property of our planet that determines climate sensitivity in units of K/(W/m2) It may not be linear. F_2x is the factor that allows me to convert K/(W/m2) to K/doubling. F_2x is supposed to be settled science (3.7 W/m2) – at least until one looks at the wide range of ERF_2x and associated uncertainty produced by extrapolation to t = 0 in 4XCO2 experiments with AOGCMs.

    However, one doesn’t appear to need an F_2x to get ECS from a Gregory plot. ECS is obvious from extrapolation to the x-intercept, which is already measured in units of temperature. Gregory plots don’t use F_2x to convert W/m2 to doublings – or do they? Extrapolation to the x-axis involves a slope, and that slope is “a” climate feedback parameter (W/m2/K). F_2x appears to be about 2 W/m2 based on that slope when the modeled planet is about 4 K warmer. However, this is based on interpreting the output from a 4XCO2 run in terms of a two-compartment model.

    In my dreams, I’d like to take some or all time points in a 1% pa experiment, artificially change the concentration of CO2, and then calculate the instantaneous radiative forcing, Fi, at each time point. (Do we care that the stratosphere and atmosphere have not responded? No. They are busy responding to seasonal changes and weather. The planet is never truly in equilibrium. I’d also raise and lower the temperature by 0.5 K to get instantaneous Planck feedback at each point. I think these radiation transfer calculations are relatively cheap computationally.) The average of these Fi’s is the value for F_2x that I most want to know – at least for converting lambda into ECS. If this average Fi is changing as the planet warms, that is something we need to know. Perhaps Fi on a planet 4 K warmer than today really is only 2 W/m2 because of changes in clouds and humidity influence this number.

  27. I don’t want to sound stupid, but if volcanic forcing was an average of -0.4 watts per m2, and the recent forcing is about 0.7 watts per m2…and we don’t have much volcanic forcing at this time, and average surface temperature is 0.9 degrees over pre industrial….does it all fit? What am I missing?

    • Recent anthropogenic forcing was 2.3 W/m2 in 2011 per AR5 s 1750, or 2.2 W/m2 vs 1860-80, like the last 20 years a low volcanism period. Solar forcing changes are negligible when averaged out over solar cycles, in TSI terms at least. And recent volcanic forcing is ~-0.1 W/m2 unadjusted, or +0.3 W/m2 vs the average. So for the mean over the last decade or so we have a total forcing change of 2.2 W/m2 from 1860-80 and a GST change of 0.8 C (globally-infilled HadCRUT4v5), implying a transient (not equilibrium climate sensitiviy of ~1.35 C (F2xCO2 is 3.7 W/m2). No idea where your 0.7 W/m2 for recent forcing came from.

  28. Pingback: Nieuws over de klimaatgevoeligheid, maar geen spectaculair nieuws | Klimaatverandering

  29. Nic Lewis,

    I expect you may have thought about how the discrepancy between the model and observational estimates of ECS could be resolved. Could you share your thoughts on this? Do you have any pragmatic suggestions as to how it could be done, what additional data is needed, how could it be collected, how long would it take and how much funding would be required?

  30. Peter: Nic has pointed me to two papers that may provide a start to an answer.

    J. M. Gregory,T. Andrews. (2016) Variation in climate sensitivity and feedback parameters during the historical period. http://dx.doi.org/10.1002/2016GL068406

    “We investigate the climate feedback parameter α (W m−2 K−1) during the historical period (since 1871) in experiments using the HadGEM2 and HadCM3 atmosphere general circulation models (AGCMs) with constant preindustrial atmospheric composition and time-dependent observational sea surface temperature (SST) and sea ice boundary conditions. In both AGCMs, for the historical period as a whole, the effective climate sensitivity is ∼2 K (α≃1.7 W m−2 K−1), and α shows substantial decadal variation caused by the patterns of SST change. Both models agree with the AGCMs of the latest Coupled Model Intercomparison Project in showing a considerably smaller effective climate sensitivity of ∼1.5 K (α = 2.3 ± 0.7 W m−2 K−1), given the time-dependent changes in sea surface conditions observed during 1979–2008, than the corresponding coupled atmosphere-ocean general circulation models (AOGCMs) give under constant quadrupled CO2 concentration. These findings help to relieve the apparent contradiction between the larger values of effective climate sensitivity diagnosed from AOGCMs and the smaller values inferred from historical climate change.”

    If I understand correctly, models show an increase in radiative cooling to space with rising Ts that is consist an ECS of 1.5-2.0 K. Some transport heat more slowing into the ocean, and therefore show a higher TCS. (ECS is independent of the rate of ocean heat uptake.)

    Andrews (2015) http://journals.ametsoc.org/doi/full/10.1175/JCLI-D-14-00545.1

    “Experiments with CO2 instantaneously quadrupled and then held constant are used to show that the relationship between the global-mean net heat input to the climate system and the global-mean surface air temperature change is nonlinear in phase 5 of the Coupled Model Intercomparison Project (CMIP5) atmosphere–ocean general circulation models (AOGCMs). The nonlinearity is shown to arise from a change in strength of climate feedbacks driven by an evolving pattern of surface warming. In 23 out of the 27 AOGCMs examined, the climate feedback parameter becomes significantly (95% confidence) less negative (i.e., the effective climate sensitivity increases) as time passes. Cloud feedback parameters show the largest changes. In the AOGCM mean, approximately 60% of the change in feedback parameter comes from the tropics (30°N–30°S). An important region involved is the tropical Pacific, where the surface warming intensifies in the east after a few decades. The dependence of climate feedbacks on an evolving pattern of surface warming is confirmed using the HadGEM2 and HadCM3 atmosphere GCMs (AGCMs). With monthly evolving sea surface temperatures and sea ice prescribed from its AOGCM counterpart, each AGCM reproduces the time-varying feedbacks, but when a fixed pattern of warming is prescribed the radiative response is linear with global temperature change or nearly so. It is also demonstrated that the regression and fixed-SST methods for evaluating effective radiative forcing are in principle different, because rapid SST adjustment when CO2 is changed can produce a pattern of surface temperature change with zero global mean but nonzero change in net radiation at the top of the atmosphere (~−0.5 W m−2 in HadCM3).”

    If I understand this correctly, cloud feedback in 4XCO2 runs changes sign and becomes strongly positive after 3-4 K of warming. This raises ECS dramatically. When model ECS (and feedbacks) are non-linear with temperature, then there isn’t a single “correct” value for ECS. We haven’t experienced enough warming to be impacted by this putative transition to a higher ECS. AOGCMs and EBM’s could agree in 1% pa runs (but don’t due to ocean heat uptake) and still have a much higher ECS extrapolated from 4XCO2 runs. One obvious thing to check is how the ECS changes in 2XCO2, 4XCO2 and at variously time points in 1% pa experiments varies.

    The increase in cloud feedback produces what might be called “a runaway GHE” in the equatorial Pacific. The radiative imbalance in that region is increased, not decreased, by surface warming.

    • Just to add to that, the change in feedback strength seems to be linked to the time after a forcing is imposed, not to the temperature change that has occurred (which is a function of both forcing and time).
      Although the local feedback strength in the equatorial Pacific may indeed be very small, maybe of opposite sign to elsewhere, there is of course no runaway warming there, as the atmosphere and ocean can transport excess heat away from the region to areas where there is lesser warming.

      • Nic: Could you please clarify you last statement? Suppose I spun up an AOGCM with 2% more sunlight (=5 W/m2 solar forcing?) and t_0 was about 4 K higher than today. Then I perform an abrupt 4XCO2 experiment. Will cloud feedback start low and become increasing positive? Or will it start as positive as it is about +4 K in an ordinary 4XCO2 run stay high?

        Is cloud feedback a function of Ts? Or a function of time since quadrupling?

      • franktoo,
        “Suppose I spun up an AOGCM with 2% more sunlight (=5 W/m2 solar forcing?) and t_0 was about 4 K higher than today. Then I perform an abrupt 4XCO2 experiment. Will cloud feedback start low and become increasing positive?”

        Interesting question. I think the answer is yes. Certainly it would be yes if the spin up was with 2x PI CO2 and then, after reaching equilibrium, CO2 was doubled again. GCM response to CO2 forcing is generally linearly additive, at least up to 4x CO2. (When T gets more than ~8 K higher than PI the response may increase as WV feedback becomes very high.)

  31. Pingback: Reconciling climate sensitivity estimates – part III, or IV? | …and Then There's Physics

  32. There’s a much easier way to model / predict what the climate is going to do. Look at what the ocean is doing, and add a few years lag:

    https://www.nature.com/articles/ncomms15875#t1

  33. stevefitzpatrick

    Jim D,
    I asked twice, and you have not answered. This is why I seldom bother to engage your comments. It seems to be a political game for you, not a discussion with the possibility of progress. You can ‘project’ any level of warming you like, no matter how silly, but it is clear that you will not revise your projections in light of contray evidence, which means it is not worth discussing your ‘projections’ with you. cio.

    • I gave you the numbers, but let’s be more explicit. Let’s take 35 years between 1 C and 2 C (e.g. 2015-2050). Let’s take a conservative-moderate BAU averaging 3 ppm per year. This gives 105 ppm, a CO2 level of 505 ppm. Let’s take 100 ppm per degree like the average of the last 60 years has been. That’s the 1.05 degrees where I got 2050 from. It goes as a log so maybe it will be a little less, but not much. Accounting for that, I get nearly 0.8 C by 2050, still going through 1 C in the 2050s decade. But these are all ballpark and not a factor of two off as you are implying by what looks like a simple extrapolation not accounting for emission increases.

      • stevefitzpatrick

        Bizarre. I ask if actual data… that is, measured warming much less than you project over the next decade…. would convince you that your projections were mistaken. You reply with something about how sensible your projections are, and continue to ignore the question. Talk about obtuse. I guess that means your opinions are never subject to change by data.

      • The warming scales at 1 C per 100 ppm, which works out as effectively 2.3 C per doubling for the last 60 years. We can work out from this that for 2.5 ppm/yr as a current rate, we should get 0.02 C/yr which is about right. As the rate increases to average 3.5 ppm/yr, we can get 1 C by 2050 which requires an average of 0.03 C per year. The factor you forget, and I keep telling you all along, is the increasing emission rate under BAU. So far CO2 increase rates have doubled every 33 years, so using 3.5 is normal growth relative to history.

      • Don Monfort

        Little jimmy dee does not have opinions. He has beliefs and faith. You will only get incessant, unwavering recitation of the dogma out of little jimmy dee. Day after day, year after year.

      • stevefitzpatrick

        I must conclude that you are in fact a robot or something similar. No request to actually address the issue (reality versus projections) seems to enter your sphere of awareness. Fair enough, you are either a robot or profoundly dumb. I can’t choose between these, but which it is does not matter. I will not waste time on you again. Vai para o inferno e cama-se., filho da puta.

      • This was your question “If the warming over the next decade is way less than 0.3C, will you admit the sensitivity to forcing is lower than you have claimed?”
        My number is based on 60 years, so one decade won’t change it much, so there’s my answer. I will recalculate at that time, but by then we will have 70 years. Decades by themselves are not worth much because they are all over the place. This is what decadal trends look like. The red line is degrees per decade for a 10-yr smoothed temperature which is the green line.
        http://woodfortrees.org/plot/gistemp/mean:80/mean:40/derivative/scale:120/plot/gistemp/mean:80/mean:40

      • Don Monfort

        Would you like me to translate the Portuguesa for you, yimmy?

  34. Jim D: Let’s take 35 years between 1 C and 2 C (e.g. 2015-2050). Let’s take a conservative-moderate BAU averaging 3 ppm per year. This gives 105 ppm, a CO2 level of 505 ppm. Let’s take 100 ppm per degree like the average of the last 60 years has been. That’s the 1.05 degrees where I got 2050 from. It goes as a log so maybe it will be a little less, but not much. Accounting for that, I get nearly 0.8 C by 2050, still going through 1 C in the 2050s decade.

    We’ll know soon enough how accurate that is.

    I am glad that you wrote it out. I may not be around to assess the accuracy of your forecast (projection, prediction, estimate, etc) but someone will.

    • Matt: You need to put a confidence interval around you projection. We know warming has average about 0.17 K/decade, but the predictions made using that observed rate could be off in any one year by about +/-0.2 K due to El Nino or La Nina or another decade long hiatus or period of rapid warming. Or you might want to narrow your range by average 2045-2055. Are you talking about global temperature? What happens to your prediction if CO2 isn’t 505 ppm then? How about a prediction for 450-550 ppm CO2 in 2050 assuming a linear interpolation. If CO2 is 5% higher than you thought, then your projected range goes up by 5%?

      • He was quoting mine. I have many predictions, one of which is 505 ppm by 2050. Depends on the assumptions. The largest uncertainty is emission rates.

  35. Pingback: Climate Sensitivity Estimates and Corrections – Enjeux énergies et environnement

  36. Pingback: Climate Sensitivity Estimates and Corrections – Climate, Forests & Woodlands

  37. Pingback: Sensible Questions on Climate Sensitivity – Climate, Forests & Woodlands