Week in review – science edition

by Judith Curry

A few things that caught my eye this past week.

Sun’s impact on climate quantified for the first time [link]

North Pacific 20th century decadal-scale variability is unique for the past 342 years [link]

A reconstructed South Atlantic Meridional Overturning Circulation time series since 1870 [link]

A robust empirical seasonal prediction of winter NAO and surface climate [link]

An interannual link between Arctic sea-ice cover and the North Atlantic Oscillation [link]

Cosmic Rays Increase Cloud Cover, Earth’s Surface Cools [link]

“New Studies Confirm Solar Activity Plays Important Role On Driving Climate” [link

Removing from models affects amplitude, frequency & regularity.  [link]

Melting sea ice may lead to more life in the sea [link]

Lovejoy: How accurately do we know the temperature of the surface of the earth? [link]

Russia identifies 200 lakes “bubbling like jacuzzis” with methane [link]

Deep-water masses in the Subpolar North Atlantic, where do they occur & what are the physics? [link]

New paper: “eastern Medit. experienced wetter-than-present summer conditions during the early–late Holocene.” [link]

“Influences of temperature &precipitation on historical &future snowpack variability over N Hemisphere in…Model” [link]

“The relationship between wintertime extreme temperature events & large-scale atmospheric circulations” [link]

“Higher Southern Oscillation Index &Pacific Decadal Oscillation trigger increase in frequency of heavy precipitation

Greenland’s Coastal Ice Passed a Climate Tipping Point 20 Years Ago [link]

Climate seesaw at the end of last glacial phase” finds “regional warming in Europe caused COOLING &snow in E Asia [link]

Sensitivity of attribution of anthropogenic near-surface warming to observational uncertainty [link]

Paper finds glaciers have been melting at the same rate since 1850 

Prediction of may help risk forecasts, especially near coasts.  [link]

Theory: oceanic feedback across Indian Ocean in 70 days triggers new MJO convection. [link

The main outcomes of the Fourth International Workshop on the Advances in the Use of Historical Marine Climate Data. [link]

The many flavors of are traced by model test to coupling w/Pacific westerly wind bursts. [link]

Paper on East Asia summer monsoon  finds extreme rainfall more intense due to climate change [link]

Amplification of AMO by AGW [link]

“Severe testing of climate change hypotheses”, Joel K. Katzav, [link]

Weather/land model of 2012 US shows locally 2-3 deg. cooler air near . [link]

NH midlats haven’t been warming as fast recently due in part to decadal trends in strat polar vortex. [link]

Molecular liquid storage of solar energy [link]

New mesoscale convective prediction study shows errors matter a lot: [link]

How Will Earth Respond to Plans for Carbon Dioxide Removal? First Workshop of CDR Model Intercomparison Project [link]

Regional Greenland accumulation variability from Operation IceBridge airborne accumulation radar [link]

US varied over 15 yrs: frequency up, base heights down, especially in Eastern winters. [link]  

Dynamical reconstruction of the global ocean state during the Last Glacial Maximum [link]

Transient atmospheric response to a reduction of sea-ice cover in the Barents and Kara seas [link]

Precipitation-driven glacier changes in the Pamir and Hindu Kush mountains [link]

Enormous volcanoes vomited lava over the ancient Earth much more often than geologists had suspected. [link]

New approach treats missing parameterizations of organized in contemporary GCMs.  [link]

If climate models have trouble w/internal low freq var (underest amp) their use in attribution studies is limited. [link]

Climate change combines with fishing & nutrients to threaten world’s coral reefs [link]

Comparison of Arctic sea ice thickness and snow depth estimates from CFSR with observations [link]

Automated parameter tuning applied to sea ice in a global climate model [link]

Russian Scientists Predict Global Cooling In The Next Few Decades [link]

Variations of Northern Hemisphere Storm Track &Extratropical Cyclone Activity Associated w/Madden-Julian Oscillation [link

How does SST variability over western N Atlantic control Arctic warming over Barents–Kara Seas? [link]

Skill possible for US precip & temp predictions out 3-4 weeks, using signal, etc.  [link]

Ocean State Report [link]

Contribution of natural variability to Arctic sea ice loss quantified [link]

Hypothesis testing in hydrology – theory and practice [link]

The effect of climate–carbon cycle feedbacks on emission metrics [link]

Soils could release much more carbon than expected as climate warms: [link]

New modeling study in demonstrates global effects on of the Atlantic Multidecadal Variability: [link]

New study identifies 2 processes that contribute to enhanced Saharan & increased Sahel : [link]

Flavors of ENSO and stratospheric polar vortex response: [link

Role of external forcing and internal variability in regulating global mean surface Temperature [link]

Decision making under uncertainty

Turning uncertainty into useful information for conservation decisions [link]

Robust decision making in data scarce contexts: addressing data and model limitations for infrastructure planning [link]

Consensus? No, Good Decisions Require “Respectful Disagreement” [link]

Why most reasoning for policy interventions is (possibly) wrong [link]

About science

Comment: Research needs more competence, less ‘excellence’ : Nature

On being female in science [link]

The suicide of expertise [link]

This is as good an article as you’re going to read on science and innovation policy. [link]

“Expert reviewers spend a lot of time allocating grant money… But the truth is that they’re not very good at it” [link]

The Problem Is Epistemology, Not Statistics: Replace Significance Tests by Confidence Intervals [link]

 

 

169 responses to “Week in review – science edition

  1. Pingback: Week in review – science edition – Enjeux énergies et environnement

  2. Dear Judith,

    Living in the other side of the Atlantic in the climate milds of the west coast of Europe, Cascais – Portugal, I wonder why all the buzz about Climate Change …. In fact, we know from historical records craved in the landscape and in the territory that’s this was, is and will be a Natural event.
    In Sahara caves we observe the same ancient pictures we can visit in Spain and Portugal caves, yet they are under tons of sand and we aren’t and that change only took few thousand years to occur. Today I cross the small Sierra de Sintra to visit some friends and the temperature dropped 5 ºC from the south bank to the north bank of the sierra and that’s natural, the only thing manmade in the process is the highway that allowed me to do the crossing in just 10 minutes.
    Again, we have to put everything in perspective, today we move faster and we see and hear globally, those are man made actions to facilitate our quality of live and our understanding of our environment and must be perceived just as it is – a very positive evolution in human condition that we need to expand to all humanity.
    Some say this is not a good idea, that we are messing with the natural course of events and we need to transition to an earlier human condition, those are described as the Naturalists.
    Myself I’m a Humanist, as most of the engineers and scientists are, driven by the curiosity of finding new solutions to enhance the Human Conditions and eager to help the solving of the actual and future problems. Nonetheless, there are some, few, among us that prefer to put the curiosity aside and accept to deliver plausible explanations to the dogmas and myths that enforces religions and political currents, those are not Scientists those are the Sophists. And please beware of them they will always keep deceiving their students because that’s their ancient way to assure their payed!!!

    All the best,
    Alexandre Guedes da Silva
    Naval Architect and Ocean Engineer

  3. Really nice array of topics…

    The most recent warming hiatus apparent in observations occurred
    largely through cooling from a negative IPO extreme that overwhelmed the warming from external forcing. An important implication of this work is that when the phase of the IPO turns positive, as it did in 2014, the combination of external forcing and internal variability should lead to accelerated global warming. This accelerated warming appears to be underway, with record high GMST in 2014, 2015, and 2016.

    Call me names. Call me a cheerleader. Don’t care… I called this. Climate (ACO2 is the control knob) skepticism is a CargoCult. They got all hung up in personalities and political beliefs and the PAWS completely fooled them… despite the fact they read all about Feynman warning them not to fall for the fake physics of the spoon benders as they look like science, talk like science, act like science, make graphs like scientists, even calculate like scientists… and ain’t got no science. And assembling these fakes into a red team ain’t gonna change this.

    The stadium is waving a HEATWAVE… for how long, nobody knows. But for now, long enough to make 2017 a record warmest year, and maybe even 2018.

    Surely somebody told congress.

    • David Wojick

      Conjecture stated as fact is not science.

    • “We propose that there is negligible in situ atmospheric warming and that almost all of the added heat trapped by anthropogenic greenhouse gases is absorbed by and stored in the ocean. It is subsequently released through the action of oscillatory mechanisms associated with regime shifts.”

      https://wattsupwiththat.com/2017/03/31/a-ground-breaking-new-paper-putting-climate-models-to-the-test-yields-an-unexpected-result-steps-and-pauses-in-the-climate-signal/

      Still with the one variable control taking on the oceans. And since CO2 is so powerful, the warming should stay there in the oceans. But wait, El Ninos. CO2 ought to be able to stop those.

      “Given that the atmosphere interacts with the top 70 m of ocean over an annual cycle (Hartmann, 1994), there is ample opportunity for the majority of available heat trapped over land that is not absorbed by land, lakes and ice to be absorbed by the ocean.”

      Give it time. The oceans will be pulling the GMST in the downwards direction for decades.

      • JCH:
        The atmosphere is agile. Capable of back flips. The oceans are where the joules are. Massive. Slow to materially gain or lose joules.
        “Below the sea surface, historical measurements of temperature are far sparser, and the warming is more gradual, about 0.01°C per decade at 1,000 meters.”

        https://scripps.ucsd.edu/news/voyager-how-long-until-ocean-temperature-goes-few-more-degrees

        Everything will be Okay.

      • Minnett… why greenhouse gases warm the oceans. Skeptics… because we’re pizzed off at Mann, greenhouse gases cannot warm the oceans:

      • JCH, lots of heat lost from the oceans that fueled ’15 & ’16 in addition to the anomalously low albedos.

        Predictions:
        1. 2017 global mean surface temperature anomalies will be lower than those of 2016, and likely lower than those of 2014.

        2. Global CO2 emissions will decline for third straight year in a row.

        Stand by for results.
        ( I will gladly accept your praise and awards after the fact ).

      • JCH:
        At my WUWT link above, in a comment I suggest again that CO2 and GHGs can warm the oceans. The author says something like a warm air parcel moves from land to the ocean that has a cooler surface. It cools. Where did the warmth go?

        With your ocean joules plot half the ocean has hardly budged. As it has done that it has dragged joules downward and will continue to do so trying to remove the imbalance from the water above it. This is the anchor that will bound the GMST for a long time.

      • Most of the OHC lost when the El Niño ended and transitioned to La Niña came to be during the 2015-2016 El Niño itself. OHC rebounded nicely in the last 1/4 of 2016, and is likely back in 2015 territory by now. There is not much upwelling now, so OHC is zooming upward.

        The thermal skin layer is a bottleneck. The more GHGs pumped into the atmosphere, the hotter TSL gets -especially during EL Niño-positive PDO; the hotter it gets, less energy can pass through it. The only means of undoing this, on net, is a persistent negative imbalance at the TOA, and there isn’t going to be one of those for a very very long time.

      • The Surface Layer bottleneck means the water moves sideways. Ocean water warmth transfer and storage improves. The polar regions only have this Surface Layer bottleneck if covered with sea ice. The suggested improved transfer of water is to the polar regions where without sea ice, there in no bottleneck. The GMST goes up. This is a sign of the system operating to moderate change. When the GMST goes down long term, this storage and transfer is reduced.

      • Where did the warmth go?

        To space, where it all goes. It’s warmed from below, and by condensing water vapor late at night., and very little directly during the day.

    • We show evidence that trend of the slowdown in surface temperature warming is also evident in the troposphere. The slowdown is a phase transition of internal climate variability with an evident atmospheric footprint, including interactions between large-scale circulation and cloud distribution. New evidence shows that the global warming trend resumed in 2014 2,30 because the IPO transitioned from negative to positive30, and 2014 and 2015 are now the warmest two years on record2,8. Future projections of climate need a better understanding of the combined effects of both external forcings and internal variability. Regardless of the variability of the external forcings, we believe that there is a probability that the atmospheric footprint of future decades resembles that of the preceding warming period (1983–1998), although the time span may depend on this ongoing positive IPO phase.
      http://www.nature.com/articles/srep40947

      It is driven by the intensity of westerlies in the polar annular modes – which are modulated by solar UV/ozone chemistry.

      Has the Hiatus ended? Difficult to say against a hyper-dynamic background of ENSO variability. ENSO and PDO have 20 to 30 year regimes that are phase locked. I keep making the point that this is not global warming – it is an Earth dynamic process that reorganises global ocean/atmospheric circulation and modulates climate over decades to millennia. The next shift is due in a 2018-2028 window – so JCH may be somewhat premature in declaring over a regime that – at any rate – has no bearing on AGW except to show that greenhouse gases are not the sole cause of global warming – at least during warm phases of the IPO.


      http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00003.1

      More salt is La Nina and more rain in Australia. Coming off a 1000 year high – frankly I am expecting a super-Hiatus this century.

      • 1000 year El Nino high….

      • Well, what you are getting is an aggressive warming period described by Knutson, of GFDL, as a spring-back warming. Ironic given the paper was about prolonging the PAWS, which was completely dead before the ink dried.

        Before the current heatwave is done, GCMs will be running too cool; there will a significant acceleration in SLR; the satellite surface temperature series will either be fixed or junked, and the shameful era of their misrepresentation to the American public as being a gold standard will be over.

        38 straight months of positive PDO. I believe a record for JIASO.

      • So what is the alternative to the hypothesis-testing framework in which theories (hypotheses) are evaluated for mirroring nature, and data are substituted for phenomena? The alternative involves inference-laden signification and a world-directed point of view. It is semiotic in that it understands the world, not in a detached manner as a mere source of data, but as a complex interpretive structure, in which the investigator is immersed. This is a world mediated and sustained by signs that exist in a continuous, connected flow, a semiosis, in which the signs are things that stand for something else (their object) in relation to something else (their interpretant). A scientifically fruitful aspect of this view is the recognition of indexical signs in which the relationship to objects is one of causation. Although the world contains, or is composed of a semiotic structure (a semiosis) of indexical signs, the interpretant aspect of these signs is what is triggered in the investigator, whose thoughts, in turn, become new signs, constituting a continuity of the signs in human thought with those in the world [e.g., Baker, 1999]. Thus, it is in through this semiosis, or action of signs, that the world “speaks” to the investigator [Baker, 2000].

        There is a vast difference between science-as-seeker and examining the entrails of a single indexical sign.

        It is hypothesized that where the PDO goes is caused by solar mediated Northern Annular Mode – which is more generally negative since the solar downturn in the 1990’s. You have little but sweetmeats and rather silly sounding rants.

        The WRFsimulated temperature field when including soil moisture and LAI modification within the model is shown to be most consistent with ground and satellite observations, all indicating a 2-3 K decrease of temperature in irrigated areas compared to the control run.

        Drought changes the balance between latent and sensibvle heat as well as irrigation. What should be scrapped is the surface temperature compilations.

        The two quotes – btw – are in papers linked in Judith’s post.

      • When they fix the obviously broken satellite series, then you can drop the thermometers.

      • When they fix the obviously broken satellite series

        lol, you’re right, they aren’t the junk that’s published as surface series.

  4. David Wojick

    The Lovejoy paper is a bit silly in that it looks at the differences in the surface statistical models and calls that the accuracy, ignoring the many other sources of error, which affect all the models.
    https://link.springer.com/article/10.1007%2Fs00382-017-3561-9?utm_content=buffer3a609&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

    Still they seem to get about 0.1 degree C, which is pretty big. As listed below there are something like 10 different major error sources. If each is around 0.1 degree then combined they may well exceed a whole degree. This would explain the difference between the surface statistical models and the satellite readings, which seem to show no GHG warming at all.

    Here is my (previously posted) draft list of potential errors, which is still not complete by any means. This should be a major research area.

    >Reforming NOAA research

    In addition to budget cuts we need to refocus climate research. Here is my proposal for NOAA global and regional temperature estimates. The first step is a white paper elaborating on these needs in some detail.

    A needed NOAA temperature research program

    NOAA’s global and US temperature estimates have become highly controversial. The core issue is accuracy. These estimates are sensitive to a number of factors, but the magnitude of sensitivity for each factor is unknown. NOAA’s present practice of stating temperatures to a hundredth of a degree is clearly untenable, because it ignores these significant uncertainties.

    Thus we need a focused research program to try to determine the accuracy range of these temperature estimates. Here is a brief outline of the factors to be explored. The goal is to attempt to estimate the uncertainty each contributes to the temperature estimates.

    Research question: How much uncertainty does any of the following factors contribute to specific global and regional temperature estimates. Each can be explored independently.

    1. The urban heat island effect (UHI). This is known to exist but its specific effect on the temperature recording stations at any given time and place is uncertain.

    2. Local heat contamination or cooling of temperature readings. Extensive investigation has shown that this is a widespread problem. Its overall extent and effect is highly uncertain.

    3. Other temperature recording station factors, to be identified and explored.

    4. Adjustments to temperature data, to be identified and explored. There are numerous adjustments made to the raw temperature data. These need to be cataloged, then analyzed for uncertainty.

    5. Homogenization, which assumes that temperature change is uniform over large areas, is a particularly troubling adjustment deserving of special attention.

    6. The use of sea surface temperature (SST) proxies in global temperature estimates. Proxies always add significant uncertainty. In the global case the majority of the surface is oceanic.

    7. The use of an availability sample rather than a random sample. It is a canon of statistical sampling theory that availability samples are unreliable. How much uncertainty this creates in the temperature estimates is a major issue.

    8. Area averaging. This is the basic method used in the surface temperature estimating model and it is a nonstandard statistical method, which creates its own uncertainties.

    9. Interpolation or in-fill. Many of the area averaging grid cells do not have good temperature data, so interpolation is used to fill them in. This can be done in many different ways, which creates another major uncertainty.

    10. Other factors, to be identified and explored.

    To the extent that the uncertainty range contributed by each factor can be quantified, these ranges can then be combined and added into the statistical temperature model. How to do this is itself a research need.

    Note that it is not a matter of adjusting the estimate, which is what is presently done. One cannot adjust away an uncertainty. The resulting temperature estimates will at best be in the form of a likely range, not a specific value as is now done.

    Most of this research will also be applicable to the other surface temperature estimation models, such as HadCRU, GISS and BEST.

    • Steven Mosher

      Your points were all addressed.

      You need to REVISIT your claims based on red team results.

      You need to spend more time questioning your own assumptions
      and your own ability to understand things.

      Feynman would suggest you are fooling yourself.
      try not to do that.

      Start by listing every objection to your 10 points.
      Start with self critcism

      • Mosshher the great and powerful, you talk about somebody else fooling themselves with no apparent sense of irony. And you issue a set of demands about how somebody else can improve their ideas again with no apparent sense of irony.

        And there’s your ever present unsupported suggestion that everything of any importance has already been looked at and you know all the answers.

        What ever happened to you?

      • David Springer

    • David, your list is worthwhile but it assumes that the manufacture of a single figure to represent the temperature of the earth.

      The problem with that idea is the amount of information which is thrown away in the process. It is why alarmists say that the reversal of the trend in Rutherglen makes no difference. And they are right. When the rest of the data fiddling is complete the figure comes out the same regardless of what has been done in a particular location.

      Better in my view is to reverse the process and look at the individual trends in actual locations first and then see how it all looks when those individual trends are aggregated over different scales.

  5. “Sun’s impact on climate change quantified for first time’

    Another useless model. Global warming is caused by life on earth, it is not solar in origin.

    • David Wojick

      Does this include the ends of the various ice ages? How about the end of the recent little ice age?

      • nabilswedan

        David,

        Without life on earth, the climate would be like those of Mars or Venus,unchanged. Life on earth is the cause of fluctuations in surface temperature, and yes, of ice ages. This conclusion is not new. We have had Vernadsky and Lovelock concluding so. The mathematical proof has already begun to appear and more will be published soon.

      • David Springer

        “Without life on earth, the climate would be like those of Mars or Venus irrelevant”

        Fixed that for ya!

    • Good to know we can turn off the sun and remain comfortable.

      • nabilswedan

        kellermfk,
        You are way off. The sun is required and average energy balance of the earth is caused and maintained by the sun. However, The VARIATION in surface temperature is caused by life, definitely.

      • Excuse me, but you are the one claiming man is warming the planet. I merely followed the logical conclusion to your theory.

      • Also nice to know the sun has no influence on days getting warmer and colder (Variation).
        To claim man is the only element influencing the chaotic climate is nonsensical.

      • Daily and annual variation in surface temperature are caused by the sun, of course, due to variation in the distance between the sun and the earth. They are repeatable changes and no change in the long run. Milankovitch 100-kyr or similar cycles are weaker than the annual seasonal cycle (0.5 degrees or less). The observed long term change in surface temperature (glaciations and deglaciations) is caused by life on earth, not human alone.

  6. Judith,
    Item missing link: Hypothesis testing in hydrology – theory and practice [link]

    • link fixed. that is a REALLY interesting paper

      • If a person chews on “Hypothesis testing in Hydrology” and ” The problem is Epistemology, not Statistics” and throw in a little Nassim Taleb for dessert, you have a feast for thought.

        You also have an even stronger case for common sense and humility in science.

    • David L. Hagen

      Debates—Hypothesis testing in hydrology: Pursuing certainty versus pursuing uberty Victor R. Baker PDF

      Abstract: Modern hydrology places nearly all its emphasis on science-as-knowledge, the hypotheses of which are increasingly expressed as physical models, whose predictions are tested by correspondence to quantitative data sets. Though arguably appropriate for applications of theory to engineering and applied science, the associated emphases on truth and degrees of certainty are not optimal for the productive and creative processes that facilitate the fundamental advancement of science as a process of discovery. The latter requires an investigative approach, where the goal is uberty, a kind of fruitfulness of inquiry, in which the abductive mode of inference adds to the much more commonly acknowledged modes of deduction and induction. The resulting world-directed approach to hydrology provides a valuable complement to the prevailing hypothesis- (theory-) directed paradigm.
      Plain Language Summary: This commentary suggests that a world-directed, investigative approach to hydrology may serve as a productive complement to the prevailing hypothesis- (theory-)directed, approaches. The emphasis of the former on discovery has the potential to be transformative for investigative hydrology.

  7. Only 3 items (out of 60+) that caught the ex professor’s eye were related to the health of the biosphere. This is more important than measuring global climate statistics.

    More algae might help stabilize the upper levels of the Arctic food chain if we are lucky? Why not link to the original source instead of the filtered version on WUWT?

    I think this is the Copernicus Marine Environment Monitoring Report.
    “Since phytoplankton represent the first link between the marine biota and their energy source (sunlight), it is to be expected that changes in the marine ecosystems would first manifest themselves through changes in the phytoplankton concentration, their species composition and their phenology (timings of important events in the phytoplankton calendar). It is, therefore, of utmost importance to monitor phytoplankton concentrations at multiple time and space scales. …
    The 2015–2016 ENSO event was the strongest observed since 1997, and in parallel a large reduction in phytoplankton chlorophyll concentration occurs in the Equatorial Pacific Ocean (Figure 17), which has not been seen since 1997”
    Conclusions: It’s getting warmer, sea levels rising, and the biosphere is reacting at accelerated rates.

    A research paper that finds that if icebergs contain enough iron (iron dissolved from aeolian dust) then it can increase the local algae populations.

    • alanlonghurst

      The reduction in chlorophyll during the recent Nino is nothing unusual and nothing directly to do with greenhouse ‘warming’. It is due to the relative lack of wind-stress at the surface during these events.
      Consequently, Coriolis-induced upwelling of cool, nutrient-rich subsurface ceases on each side of the equator. Reduced nutrient replacement from below means reduced phytoplankton growth.
      Same process and sequence occurs in many regions, especially off ‘upwelling coasts’ like California, has nothing directly to do with air temperature, but is treated as if it was in discussions of the global temperature record. Only if the frequency and strength of the Southern Oscillation (and of coastal wind regimes) is affected by greenhouse warming would this be true.

    • ENSO was discovered by fishermen – the PDO was first described by biologists. Both originate in the upwelling of cold and nutrient rich water in the eastern Pacific. Ultimately the 20th century El Nino high in frequency and intensity will give way to the La Nina normal.

      There are far more serious biological issues – balancing the human ecology with prosperous communities in vibrant landscape. This does include carbon dioxide as changing abiotic conditions in ocean ecologies and hydrology in terrestrial ecosystems has consequences – minor or major who knows. In these complex and dynamic systems – the effects can be far reaching if unpredictable.

      The role of global warming in population declines is probably overstated as these things go. But the consequences of declines in the population of tens of thousands of species seems inevitable. There is a point where populations crash to extinction and ecologies collapse. That is not a world any of us want.

      Carbon dioxide seems the least of our problems. The problem there is to nurture economic growth with the cheapest energy sources until there is a transition to 21st century energy driven by actual market forces. This will happen with astonishing rapidity as the creative destruction of capitalism is unleashed.

      Reclaiming deserts and restoring soils, forests, aquatic ecosystems and grassland has major benefits in addition to offsetting anthropogenic emissions from fossil fuel combustion, land use conversion, soil cultivation, continuous grazing and cement manufacturing. Restoring soil carbon stores increases agronomic productivity and enhances global food security. Increasing the soil organic content enhances water holding capacity and creates a more drought tolerant agriculture – with less downstream flooding.
      There is a critical level of soil carbon that is essential to maximising the effectiveness of water and nutrient inputs. Global food security, especially for countries with fragile soils and harsh climate such as in sub-Saharan Africa and South Asia, cannot be achieved without improving soil quality through an increase in soil organic content. Wildlife flourishes on restored grazing land helping to halt biodiversity loss. Reversing soil carbon loss in agricultural soils is a new green revolution where conventional agriculture is hitting a productivity barrier with exhausted soils and increasingly expensive inputs.

      Our smart move is to restore soils and ecosystems to build prosperous, sustainable and resilient communities globally this century. Some 100 billion tonnes of carbon can be restored to grassland and woodland globally in 30 or 40 years. By then the transition to 21st century energy will be underway.

      • Do you think that ENSO is a recent phenomenon and was much weaker than today in the distant past? If so do you have a study or research to back up this conclusion?

      • Depends on how far back you want to go – I believe the modern ENSO emerged after the shoaling of the Isthmus of Panama.

        “On the Pacific side, weather patterns changed too, with deep waters along the western coasts of both continents continually welling up, and the domination of the cyclic El Niño pattern, in which the eastern ocean surface alternately warms and cools. El Niño now directly or indirectly drives rainfall, and thus agriculture, on scales of decades across much of Asia, and both Americas.” http://blogs.ei.columbia.edu/2014/03/31/the-isthmus-of-panama-out-of-the-deep-earth/

        Over the Holocene – immense variability is evident in a high-resolution sediment record. Christopher Moy and colleagues (2002) examined a sediment core from Laguna Pallcacocha in southern Ecuador. More rainfall and runoff from a warmer sea surface in the eastern Pacific washes more red sediment into the lake. So we know it was pretty rainy in South America a 1000 years ago. Some 5,000 years ago there was a change from more upwelling to less – that dried the Sahel. Just 3,500 years ago there were a long series of warm Pacific events with red intensity greater than 200 – and civilisations fell. For comparison – red intensity in the ‘monster’ 1997/1998 El Niño event was 99. Extremes in the Holocene put those of the 20th century to shame.

        Note the mid-Holocene transition from La Nina to El Nino dominated conditions.

        This is a fun one – http://www.clim-past.net/6/525/2010/

      • nabilswedan

        Thank you for the information Robert. What about tree rings in the same area or western coasts of the Americas. Do the tree rings show similar impression of ENSO?

  8. Dr. Curry,.
    After reading through the article on being female in science I had some difficulty with it.
    Seems to me social sexism pales in comparison to the intellectual discrimination you endure.
    The same people that are obsessed over identity politics attack you without the slightest effort to understand your actual position,

    Sexist bigotry is a minor problem compared to intellectual bigotry.
    Improve the latter and the former will follow.

    Thanks so much for week in review.

  9. oops sorry double post

  10. The molecular solar energy storage article is a puzzler. They’ve gone to a lot of expense and effort to come up with something that could, in the best case, store 300 kJ/kg (their number). Paraffin wax stores 200 kJ/kg in the form of heat of fusion. It’s already used to store solar energy, as in ThermalCORE phase-change material, a drywall made by National Gypsum. Microballoons filled with pure paraffin wax are mixed in the gypsum, absorbing heat during the day, and releasing it into the home at night. There are numerous other examples. It’s old hat.

    Of course, coal is also stored solar energy, but it contains 20,000 kJ/kg.

  11. Roger Knights

    Judy: The link is inactive (unclickable) in: “Ocean State Report [link]”

  12. A reconstructed South Atlantic Meridional Overturning Circulation time series since 1870
    http://onlinelibrary.wiley.com/doi/10.1002/2017GL073227/abstract
    Paywalled
    Key Points:
    –A century-long reconstructed South Atlantic Meridional Overturning Circulation index from sea surface temperature is presented.
    –The Interdecadal Pacific Oscillation and Atlantic Niño are the main contributors of interannual/decadal covariability of SST and SAMOC.
    –SAMOC is currently in an anomalous positive phase.

    They use HadSST, ERSSTv3, ERSSTv4, and COBE-SST to reconstruct SAMOC. They find all reconstructions very coherent, which means temperature databases are very coherent. They find that recently, (since about 2012 from the figure), SAMOC has turned anomalously positive indicating a larger than normal volume transported northward, that is attributed to internal variability since the series is detrended, and thus eliminates the effect of global warming.

    Not particularly interesting to me.

    A robust empirical seasonal prediction of winter NAO and surface climate
    http://www.nature.com/articles/s41598-017-00353-y
    Open access

    They find that they can better predict next year winter NAO using sea-ice, SST and stratospheric circulation from September to November.

    Besides being enormously useful to predict NAO, this article is very important as links NAO to SST, sea-ice and the stratosphere. Besides being scientifically very relevant, it is important to know that one of the leading solar-climate theories links stratospheric circulation changes from
    ozone changes due to changes in solar radiation to persistent NAO– conditions during the winter during solar minima.

    An interannual link between Arctic sea-ice cover and the North Atlantic Oscillation
    http://link.springer.com/article/10.1007%2Fs00382-017-3618-9
    Open access

    More on the same. They confirm the relationship between Arctic sea ice cover and NAO. This all fits nicely with Dr. Curry’s article with Marcia Wyatt on the Stadium wave hypothesis.

    Cosmic Rays Increase Cloud Cover, Earth’s Surface Cools
    https://nextgrandminimum.wordpress.com/2017/03/24/cosmic-rays-increase-cloud-cover-earth-surface-cools/

    More on the cosmic cloud theory from Russian scientists.

    Not particularly interesting to me.

    The sun drives climate: Spain and Portugal
    http://notrickszone.com/2017/03/25/fascinating-new-studies-confirm-solar-activity-plays-important-role-on-driving-climate/#sthash.8kFidqBp.dpbs
    Temperature variability in the Iberian Range since 1602 inferred from tree-ring records
    Open access

    Latest article from Tejedor et al., for multi-centennial climate change in this part of the Iberian peninsula that does not fit at all with the CO2 hypothesis (2015 one about precipitations was also very good). A lot of regions like the US or Spain do not show the same trends as the globe. One wonders why the places we can check with some reliability do not support the global warming tale. Is global warming an Arctic feature?

    How accurately do we know the temperature of the surface of the earth?
    http://link.springer.com/article/10.1007%2Fs00382-017-3561-9
    Paywalled

    Lovejoy takes on temperature database to show that their 90% confidence range is higher than assumed. Something that many of us already knew, since very often adjustments take the changes in the databases outside the prior to adjustment 90% confidence range. The hidden uncertainty in temperature databases is shameful.

    Precipitation changes in the Mediterranean basin during the Holocene from terrestrial and marine pollen records: a model–data comparison
    http://www.clim-past.net/13/249/2017/
    Open access

    Latest take on Holocene precipitation changes in the Mediterranean from a French-German team with the participation of Michel Magny, the leading expert in the field. Their results support an important climate shift at the Mid-Holocene, and a previously unknown precipitation difference between Eastern and Western Mediterranean.

    The relationship between wintertime extreme temperature events north of 60°N and large-scale atmospheric circulations
    http://onlinelibrary.wiley.com/doi/10.1002/joc.5024/abstract
    Paywalled

    More on the AO/NAO, this time for the Arctic/Subarctic region for the instrumental era. They find a correlation between AO/NAO and winter extreme warm and extreme cold temperatures, which to me is unsurprising. Still it is nice to have the data to show it.

    Climate seesaw at the end of the last glacial phase
    https://www.sciencedaily.com/releases/2017/03/170331120340.htm
    Evidence for a bi-partition of the Younger Dryas Stadial in East Asia associated with inversed climate characteristics compared to Europe
    http://www.nature.com/articles/srep44983
    Open access

    The first half of the Younger Dryas was colder and drier than the second half in the North Atlantic region. This article shows evidence that in East Asia it was the opposite. The rest is just one assumption over another assumption.

    The evidence is weak and the assumptions, well assumptions and therefore without any value. One of the figures shows a shameless line adjusting to try to match events that according to the records took place at different times. Talking about teleconnections between the North Atlantic and East Asia is just… unsupported.

    New paper finds glaciers have been melting naturally at the same rate since 1850, no acceleration predicted
    http://hockeyschtick.blogspot.com.es/2014/01/new-paper-finds-glaciers-have-been.html
    Feedbacks and mechanisms affecting the global sensitivity of glaciers to climate change
    http://www.the-cryosphere.net/8/59/2014/tc-8-59-2014.html

    This was new in 2014, three years ago. And is about something that is quite well known by people reading glacier literature. Glacier melting does not follow CO2 levels and does not follow global temperature changes. Glacier dynamics are very complex and require the glaciers to be divided according to their nature, steepness, and if they end in the sea or land. Some cryologists, including the authors, believe that glaciers are experimenting now the melting due to past warming, to explain why they continue melting during the Pause, and therefore predict that they are committed to a further loss regardless of what the climate does. I am not convinced. I think we just don’t understand glaciers that well.

    “Severe testing of climate change hypotheses”, Joel K. Katzav, [link]
    https://pure.tue.nl/ws/files/3935074/375419940988534.pdf
    dead link
    http://www.sciencedirect.com/science/article/pii/S1355219813000774
    Paywalled

    Also an old paper from 2013. An interesting epistemological exploration of the strength of the IPCC AR4 claim that climate change is OUR FAULT. Enough to respond to Steven Mosher when he critizices the skeptical claim that it does not pass the null hypothesis test. Obviously the conclusion is that the claim that it is OUR FAULT doesn’t pass the severe testing.

    Perhaps somebody can provide an alternative link.

    Stratospheric variability contributed to and sustained the recent hiatus in Eurasian winter warming
    http://onlinelibrary.wiley.com/doi/10.1002/2016GL072035/full
    Open access

    Perhaps another one of the many papers that is showing a solar signature in the climate through changes in the stratosphere, that is known to respond a lot more to solar variability. They try to make a connection between the Hiatus, and Eurasian winter cooling, and weakening of stratospheric polar vortices. Although they carefully avoid any mention that the weakening of the stratospheric vortices could be due to solar forcing (they assign it to internal variability), they provide evidence for the downward coupling from the stratosphere to the surface, one of its fundamental requirements.

    Nature, not humans, could be cause of up to half of Arctic sea ice loss, study claims
    http://www.independent.co.uk/environment/arctic-sea-loss-ice-melting-nature-not-humans-responsible-up-to-half-study-claims-a7627616.html

    Terrible news to Arctic sea ice alarmists. There are a number of papers saying the same from some time ago, but people are not paying attention. Our host Judith Curry already said so in one of her publications:
    “In recent decades, rapid changes in the Arctic have been documented (e.g. Alkire et al. 2007). Most interpretations of the recent decline in Arctic sea ice extent have focused on the role of anthropogenic forcing (e.g. Johannessen et al. 2004), with some allowance for natural variability… But according to stadium-wave projections, and according to our interpretation of stadium-wave evolution, this trend should reverse, under the condition that the stadium-wave hypothesis captures 20th century dynamics correctly. Rebound in West Ice Extent, followed by Arctic Seas of Siberia should occur after the estimated 2006 minimum of West Ice Extent and maximum of AMO.”

    Well done Judith. You are way ahead of the pack. I hope you keep publishing from time to time.

    Improved estimates of ocean heat content from 1960 to 2015
    http://advances.sciencemag.org/content/3/3/e1601545.full
    Open access

    Latest take on Trenberth and Fasullo on ocean heat content. I don’t know how they manage to find increases in OHC in all oceans when Argo doesn’t. I’ll have to read it, but not looking forward to it.

    Due to the number of links I would not be surprised if the post went to moderation. I hope some people might find useful my take on the articles that interested me more.

    • “The potentially large contribution of internal variability to sea-ice loss over the next 40 years reinforces the importance of natural contributions to sea-ice trends over the past several decades,” they said.

      Dr Nathanial Johnson, another member of the team, told The Independent that he hoped climate science deniers would not seize on their research in an attempt to suggest what is happening in the Arctic is purely natural.

      “It would be unfortunate if this gets spun into a way that really sort of downplays the importance of anthropogenic forcing in this sea ice decline,” he said.

      They were ready for you.

      • “They were ready for you.”

        Like I care what they think. It is clear that anthropogenic forcing is just one factor among several. According to them it could be as low as 50%. It might be even less. They just demonstrated that more research into natural factors affecting sea-ice is needed.

    • =={ This was new in 2014, three years ago. }==

      More recently, from a paper where the first author on the 2014 is a second author (in 2016):

      In this paper we review and update detection and attribution studies in sea level and its major contributors during the past decades. Tide gauge records reveal that the observed twentieth-century global and regional sea level rise is out of the bounds of its natural variability, evidencing thus a human fingerprint in the reported trends. The signal varies regionally, and it partly depends on the magnitude of the background variability. The human fingerprint is also manifested in the contributors of sea level for which observations are available, namely ocean thermal expansion and glaciers’ mass loss, which dominated the global sea level rise over the twentieth century. Attribution studies provide evidence that the trends in both components are clearly dominated by anthropogenic forcing over the second half of the twentieth century. In the earlier decades, there is a lack of observations hampering an improved attribution of causes to the observed sea level rise. At certain locations along the coast, the human influence is exacerbated by local coastal activities that induce land subsidence and increase the risk of sea level-related hazards.

      • “global and regional sea level rise is out of the bounds of its natural variability”

        They must surely be kidding. Global sea level natural variability goes from –120 m. to +6m.

      • Javier –

        =={ They must surely be kidding. }==

        Argument from incredulity isn’t terribly impressive, IMO.

        I doubt they are kidding.

        Perhaps you could write to the authors to inform them of your critique and see what kind of response you get. It would seem that based on your earlier comments, you initially thought that the lead author’s earlier work had some value. Perhaps not, but Judith found it “interesting” and certainly the Notrickster thought it quite insightful for supporting his views.

        You wouldn’t want to leave the impression that you reverse engineer your assessment of the quality of someone’s work simply on the basis of whether you agree with the conclusions.

    • Steven Mosher

      “One wonders why the places we can check with some reliability do not support the global warming tale. Is global warming an Arctic feature?”

      The “tale” is NOT that all places will exhibit the warming pattern in the same way. The Majority of warming will happen in the Arctic.

      In general, the REGIONAL ( read continental scale) patterns are the hardest to predict. Getting the global scale correct ( to with 10% ) is about the current state of art. Regional? not there yet. And yes, you can get the regional wrong and global correct.

      • “And yes, you can get the regional wrong and global correct.”

        Or you can get both wrong, or get them right for the wrong reason. How would we know other than wait and see? The track record is not impressive, so the “trust us, this time we are getting it right” isn’t going to work. Skepticism looks like the only reasonable approach.

      • Mosher

        “And yes, you can get the regional wrong and global correct.”

        Which part of the general circulation were we getting correct then?

      • Steven Mosher

        “Which part of the general circulation were we getting correct then?

        What makes you assume only part is correct or that the lack of regional skill is down to that cause?

        Go get a climate Model. Run it. Make changes. Observe. Read the code. Do more model experiments.

        I have never in my entire life seen people who never worked with a model
        ( much less 102 of them) try to diagnose whaat was right or wrong, by merely looking at data someone else presented to them.

      • Mosher

        It’s called knowing about first principles.

        How can models that rely upon regional circulation get that wrong while claiming in aggregate it all comes out in the wash? If that is your argument then you are arguing that the global results don’t depend on the actual circulation or that there is class of regional circulations that all deliver the same result.

        Rather you than me arguing one of those propositions. One is tempted to ask why bother with GCMs when doing multi decade modelling. If they don’t work regionally this is telling up simpler models will do as well.

    • David L. Hagen

      The epistemology of climate models and some of its implications for climate science and the philosophy of science, Katzav, J. 2014, Preprint

      My question here is what the primary target claims of useful climate model assessment are when such assessment aims to produce new knowledge about the climate system.

      The Future of Climate Modeling, Katzav & Parker 2015

      since resource limitations make it unlikely that all three will be pursued, we offer some reflections on more limited changes in climate modeling that seem well within reach and that can be expected to yield substantial benefits. . . . how resources can best be directed to advance climate science. It may be that alternatives to climate modeling – such as theorizing that is not model-driven, efforts to expand or update observing systems, more careful empirical investigation of poorly represented (or omitted) feedbacks, or development of much more detailed and careful process models – are at least as important.

    • Javier – The latest data (and model output!) from the Arctic:

      Facts About the Arctic in April 2017

      What do you reckon? As an added bonus a robust “debate” about the alleged “natural variability” of Arctic sea ice cover with some of the authors:

      Is Arctic Ice Loss Driven by Natural Swings?

      A further examination of this question will require a modelling framework that reproduces the tropics high latitude linkage faithfully and efficiently

      • How much Arctic Sea ice melted in place, and how much migrated out of the Arctic ( to melt elsewhere )? How weak has the Beaufort Gyre become? Is that because of CO2 induced Stratospheric cooling? How does this compare to the much less measured decline from 1910 through 1945? All bears watching.

        But Arctic Sea Ice decline may be one of the climate benefits of global warming. By warming winters and not so much warming summers, AA reduces pole to equator temperature gradient. Reduced kinetic energy ( fewer severe storms ) and reduced temperature variability ( fewer extreme temperatures ) are thought to result.

      • Since when has ice volume become data? In science empirical data is any magnitude that it is actually measured. I guess in computing data is any assortment of numbers, usually with little connection to the real world.

        If your models use temperature as an input, given the unusually high temperatures in the Arctic this winter they will likely underestimate sea ice. Let’s just wait till the summer. I bet you this year’s September average extent is higher than past year’s. That will make ten years of effective no Arctic sea ice loss.

      • Javier – PIOMAS Arctic sea ice volume is modelled. Hence my “(and model output!)”. However there are “measurements” of sea ice thickness too.

        Whilst it seems we agree that there have been “unusually high temperatures in the Arctic this winter” I’m afraid I don’t follow your suggestion that “[PIOMAS] will likely underestimate sea ice” as a consequence. Perhaps you could elucidate?

        I don’t take short term bets. “Natural variability” don’t you know? However please rest assured that I look forward to comparing notes with you once again in October.

        TE – Currently there seem to be more “severe storms” reaching the Arctic than previously. Just “natural variability” again?

  13. Ulric Lyons

    A robust empirical seasonal prediction of winter NAO and surface climate:
    “The NAO varies on time scales ranging from daily to seasonal to decadal. On daily time scales, NAO variability is largely related to internal atmospheric dynamics involving fluctuations of the intensity and position of the North Atlantic westerly jet stream. On monthly to seasonal time scales, the NAO variability is additionally influenced by fluctuations in boundary fields external to the atmosphere, such as sea-surface temperatures (SST) and sea ice concentrations (SIC)”

    Seems to me that it’s a lot to do with the solar wind speed. There was plenty of deep negative NAO e.g. 1980, 1997/98, 2009/10.

  14. Ulric Lyons

    “Amplification of the Atlantic Multidecadal Oscillation associated with the onset of the industrial-era warming”

    That would be rather a struggle for a negative NAO driven warm phase of the AMO to get amplified by rising CO2 that is modeled to increase positive NAO. If anything, with all else being equal, the rising CO2 should have inhibited the rate of the recent AMO and Arctic warming compared to the 1925-1945 AMO warming rate.

    http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch10s10-3-5-6.html

  15. Steven Mosher

    About the claim that we dont fund study of natural variability,
    the claim that questions about natural variability are never asked,
    the claim that we think its ALL C02….

    Mann Spearfishing some skeptical red herrings.

    oh wait, dont trust Mann, except you know, when its to your liking

    Go figure… folks do care about natural variablity.. and yes they do ask the questions… they actually do science… not just blog comments.

    http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-16-0712.1

    • That models don’t model natural variation? This should not come as a surprise for a number of reasons – but it comes under the heading of crossing the i’s and dotting the t’s of global warming – not investigating Earth dynamics. And only after these dynamic processes have become impossible to overlook.

      Models cannot model the dynamics of Earth systems for two reasons at least.

      First – there is little enough understanding of the fundamentals – and no confidence to translate that into compuerese.

      Finally, Lorenz’s theory of the atmosphere (and ocean) as a chaotic system raises fundamental, but unanswered questions about how much the uncertainties in climate-change projections can be reduced. In 1969, Lorenz [30] wrote: ‘Perhaps we can visualize the day when all of the relevant physical principles will be perfectly known. It may then still not be possible to express these principles as mathematical equations which can be solved by digital computers. We may believe, for example, that the motion of the unsaturated portion of the atmosphere is governed by the Navier–Stokes equations, but to use these equations properly we should have to describe each turbulent eddy—a task far beyond the capacity of the largest computer. We must therefore express the pertinent statistical properties of turbulent eddies as functions of the larger-scale motions. We do not yet know how to do this, nor have we proven that the desired functions exist’. Thirty years later, this problem remains unsolved, and may possibly be unsolvable. Uncertainty in weather and climate prediction (2011), Julia Slingo, Tim Palmer

      Second – models do not model fundamental physical processes. Solutions evolve according to the chaotic dynamics of the nonlinear set of equations at the core of the model itself. Starting at slightly different points – within measurement error – gives exponentially diverging solution paths. The very same problem Lorenz found in his convection model. The end result is always this.

      It is a schematic of different solutions in the same model with slightly different initial and boundary conditions. If there is a known outcome – the surface record – inputs can be tuned to produce a result with a vague resemblance to the record. For future forecasting we are in the dark of course. Which one of the model projections might most closely resemble the evolution of real world climate? A solution from many plausible solutions may be chosen qualitatively – but it is not one we need trust as science.

      • Curious George

        The Dame Slingo’s highly symbolic “forecast uncertainty” graph is going round and round. Notice that we don’t know what the x-coordinate is, nor what the y-coordinate is. It is supposed to be a projection (a top view) from a multi-dimensional phase space. Add one more dimension as a z-axis, and in a front view you may well see blobs of different colors totally separated vertically, or barely overlapping. Models may or may not approximate the reality.

      • It is a schematic as I said. Here’s a real one.


        http://www.nature.com/ngeo/journal/v5/n4/abs/ngeo1430.html

        This is a single model – and each of the 1000’s of solutions are equally plausible. They have constrained the solutions to those that resemble the observed temperature record over 50 years.

      • Curious George

        “They have constrained the solutions to those that resemble the observed temperature record over 50 years.” So we don’t see a vast majority of equally plausible solutions. Tomorrow, as new observations become available, the selection of “solutions” will quietly change. Long live climatology!

      • The schematic shows that plausible solutions may or may not bear any resemblance to how climate actually evolves. They may contain equations that are intended to simulate nature – but the solutions are very quickly dominated by the dynamics of the nonlinear set of equations of fluid motion at the core of all climate models. After a time – reality is guaranteed to diverge from models. The is why weather predictions are limited to a week or so.

        This IPCC uses single solutions from a number of models. Each of these is arbitrarily chosen on the basis that it seems about right to the modeler.

        The bottom line is that it is not possible to predict the future of climate using climate models. And this is known without a doubt.

        “In sum, a strategy must recognise what is possible. In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.” IPCC. TAR, s14.2.2.2

        The starting point is that these ‘projections’ have zero scientific validity.

      • Steven Mosher

        “That models don’t model natural variation? This should not come as a surprise for a number of reasons – but it comes under the heading of crossing the i’s and dotting the t’s of global warming – not investigating Earth dynamics. And only after these dynamic processes have become impossible to overlook.

        Models cannot model the dynamics of Earth systems for two reasons at least.”

        #####################

        Except models DO model the dynamics

      • Whatever the equations intended to represent physical system – after a short lead time the solutions are dominated by the dynamics of the nonlinear set of equations of fluid transport at the model core. This is math and not a subject for debate.

      • As Julia Slingo and Tim Palmer suggested in the quote above – we have neither the theory or the math for representing climate dynamics. Model sensitive dependence and structural are another problem.

        Sensitive dependence and structural instability are humbling twin properties for chaotic dynamical systems, indicating limits about which kinds of questions are theoretically answerable. They echo other famous limitations on scientist’s expectations, namely the undecidability of some propositions within axiomatic mathematical systems (Gödel’s theorem) and the uncomputability of some algorithms due to excessive size of the calculation (see ref. 26). http://www.pnas.org/content/104/21/8709.full

      • Great! Climate chaos can simplify our liives! We can’t use climate models, so make current policy based on the hard and soft data in hand.* Sure people will argue about that, e.g., validity of AGT’s as a viable metrics, but still there would be one less big contentious area (climate models) about which to argue. Works for me…timely policy is still needed…as is a commitment to updating.
        ———————
        * Of one could still get at pdfs or possibilities, etc., with individual models, but that would just add a layer of abstraction and more questions to the arguments.

        But it is past time to move on with cogent policy.

      • Yes – the principle of decision making under uncertainty or not is to minimise the downside and maximise the benefits.

        All of the Copenhagen Consensus development goals for instance have climate relevancy and a benefit/cost ration greater than 15.

        http://www.copenhagenconsensus.com/post-2015-consensus

        It can provide a focus for aid and philanthropic spending. Wind and solar power are conspicuously absent.

        Domestically – the relevant policy is on reserch, development and commercialisation of 21st century energy sources.

      • And yes – we can theoretically devise pdf’s for individual models – but haven’t yet. And this would not solve the model quandary.

      • I’ve also thought for some time that K.Beven’s and other’s work on equifinality and by extension multifinality might provide some approaches of dealing with the climate model quandary in a transparent manner.

      • Steven Mosher

        “Whatever the equations intended to represent physical system – after a short lead time the solutions are dominated by the dynamics of the nonlinear set of equations of fluid transport at the model core. This is math and not a subject for debate.”

        Err No. That debate is not over, despite what you assert.

        Assertions are not math.

      • There is no debate.

        Lorenz was able to show that even for a simple set of nonlinear equations (1.1), the evolution of the solution could be changed by minute perturbations to the initial conditions, in other words, beyond a certain forecast lead time, there is no longer a single, deterministic solution and hence all forecasts must be treated as probabilistic. The fractionally dimensioned space occupied by the trajectories of the solutions of these nonlinear equations became known as the Lorenz attractor (figure 1), which suggests that nonlinear systems, such as the atmosphere, may exhibit regime-like structures that are, although fully deterministic, subject to abrupt and seemingly random change.

        Emphasis mine. Your assertion counts for sfa.

    • Steven Mosher

      Lets make it even Easier for you

      Here is the Abstract

      “Low frequency internal climate variability (ICV) plays an important role in modulating global surface temperature, regional climate, and climate extremes. However, it has not been completely characterized in the instrumental record and in the Coupled Model Intercomparison Project phase 5 (CMIP5) model ensemble. In this study, the surface temperature ICV of the North Pacific (NP), North Atlantic (NA), and Northern Hemisphere (NH) in the instrumental record and historical CMIP5 all-forcing simulations is isolated using a semi-empirical method wherein the CMIP5 ensemble mean is applied as the external forcing signal and removed from each time series. Comparison of ICV signals derived from this semi-empirical method as well as from analysis of ICV in CMIP5 pre-industrial control runs reveals disagreement in the spatial pattern and amplitude between models and instrumental data on multidecadal timescales (>20 years). Analysis of the amplitude of total variability and the ICV in the models and instrumental data indicates that the models underestimate ICV amplitude on low frequency timescales (>20 year in the NA, >40 year in the NP), while agreement is found in the NH variability. A multiple linear regression analysis of ICV in the instrumental record shows that variability in the NP drives decadal to interdecadal variability in the NH; whereas the NA drives multidecadal variability in the NH. Analysis of the CMIP5 historical simulations does not reveal such a relationship, indicating model limitations in simulating ICV. These findings demonstrate the need to better characterize low frequency ICV, which may help improve attribution and decadal prediction.”

      Now.

      1. Do you agree with this, is it true?
      2. What do you think of the authors?
      3. Do you believe everything Mann writes?
      4. Does a model that underestimates suck?
      5. Don’t all models either over or under estimate?
      6. Define Skill?
      7. How much Skill is required before you can use an estimate?

      I have model in my car it estimates that I will run out of gas in
      30 miles. It always over estimates. Does it suck?
      If it sucks, why have I never run out of gas?

      I have model of stupidity. It always underestimates your ignorance.
      But I keep using it, there’s always hope and no better model is available.

    • The paper shows that there is a lot of red-teaming going on about models within the mainstream. People like Mann and Trenberth are observationalists not modelers. They have no favorite models and they hold all models to account. Climate science has many such scientists to red-team the model results, and that debate is publicly carried out in climate science publications.

      • Jim, you are quite right as far as you go. The rest of how it works is that you seek to put words in the mouths of others where I do not.

        Quotes and citation or it didn’t happen.

      • Here’s my quote “The paper shows that there is a lot of red-teaming going on about models within the mainstream. People like Mann and Trenberth are observationalists not modelers. They have no favorite models and they hold all models to account. Climate science has many such scientists to red-team the model results, and that debate is publicly carried out in climate science publications.”
        I stand by it. You haven’t said what your beef is.

    • Surely Mosh is asking the wrong question – it wouldn’t be the first time.

      The lack of an oscillatory model signal suggests that the inter-decadal global mean surface temperature signal derived from the observations and shown in Figs. 1A and 2B is indeed the signature of natural long-term climate variability. Removing this internal signature from the observed global mean temperature record should clean up the individual and unique realization of nature, isolating the forced climate signal. Fig. 3 shows that the resulting cleaned signal presents a nearly monotonic warming of the global mean surface temperature throughout the 20th century, and closely resembles a quadratic fit to the actual 20th century global mean temperature. Interdecadal 20th century temperature deviations, such as the accelerated observed 1910–1940 warming that has been attributed to an unverifiable increase in solar irradiance (4, 7, 19, 20), appear to instead be due to natural variability. The same is true for the observed mid-40s to mid-70s cooling, previously attributed to enhanced sulfate aerosol activity (4, 6, 7, 12). Finally, a fraction of the post-1970s warming also appears to be attributable to natural variability. The monotonic increase of the cleaned global temperature throughout the 20th century suggests increasing greenhouse gas forcing more-or-less consistently dominating sulfate aerosol forcing, although our technique cannot exclude other mechanisms not contained in the current generation of model forcing (22). http://www.pnas.org/content/106/38/16120.full

      And there is still no theory or math to translate natural variability into computerese. Long term predictability using models is impossible. Short term initialsed models using nested components may be feasible but require 200 times more computing power. Let’s do a Manhattan Project on it – sarc.

      http://journals.ametsoc.org/doi/abs/10.1175/2009BAMS2752.1

      Do you imagine that Mann’s paper is but derivative and third rate?

    • Everything in the paragraph above the PNAS link is a quote.

    • And sorry – that’s 2000 times more computing power.

    • It looks like everyone agrees internal variability is not zero and is put at around +/-0.1 C in all these studies including Steinman et al. (2015). http://www.meteo.psu.edu/holocene/public_html/Mann/articles/articles/SteinmanEtAlScience15.pdf
      Natural variability which is internal variability plus solar and volcanic forcing changes may bump this up to +/- 0.2 C. The temperature rise has been 1.0 C which is five standard deviations above this variability.

      • Jim, without an explanation of what caused temperature changes in the past there is no hope of pinning down a figure.

        I’m pleased that alarmists have now discovered clouds and that climate of the earth has varied over the aeons even before man evolved.

        On the other hand alarmists are only taking the first baby steps in acknowledging that the CAGW assumptions were woefully wrong.

    • The paper says it is about 0.1 for the AMO and 0.1 for the PMO in the Northern Hemisphere. And these two are linked via the polar annular mode. The quote I provided says most of the early century warming and some of the late century was quite natural – as well as the mid century cooling.

      There is no chance that this is a strictly periodic mechanism oscillating around a rising trend. The frequency and intensity of El Nino will decline from a 20th century peak this century.

    • The fact that all of these studies have to remove the trend to see the oscillations mean that the trend is a separate thing that they are not trying to explain. These are more like stadium waves that go up and down with no net in about 60 years.

    • Yours – Jim – is a convenient myth based on the 20th century experience.

    • RIE, if you don’t believe in detrending you should complain to all the authors you cite, because that is what they do to define internal variability.

    • And the meaning of the stadium wave continues to be misunderstood. It is about using network to define the collective behaviour of nodes in the Earth system. It is not about rising and falling but about signals propagating around the planet.

      Climate is ultimately complex. Complexity begs for reductionism. With reductionism, a puzzle is studied by way of its pieces. While this approach illuminates the climate system’s components, climate’s full picture remains elusive. Understanding the pieces does not ensure understanding the collection of pieces. This conundrum motivates our study.

      Our research strategy focuses on the collective behavior of a network of climate indices. Networks are everywhere – underpinning diverse systems from the world-wide-web to biological systems, social interactions, and commerce. Networks can transform vast expanses into “small worlds”; a few long-distance links make all the difference between isolated clusters of localized activity and a globally interconnected system with synchronized [1] collective behavior; communication of a signal is tied to the blueprint of connectivity. By viewing climate as a network, one sees the architecture of interaction – a striking simplicity that belies the complexity of its component detail. Marcia Wyatt

    • Detrend the freakin’ Holocene Jim. You can see the chaotic variability in the ocean indices – and in the proxies. But please tell me with citations how your detrending diversion from the actual point is used to isolate signals in the surface record.

      Figure 12 shows 2000 years of El Nino behaviour simulated by a state-of-the-art climate model forced with present day solar irradiance and greenhouse gas concentrations. The richness of the El Nino behaviour, decade by decade and century by century, testifies to the fundamentally chaotic nature of the system that we are attempting to predict. It challenges the way in which we evaluate models and emphasizes the importance of continuing to focus on observing and understanding processes and phenomena in the climate system. It is also a classic demonstration of the need for ensemble prediction systems on all time scales in order to sample the range of possible outcomes that even the real world could produce. Nothing is certain.

      http://rsta.royalsocietypublishing.org/content/369/1956/4751

  16. While there is much of interest here – this one gave me happiness. “Hydrosemiosis and the Pursuit of Uberty.” Beautiful.

    The history of hydrology has been envisioned [e.g., Chow, 1964] as one of progress through successive stages in which science-as-knowledge is achieved, by methods that progress from empirical, to rational, to theoretical. Increasingly, moreover, the theoretical phase of progress is being empowered by computational efficiency. Predictive computer modeling is revolutionizing the ability of scientists and engineers to consolidate theoretical knowledge into convenient conceptual packages that can be used to simulate the behavior of “systems” over ranges of conditions and processes that are presumed to operate in the real world. Given the increased ease of performing such modeling, the apparent rigor of the methodology, and the utility of the results in generating confident belief for decision-making, predictive computer modeling has become the principal operating paradigm for modern hydrology.

    When I first realised that I could model water flows – I went out and bought an XT clone with 64kb of memory. On this wondrous machine I wrote programs for cubic spline interpolation in 4th order numerical solutions of the differential equation of hydrological storage. I could make a cup of tea and watch TV cartoons before the calculations finished. On this basis – I made a career out of it.

    The history of hydrology has been envisioned [e.g., Chow, 1964] as one of progress through successive stages in which science-as-knowledge is achieved, by methods that progress from empirical, to rational, to theoretical. Increasingly, moreover, the theoretical phase of progress is being empowered by computational efficiency. Predictive computer modeling is revolutionizing the ability of scientists and engineers to consolidate theoretical knowledge into convenient conceptual packages that can be used to simulate the behavior of “systems” over ranges of conditions and processes that are presumed to operate in the real world. Given the increased ease of performing such modeling, the apparent rigor of the methodology, and the utility of the results in generating confident belief for decision-making, predictive computer modeling has become the principal operating paradigm for modern hydrology.

    This is science-as-knowledge with very practical applications in the real world. As this paper makes clear – extending the conceptual framework requires a scientific process of science-as-seeking.


    http://onlinelibrary.wiley.com/doi/10.1002/2016WR020078/full

    So what is the alternative to the hypothesis-testing framework in which theories (hypotheses) are evaluated for mirroring nature, and data are substituted for phenomena? The alternative involves inference-laden signification and a world-directed point of view. It is semiotic in that it understands the world, not in a detached manner as a mere source of data, but as a complex interpretive structure, in which the investigator is immersed. This is a world mediated and sustained by signs that exist in a continuous, connected flow, a semiosis, in which the signs are things that stand for something else (their object) in relation to something else (their interpretant). A scientifically fruitful aspect of this view is the recognition of indexical signs in which the relationship to objects is one of causation. Although the world contains, or is composed of a semiotic structure (a semiosis) of indexical signs, the interpretant aspect of these signs is what is triggered in the investigator, whose thoughts, in turn, become new signs, constituting a continuity of the signs in human thought with those in the world [e.g., Baker, 1999]. Thus, it is in through this semiosis, or action of signs, that the world “speaks” to the investigator [Baker, 2000]…

    The relationship of crime detection to semiotics is described in a book by the semioticians, Eco and Sebeok [1988], entitled The Sign of Three: Dupin, Holmes, Peirce. The analogy between science and crime scene investigation is not exact, however. The crime case can be resolved, at least in principle, by identifying a unique culprit. In the science investigation, the inquiry commonly remains open with each new sign leading the investigator to new levels of understanding.

    In the 1980’s a couple of geomorphologists observed that streams in eastern Australia had changed shape in the late 1970’s. From low energy meandering forms to high energy braided forms. It was found to have been the result of alternating flood an drought dominated regimes. It was quickly linked of course to the shift in the 1970’s of the Pacific climate state. A shift that was quite widely linked to climate change at the time. It eventually turned around justifying my doubts.

    The mystery then was how a sub-decadal system could cause multi-decadal regimes. The mystery was partially resolved in the description of the PDO in 1996. It had the same periodicity as Australian flood and drought regimes – but how could a NH, eastern Pacific phenomenon influence Australian rainfall. The answer to that came with ENSO and PDO statistics – a cold (warm) PDO is associated with intense and frequent La Nina (El Nino). The periodicity of these regimes precisely mirror changes in Earth surface temperature trajectories.

    People have been looking for the cause of ENSO for decades – and basically concluded that it was resonant or stochastically forced by an externmal factor. Correlations suggest a solar trigger (in the meantime the complex dynamics of Earth system had suggested a chaotic mechanism – e.g. http://onlinelibrary.wiley.com/doi/10.1029/2007GL030288/abstract). Both ENSO and PDO are upwelling phenomenon – something that responds tp winds and currents in the south and north Pacific respectively. These are driven by polar annular modes that in turn are modulated by solar UV/ozone chemistry.

    Is there at last a hypothesis that may be tested? It suggests that the next climate shift – due in a 2018-2028 window – will be to yet cooler conditions. And there will be comprehensive monitoring systems in place. One day soon the case may be closed.

  17. Geoff Sherrington

    Objection to the cited artile on decision making under certainty for conservation purposes. This quote:
    “we regularly face choices that require us to weigh the expected costs and benefits associated with different options;
    rarely are we absolutely certain about what will happen in the future; and
    imperfect information, by itself, does not prevent us from making choices”

    In my view, it is more like real life if amended slightly to:
    “We face choices where we can choose to weigh the expected costs against benefits associated with different options;
    we are never certain about what will happen in the future; and
    we make choices based on information, that probably become worse choices as the information becomes more imperfect.”

    There is no special pleading allowed simply because the article is couched in terms of decision making for conservation purposes.

    In the ideal world, we make our own choices in as many cases as possible. Minimising the choices that others make on our behalf is a major part of participation in Life for those with the intelligence to be in places where they can be heard.

    Summary – without the intervention of Luck, one cannot make perfect decisions from imperfect information input. Not consistently.
    Geoff

  18. David Wojick

    Chris Mooney says the Trump Admin must complete the latest National Climate Aasessment (or as I call it, the National Scare). He is wrong.
    https://www.washingtonpost.com/news/energy-environment/wp/2017/03/21/trumps-budget-slashes-climate-research-but-these-scientists-still-have-a-job-to-do/

    • Apparently Mooney is not familiar with the word “repeal”. We routinely sent over proposals for repealing legislation with the annual budget.

      As a WWII vet with experience charging German pillboxes said to me “What man can make, man can break”

      • David Wojick

        Repeal would be nice but is not necessary. Note that the first National Assessment was not done until 10 years after the law was passed. Inhofe, CEI and I sued NSF to block it becoming federal policy.

        This minor law can simply be ignored and these ridiculous USGCRP (G-CRAP) Assessments not funded. Or they couod be done correctly. Imagine that! Woohoo.

  19. David L. Hagen

    Does EPA need a new director? Or for Pruitt to memorize his briefings and practice answering questions?
    Delingpole: EPA’s Scott Pruitt Gets Eaten Alive by Fox

  20. Interesting material, how about helping me with a simple guest ion . Where did the vast amounts of moisture E vaporize from during the last ice ages? To acumalate enough ice to cover half of our continent, seems the Artic must have remained warm and ice free for most of the eighty thousand cold years. Does this correspond with the suggestion that the Berring straight area between Russia and Alaska remained a temperate climate allowing early man to slowly cross from Asia? Seems the Artic has the ability to behave strangely in the past as it is today. Thanks.

    • David Wojick

      As I recall sea levels were about 400meters lower than today, so the ice water came from the oceans.

  21. About Lovejoy and temperature accuracy:

    It still amazes me that educated people claim they can divine instrument measurement accuracy without including the calibration and installed accuracy of the actual physical instruments used. Statistical techniques can sometimes increase the precision or consistency of set of data. Accuracy is determined solely by the installed calibration accuracy of an instrument. Even then, for critical measurements, such as those involving safety or billions of dollars of expenditure, instrument accuracy is rechecked after the measurement to verify the instrument was within specified accuracy tolerance during the measurement.

    0.1 to 0.2 degree accuracy on field thermometers on data from 1880 to 2012? It is just silly to claim that.

    • And that doesn’t address the human factor. Thousands of individuals doing what they always do in such large numbers-screwing things up.

      In the early 60s I worked in a factory where some employees formed one piece of bent steel. Basically a one step process repeated hundreds of times a day. And yet the inspector would make his visit and
      occasionally a half days work thrown out.

      I can only imagine how many reasons the proper reading was not recorded.

      • Geoff Sherrington

        In agreement with CK,
        If your career has had more coalface measurements than time for philosophical thought, you might agree that the instrument properties need accounting. If for example you are an analytical chemist, you would likely find the Lovejoy assertions alarming.
        The Australian record has documentation of the wrong end of the thermometer being read; of thermometers in a room warmed by fire; of thermometers in fields overgrown by grass; and many more less laughable excursions.
        To estimate a real error of less than about +/- 0.4 deg C for the historic Australian land T record is bad science. That is an order of magnitude larger than Lovejoy, maybe representing the difference between hope and experience. Note that an error band should mostly contain all past records that are plausible, including for Australia raw, Acorn, AWAPS, HQ and various other official attempts. That envelope is about that order of magnitude greater than Lovejoy’s guess.
        Much the same comment can be made and should be examined for TOA radiation balance, ARGO float temperatures and satellite era level estimates, to name but a few.
        The whole CMIP scheme, including averages of computer runs, also fails classical formal tests and procedures for error analysis.
        Geoff.

  22. I was thinking about decision making under uncertainty and decided that it is better to be lucky than smart. This is borne out by Phillip Tetlock who defined forecasters as either fox or hedgehop types. It originates in Greek poetry – “a fox knows many things, but a hedgehog one important thing”.

    Berlin expands upon this idea to divide writers and thinkers into two categories: hedgehogs, who view the world through the lens of a single defining idea (examples given include Plato, Lucretius, Dante Alighieri, Blaise Pascal, Georg Wilhelm Friedrich Hegel, Fyodor Dostoyevsky, Friedrich Nietzsche, Henrik Ibsen, Marcel Proust and Fernand Braudel), and foxes, who draw on a wide variety of experiences and for whom the world cannot be boiled down to a single idea (examples given include Herodotus, Aristotle, Desiderius Erasmus, William Shakespeare, Michel de Montaigne, Molière, Johann Wolfgang Goethe, Aleksandr Pushkin, Honoré de Balzac, James Joyce and Philip Warren Anderson). Wikepedia

    In a modern context I would define Michael Mann as a hedgehog and Judith as a fox – the latter with perhaps multiple meanings. Foxes do better at forecasting than hedgehogs – tellingly monkeys with darts are somewhere in the middle.

    Tetlock (2015) identifies 10 rules to effectively predict the future, or more precisely, to effectively narrow the uncertainty about the future.

    1. Triage. Focus your effort on important questions that can be answered, and in situations in which your effort will improve the answer. Some questions are too easy and others are impossible to answer, both of which are a waste of your time.

    2. Reductionism. Break large problems into pieces. For example, what will the price of oil be in 2020? The question cannot be answered directly with any confidence, but it can be broken into many parts: Where is oil produced today? How will production change in each location between now and 2020? Where is the oil used? How will demand change? Will there be oil revenue tax increases in the UK? These are not easy questions, but they are more manageable and can be reduced further into more granular questions.

    3. Inside vs. outside views. Seek analogues. Consider the problem from multiple points of view and ensure that you are answering the right question. Identification of analogues is a good starting point, one which we do often in the oil field. For example, the best first guess about the performance of a new reservoir is the performance of a similar reservoir elsewhere. However, we are not as good at using previous data to predict the success of the project. Although the average major project exceeds budget by 33%, the historical data do not seem to be used to improve future budget estimates.

    4. React appropriately to evidence. Once we form an opinion, we generally favor the evidence that supports our view and discount evidence that does not. This phenomenon is called anchoring, which degrades the quality of forecasts.

    5. Look for clashing factors. For every good argument, there is a counterargument, and for every action, there is a reaction. A decision often will produce both winners and losers. Almost everyone in the oil and gas industry would have predicted confidently that Saudi Arabia would cut oil production in the face of the declining oil price because the country’s producers historically did so. But the superforecasters accurately predicted that they would not by observing the different driving forces this time: The shale oil boom made this case different.

    6. Granularity and measurement. Strive for precision in estimates (for example, “I am 65% confident” instead of a general statement such as “I think that is likely.”) For example, George Tenet, director of CIA, said in 2002 that it was a “slam dunk case” that Saddam Hussein, the president of Iraq, had weapons of mass destruction. He may have said something more nuanced had he been required to state a numerical confidence limit.

    7. Strike a balance between under- and overconfidence. The main problem with being a subject matter expert (SME) can be overconfidence. SMEs, who often are hedgehogs, form opinions effortlessly and may not see the need to evaluate the situation carefully.

    8. Look for the errors behind your mistakes and successes. Understand why an estimate turned out to be wrong or right.

    9. Bring out the best in others and let others bring out the best in you. Create conditions in which people with varying opinions can contribute. The wisdom of teams applies and teams forecast better than individuals.

    10. Practice forecasting.

    To test predictions we need to have questions about the probability relevant events happening within a limited timeframe. I suggest we do the experiment. What will happen and with what probability?

    1. Will there be a La Niña or EL Niño in the next 12 months?
    2. Will the Pacific Decadal Oscillation stay positive in the next 12 months?
    3. Will the monthly surface temperature peak in the next 12 months in a new record?
    4. Will the satellite monthly troposphere temperature peak in a new satellite record in the next 12 months?
    5. What will be the Arctic ice minimum this year?
    6. Will the Atlantic Multidecadal Oscillation stay positive in the next 12 montths?
    7. What are the chances of a climate shift in the next 12 months, decade?

    We could add to the questions – if any are suggested – and ask Judith to post. There are scoring methods devised by Tetlock – and the winner will get great kudos.

    • Well, the NOAA PDO index for March is in at .08, twice February’s .04… and positive… an indicator the JIASO index is on the way back up… probably for the rest of 2017 and well into 2018.

      • I don’t have answers yet. There is a lot of variability in the PDO index. Is it random or is there a pattern. It makes a difference.

        In his foundation of the modern axiomatic theory of probability, A. N. Kolmogorov (1933) avoided defining randomness. He used the notions of random events and random variables in a mathematical sense but without explaining what randomness is. Later, in about 1965, A. N. Kolmogorov and G. J. Chaitin independently proposed a definition of randomness based on complexity or absence of regularities or patterns (which could be reproduced by an algorithm). Specifically, a series of numbers is random if the smallest algorithm capable of specifying it to a computer has about the same number of bits of information as the series itself (Chaitin, 1975; Kolmogorov, 1963, 1965; Kolmogorov and Uspenskii, 1987; from Shiryaev, 1989). http://www.hydrol-earth-syst-sci.net/14/585/2010/hess-14-585-2010.pdf

        We can discern a 20 to 30 year pattern of warm or cold dominant conditions – but it is not exclusively warm or cool and the system jumps around with no obvious pattern. The variability in the system seems extreme in the past couple of years – strong negative to strong positive. It may indicate a dragon-king event – a property of chaotic systems – and that a climate a shift is happening now. But which way?

        The PDO originates in upwelling off the Californian coast. Upwelling changes with changing patterns of wind and ocean circulation which have everything to do with the polar annular modes.

        And the AO is influenced by solar activity – with low solar activity favouring a negative AO.

        I’d put the chance of a climate shift happening now at 65% – the chance of a negative PDO as 75% and the chance of a La Nina as 80% given the state of the Southern Oscillation Index as a leading indicator. .

      • I’d put the chance of a climate shift happening now at 65% – the chance of a negative PDO as 75% and the chance of a La Nina as 80% given the state of the Southern Oscillation Index as a leading indicator. .

        This year, or sometime this century?

      • Hey chief –

        How many years ago did you begin your certain predictions for cooling a decade or three? Five? How’d that work out so far?

      • In 2003 I was staring at surface temperature data – and it struck me that the shifts in surface temperature occurred with precisely the same periodicity as the ocean and hydrological regimes I had been studying for decades.

        So better than most – and certainly better than Joshua.

    • The question says 12 months. There will be 3 or 4 climate shifts this century – just as there were four last century. The timing is about right and the dragon-king of the east is angry.

      We are looking for extra large fluctuations – in the 1970. 1990’s and now?

      There is no shame in being wrong – but we can explore some ideas and get better by keeping score.

    • Robert I. Ellison | April 3, 2017 I was thinking about decision making under uncertainty and decided that it is better to be lucky than smart.

    • Robert I. Ellison | April 3, 2017 ” was thinking about decision making under uncertainty and decided that it is better to be lucky than smart.”
      Better not to be under pressure or uncertainty. Always make the wrong decision at Bridge and can see the mistake as I am putting the wrong card down. Would almost be better to recognize the pressure and play any other card! Is there an algorithm for decision making under pressure?

    • No one wants to test themselves? No courage of conviction.

      • The two ONI numbers for 2017 are -0.4 and -0.2. Coming off the 2016 La Niña, the GMST anomalies for Jan, Feb, and Mar should should have been low. They were not. They’re exceptionally high.

        El Niño is forecast,

        but even if it fails to launch by July, the equatorial Pacific is already hot:

        The PDO has been skating just above negative territory. It should go solidly positive in a few months.

        The AMO? It doesn’t matter. It could fall of a cliff and not much would happen.

      • JCH: El Niño is forecast,

        Does that imply a forecast of more rain for California?

      • At this time of year – the ENSO forecasts are far less reliable than at other times.

        The skill (or forecasting ability) of model runs based on February-October observations to predict the November-January (NDJ) average value in the Niño-3.4 SST region (ENSO). Results shown here are an average correlation coefficient from each of the 20 models between 2002-2011 (data used from Barnston et al, 2012). Percent Explained Variance (%) is calculated by squaring the correlation coefficient and multiplying by 100 (see footnote #1). Models that explain all ENSO variability would equal 100%, while explaining none of the ENSO variance would equal 0%. Graphic by Fiona Martin based on data from NOAA CPC and IRI.< https://www.climate.gov/news-features/blogs/enso/spring-predictability-barrier-we%E2%80%99d-rather-be-spring-break

        The north-east Pacific has cooled a little this year – more since the last PDO index update. We will see where that goes. Upwelling is facilitated by winds and currents driven by changes in polar/sub-polar surface pressure fields. These are to an extent modulated by solar activity.

        I’ll post this again for convenience.

        The Oceanic Nino Index (ONI) is centred on the Nino 3.4 region in the central Pacific. It has been solidly negative as a result of a La Nina Modoki – the blue blob of upwelling that can be seen in the central Pacific in the thermally enhanced satellite image JCH posted. It is fading as it does this time of year. The result of a La Nina Modoki is a double Walker Cell pushing warm surface water both east and west and causing flooding on both sides of the Pacific. Higher water levels in the eastern equatorial Pacific will disperse both north and south allowing for more upwelling. The origin of the La Nina normal.

        The AMO is linked to the polar annular mode as well – a symptom rather than a cause.

        It is the annular mode that causes major variability in NH climate.

      • The March NOAA PDO index was posted just a couple of days ago and is twice the February number:

      • I said the north-east Pacific – which may or may be consistent with the PDO. I eyeballed it from the satellite sst anomalies – as the regional sst data is not available.

        The PDO has been dropping for months – and an increase of 0.04 in the latest month is a straw to clutch at I suppose.

        But please – prediction and probability because these sorts of back and forth with mights and coulds – and straws – are very unproductive. Numbers not verbiage is the forecasting bottom line.

    • Having spent almost 2 decades as a simulation SME, I’ve learned to break problems into boxes, by defining their limits, sometimes this is easy, other times it requires looking at the problem from a different point of view.

      I am convinced Co2 has but minor affects on our climate and temperature, and that the modern warming is from the oceans pushing warm water north of the equator headed for the Arctic to cool. And i think we have reached the maximum GAT, and it will drop, but I’m not sure when.

      I’ll also note, it seems possible that some of the warm water could be coming from the bottom of the ocean, from thermal ridges and such (like open lava fields). Maybe that was the source of the warm blob. I’m not saying it is, but if we get another one, might be interesting to see if it’s from the ocean floor.

      • Steven Mosher

        Let me suggest two boxes

        1. Things you have actual demonstrated experience and skill with
        2. Things you should remain silent about.

        You have no demonstrated skill or experience modelling the earth system.
        You have no experience modelling radiative transfer, and not enough
        understanding to say anything about the effects of c02. Scientists 100 years ago know more than you do. Read more comment less.
        And, learn spatial stats if you want to discuss temperature stations.

        next I will explain the difference between laymen and scientists

      • You have no demonstrated skill or experience modelling the earth system.
        You have no experience modelling radiative transfer, and not enough
        understanding to say anything about the effects of c02.

        Blah, blah blah, Steven. I have more hands on modeling and simulation experience with systems that were actually built, and no video games, and you didn’t write avionics code.
        As well as I was license by the fcc 40 years ago after passing their test on the transmission and reception if em waves.
        Now, you might want to notice cooling rates, measured radiation, and how it changes under clear calm skies In the middle of the night.

        As for spatial statistics, I prefer my measurement, well actually measured and not made up. You seem to not notice weather systems are not linear spatially.

  23. Ulric Lyons

    “The Arctic has seen rapid sea-ice decline in the past three decades, whilst warming at about twice the global average rate. Yet the relationship between Arctic warming and sea-ice loss is not well understood. Here, we present evidence that trends in summertime atmospheric circulation may have contributed as much as 60% to the September sea-ice extent decline since 1979.”

    Well really the rapid decline was from 1995 with the rapid warming of the AMO. And there’s a logic issue with the attribution, which is that AMO warming is negative NAO/AO driven, but rising greenhouse gases are modeled to increase positive NOA/AO.
    According to UAH 6.0 lower troposphere, the north pole cooled from Dec 1978 to Mar 1995.

  24. The polar bear link is a bit of– is it a case of Scottish fake news used as excuse to hawk for donations?

    01 April 2017

    April Fool! Our Scottish Polar Bear sighting may be untrue, but the threats from climate change are very real. Current predictions are that polar bear numbers may decline by 30% by 2050 due to the threats from polar ice melt…

  25. Another thread that got deleted was where I was explaining to RIE that internal variabilities of +/-0.1 C are commonly cited, so he keeps posting nothing new there. The example I used is Steinman et al. (2015).
    http://www.meteo.psu.edu/holocene/public_html/Mann/articles/articles/SteinmanEtAlScience15.pdf
    RIE didn’t take it well and started to change the subject, so that thread is no more.

    • Are there gremlins? The discussion was about this – http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-16-0712.1 – Michael Mann’s contention that models underestimated natural variations. So seemingly entirely on topic. An explanation is due.

      Models can’t model natural variation.

      1. The theories and the math are both lacking.
      2. After a short lead time – the evolution of many plausible solutions of a model are dominated by the nonlinear dynamics of the set of equations of fluid transport at their core – rather than by equations of state (or the relevant parametisations).

      Jimmy’s rare cite says that the AMO is 0.1K – and the PMO 0.1K in the Northern Hemisphere. It is still unclear as to whom the commonly cited is -but I suspect it is a climate blogosphere meme.

      My citation says most of the early 20th century warming, all of the mid century cooling and some of the late century warming was entirely natural. http://www.pnas.org/content/106/38/16120.abstract

      I quoted Julia Slingo and Tim Palmer on the richmess of ENSO behaviour over 2000 years – because 20th century variability is so much less than the extremes of the Holocene. See Figure 12 – http://rsta.royalsocietypublishing.org/content/369/1956/4751

      That the pattern of the 20th century will not be repeated is not diversion but central to comprehending the scope of natural variability. As an observation – warmist climate narratives only work if history starts in 1950.

      Finally – if Jim imagines that I take any of this personally – even absurd personal misrepresentations – he is mistaken.

      • When you take decadal averages to remove the ENSO, the detrended decadal variations are about 0.1 C in everything you cite as well as the stadium wave and what I cited by Steinman. When you take 60-year averages, these too disappear, as they should because internal variations don’t change the forcing in any permanent way. You just need to remember this 0.1 C for perspective, because for climate change we are talking about several whole degrees. The 11-year solar cycle is also about 0.1 C on a sub-decadal scale, and detectable even though the forcing is only ~0.2 W/m2, a tenth of what CO2 has done already. So I find it unsurprising that CO2 has already had such a large effect. Some are still in disbelief, but they also haven’t compared solar signals for scale.

    • Well, except that JCH, on many sites, was trying to predict a record 2017 warming. Will not happen now? Should not happen anyway with the entropy of the recent near La Nina affecting temp for the whole of the rest of the year.

      • Wen it comes to the surface, UAH is a knuckleball. Remember Feynman… don’t swing and miss; you’re the person most likely to strike yourself out.

        GISS March is projected to come from 1.00 to 1.07 ℃, down slightly from February. That makes the 1st 1/4 mean at least 1 ℃ versus last years record: .98 ℃. Essentially agreeing with them will be NOAA, HadCET4, BEST, JMA, C&W, etc. RSS will tell you the truth about their surface product… don’t swing.

        But you are right in believing that if the forecast 2017 El Niño does develop in May-June, then the first 1/4 should have started cold. It did… .92 ℃ in January. And you’re right again… a couple of years ago that would have been considered sky high.

        The 16 La Niña has not had much of a knock-on effect because the anorexic little thing lived it’s entire life completely fenced in by a vast area of ACO2 heated ocean surface:

      • And angech, what you need is the return of the Divine Wind. Pray for it. In the history of the Pacific, the tables have turned. The original Divine Wind sank the Chinese fleet… twice. Now it originates in not so boomy China.

      • Bring on the Divine Wind. If it sinks some of the AGW hot air. Have held off posting to you til We got a good result. 0.19C, so I could cherrypick back at your forecasts. Still going to claim 2017 as the hottest ever year as you did earlier or cooling off that idea? Read the Chief’s offering. Your chances seem to be receding rapidly.

      • Before his hiatus, he called me all sorts of names, provided a cut-and-paste blizzard of all sorts of science, the cooling/warming hiatus was going to last two or three more decades… as I predicted back then, the PDO suddenly went positive, and the GMST shot through the roof… still is.

        So I’m not too impressed, okay?

      • What he means is that I quoted lots of science – that I don’t get from climate blogs. It is consistently dismissed as cutting and pasting and I am told I don’t understand what I am quoting – inevitably by the lest least informed. The 20 to 30 year regimes – the IPO – have persisted for a 1000 years at least. This is the core of what I have been rattling on about for a decade. The implications just keep getting more obvious and significant – but they still haven’t quite got the idea.

        Over the last 1010 yr, the LD summer sea salt (LDSSS) record has exhibited two below-average (El Niño–like) epochs, 1000–1260 ad and 1920–2009 ad, and a longer above-average (La Niña–like) epoch from 1260 to 1860 ad. Spectral analysis shows the below-average epochs are associated with enhanced ENSO-like variability around 2–5 yr, while the above-average epoch is associated more with variability around 6–7 yr. The LDSSS record is also significantly correlated with annual rainfall in eastern mainland Australia. While the correlation displays decadal-scale variability similar to changes in the interdecadal Pacific oscillation (IPO), the LDSSS record suggests rainfall in the modern instrumental era (1910–2009 ad) is below the long-term average. In addition, recent rainfall declines in some regions of eastern and southeastern Australia appear to be mirrored by a downward trend in the LDSSS record, suggesting current rainfall regimes are unusual though not unknown over the last millennium.

        There are several interesting results here – but primarily the importance of this research is putting it into a high resolution millennial scale context. In the long term the long view is likely to prevail.

        I imagine that taking a big view may be more instructive than examining the PDO entrails hoping to divine a pattern that doesn’t exist – there are no trends – it is merely seemingly random changes in state on annual to millennial scales. The last Pacific climate shift happened after 1998/99 – the next is due in a 2018-2028 window. The solar connection suggests it will be to yet cooler conditions. I have put it at a 60% chance in the next 12 months – a big call mostly based on notions of dragon-kings.

        We develop the concept of “dragon-kings” corresponding to meaningful outliers, which are found to coexist with power laws in the distributions of event sizes under a broad range of conditions in a large variety of systems. These dragon-kings reveal the existence of mechanisms of self organization that are not apparent otherwise from the distribution of their smaller siblings… We emphasize the importance of understanding dragon-kings as being often associated with a neighborhood of what can be called equivalently a phase transition, a bifurcation, a catastrophe (in the sense of René Thom), or a tipping point. The presence of a phase transition is crucial to learn how to diagnose in advance the symptoms associated with a coming dragon-king. https://arxiv.org/ftp/arxiv/papers/0907/0907.4290.pdf

        The fox knows many things – the hedgehog has one big idea. Seemingly examining the entrails of the PDO in this case. Predicting a warm phase of the PDO is pretty simple – eventually you will be right. The fox wins more consistently. What I really find amusing is that these are natural variations and have nothing much discernible to do with global warming. Why do they yearn so for a warm Pacific? It will turn around.

  26. No point.

  27. This is raw data from the site in the footnote.

  28. … Furthermore, what has been assumed as natural variability in the North Pacific, based on 20th century instrumental data, is not consistent with the long-term natural variability evident in reconstructed SSTs pre-dating the anthropogenic influence.

    Big hiatus, 1944 to 1975, and the little hiatus, 2005 to 2013… both caused by the negative phase of the PDO… all three major warming spurts, commencing in 1920, 1975, and 2913… caused by the ramp up of the positive phase of the PDO. The 21st century will be worse.

  29. Trump seems to suddenly interested in a carbon tax just before a meeting with China. Don’t worry!
    1) It will be forgotten within a few days after the meeting. It’s just another head fake to appear to appease China.
    2) Congress will never go along.
    3) If there is such a thing as a carbon tax it will be funneled into defense spending and not returned to the consumers.
    http://www.marketwatch.com/story/report-trump-administration-weighing-both-vat-and-carbon-tax-2017-04-04

  30. David Wojick

    Dr. Curry posted this tweet a few hours ago: “I wonder if the Executive Director of the American Meteorological Society actually read my testimony: https://www.ametsoc.org/ams/index.cfm/about-ams/ams-position-letters/letter-to-house-committee-on-science-space-and-technology-on-opening-remarks-for-the-29-march-2017-hearing/.”

    Clearly not, or not with any understanding, which is more likely. Incredibly (literally, with no credibility) he says this: “We can now say with very high levels of confidence, based on literally thousands of independent research efforts and multiple independent lines of evidence, that most of the warming our planet has experienced over the past 50 years is due to human activity. Indeed,to suggest that humans are not responsible for most of the warming we have experienced over the past 50 years indicates a disregard for the scientific process and the vast amount of testable evidence that has been amassed on this subject.”

    I cannot imagine what thousands of efforts and vast amount of testable evidence he is referring to, except modeling of course, which is not testable evidence. The satellites show no sign of GHG warming since records began, just some mild ENSO induced warming.

    • David

      Here is the curve we all know. 50 years ago the level was around 310ppm

      http://scrippsco2.ucsd.edu/history_legacy/keeling_curve_lessons

      If most of the warming over the last 50 years is due to man from that very low level, the question must be asked as to whether we can live on earth without having a dramatic effect on temperatures?

      Or we can perhaps look at the historical context whereby we have not yet exceeded natural variability

      Tonyb

      • Steven Mosher

        the exceedance of “historical” levels has ZERO to do with understanding the impact of going from 310-400.

        There is more to the climate than c02, That doesnt make c02 unimportant, it merely means that studying history will not help you every much UNLESS all other variables were held constant in history.

        I recently dropped from 210 to 190lbs. There nothing out of the ordinary
        in my history about weighing 190. That fact, the fact that my current weigh is within my natural variabilty says NOTHING about the effect of caloric intake on weight. And of course, going back up to 200 is nothing new. I like donuts. Now I ate donuts as a kid and never put on a pound.
        Heck I weighed 175 for wrestling and ate donuts. Donuts cant cause weight gain.

      • “I recently dropped from 210 to 190 lbs. There’s nothing out of the ordinary in my history about weighing 190”

        I would be more impressed with this comment if you had not already given the extraordinary [meaning quite out of the ordinary] reason for your recent weight loss in previous comments.
        The fact is you increased to 210 lbs for all the usual reasons, getting older, a love of good food, not enough exercise and most importantly not enough discipline.
        Mind reading but hey, I do not know you so it is all in good fun. Those were the reasons for my 10 K weight gain 3 times in the last 5 years. There was a lot out of the ordinary in getting down 10 K. Stopping work, riding a bike 200 k a week and walking daily helped. Reducing alcohol helps [me].
        So there is a lot out of the ordinary in getting back down to 190. Mind reading again I am surprised you did not drop a lot lower. To claim it was ordinary weight loss for the sake of this argument is therefore very disingenuous but again, if you wish to use alternative facts go for it.

      • In Jan 2014 I had a painful rash… went to the Doc… shingles. The nurse said I weighed 220. Went home. Didn’t feel like eating. Now I weigh 160.

    • He’s trying to reassure Smith that the worldwide climate science community is acting in accordance with all scientific disciplines, and is not some kind of conspiracy, but I think Smith is so far gone, that this will fall on deaf ears. What to do?

    • Climate is potentially predictable for much longer time scales than
      weather for several reasons. One reason is that climate can be meaningfully characterized by seasonal-to-decadal averages and other statistical measures, and the averaged weather is more predictable than individual weather events.

      ???

      Climate is predictable, because it will be about average?
      Probably hasn’t looked at the centennial scale power spectrum of climatic change recently.

      There are predictable aspects of climate, of course, beyond the rather meaningless global mean temperature. Manabe ’79, though for 4xCO2, not the 1.somthing effective RF we’ve observed, thought global warming would lead to:

      1. Arctic amplification
      2. Increased water vapor
      3. A hot spot
      4. Increased precipitation at high latitudes
      5. Decreased temperature variability
      6. Decreased kinetic energy

      How has this gone?

      1. Arctic Amplification appears, though we must mind confirmation bias as at least some of it is from dynamics of ice flow

      3. The hot spot, though not as pronounced in Manabe’s early models as recent ones, has been a no show since 1979.

      4. Increased precipitation at high latitudes? Some corroboration:

      5. Decreased temperature variability? Global surface temperature recording stations are probably too transitory to tell. The US record of more than a century tends to confirm this, though.

      6. Decreased kinetic energy?

      • They are saying that it is easier to predict these average properties than the weather. For example what will the weather be like in January 2050 is a much harder question to answer to within a degree than what will the mean January temperature be in that decade, because weather is much more variable on monthly scales than decadally averaged.

      • Don Monfort

        “Climate is potentially predictable…blah..blah..blah” That’s very useful. But thanks for the clarification, jimmy dee. POTUS The Donald has rendered all this BS moot. Just as I predicted.

  31. The Problem Is Epistemology, Not Statistics: Replace Significance Tests by Confidence Intervals

    That was 20 years ago, but the suggestion was made even earlier than that. I think the essence of the recommendation has been widely adopted (if not universally), namely the publication of the standard deviations of samples, SEMs, and standard errors of parameter estimates. At least one problem will likely remain: instead of publishing unreproducibly small p-values, the journals will publish unreproducibly narrow confidence limits, via essentially the same processes.

    • David Wojick

      What we need is just the confidence interval without the mean. After all, if you take the 49% confidence interval then the true value is more likely outside than inside. The likelihood that the mean is true then decreases as the confidence interval is narrowed.

      Instead we get the mean, with the rest as a minor modifier, as though the mean were somehow likely to be true, which is wildly false. I call this the fallacy of the mean.

      How much so-called climate science depends on this fallacy? Most I think.

  32. The WMO named some new clouds recently. I don’t quite know what to think about the tuba yet. lol

    https://www.wmocloudatlas.org/clouds-supplementary-features-asperitas.html

  33. This suggests one way of reducing surface temperature – increase the water holding capacity of soils by increasing the organic content.

    Land-atmosphere interactions play an important role for hot temperature extremes in Europe. Dry soils may amplify such extremes through feedbacks with evapotranspiration. While previous observational studies generally focused on the relationship between precipitation deficits and the number of hot days, we investigate here the influence of soil moisture (SM) on summer monthly maximum temperatures (TXx) using water balance model-based SM estimates (driven with observations) and temperature observations. Generalized extreme value distributions are fitted to TXx using SM as a covariate. We identify a negative relationship between SM and TXx, whereby a 100 mm decrease in model-based SM is associated with a 1.6 °C increase in TXx in Southern-Central and Southeastern Europe. Dry SM conditions result in a 2–4 °C increase in the 20-year return value of TXx compared to wet conditions in these two regions. http://www.sciencedirect.com/science/article/pii/S2212094715000201

    It suggests as well – yet again – that surface temperature records are susceptible to drought artifacts. Tropospheric records are in theory more reliable – and give much more comprehensive information on the atmospheric energy content – and the methodology has come of age this century.

  34. Cosmic Rays Increase Cloud Cover, Earth’s Surface Cools [link]

    If that is correct, the surface temperature equilibrates to increased insolation very rapidly.

  35. Judith Curry:

    ‘Comment: Research needs more competence, less ‘excellence’ : Nature  [link] : ” – – – excellence would be defined chiefly by how results were obtained, rather than by what actually was found”/Adrian Barnett.

    The Problem Is Epistemology, Not Statistics: Replace Significance Tests by Confidence Intervals [link] : ” Finally, the most important property of an empirical finding is intersubjective replicability, that other investigators, relying on the description of what was done, will (almost always) make the same (or closely similar) observations.”/Meehl.’

    Actually the IPCC scientists have not stated any appropriate basis on the recent climate warming. They only believe that it has been caused by CO2 emissions from fossile fuels. Lack of any evidence in reality made IPCC scientists turn to climate model results. Climate models themselves are ‘excelent’, but the main problem seems to be the deep uncertainty and overestimate of model results caused by used models themselves and more especially by poorly known parameters. That is why the climate model results can not be regarded as competent and confidential.

    Even UN politicians in the Rio conference 1992 stated that there was not any final evidence, according to which the recent climate warming could be caused by CO2 emissions from fossile fuels. In spite of that, in order to lessen risk of anthropogenic warming, they regarded as necessary ‘cost-effectively’ to cut CO2 emissions. However, the cuttings according to the Kyoto protocol were only disastrous without any cost-effective influence. As the bases of the Paris agreement is the same kind, even there can be expected only the same kind of disasters.

    There are many facts, which prove that any cutting of CO2 emissions from fossile fuels have so minimal influence on climate warming that it can not be found in reality; for instance: at first, the share of CO2 content from fossile fuels in the recent totalt increase of CO2 content in atmosphere has been only about 4 % at the most; and at second, geological and present observations prove that the trends of CO2 content in atmosphere follow trends of climate temperature and not vice versa; https://judithcurry.com/2017/03/11/scott-pruitts-statement-on-climate-change/#comment-841843 .

  36. More climate FUD from Nature Communications. From the Slashdot post:


    If we do nothing to reduce our carbon dioxide (CO2) emissions, by the end of this century the Earth will be as hot as it was 50 million years ago in the early Eocene, according to a new study out today in the journal Nature Communications. This period — roughly 15 million years after dinosaurs went extinct and 49.8 million years before modern humans appeared on the scene — was 16F to 25F warmer than the modern norm. […]

    https://science.slashdot.org/story/17/04/05/1319240/were-creating-a-perfect-storm-of-unprecedented-global-warming

  37. Dear Judith,

    Thank you for for sharing these notes and links. I’m writing to call your attention, and the attention of your readers, to an excellent article on a topic frequently discussed in this blog: uncertainty in climate models.

    http://www.hoover.org/research/flawed-climate-models

    I found this article particularly valuable, because it explains uncertainty in a manner easily understood by the non-scientist.