Week in review – science edition

by Judith Curry

A few things that caught my eye this past week.

Runs that reduce sea ice also result in a significant decrease in the frequency and magnitude of extreme warm and cold temperature anomalies”. Reduction in northern mid-latitude 2-m temperature variability due to Arctic sea ice loss: [link]  JC note: more on this paper in forthcoming blog post

Role of humans in past hurricane potential intensity is unclear [link]

News from Georgia Tech: Role of soil erosion in carbon budgets [link]

Mike Hulme on climate change and “extinction” [link]

The steady evolution of climate modeling…has led to significant strides in seasonal climate prediction, but forecasting the over decades has proved more challenging.” [link]

Instead of inundation from rising seas, 12 of 15 Florida Bay islands GREW in size during 1953-2014. Zhai et al., 2019

Weakening of the teleconnection from ENSO to the Arctic stratosphere over the past few decades: What can be learned from subseasonal forecast models?

Every few years a giant hole opens up in the Antarctic sea ice. Two ocean drifters caught in a swirling vortex helped scientists figure why this happens:[link]

There really was a hiatus in global warming

Interesting website: Australian indigenous knowledge about the weather [link]

A study of the Yangtze River Delta shows how urbanization dries out the atmosphere.

Sun’s 11 year cycle appears to be driven by alignment of the planets [link]

“Is the Noble Gas‐Based Rate of Ocean Warming During the Younger Dryas Overestimated?”.

On the emerging relationship between the stratospheric Quasi-Biennial oscillation and the Madden-Julian oscillation –

The delicate energy balance in the over the and how play into it. [link]

Hydrothermal vents trigger massive phytoplankton blooms in Southern Ocean [link]

Rising seas could spur growth of some , like in the Maldives, that initially formed when sea levels were higher than they are today. [link]

The causes of recent rises in methane: a new science challenge [link]

New radio sounding study finds little evidence of lakes under Antarctica’s Recovery Glacier [link]

AMOC sensitivity to surface buoyancy fluxes–differing response of AMOC to heat loss in winter and summer,

“Our results support the existence of a European Holocene Thermal Maximum and data-model temperature discrepancies.”

Response of the Northern Stratosphere to the Madden‐Julian Oscillation During Boreal Winter

Decadal variability of the Southern Ocean carbon sink, and drivers of change (lots about wind). Carbon sink on the way down since 2011.

Circulation and temperature in South Atlantic intermediate waters during the deglaciation

Stratospheric ozone important for understanding the sea-ice impacts on atmospheric dynamics [link]

What is the role of the ocean in Arctic amplification? [link]

Shifting currents in the past 2 decades might be driving faster dioxide uptake. [link]

June 6, 1912, the -Katmai eruption in Alaska was the largest in the 20th century – an entire valley, later named “valley of the 10.000 smokes” was filled by hot, steaming eruption material

Roughly 1,500 years ago, the Tierra Blanca Joven eruption blanketed Central America in ash and likely displaced settlements [link]

Making lives worse – the flaws of green mandates

A new paper indicates the 19th century was, on average, ~1.7°C warmer than the 20th century in Northeastern China. Northeastern China was also 7-10°C warmer than today ~9000 years ago.

Social science, technology & policy

UN agency criticizes carbon offsets [link]

Should we fertilize oceans or seed clouds?  No one knows [link]

A rather nuanced study in is cautioning against the popular claim that change is already causing violent conflicts

U.S. Democrats: don’t overthink a climate debate [link]

Why we do nothing to prepare for climate change [link]

“The law of unintended consequences is that actions of people always have effects that are unanticipated. Economists and other social scientists have heeded its power for centuries; for just as long, politicians and popular opinion have largely ignored it” [link]

US Climate Change Litigation in the Age of Trump – Year Two,

We all want to change the world, and we often have strong ideas about how to do it. But how good are those ideas? Are we actually willing to test them? [link]

China is cutting back on solar and wind units due to their cost, the ballooning subsidies the state owes the solar and wind power builders, and the lack of grid-connected transmission capacity. [link]

“Evaluating rotational inertia as a component of grid reliability with high penetrations of variable renewable energy”

Three surprising solutions for climate change – project drawdown [link]

with CO2 instead of is greener, say researchers [link]

Desalinating water in a greener and more economical way

The remarkable decline in the US temperature-mortality relationship over the twentieth century. Journal of Political Economy.

Juliana vs. U.S. Climate change “game theory” vs. doing what is right?

The higher cost of electricity reflects the costs that renewables impose on the generation system, including those associated with their intermittency, higher transmission costs, and stranded asset costs assigned to ratepayers. [link]

ENERGY TRANSITIONS: Are old Midwest coal plants pushing renewables offline?

KILLING THE EARTH TO ‘SAVE’ IT : Rainforest Trees Cut Down To Make Way For Industrial Wind Turbines -[link]

New research suggests nature-based solutions – such as wetland and floodplain restoration—can improve water quality and increase flood resiliency in the Champlain Basin. [link]

Turbocharge plants’ ability to capture and store larger amounts of carbon from the atmosphere in their roots and keep it buried in the ground for hundreds of years. [link]

We need to get serious about critical materials [link]

US DOE: increasing geothermal energy by 26 fold by 2050 [link]

The false enforcement of unpopular norms. “people enforce unpopular norms to show that they have complied out of genuine conviction and not because of social pressure…some groups may be more prone to unpopular norms because of individuals’ anxiety about being regarded as insufficiently sincere”

About science and scientists

A great questions for journalists to ask when they interview scientists: “What are the wrong conclusions to draw from this study?” How to combat overhyped science news  [link]

Alabama’s stand for campus free speech sets an example for the nation [link]

On the scandal at Oberlin college. Gibson’s Bakery, a family-owned business near Oberlin College accused of racism, just won a big payout. [link]

Another interesting perspective on the Oberlin scandal [link]

“This willing constriction of intellectual freedom… corrupts the ability to think clearly, and it undermines both culture and progress. Good art doesn’t come from wokeness, and social problems starved of debate can’t find real solutions.”  The unheeded lesson of 1984

Why facts don’t change our minds. New discoveries about the human mind show the limitations of reason. [link]

Big rise in academics mental ill health [link]

On campus, hate speech is an opposing view [link]

Intellectual humility: the importance of knowing you might be wrong [link]

“Campus norms proscribe any discourse that might offend women, minorities, or anyone perceived as a victim of patriarchal white societies. However, this rule, no matter how well intentioned, is harming the very people it aims to protect.”

The future for academic publishers lies in navigating research, not distributing it [link]

over 30% of PhD students develop a psychiatric condition. This is a higher rate than for people working in defence and emergency services, which is about 22%’ [link]

Supreme Court Asked To Hear Case Involving Leaked ‘Climategate’ Emails

Interesting survey of AGU members whose research is related to climate change 74% claim to be progressives or very progressive 3% claim to be conservative or very conservative A factor in the pathological politicization of climate?

 

129 responses to “Week in review – science edition

  1. Sun’s 11 year cycle appears to be driven by alignment of the planets [link]

    Good to see interesting findings that can actually be verified.

    • The more interesting problem is the aperiodicity of ‘cycles’ – and their implications in a resonant climate system.

      “Hyperbolic – or unstable – periodic orbits” in the insoluble solar system N-body problem – or in the chaos of internal solar turbulence – or both.

      Here’s an arxiv version.

      https://arxiv.org/pdf/1803.08692.pdf

      • The use of the word ‘dynamo’ is interesting, much like Mazzarella’s concept of a, ‘torque’…
        “Earth’s rotation and sea temperature as a single unit (ut unum sint): the arrival on the Earth of fronts of hydrodynamic shock waves during epochs of strong ejection of particles from Sun gives rise to a squeezing of the Earth’s magnetosphere and to a deceleration of zonal atmospheric circulation which, like a torque, causes the Earth’s rotation to decelerate which, in turn, causes a decrease in sea temperature. Under this holistic approach, the turbulence of solar wind and the zonal atmospheric wind behave cumulatively rather than instantaneously, where energy inputs are first conveniently accumulated and then transmitted…”

      • I will not waste time looking for the source of a disembodied quote.

      • You sound like you’re interested… (Adriano Mazzarella, Solar Forcing of Changes in Atmospheric Circulation, Earth’s Rotation and Climate, The Open Atmospheric Science Journal, 2008, 2, 181-184)

      • A pleasant little paper – and I read science if it is recommended. But not blogs. The link of solar activity and length of day is polar annular modes via top down modulation of polar surface pressure – spinning up or not winds and currents in the sub-polar regions. With the Coriolis force turning meridional flows zonal near the equator providing the ‘torque’ needed. The gyre hypothesis.


        https://digitalcommons.uri.edu/gsofacpubs/140/

        Multi-decadal variability in the Pacific is defined as the Interdecadal Pacific Oscillation (e.g. Folland et al,2002, Meinke et al, 2005, Parker et al, 2007, Power et al, 1999) – a proliferation of oscillations it seems. The latest Pacific Ocean climate shift in 1998/2001 is linked to increased flow in the north (Di Lorenzo et al, 2008) and the south (Roemmich et al, 2007, Qiu, Bo et al 2006)Pacific Ocean gyres. Roemmich et al (2007) suggest that mid-latitude gyres in all of the oceans are influenced by decadal variability in the Southern and Northern Annular Modes (SAM and NAM respectively) as wind driven currents in baroclinic oceans (Sverdrup, 1947).

        There is a growing literature on the potential for stratospheric influences on climate (e.g. Matthes et al 2006, Gray et al 2010, Lockwood et al 2010, Scaife et al 2012) due to warming of stratospheric ozone by solar UV emissions, the Mansurov effect of ionization and cloud nucleation in the global electric current or yet more obscure stratospheric/tropospheric coupling.

      • David Appell

        “There is a growing literature on the potential for stratospheric influences on climate (e.g. Matthes et al 2006, Gray et al 2010, Lockwood et al 2010, Scaife et al 2012) due to warming of stratospheric ozone by solar UV emissions”

        You’re about 30 years behind the times.

      • David is not aware that there is a curve apparently. “Understanding the influence of solar variability on the Earth’s climate requires knowledge of solar variability, solar‐terrestrial interactions, and the mechanisms determining the response of the Earth’s climate system.” https://agupubs.onlinelibrary.wiley.com/doi/pdf/10.1029/2009RG000282

        The mechanisms include modulation of the polar annular modes – polar vortices – as discussed above. Here’s a model study. Valid if the UV/ozone geophysics are correct.

        “Here, we explore possible impacts through two experiments designed to bracket uncertainty in ultraviolet irradiance in a scenario in which future solar activity decreases to Maunder Minimum-like conditions by 2050. Both experiments show regional structure in the wintertime response, resembling the North Atlantic Oscillation, with enhanced relative cooling over northern Eurasia and the eastern United States.” https://www.nature.com/articles/ncomms8535

        Here’s GEC version.


        https://agupubs.onlinelibrary.wiley.com/doi/full/10.1002/2014GL061421

        Here’s Mike Lockwood’s et al 2010 ‘correlation’ between solar activity and cold winters. Possible with enhanced polar vortex waviness in low solar activity conditions.

        https://iopscience.iop.org/article/10.1088/1748-9326/5/2/024001/meta

        The original correlation between the sun and crops was by William Herschel in Proceedings of the Royal Society in 1801. Not denied until the IPCC got into the act 30 odd years ago.

        You’ll notice the difference between Davids and myself. I cite science and they don’t.

      • David Appell

        sorry robert, there is no evidence the sun has caused modern climate change. This is very well known. See chap1 of the ar of wg1.

      • And there is no chance that models have included decadal to millennial geophysics of natural variability.

        e.g. https://www.nature.com/articles/s41612-018-0044-6

        “The global-mean temperature trends associated with GSW are as large as 0.3 °C per 40 years, and so are capable of doubling, nullifying or even reversing the forced global warming trends on that timescale.”

        Missed that too they did.

      • Good to know DA relies so completely on the IPCC. I don’t think I will ignore the hundreds of papers that beg to differ.
        If we were to accept everything in AR5 as the final word then we would have no knowledge of the geothermal activity under either Ice Sheet. Since its publication much has been learned about the potential influence of geological forces.
        I think I will just keep on believing there is more we have to learn than what we know.

    • I think everyone well understands that nominally, all climate change is the result of solar activity. It is amazing that Western science pretends to be unaware of that simple fact but in reality, such ignorance is ideological, grounded in an unprovable and unverifiable conjecture about AGW, belief in which is based only on it’s utility to the Left to demonize America and free enterprise capitalism, i.e., a Left vs. right issue, not science. Apparently, we all must learn anew what everyone has always known since the dawn of reason– climate change happens and it has always happened without our help and despite our ignorance as to all of the variables involved. All of the geophysical evidence that we do have, however, indicates that humanity’s CO2 is the least significant of all the variables involved; and, evidence to the contrary has been plagued by knowing corruption and fraud.

    • I think I’ve never heard so loud
      The quiet message in a cloud.

      Hear it now, what were the odds?
      The raucous laughter of the Gods.

      Kim 2011

      The global first order differential energy equation can be written as the change in heat in oceans is approximately equal to energy in less energy out at the top of the atmosphere (TOA).

      Δ(ocean heat) ≈ Ein – Eout

      But as energy in from the sun is so invariable – a solar driver of climate remains no more than wild conjecture until there is observational evidence of a link to energy out.

      (a)

      (b)

      Figure 1: CERES Shortwave (a) and Infrared (b) TOA power flux

      The graphs show SW warming exceeding IR cooling. The post hiatus warming is mostly from sea surface temperature/marine stratiform cloud coupling in the upwelling region of the eastern Pacific. (Loeb et al 2018) The mechanism involves closed and open cell cloud formation – discovered in the satellite era – in Rayleigh–Bénard convection in a fluid (the atmosphere) heated from below. Closed cells rain out more quickly over warmer oceans decreasing domain albedo. (Koren 2017)


      Now all that is needed is the link between minimal solar variability and upwelling in the eastern Pacific discussed above.

  2. https://en.wikipedia.org/wiki/Orbital_resonance
    Venus and Earth are like clockwork with a little error. That happens to bring it to an Earth Jupiter alignment at the right time. Someone said, planetary cycles are a waste of time. They used 2 decimal places and a long time frame. So if Venus and Earth had a different ratio, this regular alignment might not happen.

  3. Ireneusz Palmowski

    Abstract
    While a connection between the El Niño‐Southern Oscillation (ENSO) and the Northern Hemisphere wintertime stratospheric polar vortex appears robust in observational studies focusing on the period before 1979 and in many modeling studies, this connection is not evident over the past few decades. In this study, the factors that have led to the disappearance of the ENSO‐vortex relationship are assessed by comparing this relationship in observational data and in operational subseasonal forecasting models over the past few decades. For reforecasts initialized in December, the models simulate a significantly weaker vortex during El Niño (EN) than La Niña (LN) as occurred before 1979, but no such effect was observed to have occurred.
    https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2018JD029961#.XP9Q-rEbx7Q.twitter

  4. Re: paper about the “pause” or “hiatus”

    Here it is, and imo it is well worth reading: “Inference related to common breaks in a multivariate system with joined segmented trends with applications to global and hemispheric temperatures.”

    The copy at the Journal of Econometrics is gated. Here is an open copy:

    https://arxiv.org/abs/1805.09937

    • Steven Mosher

      rather hilarious effort when they depend on finding breaks in a forcing series without regarding the large uncertainties in that series. future they used an out of date forcing series (hansen 2011)

      • David L. Hagen (HagenDL)

        Steven Mosher
        Presumably later data will provide a longer time series showing greater significance using their method. Under the scientific method, YOU bear the burden of proof to show the error in their methodology and/or results using more extensive radiative data. Otherwise your rhetorical ad hominem attack is inconsequential.
        Inference Related to Common Breaks in a MultivariateSystem with Joined Segmented Trends withApplications to Global and Hemispheric Temperatures Kim et al. https://arxiv.org/pdf/1805.09937.pdf

        For the filtered series from 1900 to 1992, we reject the null at less than 5% significance level in all pairs of radiative forcing and temperature series. Hence, there is clear evidence of a break in temperatures that is near 1960 (varying between 1954 and 1966 depending on the series and the forcing). This concurs with the common break results obtained in the previ-ous section. When using the sample 1963-2014 we reject the null in seven and eight cases out of eleven filtered series at the 10% significance level with well-mixed green-house gases and total radiating forcing, respectively. Since the sample size is small (T= 52), we might expect that the LR test may have little power, but our result suggests strong evidence for the presence of a break in temperature series. Note that the evidence for a break is stronger when using bivariate systems involving TRF. This is due to the fact that TRF exhibits a larger decrease in slope compared to WMGHG. The errors are also more strongly correlated. For instance, solar irradiance is a part of both TRF and temperatures and an important source of variations. When considering systems with TRF, the only pairs that do not allow a rejection at the 10% level are those associated with Northern hemisphere temperatures. For these, the p-values range from .14 to .19, while for all other pairs they are below 10%. Given that the small sample size is small, the overall evidence strongly indicates a break in temperatures in the early 90s consistent with the presence of the much-debated “hiatus” . . .
        Conclusion: … Our empirical results show that, once we filter the temperature data for the effect of the Atlantic Mutidecadal Oscillation (AMO)and the North Atlantic Oscillation (NAO), the breaks in the slope of radiative forcing and temperatures are common, both for the large increase in the 60s and the recent “hiatus”… Our results indicate that indeed the “hiatus” represents a significant slowdown in the rate of increase in temperatures, especially when considering global or southern hemisphere series, for which our test points to a rejection of the null of no change for all data sources considered

        Data
        The annual temperature data used are from the HadCRUT4 (1850-2014) (http://www.metoffice.gov.uk /hadobs/hadcrut4/data/current/download.html) and the GISS-NASA (1880-2014)datasets (http://data.giss.nasa.gov/gistemp/). The Atlantic Multidecadal Oscillation (AMO)and the North Atlantic Oscillation (NAO) series (1856-2014) are from NOAA; http://www.esrl.noaa.gov/psd/data/timeseries/AMO/ and http://www.esrl.noaa.gov/psd/ gcoswgsp/ Time-series/NAO/). As stated above, for global temperatures, we also use the data from BerkeleyEarth (Rohde et al., 2013) and the dataset in Karl et al. (2015). Wealso use series fromdatabases related to climate model simulations by the Goddard Institute for Space Studies12
        (GISS-NASA). The radiative forcing data obtained from GISS-NASA (https://data.giss.nasa.gov/modelforce/; Hansen et al., 2011) for the period 1880-2010 includethe following (in W/m2):well-mixed greenhouse gases, WMGHG, (carbon dioxide, methane, nitrous oxide and chlo-rofluorocarbons); ozone; stratospheric water vapor; solar irradiance; land use change; snowalbedo; stratospheric aerosols; black carbon; reflective tropospheric aerosols; and the indi-rect effect of aerosols. The aggregated radiative forcing series are constructed as follows:WMGHG is the radiative forcing of the well-mixed greenhouse gases and has a largely an-thropogenic origin; Total Radiative Forcing (TRF) is WMGHG plus the radiative forcing of ozone, stratospheric water vapor, land use change; snow albedo, black carbon, reflective tropospheric aerosols, the indirect effect of aerosols and solar irradiance

  5. Re: the reference to a new desalination technology. It looks like the Lausanne people are reproducing what the Chlor-Alkali industry has been doing in caustic evaporation trains for over a century. It works. For desal, these guys have a better idea: https://www.saltworkstech.com.

  6. Ireneusz Palmowski

    Solar disk without spots and coronal holes.

  7. Ulric Lyons

    ‘Sun’s 11 year cycle appears to be driven by alignment of the planets’

    This correlation has been noted since the mid 1800’s. I independently found it in 2004, and in 2013 I found the more complete solution for the ordering and timing of each cycle maximum and for the occurrence of each centennial minimum. I can also show why grand solar minima series occur on average every 863, e.g. from 350 AD, 1220 AD, and 2095 AD.

  8. Ulric Lyons

    ‘Improving Climate Predictions over Decades’

    As usual, I argue that the AMO functions as a negative feedback to changes in the solar wind temperature/pressure, and also controls global lower troposphere water vapour and low cloud cover changes as directly associated negative feedbacks.

    https://www.linkedin.com/pulse/association-between-sunspot-cycles-amo-ulric-lyons/

  9. Here I correct some of the misrepresentations and strawman arguments in this https://judithcurry.com/2019/05/25/week-in-review-science-edition-102/#comment-893801 and following comments by Ellison on that WIR thread. I’ll expand on my previous comments to clarify the point I made that has been misrepresented

    If the pre-1967 learning rates for nuclear power had continued to 2015, the overnight construction cost of nuclear power plants US, Canada, France, Germany, Japan and India would now be:
    US, CA, FR, DE, JP, IN

    • At the actual historical deployment rate over this period:
    9%, 15%, 5%, 7%, 13%, 37%, 11%.

    • At the linear rate:
    6%, 10%, 3%, 4%, 7%, 33%, 8%.

    • At the accelerating rate:
    5%, 7%, 2%, 3%, 4%, 31%, 5%.

    (source: Table 3 https://www.mdpi.com/1996-1073/10/12/2169/htm )

    If the regulatory impediments to nuclear power are removed, nuclear could return to rapid learning rates and an accelerating deployment rate. This does not mean a rapid drop in OCC to 10% of current levels. I did not say that nor imply it.

    If the health externality of power generation technologies is internalised, the deployment rate could accelerate more rapidly than the accelerating rates quoted above.

    Whereas, over the 50 year period from 1967 to 2017 the OCC of nuclear power could have reduced by the percentages quoted above, it will take much longer in future. This is because learning rate is the cost reduction per doubling of global capacity of construction starts. Cumulative global capacity in 1967 was just 32 GW. In 2017 it was about 512 GW – i.e. 4 doublings. Now the cumulative global capacity of construction starts is about 512 GW, so one doubling requires building another 512 GW of global capacity. At 25% learning rate, the cost would reduce by 25%. The higher the deployment rate, the faster the cost reductions will be achieved.

    To achieve higher rates we need to remove the regulatory impediments to nuclear power. We could increase the rate substantially if we add the health externality of power generation technologies to the cost of the electricity each technology supplies.

    • I think you will find the regulatory problem stems from the vast and out of control proliferation of administrative rules, regulatory guides and allied items generated since the early 1970’s. Basically, the bureaucrats in the NRC went went wild, as unrestrained government agencies invariably do. The excessive administrative rules are the creation of the bureaucracy, not the overriding regulations The administrative rules can be removed and/or simplified by the simple stroke of a pen within the NRC. No change to the over arching laws (Code of Federal Regulations) is really necessary.
      However, has any bureaucracy ever volunteered to reduce their own power? Nope, requires a higher authority – for example the Executive Branch (President Trump)
      By way of a contrast, fossil power plants and components are built with relatively little government interference. The plants are significantly more cost effective to build than nuclear power plants. The forces of the competitive marketplace and industry codes/standards produce superior and cost effective power plants. The heavy hand of government does not.
      If new nuclear power in the US is to emerge, the NRC must be given a major “haircut”, just like as recently done with the EPA.

  10. “Is the Noble Gas‐Based Rate of Ocean Warming During the Younger Dryas Overestimated?”. https://rdcu.be/bFTlr

    A recent result from the clathrate‐containing WAIS Divide Ice Core showed tight covariation between ocean and Antarctic temperatures throughout the last deglaciation, except for the Younger Dryas interval.

    Look again, there were little warming and cooling cycles in Greenland ice core data during the long cold that did not show up in Antarctic Ice core data.

    ice cores from the NH oceans are stored in Greenland. Ice cores from the SH oceans are stored in Antarctica. The Greenland ice cores showed much colder temperatures, sometimes warming to equal Antarctic temperatures.
    There were melt water evaporation and snowfall events in the NH that did not show up in the SH data. The NH was colder because ice extent was much larger.
    Younger Dryas was only in the NH because it was caused by dumps of ice cold melt water, that quickly mixed with massive oceans and caused the rapid cooling and warming spikes.

    If you will ever look at it and discuss it, the ice core data is a true treasure that can help understand what did happen in ice ages and warm periods and how the major ice ages got bigger and bigger until suddenly we have smaller warm and cold cycles for a new normal.

  11. “The cost of new nuclear plants is high,
    and this significantly constrains the growth of nuclear power under scenarios that assume
    ‘business as usual’ and modest carbon emission constraints. In those parts of the world where a carbon constraint is not a primary factor, fossil
    fuels, whether coal or natural gas, are generally a lower cost alternative for electricity generation.”
    http://energy.mit.edu/wp-content/uploads/2018/09/The-Future-of-Nuclear-Energy-in-a-Carbon-Constrained-World.pdf

    The MIT study recommends ways to reduce costs in the US – and these have very little to do with regulatory reform. Reduction to 10% of would require a 90% reduction in materials and a 90% increase in productivity.

    The solution is standardization and production efficiencies – in factories especially.

    Modest carbon constraint – btw – is the Paris fallback for the parts of the planet where power demand is growing most strongly. Fossil fuels in modern plants save far more lives than more expensive options.

    http://www.aseanenergy.org/resources/reports/aseans-energy-equation/

    • Nope, the problem is excessive regulation. Having been heavily involved in building (and running) both nuclear and fossil power plants, the nuclear plant construction activities are utterly mind numbing and grossly inefficient. This is the direct result of the fiendish complexities caused by a stupefying armada of rules, regulations and stunning layers of bureaucrats & watchers who only wrap progress around the axle.
      Last time I checked, the eggheads at MIT had never built a commercial power plant of any kind.

      • News flash, I have been in the power industry for nearly 50 years. Do the math.

      • Your reference was written by a “green energy” advocate with virtually no experience in nuclear and fossil power plant construction. As near as I can tell, has no technical credentials either. I gather you are in a similar boat.

      • This is a journalist at Forbes who does a reasonable job of summarizing the MIT study and interviewing the study leader. The study itself is the product of well credentialed academic and industry people. But I take it you haven’t actually built anything for a very long time?

        One thing that they identify as a major factor in the cost of nuclear in the US is management failures. So it’s your fault?

      • Nope, problem stems primarily from excessive government regulations that create projects beyond the ability of mortal man to manage. Plain to see for those of us who have built both fossil and nuclear plants.
        Pretty much what happens when “big government” shows up and tries to run the show. The bureaucrats have no incentive to do a good job – they use other people’s money and are completely de-coupled from accountability.

      • Actual in-depth experience versus theory from academia. Sound kind of like climate theoretical models versus reality and actual observation.

      • “The study is designed to serve as a balanced, fact based, and analysis-driven guide for stakeholders
        involved in nuclear energy. Policy makers, utilities,
        existing and startup energy companies, regulators,
        investors, and other power-sector stakeholders
        can use this study to better understand the challenges and opportunities currently facing
        nuclear energy in the U.S. and around the world. The report distills results and findings from more
        than two years of primary research, a review of the state of the art, and quantitative modeling
        and analysis.”

        Check the forword at least for contributors and reviewers.

        ““A shift towards serial manufacturing of standardized plants, including more aggressive use of fabrication in factories and shipyards, can be a viable cost-reduction strategy in countries where the productivity of the traditional construction sector is low,” says MIT visiting research scientist David Petti, study executive director and Laboratory Fellow at the Idaho National Laboratory. “Future projects should also incorporate reactor designs with inherent and passive safety features.”

        These safety features could include core materials with high chemical and physical stability and engineered safety systems that require limited or no emergency AC power and minimal external intervention. Features like these can reduce the probability of severe accidents occurring and mitigate offsite consequences in the event of an incident. Such designs can also ease the licensing of new plants and accelerate their global deployment.”
        http://news.mit.edu/2018/mitei-releases-report-future-nuclear-energy-0904

        But thanks for your unsubstantiated personal opinion.

      • Without substantial reform of the regulatory agencies, the advantages of advanced passively safe reactors will not be enough to overcome the severe competitive disadvantages caused by excessive regulations.
        The Westinghouse AP1000 reactors are demonstratively orders of magnitude safer than earlier designs. However, no reduction in regulatory costs occurred – actually ballooned up. Reality versus the theory that safer reactors reduce costs.
        The point is without regulatory reform, the unnecessary monetary burden caused by the government is so great that the safer advanced designs will be unable to compete in the marketplace.
        The nuclear industry’s response of bleating for subsidies from consumers and taxpayers is disheartening and ultimately self defeating.

      • Fix the root cause of the problem: excessive government regulations.

    • “But thanks for your unsubstantiated personal opinion.”

      Seen from the outside, his opinion is closer to reality than yours or that of MIT’s academic gurus.
      If I may add, Robert, you often have this tendency to argument as if you were an expert in everything… if you think that is a good style go ahead and keep on doing it.

      It is a fact that it takes endless time, paperwork, review panels, etc… just to inspect and re-do one single weld.
      Small accidents, like a twisted ankle of a worker in the NON-nuclear part of a reactor under construction can lead to investigations and the sure call for stopping the whole project by the “green” propaganda machine.
      For this we can thank, in the western world, the likes of the litigation lawyers which basically blocked tens and tens of reactor projects in the USA alone in the 70s and 80s.
      If this weren’t true, how do you explain that the same reactor (EPR design) can be built efficiently, two of them!, in a much shorter time in China, compared to the endless ages in Finland and France?

      • “”The recent experience of nuclear construction
        projects in the United States and Europe has
        demonstrated repeated failures of construction
        management practices in terms of their
        ability to deliver products on time and within
        budget.”

        There is a whole chapter regulation in this MIT nuclear futures document.

        “Regulatory agencies around the world have
        adopted basic principles similar to those
        described in the policies of the IAEA and in
        U.S. NRC regulations, though they vary in
        their detailed application of these policies
        and principles—for example, with respect to
        required burden of proof. While significant
        cultural, social, and political differences may
        exist between countries, the fundamental
        basis for assessing the safety of nuclear
        reactors is fairly uniform among countries
        with established nuclear power programs.”

        And the focus of the report from authoritative industry and academic sources is to pave the way for advanced reactors.

        The difference between you and I seems to be relying on an authoritative source opposed to waffling on about sprained ankles.

      • Robertok06,

        Thank you for dropping in and adding your contribution, which is backed by enormous expertise in nuclear energy matters. Further comments would be most welcome and helpful.

        It is unfortunate that so many people have caught the anti-nuclear disease, just like the CAGW disease. The damage the paranoia in these two areas is doing untold harm to human well-being.

        kellermfk is dead right that the root cause of the problem is excessive government regulations. The regulatory environment for nuclear is totally out of whack with the safety and health impacts of nuclear power compared with other electricity generation technologies. Root cause is:
        > Anti nuclear power protest movement
        > public fear
        > driven by activists organisations like Greenpeace, WWF, FOE, etc. and MSM and entertainment industry
        > governments have to act or they don’t get elected
        > Legislation is implemented, regulatory agencies established, masses of regulations build up over time
        > designs take many years to decades to get approved and cost the industry billions of dollars to get through the approval process
        > then regulations are changed and long delays and additional costs are imposed to get the design re-certified, or get the plant completed to with the new design changes.
        > Designs have to become huge to spread the design cost
        > The plants have to operate for 40 to 80 years to be worth building
        > Operation and maintenance cost is huge because of the regulations
        > fuel costs and back end if the fuel cycle is tens of times more costly that it should be if all technologies were regulated on the same basis regarding equivalent risks and health externalities.

        If not for the excessive regulation (for some 60 years), the cost of nuclear could now be around 5% to 10% of what it is. We’d have SMRs of 5 to 200 MW capacity. Highly flexible, able to follow load (like submarines do), fueled for life. Short operating lives so they are replaced quickly. In this case technology would develop rapidly.

        Unjustified regulatory ratcheting set the world back enormously. The lost 50 years can never be recovered.

        But we could remove the impediments and make faster progress in future, for the benefit of humanity.

      • “There are also significant variations in capital costs by country, particularly between the emerging industrial economies of East Asia and the mature markets of Europe and North America. Variations have a variety of explanations, including: differential labour costs; more experience in the recent building of reactors; economies of scale from building multiple units; and streamlined licensing and project management within large civil engineering projects.”
        http://www.world-nuclear.org/information-library/economic-aspects/economics-of-nuclear-power.aspx

        The central problem was the concentration on light water reactors with uranium enrichment to provide a supply for armaments. Just as is happening in Iran and North Korea today. To get even 34% thermal efficiency required that these plants core be enlarged substantially resulting in capital costs in the order of $10B. This causes problems in terms of construction risks and in operating in deregulated markets.

        “Construction interest costs can be an important element of the total capital cost but this depends on the rate of interest and the construction period. For a five-year construction period, a 2004 University of Chicago study shows that the interest payments during construction can be as much as 30% of the overall expenditure. This increases to 40% if applied to a seven-year construction schedule, demonstrating the importance of completing the plant on time. Where investors add a risk premium to the interest charges applied to nuclear plants, the impact of financing costs will be substantial.” op. cit. But slating all construction delays in large and complex projects to regulatory factors is incorrect.

        The MIT study in chapter 5 – and I doubt that any of these guys have read past the title – discusses LNT and a more risk based approach for advanced nuclear designs. The TVA and the NRC have recently gone in that directed by reducing safety zones for advanced reactors to site boundaries. There are however regulatory principles informed by past failures. Which of them should be ditched?

        And how much do they add to cost?


        Rational engineering – and I have decades of design and construct experience on projects worth up to $10B – says very little. Only advanced modular designs with radically reduced capital cost, passive safety and factory production of standardized units can turn the nuclear tide.

  12. David Appell

    Judith wrote:
    “Alabama’s stand for campus free speech sets an example for the nation”

    But not so their stance for public TV, huh?

  13. And if you want to conserve water, build resilience to flood and drought, restore water tables, ensure food security and increase agricultural productivity – carbon is much better returned to soils and ecosystems than in the atmosphere.

    A new agricultural fund.

    https://www.cnbc.com/2019/06/11/this-is-a-15-trillion-opportunity-for-farmers-to-fight-climate-change.html

    A media report on varbon neutral Australian farmers.

    https://www.abc.net.au/news/rural/2019-06-08/carbon-neutral-livestock-achievable-by-2030-says-mla/11046592

    A study on soil carbon and grazing methods in China.

    https://peerj.com/articles/7112.pdf

  14. Curious George

    “We all want to change the world.” The “world” probably means the human society, the social system. Yes, the society is far from ideal, and it could use an improvement. Assuming that the desire is pure, will we change it for better, or for worse? There are always thousands of ways to do something wrong for every way to do it right. Does the Precautionary Principle apply here?

  15. The main argument for rejecting the important role that solar variability has on climate that proxies support is that total solar irradiance changes are too small. It is essentially the same argument for rejecting that solar variability has a planetary cause, as the tidal effect of the planets on the Sun is too small.

    In essence it boils down to the confrontation between evidence and theory that has played out multiple times in science, due to the reluctance to accept evidence that contradicts our understanding. Most of the times the evidence-supported possibility turns out to be right and our understanding is shown to be wrong or incomplete. When Alfred Wegener presented the evidence that the continents had changed their position in 1912, supported in abundant evidence, it was rejected by most geologists mainly because it lacked a mechanism. It took 50 years to advance our knowledge enough to show him right and discover the mechanism.

    The evidence is clear that solar variability has a huge effect on climate on the multi-decadal to millennial scale. Those that stick to a lack of known mechanism and the smallness of TSI changes to reject it will be shown as wrong as the geologists who rejected Wegener’s theory. In the confrontation between evidence and theory it is safer to bet on the evidence.

    Stefani et al., 2019 is an advance in our understanding of how the planets can affect solar activity, an idea that is heretical among most solar physicists, including Prof. Svalgaard that has attacked it continuously at WUWT. They have too much confidence on what they know and neglect to take into consideration that there is a lot more that they don’t know.

    As an experimentalist, my education consisted in seeing many beautiful hypotheses being slain by ugly facts, and the lesson learned was to always stick to the evidence as the best default approach to science. I started my climate analysis convinced that it could not have been the Sun due to the powerful arguments of theory, but the evidence is obstinate and I was forced to change my preconception. Milankovitch forcing is responsible for getting us in and out of glaciations, and solar variability is the main forcing responsible for centennial to millennial climate variability. Changes in greenhouse gases do not have much role in the climate of the Pleistocene-Holocene. Modern global warming (1750-now) is the result mainly of the recovery of solar activity after the Maunder Minimum and the Modern Solar Maximum of 1935-2005. The contribution by GHGs since 1950 is an important but secondary factor.

    The evidence for the important role of solar activity is going to be reinforced by the below average solar activity of solar cycles 24 and 25. While CO2 is expected to continue increasing unabated over the next decades, solar activity should continue being reduced until around 2033, producing strikingly opposite predictions that should help clarify the issue.

    Let’s just hope that climatologists don’t go into the denial geologists went for 50 years, ignoring the evidence because it didn’t fit the theory. The CO2 hypothesis has demonstrated to be very accommodating and after being delivered a huge blow by Milankovitch theory still many scientists believe is cause of glacial/interglacial transitions, and not effect.

    The planetary theory of solar variability still has a long way to be properly established, but Stefani et al. 2019 is an important step. No doubt die-hard opponents like Leif Svalgaard will be quick to dismiss its relevance. As Max Planck famously said “science advances a funeral at a time.” Almost nobody in a scientific dispute is ever convinced by contrary evidence.

    • Ulric Lyons

      Have you actually looked at my findings on the ordering of sunspot cycles and centennial solar minima? The phase relationships of Earth-Venus and Jupiter-Uranus dictate the timing of each cycle maximum, and of each centennial minimum. Their long term cycle of 1726.6 years dictates the occurrence of grand minima series, which happen at the half cycle on average every 863 years. Around these nodes the quadrupole alignments are more displaced, physically resulting in each series of longer centennial minima. It is my second proof of the next grand solar minima series beginning late this century. The best previous analogue for the next two centennial solar minima, is the pair between 1350 and 1195 BC, which was likely the greatest period of human civilisation collapse during the Holocene.

      • Ulric Lyons

        “An interesting animal 2200, a warm peak but also an abrupt downturn to cold”

        The warm peaks in Greenland are accompanied by increased cold periods to the mid latitudes because of increased negative AO/NAO states during weak solar periods. That drives an increase in El Nino conditions and a warmer North Atlantic phase, hence the Greenland warming. Regions like the Amazon, northern India and China, the northern Levant, and southern Australia dry out and have greater temperature variability, and regions like the Sahel and southern Europe become wetter. One can only make sense of it all by first knowing what the ocean phase response is, and then how that plays out regionally. The same applies to the 1350-1195 BC warm peak in GISP2, and the near 1000 AD warm peak in GISP2, which was the Oort solar minimum.
        https://en.wikipedia.org/wiki/4.2_kiloyear_event

        It is worth noting that the grand solar minima of 1250-1195 BC and 350-400 AD both had a series of great earthquakes around the Mediterranean.

        The 2354 BC to 2345 BC period in your link had a series of Jovian configurations driving extreme cold events. 2348 BC was the same type as in 1010 when the Nile froze, and also the extreme winters of 1600-1602, 1784, and 1963. The UK-Ire inundation is either cool wet summers/autumns or strong winter blocking like in Jan-Feb 2014.

      • Thanks again UL, you hint to an improved perspective.
        My archaeo evidence stops at around 2000bce, so I won’t stray beyond.

        Quoting you “The 2354 BC to 2345 BC period in your link had a series of Jovian configurations driving extreme cold events. 2348 BC was the same type as in 1010 when the Nile froze,–“.

        2348/2345 bce links to my main archaeological evidence; a post build alteration to a wider obliquity angle (from ~14.5, as several others, to 23.x). For mainstream science this is tough, but it is the strongest and easiest to verify of evidence. Which was my starting point to Javier’s remark, viz ‘theory versus evidence’.

        The 4.2kyrs Wiki link in your post describes many extreme events that may be easier to explain by the mentioned evidence rather than by climate theory. I say so because the evidence is there. The wiki piece does not mention the Queccaya in Peru, quote form wiki ” As the ice cap is retreating, it is exposing almost perfectly preserved, unfossilized plant specimens that have been dated to 5,200 years before present, indicating that it has been more than 50 centuries since the ice cap was smaller than it is today.” First, here the date was revised downwards (last I know as of 2009? it was nearer to the 2345); second, it was an abrupt increase in permanent ice cover that froze and preserved plants. The location if perfect for that, at 14deg S, which would be the original tropic, and so a max insolation decrease. But see here https://kb.osu.edu/bitstream/handle/1811/80465/1/StahlHenry_thesis.pdf

        The Jupiter – and Saturn- connection hints to possible primary cause due to the gravitational ‘tug of war’; it was much more than Mediterranean earthquakes in the Holocene max (the effect has also surfaced in Lampedusa).

      • “The 4.2kyrs Wiki link in your post describes many extreme events that may be easier to explain by the mentioned evidence rather than by climate theory. I say so because the evidence is there.”

        The regional climate proxies are not evidence for the cause, which has to involve ocean phases because of the regional shifts in rainfall.

        “The Jupiter – and Saturn- connection hints to possible primary cause due to the gravitational ‘tug of war’”

        The larger gravitational tugs are at every syzygy, there is nothing special gravitationally about the trigon period. Good extensive correlations are far more important than hints.

      • You read me wrong.
        Exactly my view “regional climate proxies are not evidence for the cause” and neither are ” ocean phases because of the regional shifts in rainfall.” because all seem as secondary effects of a major driver.
        Planetary alignments corresponding to event dates are clues to follow (apparently a favourable condition/trigger but not driver).
        The evidence I referred to is here, (and we picked the right date for it – the solstice). http://old.culturemalta.org/71/162/Hagar-Qim-and-Imnajdra

        The structure on the right is still fully functional and capable of predicting the solstice day and hour. Designed/built ~2900bce to an obliquity of 14.5, then modified to ~23.x deg. The middle is pre 3200; the left pre 5200 and both to obliquity ~14.5 . The axial orientation shows micro-plate tectonic rotations.
        Design requirement is an axial alignment to the equinox sunrise point on horizon, ie EW. If the last is still fully functional, which I tested repeatedly with different setup, then the earlier ones are not ignorant trials until they got it right. This is the hard evidence. (pls ignore the last paragraph, they have been carried away by the Hancock speculations)

      • Ulric Lyons

        Obviously the Sun is the driver, with the solar variability being ordered by the planets at various scales. I cannot take this obliquity shift idea seriously, sorry.

      • I told you it is a tough one. But never mind; thanks for the discussion.

      • Don’t mean to press the point, but this I discovered in a link on the Ice-man paper. Link: https://www.researchgate.net/profile/Jean_Haas/publication/227538423_A_major_widespread_climatic_change_around_5300_cal_yr_BP_at_the_time_of_the_Alpine_Iceman/links/5c50227c299bf12be3eb76bc/A-major-widespread-climatic-change-around-5300-cal-yr-BP-at-the-time-of-the-Alpine-Iceman.pdf?origin=publication_detail
        From abstract “Palaeo-environmental and archaeological data from Arbon Bleiche, Lake Constance (Switzerland) give evidence of a rapid rise in lake-level dated by tree-ring and radiocarbon to 5320 cal. yr BP. This rise event was the latest in a series of three successive episodes of higher
        lake-level between 5550 and 5300 cal. yr BP coinciding with glacier advance and tree-limit decline in the Alps.”
        Note the start at 5550BP (~3550bce) as per my link https://melitamegalithic.wordpress.com/2019/03/15/searching-evidence-update-2/ , and it all fits.
        All tell-tale proxies indicate abrupt events.

    • Javier said “In essence it boils down to—-enough to show him right and discover the mechanism.” To that, Amen to the power of ten.
      However the geologists are still in denial, the few who are plodding ahead are meeting great resistance. Especially when it comes to timing. The ages are far shorter than thought (eg in tectonic rotations and their time and place).
      A second point: Milankovitch build his theory on the assumption that the Stockwell/Newcomb assertion is correct. Evidence says otherwise, and that upsets all.

      Ulric Lyons points to 1350 and 1195bce as ” likely the greatest period of human civilisation collapse during the Holocene.”. That stretch of time is an Eddy cycle root (thanks to Javier for the tip). Probably not, as the earlier roots were far more drastic, as evidence shows. Incidentally it seems the ‘Kepler Trigon’ planetary alignments seem to have a finger in the pie here. See https://melitamegalithic.wordpress.com/2019/06/14/ask-otzi/

      • Ulric Lyons

        melitamegalithic

        The Eddy period reverses sign from 8200 BC to 1200 BC through the GISP2 series, and it has no known source. My mean 863 year nodes are from: 2220 BC, 1357 BC, 494 BC, 369 AD, 1232 AD, and 2095 AD.

        Hamlet’s Mill has no place in grand solar minima series because Saturn does not have a finger in the pie.

      • Ulric Lyons
        I was surprised to find the Eddy cycle as had been presented for the last 2kyrs, also fitted so close at the roots to dates ~2345, 3195, 4375; dates for which I already had correlations from several proxies, plus, importantly, archaeo/geological. The evidence I have is limited to that stretch of time, ie ~5200 to ~2345bce.
        More than that I do not know, but it is beyond coincidence. You added the 1350-1195bce historical; a root.
        I do not know anything about reversing of sign in Gisp2 (Gisp2 data from wiki ‘temp anomaly’ -my source- appears to be presented ~1100yrs out of step with Vostok and Kilimanjaro; important when attempting correlations of global events).
        I don’t ‘do’ solar minima -not informed-, and less so Hamlet’s Mill (did not read it, so ???), I seek to explain the events told by the archeological structures which tell their story clearly. That story agrees with the heretical not with the accepted dogma; hence my questioning.
        And yes, thank you for the data and the clues that are provided here.

      • Ulric Lyons

        Sorry that should have been 6200 BC, which is cold on GISP2. Then move forward in 1000 year steps and by the time you reach 4200 BC and 3200 BC they are warm spikes instead.
        I have no idea what you mean by root dates.
        Hamlet’s Mill, the trigon over 40 or 43 Jupiter-Saturn synodic periods.
        And if you “don’t ‘do’ solar minima”, you need not have replied to my comment.

      • Ulric Lyons

        Correction.. 2200 BC and 1200 BC.

      • Ulric L
        Thank you for the reply. Somehow there always seems to be a new lead/hint to follow in all correspondence. I’ll read H’sM.

        “Sorry that should have been 6200 BC, which is cold on GISP2” . Yes, but more than cold, it was an abrupt turning point. The evidence I have is for 4375bce and 2345, however 6200 is similar and links to the Doggebank sinking.
        But see my link here: https://melitamegalithic.wordpress.com/2019/03/15/searching-evidence-update-2/ Those three dates correspond to an abrupt change from cold at both Gisp2 and Vostok (polar) but in reverse for equatorial Kilimanjaro. They also correspond to the root of the Eddy, as it is later for the LIA and the earlier DACP (and your (1350+1195)/2 =1272 ).
        Root is bottom of Eddy curve, as opposite to peak/top. RWP and MWP were peaks. 355xbce was a near? peak, in the Med signalling a change to adverse times as evident in prehistoric archaeo. In the holocene max that change was also abrupt, unlike in the last 2k yrs.

        Your 4200bce is a root/bottom, 4375 is a date from tree-rings, but they are most likely one and the same event. 3200 is the next root and here tree-rings give 3195 (3202 is a trigon) Again it is likely the same event.
        But not in archaeology/geology since there occurs an evident tectonic event (~3200, as is ~5200). There seems to be certain conditions that favour drastic change and a trigger that sets it going.

      • “Yes, but more than cold, it was an abrupt turning point.”

        Around 6200 BC had a positive AO/NAO regime with strong trade winds, as with the next two coldest periods on GISP2 at 2700-2500 BC and the 700’s AD. They are all warmer periods in the Vostok proxy. Their periodic interval is a mean 2*1726.6 years, a product of the long cycle of centennial minima that I have identified.
        There is great temptation in believing that cycles should conform to sinusoidal curves. Show me what produces an Eddy curve I will take note. Else I will continue as usual mapping the noise.

        “Your 4200bce is a root/bottom, 4375 is a date from tree-rings, but they are most likely one and the same event.”

        I corrected that to 2200 BC. It’s not as if extreme cold winters don’t occur outside of centennial solar minima. A tree frost ring from a single year is seasonal weather and not evidence of a climate regime or of abrupt climate change.

      • UL, thanks again.
        My slip with 4200; acknowledged. It is 2200. An interesting animal 2200, a warm peak but also an abrupt downturn to cold (itself an Eddy root -for what its worth). Equally interesting there is the earlier 2345 point where Gisp2 and Vostok surge up but Kilimanjaro takes a sharp downturn. As had been pointed out here: https://www.researchgate.net/publication/301621337_Why_we_shouldn't_ignore_the_mid-24th_century_BC_when_discussing_the_2200-2000_BC_climate_anomaly
        Allow me please to make something clear; I am not interested in climate per se, but why between 2900-2200 a megalithic calendar appears to have its dimensions modified by extension for an apparent change of obliquity from ~14.5 to ~23.5 . There are a number of proxies indicating some abrupt event, but none indicate what it was.
        Agreed, the Eddy sinusoidal is only a hypothetical construct, which somehow seems to fit, but with no hint of what it is or what causes it, afaik.
        There appears to be a random element when it triggers (I point here that: 6200-doggerland sinking; and 5200 and 3200 both altered the central Med to some extent tectonically -that is certain- , and possibly elsewhere on the globe too (Otzi is case in point > quote: “Since they (grasses) apparently grew also at 3210 m (the Ötzi finding place) indicates that a warmer period than today may have existed at the respective time period.” One date for Otzi is 3200. Now 3200 was an abrupt temp downturn but temp rose again by even more by 2800. An abruptly frozen body where it had been warm, and remained frozen in spite of higher temp at 2200. Was it also an altitude change? a tectonic event as in the Med?)
        That is the evidence.

      • Ulric Lyons

        Sorry my latest reply went in the wrong place up the thread ^^

      • Melitamegalithic

        How significant is the impact of next Eddy cycle root likely to be for global economy and, therefore, for human well-being? Could you provide answers to these questions:

        1. When is the next Eddy cycle root expected, and how long would it last?

        2. On what areas of the world would it have the most impact?

        3. What impact sectors is it most likely to affect positively and which negatively? (Impact sectors: agriculture, forestry, health, sea level, storms, fresh water, ecosystems, energy).

        4. I suspect the main negative impact would be on agriculture. But agriculture comprises only 6% of world GDP. And trade can adapt rapidly to move agriculture products to areas with excess production to those that are badly impacted. So, what is the likely impact of the next Eddy cycle root on total global food production?

      • Peter Lang
        We seem to be heading to another Eddy peak; a warming. What that heralds I have little idea.
        If it will be like the Mediaeval warm period, we have a fair idea what, and we may be seeing the beginning of it in population upheaval. The political trigger may be a symptom of it.
        If on the other hand it will trigger events as in the Holocene max, then forget your questions and take RI Ellison’s advice – aim for max resilience. The main foe will be nature.

    • Ulric Lyons

      Javier writes:
      “While CO2 is expected to continue increasing unabated over the next decades, solar activity should continue being reduced until around 2033, producing strikingly opposite predictions that should help clarify the issue.”

      Stronger cooling should occur just after 2033 when the solar wind strengthens, driving colder ocean phases. Post 1995 warming is largely down to warmer ocean phases reducing low cloud clover and increasing lower troposphere water vapour, as a response to weaker solar wind states.

  16. Judith,

    Thank you again for this remarkable selection of informative (mostly) articles! It generally takes me a few weeks to read through most of them.

    • Ireneusz Palmowski

      On the night of June 21 in the Oregon mountains, the temperature will drop below 0 C.

  17. Ireneusz Palmowski

    The beginning of winter in Australia.

  18. Ireneusz Palmowski

    You can see that the stratospheric winter polar vortex in the south is also weakened.


    There is a minimum of solar wind activity.

  19. Ireneusz Palmowski

    Strong wave is visible in the upper stratosphere in the area of operation of the polar vortex.

  20. Those of you with a mathematical statistical bent might enjoy the article by Kim et al using multivariate break point analysis (aka “change point” analysis”) to test the existence of the “hiatus”. Here is a selection:

    begin quote

    To motivate our quasi-likelihood ratio test, we first assume that u is multivariate normal with 0 mean and covariance
    Σ ⊗ IT . This assumption will be relaxed when we derive the asymptotic distribution of our test. For a generic break date
    vector k and the corresponding regressor matrix X, the log-likelihood function of y is then given by
    l(k, θ , Σ) = −(nT /2) log 2π − (T /2) log |Σ| − (1/2)(y − Xθ)
    ′(Σ−1 ⊗ IT )(y − Xθ).

    Let θˆ = (θˆ′1, . . . , θˆ′n)′

    and Σˆ (k) be the maximum likelihood estimators for θ and Σ, which jointly solve θˆ = [X′(Σˆ −1(k) ⊗IT )X]−1[X′(Σˆ −1(k) ⊗ IT )y]
    and Σˆ (k) = T−1Uˆ ′kUˆk,
    where
    Uˆk = [y1 − X(k1)θˆ1, . . . , yn − X(kn)θˆn].

    Thus, the maximized log-likelihood function is

    l(k) = −(nT /2)(log 2π + 1) − (T /2) log |Σˆ (k)|. (4)

    It is useful to work with break fractions; hence define λij = kij/T , λi = ki/T and λ = k/T , with the true break fractions λ0ij, λ0i and λ0 equivalently defined.

    With these definitions, functions of λ such as l(λ) and Σˆ (λ) will be used interchangeably with l(k) and Σˆ (k).

    We now state the assumptions needed for the asymptotic analysis. Let [.] denote the integer part of the argument.
    Assumption 1. 0 < λ0i1 < · · · < λ0imi< 1, with k0ij = [Tλ0ij] for i = 1, . . . , n.
    Assumption 2. δij ̸= 0 for j = 1, . . . , mi and i = 1, . . . , n.
    Assumption 3. Let ut = (u1t, . . . , unt)′. Then, ut is stationary with E(ut) = 0 and Var(ut) = Σ. In addition, T−1/2 ∑[Tr]t=1ut =T−1/2Ψ1/2 ∑[Tr]t=1et + op(1)
    and T−1/2 ∑[Tr]t=1et ⇒ W(r), where ‘‘⇒’’ denotes weak convergence under the Skorohod topology, W(r) is the n dimensional standard Wiener process
    and Ψ = lim T−1E(∑Tt=1ut)(∑Tt=1ut)′.

    end quote

    The typography does not all come through, especially superscripts and subscripts, but you get an idea of the authors' rigor and proficiency. I quote this so that readers will not automatically skip an article in the econometrics literature, misperceiving it to be trivial. They obtain standard errors of estimates and p-values of the test statistics through bootstrapping.

    The authors find that, according to statistical criteria that they explain, the time series of forcing measurements and the time series of global temperature have common break points (within a few years), supporting the idea that they are causally related from 1880-2010, and 1963-2014.

    another quote It has been proposed that effects of natural variability modes such as AMO, NAO and PDO were able to mask the warming trend since the 90s, creating the illusion of a slowdown in the underlying warming trend (e.g., Guan et al., 2015; Steinman et al., 2015; Li et al., 2013;Trenberth and Fasullo, 2013). However, Estrada and Perron (2017) argue that low-frequency oscillations do distort the underlying warming trend but cannot account for the current slowdown. Instead, the main effect of these oscillations has been to make it more difficult to detect the drop in the rate of warming, a real feature of the warming trend imparted by the slowdown in the radiative forcing from well-mixed greenhouse gases.

    Establishing whether the ‘‘hiatus’’ period is statistically significant is important because it would counter the widely held view that it is the product of natural internal variability (Kosaka and Xie, 2013; Trenberth and Fasullo, 2013; Meehl et al., 2011; Balmaseda et al., 2013). Using standard tests (e.g., Perron and Yabu, 2009), the results are mixed across various series and sometimes borderline. Our aim is to provide tests with enhanced power by casting the testing problem in a bivariate framework involving temperatures and radiative forcing. We consider bivariate systems with one temperature series and one forcing variable. We use the LR test for the presence of a break in temperature series given the presence of a structural break in radiative forcing series.

    Coauthors Estrada and Perron have other publications on this topic.

    They conclude that the hiatus was coincident with a change in the forcing regime, hence not a part of “natural internal variability”.

    To me, this is a good and important paper. The main limitation, common with lots of papers, is that they have no model of a possible background natural variability with a long period, such as appx 1000 years, Which is inestimable with time series only appx 1/10th that long.

    Thanks again to Judith Curry for bringing this paper to our attention.

    • Those of you who, like me, are weak on Skorohod topology (aka Skorokhod topology), search on wikipedia for “cadlag”, or “Skorokhod Spaces”.
      (my copy/paste of the address does not seem to work here.)
      What the authors assume is essentially that there is a distance function in the two-dimensional time series space that they are using as a model.

    • Temperature breaks are obvious – and it is not 1963. Using network analysis – they are the result of synchronized chaos between nodes of global climate variability.

      And the changes in temperature primarily emerge from changes in cloud cover associated with shifts in ocean and atmospheric circulation.

      “The top-of-atmosphere (TOA) Earth radiation budget (ERB) is determined from the difference between how much energy is absorbed and emitted by the planet. Climate forcing results in an imbalance in the TOA radiation budget that has direct implications for global climate, but the large natural variability in the Earth’s radiation budget due to fluctuations in atmospheric and ocean dynamics complicates this picture.” https://link.springer.com/article/10.1007/s10712-012-9175-1

      • Robert I Ellison: Temperature breaks are obvious – and it is not 1963. Using network analysis – they are the result of synchronized chaos between nodes of global climate variability.

        Different methods of analysis produce different results. Who knew?

      • Can’t tell the difference between observation and model?

      • Robert I Ellison: Can’t tell the difference between observation and model?

        Of which model are you speaking Robbie, the model produced by the network analysis, or the model produced by the bivariate change point analysis — or another not yet mentioned? Each is compared to observations. Maybe you mean the poorly articulated “eyeball model”, which nearly always fails to give an accurate representation of reality when there are random measurement errors (probably autocorrelated) on a system composed of linked chaotic processes.

      • Temperature and cloud. Network math merely confirms what is evident in temperature data – cloud observations show how it works. Obvious if science is subswtituted for statistics.

        e.g. https://www.sciencedaily.com/releases/2013/08/130822105042.htm

      • Robert I Ellison: Temperature and cloud. Network math merely confirms what is evident in temperature data – cloud observations show how it works. Obvious if science is subswtituted for statistics.

        If you model the trajectory of temperature since the late 1800s as a piecewise linear function of time, then the breakpoints are reasonably accurately obtained by eyeballing, as in the third figure of your post:
        Robert I. Ellison | June 18, 2019 at 5:14 pm |

        Such a model is not a veridical representation of any of the processes generating the temperature record. For an accurate veridical model you need at least 3 components:

        A (likely monotonic non-decreasing) function of time, which Kim et al treat as piecewise linear;

        PLUS

        B an oscillatory process generated by the linked chaotic systems that have been at least partly quantified: ENSO, PDO, Stadium Wave, AMOC, etc.

        PLUS

        C an autoregressive process representing all the random background variation (variation that is non-predictable);

        Estimating breakpoints in A, given B & C, is the task undertaken by Kim et al. It requires a sophisticated mathematical/statistical analysis of non-stationary bivariate processes; maximum-likelihood estimation of the parameters (computationally intensive for a model this complex); bootstrapping to obtain reasonably accurate p-values of tests of null hypotheses and confidence intervals on estimands.

        The eyeballing the temperature record and network analysis that you reference are seriously underpowered for finding the breakpoints of a piecewise linear representation of the signal modeled by A; very unlikely to produce a veridical representation of the processes generating the observations.. Including cloud cover in the model might be informative, but is hardly sufficient given A, B, and C.

      • But here we have both cause – albedo – and effect – global average temperatures. And very obvious multi-decadal climate shifts.

      • Robert I Ellison: And very obvious multi-decadal climate shifts.

        Those are the results of the natural oscillations, B in my outline. Kim et al estimated something more subtle, a change in monotonic upward component (hypthetically driven by anthropogenic CO2, which they include in their model, though any hypothetical driver could be included ) of the temperature trajectory, given the presence of natural oscillations and autoregressive noise.

        If there is an anthropogenic component to the temperature increase since the late 1800s, it can only be detected via an accurate model that includes accurate estimates of the background variation — about which much has been learned in the last 20 years, and even just as I have been reading and commenting here (e.g. Stadium Wave); Kim et al have made a major contribution to such an effort. The network analysis that you linked to years ago was a major contribution as well, but without explicit models for a hypothetical anthropogenic signal and autoregressive noise, it is too simple to elucidate any potential anthropogenic influence in the temperature increase.

      • These are physical systems with multiple consilient data series. And the multi-decadal, stadium wave component is about half the warming of the past 40 years.

        https://www.nature.com/articles/s41612-018-0044-6

      • Makes me wonder about the impact of decadal to millennial internal variability

      • And what they did was reject internal variability in favor of common – if unphysical – breakpoints in temperature and external forcing.

      • Robert I Ellison: And what they did was reject internal variability

        Robbie, that is not true. You did not read the article, did you.

      • JudithCurry: Makes me wonder about the impact of decadal to millennial internal variability

        Really reliable time series for testing the hypothesis that Kim et al tested are only about 10% of a millenium; long period oscillations require more information than they could incorporate. If the upswing in temps since the late 1800s is caused by the increase in a process with a period of 1000 years, as I think possible, then it is going to take a long time to develop statistical methods to distinguish between the effects of such a process and the effects of the monotonic rise in CO2.

        For the decadal oscillations the authors say (second sentence; first included for completeness ): The annual temperature data used are from the HadCRUT4 (1850–2014) (http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/download.html) and the GISS-NASA (1880–2014) datasets (http://data.giss.nasa.gov/gistemp/). TheAtlantic Multidecadal Oscillation (AMO) and the North Atlantic Oscillation(NAO) series (1856–2014) are from NOAA; (http://www.esrl.noaa.gov/psd/data/timeseries/AMO/) and (http://www.esrl.noaa.gov/psd/gcos_wgsp/Timeseries/NAO/).

        Other data include: As stated above, for global temperatures, we also use the data from Berkeley Earth (Rohde et al., 2013) and the dataset in Karl et al. (2015). We also use series from databases related to climate model simulations by the Goddard Institute for Space Studies (GISS-NASA). The radiative forcing data obtained from GISS-NASA (https://data.giss.nasa.gov/modelforce/; Hansen et al., 2011) for the period 1880–2010 include the following (in W/m2): well-mixed greenhouse gases, WMGHG, (carbon dioxide, methane, nitrous oxide and chlorofluorocarbons); ozone; stratospheric water vapor; solar irradiance; land use change; snow albedo; stratospheric aerosols; black carbon; reflective tropospheric aerosols; and the indirect effect of aerosols. The aggregated radiative forcing series are constructed as follows: WMGHG is the radiative forcing of the well-mixed greenhouse gases and has a largely anthropogenic origin; Total Radiative Forcing (TRF) is WMGHG plus the radiative forcing of ozone, stratospheric water vapor, land use change; snow albedo, black carbon, reflective tropospheric aerosols, the indirect effect of aerosols and solar irradiance.

        I called this paper a good step forward. Hopefully next steps will include reliable data on PDO, ENSO, and others. This paper is the latest in a series, and hopefully the authors will extend the series.

      • Kim et al filtered out the effects of AMO and NAO:
        The temperature series are affected by various modes of natural variability such as the AMO and NAO, which are characterized by low frequency movements (Kerr, 2000; Hurrell, 1995). Since trends and breaks are low frequency features, it is important to purge them from the temperature series allowing more precise estimates of the break dates.2 Other high frequency fluctuations in temperature series do not affect the precision of the estimates of the break dates and the magnitudes of the changes in slope. Accordingly, we filter out the effect of these modes of variability by regressing each temperature series on these modes and a constant. Since the effect of natural variability might have occurred with a time lag, we choose an appropriate lag using the Bayesian Information Criterion (BIC); Schwarz (1978). The candidate regressors for the filtering are the current value and lags (up to order kmax − 1) of AMO and NAO. We first work with G from HadCRUT4. We start with kmax = 2, so that the candidates are the current value and the first lag only. BIC chooses the current value of AMO and the first lag of NAO. Since the maximum lag allowed is selected, we increase kmax to 4. Then, BIC chooses the current value of AMO and the second lag of NAO (the first is not included given that we search over models not necessarily including all lags up to some chosen order). When applying the BIC, the number of observations used is limited by kmax (e.g., Perron and Ng, 2005). Having decided on the current value of AMO and the second lag of NAO, we apply the filtering to all available observations, not limited by kmax. We could repeat the same procedure to each series. However, it does not make much sense to have the same mode affect each temperature series with a different lag. Hence, we filter all series with the current value of AMO and the second lag of NAO.3 The filtered temperature series are denoted as ˜ G, Ñ and ˜S. Fig. 1 presents graphs of the original and filtered series.

      • One problem is that AMO, etc are not easily filterable, since the external forcing projects on these modes

      • Judith Curry: One problem is that AMO, etc are not easily filterable, since the external forcing projects on these modes

        Do you have a specific criticism of their specific method? They do not claim that it was “easy”.

  21. Ireneusz Palmowski

    Cold air does not leave the North Atlantic.
    https://earth.nullschool.net/#2019/06/23/0600Z/wind/isobaric/700hPa/overlay=temp/orthographic=-29.33,55.05,653
    Unfavorable conditions for the formation of hurricanes in the Atlantic.

  22. over 30% of PhD students develop a psychiatric condition. This is a higher rate than for people working in defence and emergency services, which is about 22%’

    Anonymous writes in the Guardian

    “…the organisation (sic) provides infrastructure and services. With these resources, hundreds of academics are then free to pursue their goals and further their own positions as quasi-entrepreneurs.”

    I am unaware of a research university or research establishment writing a blank check for someone to come in and kinda do-what-you-want. Rather, a scientist enters a research establishment with a source of funding of their own to pay for his/her research and a portion of their own salary. In the US, Government funding is taxed by universities at 45%+ for “administration” expenses. The scientist then contracts with the Department Chair on responsibilities serving the Department and the larger establishment, frequently including teaching, committees, and “public outreach”. The ascendency from Instructor to Assistant Professor to Associate Professor and Professor have time lines in the tenure system. Can’t remain in lower level ranks forever.

    The pressures on the scientists are to write grants, perform research for which the granting agency was written, write reports/publications, attend national/governmental conferences while chair some of them, recruit future trainees as in graduate students and post-doctorates all the while delegating undergraduate teaching to said trainees. Undergraduate ranks are the resource for future graduate students who will not flock to one’s door if they don’t believe they will be treated fairly. Hence, they have to connect in a positive way with the senior research scientist, usually on a one-to-one basis.

    Today’s research scientist is not the lonely figure in a lab surrounded by test tubes, some gadget he/she developed to measure something arcane. Today’s research scientist is a CEO of an enterprise. They need to learn how to be a CEO just like every other CEO. The road to success is bumpy and twisted given the trek is further complicated by human foibles.

    Anonymous writes from a naive perspective. Maybe a specialist in Medieval High German literature can be a solitary soul although I don’t see many positions for such advertised.

  23. Drought and Famine in India 1870-2016
    “Using station‐based observations and simulations, we reconstruct soil moisture (agricultural) drought in India for the period 1870–2016. We show that over this century and a half period, India experienced seven major drought periods (1876–1882, 1895–1900, 1908–1924, 1937–1945, 1982–1990, 1997–2004, and 2011–2015) based on severity‐area‐duration analysis of reconstructed soil moisture. Out of six major famines (1873–74, 1876, 1877, 1896–97, 1899, and 1943) that occurred during 1870–2016, five are linked to soil moisture drought, and one (1943) was not. The three most deadly droughts (1877, 1896, and 1899) were linked with the positive phase of El Niño–Southern Oscillation.”

    https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2018GL081477

  24. Judith,
    I think this is something to keep an eye on:
    https://www.climatechange.ai/
    Coverage of their just finished 6/14/2019 conference here:
    https://slideslive.com/38917142/climate-change-how-can-ai-help

    Meanwhile the IBM A.I. debater program is getting better. Maybe we should have a A.I. Red/Blue team debate?
    https://www.expresscomputer.in/artificial-intelligence-ai/ibms-ai-debating-system-argues-for-and-against-cannabis-legalisation/36905/

    • Some interesting work being done by using ML to study climate change.
      One thing machine learning does well is create mathematical equations to describe patterns in data. One complaint frequently cited as a weakness in GCM is the use of parameters to substitute for the finite computational resources needed to do high resolution decadal simulations. One possible solution would be to incorporate these ML process at scale so that each cell of the grid can use it’s own custom dynamic equation to better model these chaotic processes.
      Recovering the parameters underlying the Lorenz-96 chaotic dynamics:
      https://www.climatechange.ai/CameraReady/70/CameraReadySubmission/icml_l96_theta_edited_v5.pdf

    • Curious George

      It is a very modern hope: If we don’t understand it, maybe computers can.

      • Or maybe the reverse is true? Neural networks were designed based on our understanding of how our brains store, retrieve and modify information.
        It’s entirely possible there are other algorithms that can generate machine intelligence that use a completely different schema than our biological models.
        Example:
        One could then imagine a “general” intelligence as simply an algorithm that is extremely good at matching the task you ask it to perform to the specialized service algorithm that can perform that task. Rather than acting like a single brain that strives to achieve a particular goal, the central AI would be more like a search engine, looking through the tasks it can perform to find the closest match and calling upon a series of subroutines to achieve the goal.
        https://singularityhub.com/2019/06/02/less-like-us-an-alternate-theory-of-artificial-general-intelligence/

  25. https://journals.ametsoc.org/doi/pdf/10.1175/JCLI-D-18-0446.1

    This study identifies the AMO role in Northern Hemisphere extreme temperatures

  26. “An extreme drought was observed in the year 1707. Also, the years 1705, 1706, 1784, 1786, 1809, 1810, 1813, 1821, 1849, 1858, 1861, 1909, 1967, 2006, and 2009 experienced severe drought conditions. Consecutive years with moderate drought were 1702–1706, 1783–1789, 1796–1798, 1812–1814, 1816–1827, 1846–1848, 1856–1864, 1873–1877, 1879–1882, 1899–1901, 1908–1911, and 1967–1968. Three historic mega-drought events that occurred in Asia were also captured in our reconstruction: Strange Parallels drought (1756–1768), the East India drought (1790–1796), and the late Victorian Great Drought (1876–1878). Very few wet years (1776–1979, 1989, 1991, and 2003) were observed during the reconstruction period.

    https://www.researchgate.net/publication/328624704_Drought_scPDSI_reconstruction_of_trans-Himalayan_region_of_central_Himalaya_using_Pinus_wallichiana_tree-rings

    This study covers droughts in the trans Himalayan region providing more evidence that droughts were extensive before 1950.

  27. https://file.scirp.org/pdf/ACS_2019010914482656.pdf
    “In this paper, we have found that the dramatic upward rising signals can be perfectly fitted with periodic functions, which suggests that the major climate factors can still be the main reason for the recent global climate warming, and the secondary climate factor such as anthropogenic emissions might be the secondary reason.”
    This paper takes on the control knob theory straight on.

    • I would be wary though, of papers published in a SCIRP journal. A Chinese vanity publishing operation.

      • Noted.

        I have only cursory knowledge of papers concerning solar impact on climate. Since the persona non grata status has been given studies of the sun by the establishment, I was surprised a couple of years ago when I accumulated a list of over 200 solar papers. I’m sure there were even some who were not on the take of big oil.:)

        I’ve linked a few more recent papers here. I’m curious if you have seen these papers and what your views are about the quality of the papers.

        https://www.hindawi.com/journals/aa/2019/1214896/

        https://www.ias.ac.in/article/fulltext/joaa/040/02/0011

        https://www.sciencedirect.com/science/article/pii/S1364682618301469

        https://www.mdpi.com/2073-4433/10/1/29/htm

      • cerescokid,

        The Scaffeta-Wilson article is part of a long controversy over two TSI reconstructions that differ in their long term trend. The bottom line is if solar activity has been decreasing while temperature has been increasing or not. Scafetta defends several controversial points of view in solar physics and climate, but he is well known and whether correct or not, there are no quality issues.

        The Zao et al. 2019 article is just a wavelet analysis of TSI. It is not a good paper and the significance of the periodicity found remains to be determined. Low frequency periodicities in the Sun have been studied and there are several of them that are well characterized in multiple solar-related phenomena. The main ones are the 1.3 and 1.7-year periodicities. See for example:
        Ruzmaikin, A., Cadavid, A.C. and Lawrence, J., 2008. Quasi-periodic patterns coupling the Sun, solar wind and the Earth. Journal of Atmospheric and Solar-Terrestrial Physics, 70(17), pp.2112-2117.
        https://www.sciencedirect.com/science/article/pii/S1364682608002526

        For Zherebtsov et al. 2019 the journal is good but I have problems with the article. The sunspot graph in figure 1 is incorrect. Figures 2 and 3 are about a single event. And then they use their own model based on Svensmark hypothesis with the idea that alterations in atmospheric electricity regulate cloud formation and distribution. In my opinion all very weak and speculative.

        Kushnir & Stein 2019 is a comparison of proxy climate data from the Eastern Mediterranean with reconstructed solar activity. It is moderately interesting and the association of Eastern Mediterranean precipitation with solar activity appears correct. The publisher MDPI is borderline, and has been involved in multiple controversies.
        https://en.wikipedia.org/wiki/MDPI

  28. “We derive the total rainfall, number of raindays, wettest day of the month and the simple daily intensity index for each city over the past 178 years, and find relatively consistent relationships between all indices despite potential data quality issues associated with the historical data. We identify several extreme daily rainfall events in the pre-1900 period in Sydney and Melbourne that warrant further examination as they appear to be more extreme than anything in the modern record”

    More evidence of natural variability when the establishment narrative is otherwise.

    https://www.sciencedirect.com/science/article/pii/S221209471930009X

  29. So now that we know, again, that the Holocene Temperature Maximum is real and was not caused by CO2, and that the cooling since then was not caused by CO2, perhaps we can do something rational, like divert the exorbitant CO2 research funds to cleaning up the plastic debris in the oceans – which is not only reasonable but possible, neither of which is true for CO2 mitigation

    • Or is it not possible to convince those whose livelihoods depend on CO2 mitigation to devote themselves instead to ocean sweep-up?

      • aporiac1960

        “is it not possible to convince those whose livelihoods depend on CO2 mitigation to devote themselves instead to …?”

        According to Upton Sinclair it is impossible. I’m sure he’s right. That is not to say their efforts cannot be redirected. The same coercive forces that led them to discover an interest in CO2 mitigation would be equally effective in the service of any other program. I’m not sure sure that counts as ‘conviction’ in an absolute sense, but in the Upton Sinclair pragmatic sense, it is difference without a distinction.

  30. Ireneusz Palmowski

    The jetstream will bring heavy thunderstorms to the central US on June 23.
    https://earth.nullschool.net/#2019/06/23/0000Z/wind/isobaric/250hPa/orthographic=-113.01,40.75,862

  31. Ireneusz Palmowski

    This is the stratospheric intrusion forecast this week.

  32. The notion that there are “common breaks in a multivariate system with joined segmented trends” is something one might expect from econometricians accustomed to dealing with the vagaries of man-made market indices. Natural geophysical variables, by contrast, are physical processes that operate continuously–without any abrupt breaks. In any event, far greater insight into the resultant system and its signals is provided by multi-dimensional Fourier analysis than by classic statistical methods, no matter how sophisticated, applied to highly arbitrary constructions of “joined segmented trends.”

  33. This important palaeo paper confirms that the Holocene temperature history in central Europe is more similar to the Greenland based proxies, showing a clear early Holocene warm optimum, than to computer models and other biological proxies favoured by Shakun, PAGES2000 and others.

    <>

    So the early Holocene optimum, just like the MWP and LIA, are real.

    In other news (that you won’t have read in the MSM):

    https://notrickszone.com/2019/06/16/antarctic-dome-a-station-sets-new-record-low-82-7c-and-arctic-ice-hasnt-melted-this-decade/

    https://chiefio.wordpress.com/2019/06/13/crop-failure-year-looms/

    https://www.iceagenow.info/nasa-largest-glacier-in-greenland-is-growing-for-third-year-in-a-row/

  34. Syntax fail.
    This is the above link I meant to copy:

    Our results support the existence of a European Holocene Thermal Maximum and data-model temperature discrepancies.” https://advances.sciencemag.org/content/5/6/eaav3809?utm_campaign=toc_advances_2019-06-07

  35. Wade Allison – The energy revolution must be nuclear
    http://www.onlineopinion.com.au/view.asp?article=20371&page=0
    Excerpt:
    “Today, however, we stand at the threshold of a new energy revolution. The benefits of fossil fuels no longer outweigh the costs, and standard renewables remain as weak and unreliable as before the Industrial Revolution.”

    • “The public accepts the use of nuclear technology for human health; it should do the same for the health of the planet. Yet, although fears of nuclear power have no scientific basis – indeed, nuclear power is far safer than any other energy source – they pervade public policy, with the risks often being fictionalized for the sake of entertainment.

      Our future, and the health of our environment, requires us to change course and embrace nuclear power. To support this shift, more comprehensive and accurate education should be provided, both for the general public and for today’s young people who one day will be building and operating nuclear power stations throughout the world. At the same time, much of the precautionary bureaucracy put in place during the Cold War needs to be reconsidered.

      In terms of safety, reliability, efficiency, and environmental friendliness, nuclear energy is the best candidate to replace fossil fuels. Without it, the energy revolution the world urgently needs will never happen.”

  36. Ireneusz Palmowski

    The production of these maps is possible thanks to the free public data policy of the United States of America. My thanks go to the National Centers for Environmental Prediction and National Weather Service for their reliable service providing global numerical weather prediction data.

  37. One wonders why…
    “WMGHG is the radiative forcing of the well-mixed greenhouse gases and has a largely anthropogenic origin; Total Radiative Forcing (TRF) is WMGHG plus the radiative forcing of ozone, stratospheric water vapor, land use change; snow albedo, black carbon, reflective tropospheric aerosols, the indirect effect of aerosols and solar irradiance.”
    1. “radiative forcing of the well-mixed greenhouse gases and has a largely anthropogenic origin” – Most of the well-mixed GHGs is CO2 and most of that – 95%? – is of non-human origin
    and
    2. stratospheric water vapor is mentioned but tropospheric water vapor is not – tropospheric water vapor being 99% of the water vapor in the atmosphere and responsible for 60% – 90% of GHG effect, depending on your politics, as well as for – after condensation – most of the albedo of the globe.

    That sounds like an incantation that might begin a convocation of true believers. Scientists, skeptics, are supposed to be better than that. For shame!

  38. There really was a hiatus in global warming

    More confirmation here:

    K. Haustein, F.E. Otto, V. Venema, P. Jacobs, K. Cowtan, Z. Hausfather, R.G. Way, B. White, A. Subramanian, and A.P. Schurer, “A limited role for unforced internal variability in 20th century warming.”, Journal of Climate, 2019. http://dx.doi.org/10.1175/JCLI-D-18-0555.1

    from whom: Overall, our results support previous work that has shown
    415 that using updated external radiative forcing (Huber and Knutti 2014; Schmidt et al. 2014) and
    416 accounting for ENSO-related variability explains the so-called ”hiatus”.

    It would seem obvious that if the so called “hiatus” has been [explained], then it must have occurred.

    h/t RealClimate

  39. Ireneusz Palmowski

    A dangerous storm front in Louisiana, Texas and Arkansas.
    https://www.accuweather.com/en/us/ruston-la/71270/weather-radar/333433

  40. Strengthening tropical Pacific zonal sea surface temperature gradient consistent with rising greenhouse gases

    As exemplified by El Niño, the tropical Pacific Ocean strongly influences regional climates and their variability worldwide1–3. It also regulates the rate of global temperature rise in response to rising GHGs4. The tropical Pacific Ocean response to rising GHGs impacts all of the world’s population. State-of-the-art climate models predict that rising GHGs reduce the west-to-east warm-to-cool sea surface temperature gradient across the equatorial Pacific5. In nature, however, the gradient has strengthened in recent decades as GHG concentrations have risen sharply5. This stark discrepancy between models and observations has troubled the climate research community for two decades. Here, by returning to the fundamental dynam-ics and thermodynamics of the tropical ocean–atmosphere system, and avoiding sources of model bias, we show that a parsimonious formulation of tropical Pacific dynamics yields a response that is consistent with observations and attribut-able to rising GHGs. We use the same dynamics to show that the erroneous warming in state-of-the-art models is a con-sequence of the cold bias of their equatorial cold tongues. The failure of state-of-the-art models to capture the correct response introduces critical error into their projections of cli-mate change in the many regions sensitive to tropical Pacific sea surface temperatures.

    • It’s very reassuring that models can be tweaked to correspond to past data. Now all we have to do is make them predict reliably. That’s Freeman Dyson’s opinion, not just mine.