Site icon Climate Etc.

Will the Oroville Dam survive the ARkStorm?

by David Hagen

Should California plan for permanent drought or climate persistence?

After six years of drought, California was deluged in 2017 by sequential “atmospheric rivers” causing the highest January/February rainfall on recent record. California’s Oroville Dam experienced failures of both its primary and emergency spillways. The accumulated debris even blocked the residual hydropower flow. That triggered emergency draw down and round the clock emergency spillway repairs, through destroying the lower main spillway. A modeled larger ARkStorm (“Atmospheric River 1000 [year] Storm”), or one of seven larger 270 year megafloods, would likely cause California 350% or more of the damage of a 7.8 magnitude San Andreas fault ShakeOut earthquake scenario! These very real dangers of breaching the US’s highest dam, with consequent catastrophic flooding, confront us with challenging issues:

1. Are dams designed for millennial confluences of cyclones, weather fronts, and thunderstorms?
2. How many dams will be breached by the next “Pineapple Express” driven AD 1862 or 1605 level megaflood?
3. How much does human global warming add to such natural variations with climate persistence?
4. Will California rely on fragile central planning or develop robust distributed resilience?

To explore these issues, lets examine the Oroville Dam failures, and their larger context.

Oroville Dam Spillway Failures

On Feb. 12th, 2017, California’s authorities ordered the astonishing emergency evacuation of 188,000 people in Butte, Sutter, and Yuba counties, in the flood path of the Oroville high dam. (Mercury News 2017aDWR 2017aNDPD 2017). That was triggered by a flash warning that catastrophic failure of the emergency spillway could occur within 45 minutes due to rapid erosion by the “managed” overtopping flow – right after calming official assurances that there was no danger of dam failure.

This is soberingly similar to official assurances that “Iron” dam Soviet design of China’s Banqiao Dam was invincible (Si 1998). Officials had even authorized retaining another 32 million cubic meters of water above the dam’s safe design capacity. Yet some 171,000 to 230,000 people died from the 1975 catastrophic failure of China’s Banqiao Dam and Shimantan Dams, when deluged by Super Typhoon Nina being blocked by a cold front. See BritannicaFish (2013), Si (1998)Ingomar200 (2013). That, with the domino failure of sixty downstream dams, displaced eleven million people. An overflow caused catastrophic breach and failure of the USA’s highest dam is thus no empty threat.

Built on “solid rock”, Oroville Dam’s emergency spillway had an official design capacity of 350,000 cubic feet per second (cfs) with a 17 ft deep (5.2 m) overflow – 350% of Niagara Falls’normal treaty flow of 100,000 cfs, (Niagara Parks). Yet on Feb. 12th, a flow of 12,000 cfs – just 3.4% of design – began rapidly eroding the “weathered fractured” schist (metamorphic rock) of the downhill part of the emergency spillway and progressing upstream towards its concrete ogee weir (Boxall 2017MetaBunk 2017).

The chaotic Oroville emergency evacuation of 188,000 people took 7.5 hours! Yet should Oroville Dam actually fail, downstream residents would not be warned in time (AP 2017). Geophysicist Steven Ward modeled floods from partial breaches to full failure of Oroville’s Dam. Ward found that even a 20 m (66′) deep partial breach extending 400 m (1310′) along Oroville’s emergency spillway ogee weir would flood the nearest SR70 escape route within 35 minutes, and flood Yuba City in 18 hours. Like the sudden Banqiao flood, many downstream residents would likely drown.

Regulatory Failures?

Have government regulators been protecting the public welfare? Or their jobs? The day after Oroville’s emergency evacuation, the US Federal Energy Regulatory Commission (FERC) ordered California’s Dept. of Water Resources (DWR) to:

“initiate immediate design of emergency repair to minimize further degradation of both the emergency spillway and the service spillway. In addition, DWR shall convene an Independent Board of Consultants (BOC).” (FERC 2017).

During Oroville’s 2005 relicensing, Stakeholder EE56 had specifically asked:

“EE56 – – prepare flood inundation maps for a 1997(?) worse case with 300,000 cfs coming out of the dam’s normal and emergency spillways. In 1997, it is believed that Oroville storage was almost to a point where the 300,000 cfs of inflow was going to pass through the reservoir. DWR was making plans to evacuate the power plant. The 300,000 would have topped the levees and put 10 feet of water into the town of Oroville.”

Yet Oroville Dam was designed and authorized to operate with the unhardened emergency spillway. If the reservoir was already above the lowered design level, then some or all the peak design inflow of 960,000 cfs would flow out, overtopping the emergency spillway. (DWR 2005a, FERC 2007aFERC 2007b). A actual breach of the main dam would cause a 100 fold higher outflow of about 32 million cfs (~900,000 m3/s). That would raise the Feather River 144 ft (44 m) above its bed in Oroville. (Ward 2017) Why did regulators not investigate in 2005 with such urgency and thoroughness the recommendations by Friends of the River, the Sierra Club and the South Yuba Citizens League:

“to require that the dam’s emergency spillway be armored with concrete rather than remain as just a hillside”? (Mercury News 2017b)

The Probable Maximum Flood design only considered “non-operation of one and two spillway gates.” The DWR chose “not to pursue installing Obermeyer gates, or any other structure requiring human or mechanical operation, on the emergency spillway.”

Why has Oroville’s main spillway now failed, needing massive rebuilding – after so many “satisfactory” regulatory inspections? Why had the emergency spillway never been tested? Why were dual emergency and spillway failures, combined with hydropower flow blockage not addressed during relicensing?

Were Oroville’s severe spillway and emergency spillway failures and emergency evacuation but bad management by bungling bureaucrats? Did federal law transferring DWR’s “authority to prescribe flood protection measures at the Oroville Facilities to USACE” cause that danger to “fall through the cracks”?

Have regulators been blinded by the mantra of “permanent drought” caused by “indisputable” “climate change”? Is this just another case of “extreme weather” caused by “catastrophic majority anthropogenic global warming”?

OR does this reveal a systemic failure to explicitly prepare worst case conditions including a failure tree of all possible failures? Like the Banqiao regulators, do these failures unveil unfamiliarity with underlying natural phenomena of “climate persistence”?

Have regulators fully addressed sequential “Pineapple express” atmospheric rivers”, with cyclones colliding with weather fronts and thunderstorms? (NASA 2017NOAAaNOAAb) Will these cause more frequent “perfect storms” than expected?

Climate Persistence

Seven years of great abundance are coming throughout the land of Egypt, but seven years of famine will follow them.” Joseph

Why were Pharaoh’s wise men unable to anticipate the seven year long bounty and seven year famine that Joseph interpreted and predicted? After six years of “perpetual” drought, California is now deluged with the highest precipitation for January and February on human record (Sacbee2017b). Have dam designers, regulators, and climate modelers fully grappled with the “climate persistence” that Joseph warned of and planned for?

.

Such natural extremes from climate persistence are quantitatively modeled by Harold E. Hurst (1951) in his breakthrough hydrological analysis of the 813 year record of Nile river flows (Rikert 2014). Russian A.N. Kolmogorov (1940) also published theoretical models. Demetris Koutsoyiannis and ITIA colleagues at the National Technical University of Athens systematically extended Hurst/Kolmogorov dynamicsWyatt and Curry have further quantified the “Stadium Wave” sequence of oceanic/atmospheric oscillations around the earth (Wyatt 2013).

.

Koutsoyiannis et al. show natural climate persistence causes consistently higher event frequency than Markovian, or Las Vegas roulette of random “white noise” phenomena. The little known Hurst standard deviations due to Hurst-Kolmogorov dynamics (a.k.a. climate persistence) are much higher than Markovian variations and typically TWICE as large as commonly calculated standard deviations of random “white noise” in climate models.

.

Actual peak climate variations are often more than DOUBLE those confidently predicted by blind application of random statistics. This >200% difference between “climate persistence” and “white noise” “climate” models is vitally important when designing hydropower dams, ocean breakwaters, and similar power and protective structures, and in modeling climate change.

Have engineers and regulators fully incorporated climate persistence when quantifying the Probable Maximum Flood (PMF)? Such failures to recognize and calculate climate persistence Hurst deviations will likely cause further catastrophic failures of critical infrastructure in California and elsewhere. Will regulators now recognize climate persistence and redesign emergency management for such peak flows from statistically quantifiable natural events?

.

More Perfect Storms?

Are California’s current storms unusually strong “extreme weather” amplified by human “climate change”? The “perfect storm” that breached Banqiao Dam in China’s Henan province on Aug 5, 1975 was due to Super Typhoon Nina being stopped by a cold front (Britannica 2014Ingomar200 2013). It dropped the area’s average yearly rainfall in less than 24 hours. That day’s 106 cm (41.7″) of rain dwarfed the 30 cm (11.8″) daily limit the dam’s designers had considered and the 10 cm forecast. Driven by political pressure of “primacy to accumulation and irrigation” above all else, Banqiao Dam had only 5 of the 12 sluice gates recommended by hydrologist Chen Xing. The deluge of 120% of annual rainfall in 24 hours breached the 387 ft (118 m) high Banqiao Dam, and the Shimantan Dam. That in turn destroyed ~62 downstream dams. Zongshu & Nniazu (2011) report that by 2003 China saw 3,481 dams collapse!

Why did Chinese bureaucrats not prepare for the 1975 deluge – 353% higher than the Banqiao dam emergency design level on the Ru river? Authorities claimed the 1974 storm was a 1 in 2000 year event when the dam was designed for 1 in 1000 year events. Liu et al. (2011) note China’s Hydraulic Design Code is based on a Log Pearson type 3 distribution 100 year return. Yet 147 dams including the Banqiao had already failed from larger floods.

Riebsame’s (1988) review shows California’s Folsom and Oroville Dams were confidently designed for 500 year Reservoir Design Floods (RDFs) based on the Dec. 1937 rainfall. However, the major 1950 and 1955 rainfalls forced designers to drastically cut Folsom Dam’s RDF by 76% to 120 years. The Oroville dam design then had to be rapidly resized before being built in 1965 – yet its capacity was still severely stressed in 1986. From 2005 through June 2013 US:

“state dam safety programs reported 173 dam failures and 587 “incidents” – episodes that, without intervention, would likely have resulted in dam failure.” (ASDSO 2017)

Now Dowdy and Catto’s (2017) analysis of storm combinations and extreme weather finds:

“The highest risk of extreme precipitation and extreme wind speeds is found to be associated with a triple storm type characterized by concurrent cyclone, front and thunderstorm occurrences.”

In its Oroville Dam relicensing DWR (2004) assumed a 1 in 500 year average reoccurrence period Probable Maximum Flood (PMF). Oroville Dam was designed for an additional 8 day volume of 5,217,300 acre ft (227 billion cu ft, 6.44 billion m3) from 28.9″ precipitation (4.5″ snowmelt + 22.6″ rainfall) on its 3607 sq mi drainage area. Design outflow was calculated from the overtopping and expected failure of the upstream Butt Valley Dam.

In February 2017, a Niagara Falls’ 100,000 cfs flow was discharged through Oroville Dam’s severely damaged spillway – sacrificed to protect and repair the emergency spillway. How would downstream cities of Oroville, Biggs, Gridley, Live Oak on down to Yuba City handle even a partial failure of Oroville Dam’s emergency overflow as modeled by Steven Ward? How well would they survive 960% of Niagra’s flow, if a damaged Oroville Dam was again filled to overflowing and was then breached with the full peak design inflow?

The Coming ARkStorm? And the next California megaflood?

And the next California megaflood?

.

The United States Geological Service (USGS) with California’s Dept. Water Resources, led a study modeling a 25 day ARkStorm – the “Atmospheric River 1000″ (year storm) (USGS 2011Porter et al. 2011). Will the Oroville Dam survive that larger ARkStorm? OR will downstream Oroville, Biggs, Gridley, Live Oak on down to Yuba City be inundated by catastrophic Oroville dam breaches following main spillway and/or emergency spillway breaches? What then of California’s other 832 “high-hazard dams” (52% of total) – each of which could cause death on failure? (Snibbe 2017).

.

Could the Oroville Dam survive an AD 1861-62 deluge that flooded 3 to 3.5 million acres of the Sacramento and San Joaquin valleys ()? Ingram (2013) describes this 43 day storm that flooded central and southern California for up to 6 months ().

.

Could the Oroville Dam survive an AD 1861-62 deluge that flooded 3 to 3.5 million acres of the Sacramento and San Joaquin valleys (Newbold 1991)? Ingram (2013) describes this 43 day storm that flooded central and southern California for up to 6 months (Maximus 2017).

.

“One-quarter of the state’s estimated 800,000 cattle drowned in the flood, . . . One-third of the state’s property was destroyed, and one home in eight was destroyed completely or carried away by the floodwaters.”

further warned that seven even larger megastorms had hit California in the last 1,800 years. What then if it were hit by another AD 1650 scale mega flood, which was 50% greater than the 1861 flood? Are we fully accounting for natural ranges of “extreme weather”? Or are regulators fixated on “climate models”? Are regulators accounting for California’s 1800 year record of megafloods about every 270 years?

.

Porter et al. 2011 further warned that seven even larger megastorms had hit California in the last 1,800 years. What then if it were hit by another AD 1650 scale mega flood, which was 50% greater than the 1861 flood? Are we fully accounting for natural ranges of “extreme weather”? Or are regulators fixated on “climate models”? Are regulators accounting for California’s 1800 year record of megafloods about every 270 years?

.

The nearly 50″ rainfall of 1861-62 was followed by a drought with little more than 4″ for 1862-63! Benson et al. (2002) found “during the past 2740 yr, drought durations ranged from 20 to 100 yr and intervals between droughts ranged from 80 to 230 yr.” Ingram et al. (2005) document previous droughts and floods. These suggest California’s recent 6 year drought is unremarkable.

.

Lamontagne (2014) analyzed California rainfall records, including the 107 year record at Oroville. The 2017 storms are now raising concerns about the next ArkStorm (Dowd 2017). That “Atmospheric River 1000 Storm” (ARkStorm) – could cause more than 300% of the damage from a 7.8 magnitude San Andreas fault ShakeOut earthquake scenario (Perry, S., et al. 2008).

.

“The ARkStorm storm is patterned after the 1861-62 historical events but uses modern modeling methods and data from large storms in 1969 and 1986. . . . Megastorms are California’s other Big One. A severe California winter storm could realistically flood thousands of square miles of urban and agricultural land, result in thousands of landslides, disrupt lifelines throughout the state for days or weeks, and cost on the order of $725 billion. This figure is more than three times that estimated for the ShakeOut scenario earthquake, that has roughly the same annual occurrence probability as an ARkStorm-like event.”

.

The USGS (2011) in its ARkStorm Scenario warns:

.

“An ARkStorm is plausible, perhaps inevitable: The geologic record shows 6 megastorms more severe than 1861-1862 in California in the last 1800 years. . . megastorms that occurred in the years 212, 440, 603, 1029, 1418, and 1605, coinciding with climatological events that were happening elsewhere in the world.”

.

As California sought to restock its 6 year drought depleted reservoirs, January’s rainfall soared to 232% of average. Lake Shasta rose rapidly from 31% of capacity to 95% in 7 months (August 2016 to February 23, 2017). This is raising concerns for the rest of California’s dams (Snibbe 2017). Engineers tested 602 ft (183.5 m) tall Shasta Dam’s top overflow gates for the first time and then began dumping water to prepare for the next storm (Serna 2017).

.

Fearing further damage and failure of Oroville’s main spillway, authorities instead allowed Lake Oroville to rise to overflow Oroville Dam’s emergency spillway (Nazarian A. 2017). The Lake Oroville reservoir rose much faster in 2016-17 than in 1982-82 or 1977-78 just after falling to 33% of capacity in 2015 from its 6 year drought (DWR 2017d). The resultant severe erosion and potential breach of Oroville Dam’s emergency spillway’s was then compounded by more atmospheric rivers (Watts 2017a).

.

Meteorologist R. Maue warned of 10 trillion gallons of rainfall on California (Watts 2017b). The National Oceanic and Atmospheric Administration forecast up to 11″ of rain to the Lake Oroville watershed from Feb. 20-27 (NOAA 2017). Authorities urgently began emergency drawdown through the main damaged spillway, sacrificing the main spillway’s lower portion to prevent a worse failure and breach of the emergency spillway. All hydropower turbine flow was blocked by downstream debris as well as being shut off to prevent “no load” failure.

.

Like the Banqiao Dam workers, crews have been working around the clock to patch Oroville Dam’s emergency spillway (Mercury 2017c). By March 16th, the massive $4.7 million/day emergency repair effort had dredged about 1.24 million of the 1.7 million cu yds of eroded debris, (while damaging the access roads). That opened a downstream channel large enough to flow 13,000 cfs through Oroville’s Hyatt Power Plant’s hydro turbines, barely keeping up with low inflows (DWR 2017e). Round the clock hardening of the main spillway breach and the emergency spillway erosion continue, preparing for oncoming rains.

.

In just two years, Nevada’s snowpack has gone from near record low in 2015 to a record high in 2017, reaching 17′ at Mt. Rose and outstripping measurement tools (Spillman 2017). California’s North Sierra Precipitation: 8-Station Index for February 2017 was ~293% of average and the cumulative precipitation was running 26% above the wettest 1982-83 year (DWR 2017c). The Feather River Basin had reached 158% of average snowpack, with more snow on the way. California’s average snowpack had reached 186% of normal for March 1st, well above levels that caused its 1983 floods (DWR 2017b).

.

Will California seriously address such far greater dangers of the next natural 1000 year ARkStorm? Or will it persist in prioritizing mandated “climate change” mitigation (a.k.a. “catastrophic majority anthropogenic global warming”)? Will authorities clearly warn downstream residents that the ARkStorm or the next megaflood will require evacuation with a high likelihood of major flood damage? And advise that other taxpayers should NOT have to cover that clearly forseeable risk?

.

Archaic Design and Operating Requirements

.

How well was Oroville Dam designed for such California megafloods? Even with sub megaflood rains, Oroville Dam’s lower spillway has now failed. Could it have tolerated a full 1 in 1000 year Atmosphere River ARkStorm? What of a much larger AD 1605 megaflood overflow? OrovilleDam’s unhardened earthen emergency spillway began to rapidly erode uphill and almost breach the weir from just 3% of the design overflow. The combined 1.7 million cu yards of debris even blocked the residual 14,000 cfs design flow through Oroville’s Hyatt Power Plant.

.

Dam experts like Scott Cahill (2017) are now explaining and detailing Oroville Dam’s multiple dangerous failures. Why were none of the current major repairs and necessary upgrades even envisioned in the Oroville Dam Relicensing capital and maintenance budgets (DWR 2005b)? Cahill’s (2017) articles further expose Oroville’s numerous coverups, cultural, and regulatory failures. Oroville Dam’s flood-control manual hasn’t been updated for half a century (Sacbee 2017a):

.

“The critical document that determines how much space should be left in Lake Oroville for flood control during the rainy season hasn’t been updated since 1970, and it uses climatological data and runoff projections so old they don’t account for two of the biggest floods ever to strike the region. . . . At Oroville, the manual cites weather patterns prior to the 1950s, and data doesn’t account for the catastrophic floods of 1986 and 1997.”

.

The mandated Oroville Dam Flow control requirements were not designed for even the coming ARkStorm (USGS 2011). The ARkStorm study:

.

“represents an important wake-up call about the extensive devastation and long-term consequences an extreme event of this magnitude might cause. . . . The ARkStorm scenario raises serious questions about the ability of existing national, state, and local disaster policy to handle an event of this large magnitude.”

.

Overtopping, failure and collapse, from a severe storm and neglect, caused the May 31, 1889 failure of South Fork Dam near Johnstown, Pa., killing 2,209 people – the largest US disaster until 9/11 (Ward 2011). California had 45 dam failures from 1883 to 1965. Cavitation erosion severely damaged the Glen Canyon spillway in 1983. Aeration was added to correct that problem and then to Hoover and Blue Mesa dams (Falvey 1990). Why was the Oroville Dam not retrofitted with protective aeration to prevent erosion per best practice? US Army Corp/DOI (2015)Rogers (2017).

.

Yet the ARkStorm study still assumed all requirements by the California Division of the Safety of Dams (DSOD) would be met with only:

.

“. . . minor spillway damage or erosion in downstream channels. . . . When spillways sit untested for years, then are subjected to continuous flow for an extensive period, damage is possible or even likely.”

.

The ARkStorm study excluded or hid the dangers of Oroville Dam’s current spillway failures. Instead it assumed:

.

“. . .A component of this (DSOD) inspection is to verify spillways are unobstructed and fully functional. . . .Because of the extremely sensitive nature of a dam-damage scenario, the selection of a particular dam to imagine as hypothetically damaged in such a way is left to emergency planners.”

.

Would the Oroville dam survive this much greater 1 in 1000 year ARkStorm? Could today’s conventional planning allow a greater disaster than Johnstown flood? What then when California will experience another of its 270 year megafloods? Far better to address that worst case Probable Maximum Precipitation (PMP) up front now than stumble into a predictable disaster greater than 9/11.

.

Underestimation by Classical Statistics

Do Las Vegas roulette statistics describe nature? OR does nature “bias” the dice by “climate persistence”?

Demetris Koutsoyiannis (2010b) Memory in climate and things not to be forgotten

Demetris Koutsoyiannis summarizes how classical statistics severely underestimate natural extreme weather events which are dominated by Joseph like climate persistence. (Edited personal communication):

“Classical statistics underestimates the standard deviation; the underestimation could be 50% or less, but it could be even (much) higher than that, depending on:

  1. the sample size,
  2. the time scale to which the estimation refers, and
  3. the Hurst coefficient.

There are two factors leading to underestimation. At the basic (lowest) time scale (i.e. that on which the measurements are made) the underestimation is due to statistical bias.

E.g., in “Things not to be forgotten” (Koutsoyiannis 2010b), you will see the explanation of the bias in slides 29-20. Then you may see how this materializes in some real world case studies. For example at the graph in slide 23 at the smallest scale 1 (Log (scale) = 0) there is a huge underestimation as shown in the graph.

As the time scale increases, i.e. as we move from weather to climate, the underestimation inflates, as seen by comparing the HK and “white noise” curves in these graphs. This is the second factor, which represents the effect of aggregating at longer time scales.

See Fig. 9 and its discussion in Markonis & Koutsoyiannis (2013) Climatic Variability . . . . This is reproduced as Fig. 15 in Koutsoyiannis (2013) Hydrology and Change.”

Global climate models severely underestimate climate variability by their systemic weakness of Las Vegas roulette statistics. Serrat-Capdevila et al. (2016) observe:

“. . .since uncertainty is a structural component of climate and hydrological systems, Anagnostopoulos et al. (2010) found that large uncertainties and poor skill were shown by GCM predictions without bias correction. . . it cannot be addressed through increased model complexity. . . . Koutsoyiannis (2011) showed that an ensemble of climate model projections is fully contained WITHIN the uncertainty envelope of traditional stochastic methods using historical data, including the Hurst phenomena. . . .the Hurst phenomena (1951) describes the large and long excursions of natural events above and below their mean, as opposed to random processes which do not exhibit such behavior. The uncertainty range in time-averaged natural Hurst-Kolmogorov processes is much greater than that shown in GCM projections and classical statistics.”

Null hypothesis of climate persistence

What will it take to distinguish anthropogenic from natural climatic changes? Taking at face value the IPCC’s 1990 warning of 3°C/century warming, I wrote a 330 page report on using solar thermal technologies to redress that predicted anthropogenic warming (Hagen & Kaneff 1991). However, the actual global surface warming over the satellite era since 1979 has only been about 50% of what the IPCC predicted. Model predictions of the anthropogenic signature of tropical tropospheric temperature are running 300% of actual warming (Christy 2016). The harms from projected warming appear to have been strongly overstated and the benefits understated (Michaels 2017). Furthermore, the IPCC (1990) stated:

“The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.”

We now recognize more of the natural climatic variations caused by the interaction of five highly nonlinear coupled chaotic processes: earth’s atmosphere, oceans, volcanoes, solar weather, and galactic cosmic rays (Curry 2016Curry 2017). It appears that we need to replace determinative with stochastic models. (There is still strategic economic and national security benefits in developing dispatchable solar power and fuels cheaper than fossil power and oil.)

In light of ignoring climate persistence and poor predictions by climate models, and as detailed by Koutsoyiannis et al. and summarized above, I propose the following Climatic null hypothesis that:

Natural climatic variation is quantified by the stochastic uncertainty envelope of historical and paleo data, embodying the nonlinear chaotic interaction of atmospheric, oceanic, volcanic, solar, and galactic processes, including climate persistence quantified by Hurst-Kolmogorov dynamics.”

Scientists proposing catastrophic majority anthropogenic global warming models (a.k.a. “Climate change”) bear the burden of proof of providing clear robust evidence supporting validated model predictions of anthropogenic warming with strong significant differences from this climatic null hypothesis.

Bailey (2017) shows 5 sigma scientific models often deviate by up to five orders of magnitude beyond naive normal distribution assumptions. Christy (2016)’s comparisons show that today’s climate models suffer from severe Type B systemic errors of collective climate sensitivity assumptions of the tropospheric tropical temperature “anthropogenic signature” are about 300% far above long term observations.

Johnson (2013) quantifies the much more stringent statistical requirements essential to develop reproducible scientific models, especially with highly significant findings – with trillion dollar consequences:

“evidence thresholds required for the declaration of a significant finding should be increased to 25–50:1, and to 100–200:1 for the declaration of a highly significant finding.”

Koutsoyiannis (2011) showed that “an ensemble of climate model projections” of (realistic) global climate models are statistically likely to be within this climatic null hypothesis. Thus, the anthropogenic climate hypothesis will need to show alternative robust statistics per Johnson’s critieria, with clear anthropogenic caused deviations from natural climate persistence of full Hurst-Kolmogorov dynamics.

Anthropogenic contributions to climate are obvious by the Urban Heat Island Effect and hawks circling over ploughed fields. However, when comparing model outputs against Hurst climate persistence, I find that the catastrophic majority anthropogenic global warming hypothesis to be formally “Not Proven” per Bray (2005), and Johnson (2013).

Consequences?

How can we grapple with China’s 62-64 domino dam catastrophe from the 1975 failures of Banqiao and Shimantan Dams, and comprehend how policy decisions killed 170,000 to 230,000 people?

Have we grasped that the next ARkStorm would likely cause three times the damage of California’s “Big One”? Has California confronted its seven megafloods with their sobering potential of destroying Los Angeles’ stored Sierra water supply while causing more deaths than 9/11?

Even a partial failure of Oroville Dam’s emergency spillway confronts us with flooding downstream cities faster than they could be evacuated. Have actuaries and insurance companies incorporated these risks? Why should “We the People” be forced to pay for foolish actions and politically driven failures of States to maintain and update dams based on known risks?

Has today’s focus on global warming embedded a “persistent drought” mentality? Has it blinded scientists and politicians from addressing the massive shifts from climate persistence that cause seven year type periods of plenty or famine as Joseph predicted and managed? Will California and FERC upgrade and update their dam peak flow capabilities, emergency dam management flow policies, and energy and water management, to fully account for the much higher probability of Hurst “climate persistence”?

Will hydrology, climate science and management incorporate full Hurst Kolmogorov dynamics with natural stochastic models. While hydrologists analyze natural event distributions (such as log Pearson type 3), have they included historic and paleo evidence of California’s 270 year mega floods and droughts? Will they account for sequential cyclones and “atmospheric rivers” hitting California? Will they address worst case hurricane level triple storm combinations of “concurrent cyclone, front and thunderstorm occurrences”?

Will we have statesmen who will confront the Super Typhoon Nina-weather front “perfect storm” that caused China’s 62 dam Banqiao Dam/Shimantan Dam catastrophe? OR will politicians lead California to suffer through cascading dam failures combined with a catastrophic Oroville Dam failure from the next Atmospheric River 1000 year level ARkStorm?
Has California really prepared for combined dam flood damage three times greater than the 7.8 magnitude San Andreas fault ShakeOut earthquake scenario? – Or considered the even greater AD1605 level flood?

How well would Los Angeles and Southern California survive cut off from their Northern Sierra Feather river water supply? Will they now pay for repairs and improvements to protect their drinking water supply? Will California continue to rely on its Russian roulette game of climate models? OR will Californians develop the robust resilience of distributed decisions and actions based on climate persistence?

Who will decide? Who will suffer, survive, or prosper?

I pray readers review and take action to redress these failures and warnings before another 100,000 more people die unnecessarily like those drowned by the Banqiao Dam and Shimantan Dam failures.

David L. Hagen

PS These observations are as a general energy researcher without extensive hydropower (1 course) or civil engineering expertise. While I raise issues, I look forward to experts addressing details and their feedback. Please cite the section quote the statement you are addressing.

[ REFERENCES ]

APPENDIX ]

Moderation note:  As with all guest posts, please keep your comments civil and relevant.

Exit mobile version