Will the Oroville Dam survive the ARkStorm?

by David Hagen

Should California plan for permanent drought or climate persistence?

After six years of drought, California was deluged in 2017 by sequential “atmospheric rivers” causing the highest January/February rainfall on recent record. California’s Oroville Dam experienced failures of both its primary and emergency spillways. The accumulated debris even blocked the residual hydropower flow. That triggered emergency draw down and round the clock emergency spillway repairs, through destroying the lower main spillway. A modeled larger ARkStorm (“Atmospheric River 1000 [year] Storm”), or one of seven larger 270 year megafloods, would likely cause California 350% or more of the damage of a 7.8 magnitude San Andreas fault ShakeOut earthquake scenario! These very real dangers of breaching the US’s highest dam, with consequent catastrophic flooding, confront us with challenging issues:

1. Are dams designed for millennial confluences of cyclones, weather fronts, and thunderstorms?
2. How many dams will be breached by the next “Pineapple Express” driven AD 1862 or 1605 level megaflood?
3. How much does human global warming add to such natural variations with climate persistence?
4. Will California rely on fragile central planning or develop robust distributed resilience?

To explore these issues, lets examine the Oroville Dam failures, and their larger context.

Oroville Dam Spillway Failures

On Feb. 12th, 2017, California’s authorities ordered the astonishing emergency evacuation of 188,000 people in Butte, Sutter, and Yuba counties, in the flood path of the Oroville high dam. (Mercury News 2017aDWR 2017aNDPD 2017). That was triggered by a flash warning that catastrophic failure of the emergency spillway could occur within 45 minutes due to rapid erosion by the “managed” overtopping flow – right after calming official assurances that there was no danger of dam failure.

This is soberingly similar to official assurances that “Iron” dam Soviet design of China’s Banqiao Dam was invincible (Si 1998). Officials had even authorized retaining another 32 million cubic meters of water above the dam’s safe design capacity. Yet some 171,000 to 230,000 people died from the 1975 catastrophic failure of China’s Banqiao Dam and Shimantan Dams, when deluged by Super Typhoon Nina being blocked by a cold front. See BritannicaFish (2013), Si (1998)Ingomar200 (2013). That, with the domino failure of sixty downstream dams, displaced eleven million people. An overflow caused catastrophic breach and failure of the USA’s highest dam is thus no empty threat.

Built on “solid rock”, Oroville Dam’s emergency spillway had an official design capacity of 350,000 cubic feet per second (cfs) with a 17 ft deep (5.2 m) overflow – 350% of Niagara Falls’normal treaty flow of 100,000 cfs, (Niagara Parks). Yet on Feb. 12th, a flow of 12,000 cfs – just 3.4% of design – began rapidly eroding the “weathered fractured” schist (metamorphic rock) of the downhill part of the emergency spillway and progressing upstream towards its concrete ogee weir (Boxall 2017MetaBunk 2017).

The chaotic Oroville emergency evacuation of 188,000 people took 7.5 hours! Yet should Oroville Dam actually fail, downstream residents would not be warned in time (AP 2017). Geophysicist Steven Ward modeled floods from partial breaches to full failure of Oroville’s Dam. Ward found that even a 20 m (66′) deep partial breach extending 400 m (1310′) along Oroville’s emergency spillway ogee weir would flood the nearest SR70 escape route within 35 minutes, and flood Yuba City in 18 hours. Like the sudden Banqiao flood, many downstream residents would likely drown.

Regulatory Failures?

Have government regulators been protecting the public welfare? Or their jobs? The day after Oroville’s emergency evacuation, the US Federal Energy Regulatory Commission (FERC) ordered California’s Dept. of Water Resources (DWR) to:

“initiate immediate design of emergency repair to minimize further degradation of both the emergency spillway and the service spillway. In addition, DWR shall convene an Independent Board of Consultants (BOC).” (FERC 2017).

During Oroville’s 2005 relicensing, Stakeholder EE56 had specifically asked:

“EE56 – – prepare flood inundation maps for a 1997(?) worse case with 300,000 cfs coming out of the dam’s normal and emergency spillways. In 1997, it is believed that Oroville storage was almost to a point where the 300,000 cfs of inflow was going to pass through the reservoir. DWR was making plans to evacuate the power plant. The 300,000 would have topped the levees and put 10 feet of water into the town of Oroville.”

Yet Oroville Dam was designed and authorized to operate with the unhardened emergency spillway. If the reservoir was already above the lowered design level, then some or all the peak design inflow of 960,000 cfs would flow out, overtopping the emergency spillway. (DWR 2005a, FERC 2007aFERC 2007b). A actual breach of the main dam would cause a 100 fold higher outflow of about 32 million cfs (~900,000 m3/s). That would raise the Feather River 144 ft (44 m) above its bed in Oroville. (Ward 2017) Why did regulators not investigate in 2005 with such urgency and thoroughness the recommendations by Friends of the River, the Sierra Club and the South Yuba Citizens League:

“to require that the dam’s emergency spillway be armored with concrete rather than remain as just a hillside”? (Mercury News 2017b)

The Probable Maximum Flood design only considered “non-operation of one and two spillway gates.” The DWR chose “not to pursue installing Obermeyer gates, or any other structure requiring human or mechanical operation, on the emergency spillway.”

Why has Oroville’s main spillway now failed, needing massive rebuilding – after so many “satisfactory” regulatory inspections? Why had the emergency spillway never been tested? Why were dual emergency and spillway failures, combined with hydropower flow blockage not addressed during relicensing?

Were Oroville’s severe spillway and emergency spillway failures and emergency evacuation but bad management by bungling bureaucrats? Did federal law transferring DWR’s “authority to prescribe flood protection measures at the Oroville Facilities to USACE” cause that danger to “fall through the cracks”?

Have regulators been blinded by the mantra of “permanent drought” caused by “indisputable” “climate change”? Is this just another case of “extreme weather” caused by “catastrophic majority anthropogenic global warming”?

OR does this reveal a systemic failure to explicitly prepare worst case conditions including a failure tree of all possible failures? Like the Banqiao regulators, do these failures unveil unfamiliarity with underlying natural phenomena of “climate persistence”?

Have regulators fully addressed sequential “Pineapple express” atmospheric rivers”, with cyclones colliding with weather fronts and thunderstorms? (NASA 2017NOAAaNOAAb) Will these cause more frequent “perfect storms” than expected?

Climate Persistence

Seven years of great abundance are coming throughout the land of Egypt, but seven years of famine will follow them.” Joseph

Why were Pharaoh’s wise men unable to anticipate the seven year long bounty and seven year famine that Joseph interpreted and predicted? After six years of “perpetual” drought, California is now deluged with the highest precipitation for January and February on human record (Sacbee2017b). Have dam designers, regulators, and climate modelers fully grappled with the “climate persistence” that Joseph warned of and planned for?

.

Such natural extremes from climate persistence are quantitatively modeled by Harold E. Hurst (1951) in his breakthrough hydrological analysis of the 813 year record of Nile river flows (Rikert 2014). Russian A.N. Kolmogorov (1940) also published theoretical models. Demetris Koutsoyiannis and ITIA colleagues at the National Technical University of Athens systematically extended Hurst/Kolmogorov dynamicsWyatt and Curry have further quantified the “Stadium Wave” sequence of oceanic/atmospheric oscillations around the earth (Wyatt 2013).

.

Koutsoyiannis et al. show natural climate persistence causes consistently higher event frequency than Markovian, or Las Vegas roulette of random “white noise” phenomena. The little known Hurst standard deviations due to Hurst-Kolmogorov dynamics (a.k.a. climate persistence) are much higher than Markovian variations and typically TWICE as large as commonly calculated standard deviations of random “white noise” in climate models.

.

Actual peak climate variations are often more than DOUBLE those confidently predicted by blind application of random statistics. This >200% difference between “climate persistence” and “white noise” “climate” models is vitally important when designing hydropower dams, ocean breakwaters, and similar power and protective structures, and in modeling climate change.

Have engineers and regulators fully incorporated climate persistence when quantifying the Probable Maximum Flood (PMF)? Such failures to recognize and calculate climate persistence Hurst deviations will likely cause further catastrophic failures of critical infrastructure in California and elsewhere. Will regulators now recognize climate persistence and redesign emergency management for such peak flows from statistically quantifiable natural events?

.

More Perfect Storms?

Are California’s current storms unusually strong “extreme weather” amplified by human “climate change”? The “perfect storm” that breached Banqiao Dam in China’s Henan province on Aug 5, 1975 was due to Super Typhoon Nina being stopped by a cold front (Britannica 2014Ingomar200 2013). It dropped the area’s average yearly rainfall in less than 24 hours. That day’s 106 cm (41.7″) of rain dwarfed the 30 cm (11.8″) daily limit the dam’s designers had considered and the 10 cm forecast. Driven by political pressure of “primacy to accumulation and irrigation” above all else, Banqiao Dam had only 5 of the 12 sluice gates recommended by hydrologist Chen Xing. The deluge of 120% of annual rainfall in 24 hours breached the 387 ft (118 m) high Banqiao Dam, and the Shimantan Dam. That in turn destroyed ~62 downstream dams. Zongshu & Nniazu (2011) report that by 2003 China saw 3,481 dams collapse!

Why did Chinese bureaucrats not prepare for the 1975 deluge – 353% higher than the Banqiao dam emergency design level on the Ru river? Authorities claimed the 1974 storm was a 1 in 2000 year event when the dam was designed for 1 in 1000 year events. Liu et al. (2011) note China’s Hydraulic Design Code is based on a Log Pearson type 3 distribution 100 year return. Yet 147 dams including the Banqiao had already failed from larger floods.

Riebsame’s (1988) review shows California’s Folsom and Oroville Dams were confidently designed for 500 year Reservoir Design Floods (RDFs) based on the Dec. 1937 rainfall. However, the major 1950 and 1955 rainfalls forced designers to drastically cut Folsom Dam’s RDF by 76% to 120 years. The Oroville dam design then had to be rapidly resized before being built in 1965 – yet its capacity was still severely stressed in 1986. From 2005 through June 2013 US:

“state dam safety programs reported 173 dam failures and 587 “incidents” – episodes that, without intervention, would likely have resulted in dam failure.” (ASDSO 2017)

Now Dowdy and Catto’s (2017) analysis of storm combinations and extreme weather finds:

“The highest risk of extreme precipitation and extreme wind speeds is found to be associated with a triple storm type characterized by concurrent cyclone, front and thunderstorm occurrences.”

In its Oroville Dam relicensing DWR (2004) assumed a 1 in 500 year average reoccurrence period Probable Maximum Flood (PMF). Oroville Dam was designed for an additional 8 day volume of 5,217,300 acre ft (227 billion cu ft, 6.44 billion m3) from 28.9″ precipitation (4.5″ snowmelt + 22.6″ rainfall) on its 3607 sq mi drainage area. Design outflow was calculated from the overtopping and expected failure of the upstream Butt Valley Dam.

In February 2017, a Niagara Falls’ 100,000 cfs flow was discharged through Oroville Dam’s severely damaged spillway – sacrificed to protect and repair the emergency spillway. How would downstream cities of Oroville, Biggs, Gridley, Live Oak on down to Yuba City handle even a partial failure of Oroville Dam’s emergency overflow as modeled by Steven Ward? How well would they survive 960% of Niagra’s flow, if a damaged Oroville Dam was again filled to overflowing and was then breached with the full peak design inflow?

The Coming ARkStorm? And the next California megaflood?

And the next California megaflood?

.

The United States Geological Service (USGS) with California’s Dept. Water Resources, led a study modeling a 25 day ARkStorm – the “Atmospheric River 1000″ (year storm) (USGS 2011Porter et al. 2011). Will the Oroville Dam survive that larger ARkStorm? OR will downstream Oroville, Biggs, Gridley, Live Oak on down to Yuba City be inundated by catastrophic Oroville dam breaches following main spillway and/or emergency spillway breaches? What then of California’s other 832 “high-hazard dams” (52% of total) – each of which could cause death on failure? (Snibbe 2017).

.

Could the Oroville Dam survive an AD 1861-62 deluge that flooded 3 to 3.5 million acres of the Sacramento and San Joaquin valleys ()? Ingram (2013) describes this 43 day storm that flooded central and southern California for up to 6 months ().

.

Could the Oroville Dam survive an AD 1861-62 deluge that flooded 3 to 3.5 million acres of the Sacramento and San Joaquin valleys (Newbold 1991)? Ingram (2013) describes this 43 day storm that flooded central and southern California for up to 6 months (Maximus 2017).

.

“One-quarter of the state’s estimated 800,000 cattle drowned in the flood, . . . One-third of the state’s property was destroyed, and one home in eight was destroyed completely or carried away by the floodwaters.”

further warned that seven even larger megastorms had hit California in the last 1,800 years. What then if it were hit by another AD 1650 scale mega flood, which was 50% greater than the 1861 flood? Are we fully accounting for natural ranges of “extreme weather”? Or are regulators fixated on “climate models”? Are regulators accounting for California’s 1800 year record of megafloods about every 270 years?

.

Porter et al. 2011 further warned that seven even larger megastorms had hit California in the last 1,800 years. What then if it were hit by another AD 1650 scale mega flood, which was 50% greater than the 1861 flood? Are we fully accounting for natural ranges of “extreme weather”? Or are regulators fixated on “climate models”? Are regulators accounting for California’s 1800 year record of megafloods about every 270 years?

.

The nearly 50″ rainfall of 1861-62 was followed by a drought with little more than 4″ for 1862-63! Benson et al. (2002) found “during the past 2740 yr, drought durations ranged from 20 to 100 yr and intervals between droughts ranged from 80 to 230 yr.” Ingram et al. (2005) document previous droughts and floods. These suggest California’s recent 6 year drought is unremarkable.

.

Lamontagne (2014) analyzed California rainfall records, including the 107 year record at Oroville. The 2017 storms are now raising concerns about the next ArkStorm (Dowd 2017). That “Atmospheric River 1000 Storm” (ARkStorm) – could cause more than 300% of the damage from a 7.8 magnitude San Andreas fault ShakeOut earthquake scenario (Perry, S., et al. 2008).

.

“The ARkStorm storm is patterned after the 1861-62 historical events but uses modern modeling methods and data from large storms in 1969 and 1986. . . . Megastorms are California’s other Big One. A severe California winter storm could realistically flood thousands of square miles of urban and agricultural land, result in thousands of landslides, disrupt lifelines throughout the state for days or weeks, and cost on the order of $725 billion. This figure is more than three times that estimated for the ShakeOut scenario earthquake, that has roughly the same annual occurrence probability as an ARkStorm-like event.”

.

The USGS (2011) in its ARkStorm Scenario warns:

.

“An ARkStorm is plausible, perhaps inevitable: The geologic record shows 6 megastorms more severe than 1861-1862 in California in the last 1800 years. . . megastorms that occurred in the years 212, 440, 603, 1029, 1418, and 1605, coinciding with climatological events that were happening elsewhere in the world.”

.

As California sought to restock its 6 year drought depleted reservoirs, January’s rainfall soared to 232% of average. Lake Shasta rose rapidly from 31% of capacity to 95% in 7 months (August 2016 to February 23, 2017). This is raising concerns for the rest of California’s dams (Snibbe 2017). Engineers tested 602 ft (183.5 m) tall Shasta Dam’s top overflow gates for the first time and then began dumping water to prepare for the next storm (Serna 2017).

.

Fearing further damage and failure of Oroville’s main spillway, authorities instead allowed Lake Oroville to rise to overflow Oroville Dam’s emergency spillway (Nazarian A. 2017). The Lake Oroville reservoir rose much faster in 2016-17 than in 1982-82 or 1977-78 just after falling to 33% of capacity in 2015 from its 6 year drought (DWR 2017d). The resultant severe erosion and potential breach of Oroville Dam’s emergency spillway’s was then compounded by more atmospheric rivers (Watts 2017a).

.

Meteorologist R. Maue warned of 10 trillion gallons of rainfall on California (Watts 2017b). The National Oceanic and Atmospheric Administration forecast up to 11″ of rain to the Lake Oroville watershed from Feb. 20-27 (NOAA 2017). Authorities urgently began emergency drawdown through the main damaged spillway, sacrificing the main spillway’s lower portion to prevent a worse failure and breach of the emergency spillway. All hydropower turbine flow was blocked by downstream debris as well as being shut off to prevent “no load” failure.

.

Like the Banqiao Dam workers, crews have been working around the clock to patch Oroville Dam’s emergency spillway (Mercury 2017c). By March 16th, the massive $4.7 million/day emergency repair effort had dredged about 1.24 million of the 1.7 million cu yds of eroded debris, (while damaging the access roads). That opened a downstream channel large enough to flow 13,000 cfs through Oroville’s Hyatt Power Plant’s hydro turbines, barely keeping up with low inflows (DWR 2017e). Round the clock hardening of the main spillway breach and the emergency spillway erosion continue, preparing for oncoming rains.

.

In just two years, Nevada’s snowpack has gone from near record low in 2015 to a record high in 2017, reaching 17′ at Mt. Rose and outstripping measurement tools (Spillman 2017). California’s North Sierra Precipitation: 8-Station Index for February 2017 was ~293% of average and the cumulative precipitation was running 26% above the wettest 1982-83 year (DWR 2017c). The Feather River Basin had reached 158% of average snowpack, with more snow on the way. California’s average snowpack had reached 186% of normal for March 1st, well above levels that caused its 1983 floods (DWR 2017b).

.

Will California seriously address such far greater dangers of the next natural 1000 year ARkStorm? Or will it persist in prioritizing mandated “climate change” mitigation (a.k.a. “catastrophic majority anthropogenic global warming”)? Will authorities clearly warn downstream residents that the ARkStorm or the next megaflood will require evacuation with a high likelihood of major flood damage? And advise that other taxpayers should NOT have to cover that clearly forseeable risk?

.

Archaic Design and Operating Requirements

.

How well was Oroville Dam designed for such California megafloods? Even with sub megaflood rains, Oroville Dam’s lower spillway has now failed. Could it have tolerated a full 1 in 1000 year Atmosphere River ARkStorm? What of a much larger AD 1605 megaflood overflow? OrovilleDam’s unhardened earthen emergency spillway began to rapidly erode uphill and almost breach the weir from just 3% of the design overflow. The combined 1.7 million cu yards of debris even blocked the residual 14,000 cfs design flow through Oroville’s Hyatt Power Plant.

.

Dam experts like Scott Cahill (2017) are now explaining and detailing Oroville Dam’s multiple dangerous failures. Why were none of the current major repairs and necessary upgrades even envisioned in the Oroville Dam Relicensing capital and maintenance budgets (DWR 2005b)? Cahill’s (2017) articles further expose Oroville’s numerous coverups, cultural, and regulatory failures. Oroville Dam’s flood-control manual hasn’t been updated for half a century (Sacbee 2017a):

.

“The critical document that determines how much space should be left in Lake Oroville for flood control during the rainy season hasn’t been updated since 1970, and it uses climatological data and runoff projections so old they don’t account for two of the biggest floods ever to strike the region. . . . At Oroville, the manual cites weather patterns prior to the 1950s, and data doesn’t account for the catastrophic floods of 1986 and 1997.”

.

The mandated Oroville Dam Flow control requirements were not designed for even the coming ARkStorm (USGS 2011). The ARkStorm study:

.

“represents an important wake-up call about the extensive devastation and long-term consequences an extreme event of this magnitude might cause. . . . The ARkStorm scenario raises serious questions about the ability of existing national, state, and local disaster policy to handle an event of this large magnitude.”

.

Overtopping, failure and collapse, from a severe storm and neglect, caused the May 31, 1889 failure of South Fork Dam near Johnstown, Pa., killing 2,209 people – the largest US disaster until 9/11 (Ward 2011). California had 45 dam failures from 1883 to 1965. Cavitation erosion severely damaged the Glen Canyon spillway in 1983. Aeration was added to correct that problem and then to Hoover and Blue Mesa dams (Falvey 1990). Why was the Oroville Dam not retrofitted with protective aeration to prevent erosion per best practice? US Army Corp/DOI (2015)Rogers (2017).

.

Yet the ARkStorm study still assumed all requirements by the California Division of the Safety of Dams (DSOD) would be met with only:

.

“. . . minor spillway damage or erosion in downstream channels. . . . When spillways sit untested for years, then are subjected to continuous flow for an extensive period, damage is possible or even likely.”

.

The ARkStorm study excluded or hid the dangers of Oroville Dam’s current spillway failures. Instead it assumed:

.

“. . .A component of this (DSOD) inspection is to verify spillways are unobstructed and fully functional. . . .Because of the extremely sensitive nature of a dam-damage scenario, the selection of a particular dam to imagine as hypothetically damaged in such a way is left to emergency planners.”

.

Would the Oroville dam survive this much greater 1 in 1000 year ARkStorm? Could today’s conventional planning allow a greater disaster than Johnstown flood? What then when California will experience another of its 270 year megafloods? Far better to address that worst case Probable Maximum Precipitation (PMP) up front now than stumble into a predictable disaster greater than 9/11.

.

Underestimation by Classical Statistics

Do Las Vegas roulette statistics describe nature? OR does nature “bias” the dice by “climate persistence”?

Demetris Koutsoyiannis (2010b) Memory in climate and things not to be forgotten

Demetris Koutsoyiannis summarizes how classical statistics severely underestimate natural extreme weather events which are dominated by Joseph like climate persistence. (Edited personal communication):

“Classical statistics underestimates the standard deviation; the underestimation could be 50% or less, but it could be even (much) higher than that, depending on:

  1. the sample size,
  2. the time scale to which the estimation refers, and
  3. the Hurst coefficient.

There are two factors leading to underestimation. At the basic (lowest) time scale (i.e. that on which the measurements are made) the underestimation is due to statistical bias.

E.g., in “Things not to be forgotten” (Koutsoyiannis 2010b), you will see the explanation of the bias in slides 29-20. Then you may see how this materializes in some real world case studies. For example at the graph in slide 23 at the smallest scale 1 (Log (scale) = 0) there is a huge underestimation as shown in the graph.

As the time scale increases, i.e. as we move from weather to climate, the underestimation inflates, as seen by comparing the HK and “white noise” curves in these graphs. This is the second factor, which represents the effect of aggregating at longer time scales.

See Fig. 9 and its discussion in Markonis & Koutsoyiannis (2013) Climatic Variability . . . . This is reproduced as Fig. 15 in Koutsoyiannis (2013) Hydrology and Change.”

Global climate models severely underestimate climate variability by their systemic weakness of Las Vegas roulette statistics. Serrat-Capdevila et al. (2016) observe:

“. . .since uncertainty is a structural component of climate and hydrological systems, Anagnostopoulos et al. (2010) found that large uncertainties and poor skill were shown by GCM predictions without bias correction. . . it cannot be addressed through increased model complexity. . . . Koutsoyiannis (2011) showed that an ensemble of climate model projections is fully contained WITHIN the uncertainty envelope of traditional stochastic methods using historical data, including the Hurst phenomena. . . .the Hurst phenomena (1951) describes the large and long excursions of natural events above and below their mean, as opposed to random processes which do not exhibit such behavior. The uncertainty range in time-averaged natural Hurst-Kolmogorov processes is much greater than that shown in GCM projections and classical statistics.”

Null hypothesis of climate persistence

What will it take to distinguish anthropogenic from natural climatic changes? Taking at face value the IPCC’s 1990 warning of 3°C/century warming, I wrote a 330 page report on using solar thermal technologies to redress that predicted anthropogenic warming (Hagen & Kaneff 1991). However, the actual global surface warming over the satellite era since 1979 has only been about 50% of what the IPCC predicted. Model predictions of the anthropogenic signature of tropical tropospheric temperature are running 300% of actual warming (Christy 2016). The harms from projected warming appear to have been strongly overstated and the benefits understated (Michaels 2017). Furthermore, the IPCC (1990) stated:

“The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.”

We now recognize more of the natural climatic variations caused by the interaction of five highly nonlinear coupled chaotic processes: earth’s atmosphere, oceans, volcanoes, solar weather, and galactic cosmic rays (Curry 2016Curry 2017). It appears that we need to replace determinative with stochastic models. (There is still strategic economic and national security benefits in developing dispatchable solar power and fuels cheaper than fossil power and oil.)

In light of ignoring climate persistence and poor predictions by climate models, and as detailed by Koutsoyiannis et al. and summarized above, I propose the following Climatic null hypothesis that:

Natural climatic variation is quantified by the stochastic uncertainty envelope of historical and paleo data, embodying the nonlinear chaotic interaction of atmospheric, oceanic, volcanic, solar, and galactic processes, including climate persistence quantified by Hurst-Kolmogorov dynamics.”

Scientists proposing catastrophic majority anthropogenic global warming models (a.k.a. “Climate change”) bear the burden of proof of providing clear robust evidence supporting validated model predictions of anthropogenic warming with strong significant differences from this climatic null hypothesis.

Bailey (2017) shows 5 sigma scientific models often deviate by up to five orders of magnitude beyond naive normal distribution assumptions. Christy (2016)’s comparisons show that today’s climate models suffer from severe Type B systemic errors of collective climate sensitivity assumptions of the tropospheric tropical temperature “anthropogenic signature” are about 300% far above long term observations.

Johnson (2013) quantifies the much more stringent statistical requirements essential to develop reproducible scientific models, especially with highly significant findings – with trillion dollar consequences:

“evidence thresholds required for the declaration of a significant finding should be increased to 25–50:1, and to 100–200:1 for the declaration of a highly significant finding.”

Koutsoyiannis (2011) showed that “an ensemble of climate model projections” of (realistic) global climate models are statistically likely to be within this climatic null hypothesis. Thus, the anthropogenic climate hypothesis will need to show alternative robust statistics per Johnson’s critieria, with clear anthropogenic caused deviations from natural climate persistence of full Hurst-Kolmogorov dynamics.

Anthropogenic contributions to climate are obvious by the Urban Heat Island Effect and hawks circling over ploughed fields. However, when comparing model outputs against Hurst climate persistence, I find that the catastrophic majority anthropogenic global warming hypothesis to be formally “Not Proven” per Bray (2005), and Johnson (2013).

Consequences?

How can we grapple with China’s 62-64 domino dam catastrophe from the 1975 failures of Banqiao and Shimantan Dams, and comprehend how policy decisions killed 170,000 to 230,000 people?

Have we grasped that the next ARkStorm would likely cause three times the damage of California’s “Big One”? Has California confronted its seven megafloods with their sobering potential of destroying Los Angeles’ stored Sierra water supply while causing more deaths than 9/11?

Even a partial failure of Oroville Dam’s emergency spillway confronts us with flooding downstream cities faster than they could be evacuated. Have actuaries and insurance companies incorporated these risks? Why should “We the People” be forced to pay for foolish actions and politically driven failures of States to maintain and update dams based on known risks?

Has today’s focus on global warming embedded a “persistent drought” mentality? Has it blinded scientists and politicians from addressing the massive shifts from climate persistence that cause seven year type periods of plenty or famine as Joseph predicted and managed? Will California and FERC upgrade and update their dam peak flow capabilities, emergency dam management flow policies, and energy and water management, to fully account for the much higher probability of Hurst “climate persistence”?

Will hydrology, climate science and management incorporate full Hurst Kolmogorov dynamics with natural stochastic models. While hydrologists analyze natural event distributions (such as log Pearson type 3), have they included historic and paleo evidence of California’s 270 year mega floods and droughts? Will they account for sequential cyclones and “atmospheric rivers” hitting California? Will they address worst case hurricane level triple storm combinations of “concurrent cyclone, front and thunderstorm occurrences”?

Will we have statesmen who will confront the Super Typhoon Nina-weather front “perfect storm” that caused China’s 62 dam Banqiao Dam/Shimantan Dam catastrophe? OR will politicians lead California to suffer through cascading dam failures combined with a catastrophic Oroville Dam failure from the next Atmospheric River 1000 year level ARkStorm?
Has California really prepared for combined dam flood damage three times greater than the 7.8 magnitude San Andreas fault ShakeOut earthquake scenario? – Or considered the even greater AD1605 level flood?

How well would Los Angeles and Southern California survive cut off from their Northern Sierra Feather river water supply? Will they now pay for repairs and improvements to protect their drinking water supply? Will California continue to rely on its Russian roulette game of climate models? OR will Californians develop the robust resilience of distributed decisions and actions based on climate persistence?

Who will decide? Who will suffer, survive, or prosper?

I pray readers review and take action to redress these failures and warnings before another 100,000 more people die unnecessarily like those drowned by the Banqiao Dam and Shimantan Dam failures.

David L. Hagen

PS These observations are as a general energy researcher without extensive hydropower (1 course) or civil engineering expertise. While I raise issues, I look forward to experts addressing details and their feedback. Please cite the section quote the statement you are addressing.

[ REFERENCES ]

APPENDIX ]

Moderation note:  As with all guest posts, please keep your comments civil and relevant.

271 responses to “Will the Oroville Dam survive the ARkStorm?

  1. Scott Pruitt is onto something.

    The apparent ignorance of the longer dated mega events by policy makers (not well served by climate activists) borders on the criminal.

  2. Note: the inline links to the references aren’t currently working (we are working to fix this). In the meantime, at the end of the post there is a link to a file [REFERENCES] that contains all the links.

    JC NOTE: inline refs now fixed

  3. The 2005 FERC decision not to agree with the request to concrete the emergency spillway slope has an interesting twist. If it had been designated an auxilliary spillway (which it obviously also is) concrete armoring woild have been appropriately ordered. The SoCal water districts who buy the water but don’t live in the flood zone didn’t want to pay the estimated $140 million. So they lawyered up and said the original design and permit called the weir an emergency spillway so concreting wasn’t necessary.
    The necessary armoring is now in place at a much higher emergency basis cost.

    • Given they “lawyered up” originally, one wonders if they retained those lawyers and will fight it now. Given the state, my guess is probably so – it survived this storm, so it is good enough!

      Of course while the 200k people below the damn will suffer the most, those water districts will be having more long term problems should the dam fail catastrophically.

    • “So they lawyered up…”

      I can only suggest that the residents at risk also “lawyer up” and demand that dam levels be kept low enough so that the dam can contain the inflow for sufficient enough time to allow evacuation of any and all areas likely to be at risk due to catastrophic failure. If this means that, eg, LA residents need to live with severe water restrictions, so be it – if they (LA residents) want the restrictions lifted, they need only spend the money required to mitigate the risks.
      Residents at risk need merely make the same arguments as the climate alarmists re: the precautionary principle and “fairness” – with “geographical” rather than “generational” arguments (that is: “your neighbor’s lives will be ruined” rather than “your children’s lives will be ruined”)

  4. The post raises an unsolvable engineering dilemma best illustrated by Hurricane Andrew and Fukushima Daiichi. What are the occurances to which you design and build. After Andrew, all new construction is Cat 5 proof, including my building. But that does not solve the storm surge problem. After Tohoku, my guess is most Japanese construction will be above that tidal wave elevation, as the ancient stone warnings from the Cascadia event of January 26, 1700 (dated not by Amerindians but by the Japanese tsunami it caused.) This is a risk/reward tradeoff like the climate precautionary principle. Building a struture to withstand a 1000 year event makes little sense if the structure has a 100 year life.
    Sometimes bad stuff just happens. Like the next now overdue Cascadia earthquake/tsunami.

    • David L. Hagen

      ristvan –
      The engineering/cultural issues are solvable – by a combination of mitigation (protection for known effects up to a certain size) and then adaptation – plan to evacuate and rebuild, with risk explicitly addressed by insurance plans – NOT expecting all other tax payers to fork out.

  5. The Oroville dam fiasco. This is the result of our CA fascist government at work. They put our tax dollars at work subsidizing cheap illegal immigrant labor for corporations while our infrastructure literally crumbled before our eyes!

    • It is still crumbling. Google images of hwy 101 at Big Sur. More than a little inconvenient. The disaster separates the residential from the commercial areas of Big Sur. Sure you can detour around–a mere 250 mile round trip.
      Meantime, Moonbeam Brown is building the high speed rail to nowhere.
      Factoid: since 1970, California’s population has increased 87% (not counting uncountable [violates sanctuary] illegals). Its water reservoir capacity has increased 26%. Self inflicted drought wound in a drought prone state. Don’t ask the rest of us to pay for it.

      • Willard, you do not get the difference. People buy into Mar-a-Lago with thier own money. Not yours and mine. They can do whatever they want with theirs. I object when you all try yours with mine.

      • > People buy into Mar-a-Lago with thier own money.

        From the article you may not have read, Sir:

        American taxpayers must foot a bill of more than $3 million (£2.4 million) each time he travels there, rather than staying at the official presidential residence of the White House — amounting to $600 million over the four years.

        Now, do you think I get the difference or what?

      • Ristvan
        A very valid point. California is not in drought. It is a simple case of natural cycles not being able to provide the current water requirements given the population growth, and the high demands for crop irrigation to satisfy the countries produce.

        After the new Zealand earthquakes​, new standards were introduced, significantly increasing engineering and build costs. A new building in Wellington constructed to these standards was condemned after a quake of lesser magnitude occurred many km away. Demolished.

        Life is a risk, negligence is another matter.

      • @Willard, and how much did it cost each time Obama traveled to Hawaii for a break?

        If you are going to claim that Trump should not be traveling to Mar-a-Lago, you need to compare it to other Presidents behavior.

        I also expect the cost-per-trip to decline over time as much of the work of the trip becomes routine.

      • Willard,

        Good point. Obama never cost taxpayers a cent when traveling to Martha’s Vineyard or Hawaii or any of his other many vacations.

        Oh, wait.

      • =={ @Willard, and how much did it cost each time Obama traveled to Hawaii for a break?

        […]

        Willard,

        Good point. Obama never cost taxpayers a cent when traveling to Martha’s Vineyard or Hawaii or any of his other many {==

        It really is remarkable just how resilient and useful “They did it first/do it too” arguments have been over the history of Climate Etc.

        https://twitter.com/realDonaldTrump/status/155014799909064704

        President Trump’s trips to Mar-a-Lago have cost taxpayers an estimated $10 million
        President Obama spent $97 million on travel while he was in office. Trump is on pace to outspend him in less than one year.

        http://www.vox.com/policy-and-politics/2017/2/21/14683940/trump-travel-cost-mar-a-lago

      • > And how much did it cost each time Obama

        Scratch your own itch, DavidE.

        The billion above-mentioned excludes what it costs Palm Beach:

        The Palm Beach Sheriff’s Office says that the county has spent $570,000 to help security and other measures to protect [teh Donald] during his presidency, and that doesn’t include overtime pay. Law enforcement officials assigned to aiding the president’s visits are paid overtime, totaling another $60,000 or so per day. [teh Donald] also visited Palm Beach during the winter holidays after he was elected but before he became president, costing the county around $548,000. Altogether, the bill comes to well over $1 million in [teh Donald]-related costs incurred by the county over the past few months.

        NYC’s case is even worse than that. And besides costing taxpayers’ money, teh Donald is making money out of guests who want to mingle with him, which leads to an interesting ethical quandary.

        Perhaps you’d like me to ask rhetorical questions instead?

        It seems to be DavidH’s theme.

      • What really jumped out at some people, though, was that Trump was proposing cuts to some relatively low-cost programs shortly before he prepared to fly to his Mar-a-Lago resort in Florida. According to an analysis from Politico, that’s a trip that costs about $3 million each time — and it’s a trip that he’s made four times this year.

        If that $3 million estimate is true, he could have funded the U.S. Interagency Council on Homelessness — budgeted at $4 million in 2016 — for nearly four years if he’d just stayed in the White House.

      • re: Mar-a-logo trips

        It’s not a case of “but he did it so it’s justified” and more more a matter of pointing out the hypocrisy of the people raising the complaint because they utterly ignored the behavior when it wasn’t Trump doing it. This isn’t even something recent, EVERY President has had some place outside of the White House that they went to on a regular basis

        But you also seem to be advocating that the President should not leave his (very nice) house and home office except for business trips for the duration of their term. They aren’t under house arrest. The cost of their protection is just one more budget item.

        You raise the issue of the costs to the local economy when Trump is at Mar-a-logo, but do you consider the benefits to the economy of the horde of reporters/etc who are in the area just because Trump is?

      • David Springer

        Mar-a-Lago cost probably highly inflated according to WaPo.

        https://www.washingtonpost.com/news/politics/wp/2017/03/17/how-much-is-donald-trumps-travel-and-protection-costing-anyway/

        And it has nothing to do with Mar-a-Lago because the lion’s share of the cost is the hourly price of flying AF1 and support aircraft. It’s all about distance traveled and little else.

        Hawaii->Washington is 5 times the distance Miami-Washington so one Obama trip to Hawaii is equal to 5 Trump trips to Miami.

        But I didn’t begrudge Obama his annual vacation in Hawaii because I’m not mean spirited like some people.

        What DID incense me about Obama’s travel is how much he, Michelle, and Biden separately did in support of Hillary Clinton’s campaign. Taxpayers picked up a huge tab for the campaigner in chief’s unprecedented support for his preferred predecessor.

      • David Springer

        But think of how many jobs are funded by all the money going into POTUS travel. Pilots, aircraft mechanics, aircraft manufacturers, secret service, caterers… the list goes on and on! Why are those people any less important than Bert & Ernie?

  6. CA plans a $100 Billion train to LA from SF with tickets the same price as SW airlines. But wait, we can also buy a $10 billion tunnel under the delta to deliver more water down to So Cal. Carlsbad desalination plant for $1 Billion produces 50,000,000 gal per day. Ten of those, or 100 of those for the price of the other projects would solve the water problems.

    Scott

    • David L. Hagen

      scotts4sf Globally the Israeli’s have developed the cheapest bulk RO water.

      Sorek will profitably sell water to the Israeli water authority for 58 U.S. cents per cubic meter (1,000 liters, or about what one person in Israel uses per week), which is a lower price than today’s conventional desalination plants can manage. What’s more, its energy consumption is among the lowest in the world for large-scale desalination plants.

  7. Reblogged this on Climate Collections.

  8. “the IPCC (1990) stated: “The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.” How true! wicked problem anyone?

  9. Curious George

    Nowhere do you mention that the dam was opened by Governor Ronald Reagan. Why not blame him personally?

    That said, it is spillways – far from the dam – that failed. Clearly an engineering fiasco. There is a lot to blame California Democrats for, but not this one.

    • David L. Hagen

      Curious George – those in Los Angeles reap the benefits of clean reliable drinking water – but refused to pay for the risks that those downstream of the dam have to bear. The dam needs a combination of mitigation – fixing and providing known economic solutions – and adaptation – plans to get out in time with risk insurance paid by Los Angeles to rebuild the destroyed homes.

  10. Unless something breaks, the outflow from the system only can equal the inflow.

    I can’t get a clear statement whether even that would be a problem, in which case the dam has nothing to do with it.

    Is the problem something breaks, or just that there’s too much water?

    • David L. Hagen

      rhhardin
      The problem is when you have so much rain that you have “overtopping” and a consequent catastrophic breach. When the ARkStorm or another megaflood come (or mismanagement again fills up the reservoir within the peak rainy season), the main spillway will be inadequate. At least one of the upstream dams is expected to fail, dumping its stored volume and flow into the main Oroville Lake. The “emergency”/auxiliary spillway would likely be overtopped again with severe downstream erosion, jeopardizing the integrity of the crest. When a break occurs in the top of the ogee weir, you have a “breach” scenario with rapid erosion downward and rapid catastrophic flow of water deluging the downstream towns. See Stephen Ward’s dam breach models for both “partial” and “full” breaches. For further discussion See MetaBunk.org https://www.metabunk.org/oroville-dam-spillway-failure.t8381/
      For another catastrophe waiting to happen, see Mosul Dam where the construction is a problem.

      • So the problem is erosion at the max river flow rate if it flows in the spillway?

        That’s a spillway problem, not a dam problem proper. You’d have the same problem if the reservoir were filled with dirt rather than water.

      • David L. Hagen

        rhhardin – But if the spillway goes, it can take the dam with it. Thus the emergency evacuation over erosion rapidly moving upstream towards the emergency spillway ogee weir.

  11. It amazes me that we are all the time warned about extreme weather, and yet the same authorities seems not to care for the infrastructure needed for all these extremes. Is it just a scam to lure money out of our pockets and keep the scare alive, while they not themselves take it that seriously.
    There is or was a demand that every state should incorporate climate change in their planning.
    Would any state dare to say that we plan just as usual and don’t see any special conciderations for “climate change”?

  12. Meanwhile, will California survive the union-driven, one-party state, Leftwing politics and bullet train to nowhere boondoggle of Governor “Moonbeam” Brown?

  13. One simple way to judge an article is to do a mere count of exclamation marks and question marks.’

    Suprised and confused is no way to go through life.
    Neither is Alarmed ! and dumbfounded?

    That said, I said it better. We don’t even plan for the past.

    • Steven,

      These matters of style reflect how much writing one does, and one’s editors. When i started back in 2003, I asked the great Martin van Creveld what he thought about one of my articles. His reply: “Lose the exclamation marks.” Judging content by style is like reviewing a book by its cover.

      “We don’t even plan for the past.”

      In my opinion that is the best summary of us climate policy yet written. I’ve used that quote scores of times (always with attribution).

      • > Judging content by style is like reviewing a book by its cover.

        You should know that syntax ain’t style, Editor.

        Question marks indicate questions.

        The number of questions a post can try to answer is limited.

        Too many question marks may indicate:

        – Just Asking Questions
        – Underhanded assertions
        – Overoptimistic problematization

        As an editor, you should be able to recognize that this is a two-pronged post: Cali matters and non-linear matters.

        How are these matters connected exactly?

        Which of these two topics come first?

        What should we conclude from DavidH’s questioning?

        The Bible, srsly?

        Communist China, really?

        Why does it go all over the place like that?

        Where’s DavidH’s argument exactly?

        Is that how engineers produce reports?

        Do you start to feel the power of just asking questions?

        Should I make it the theme of this thread?

      • David Springer

        Is Willard sane?

        No!

        Note that I distilled the essence of Willard’s entire existence into four words plus one exclamatory and one question mark. Talk about style…

    • Steven
      That is why think tanks are populated with bright, YOUNG intelligent graduates. Particularly political ones on foreign policy relating to military. Why confuse future plans with the mistakes of the past.

      Virtually every state in the USA runs at deficit, confirmation that the current cost of society exceeds that societies ability to support it. Asking that society to fund remedial work for historic inadequate planning and spend, and on top of that pay for future generations infrastructure​ and safeguard is a political no go.

    • Steven Mosher: That said, I said it better. We don’t even plan for the past.

      Lots of people have said that.

      The details in David Hagen’s essay are worth the read, even counting the punctuation marks.

    • David L. Hagen

      Steven Mosher
      We look forward to your pearls of wisdom in addressing each of the questions I raised.

      Finally, creating is considered the most difficult task in terms of cognitive processing. Questions that address this cognitive domain may require learners to generate alternative hypotheses based on observed phenomena, devise a new procedure to accomplish a task, or conceptualize a new product.

      Best Practice Strategies for Effective Use of Questions as a Teaching Tool

  14. If the dam fails, it will be blamed on climate change … and DJT.
    I live 300 yards from the campus of one of the premier science universities in the world.
    Every single person I interact with in my daily life believes unusual weather events are the result of human caused climate change.
    They will blame the dam failure on DJT’s budget cuts.
    DJT has nefarious magical powers just like CO2.

    • Curious George

      So far the flow never exceeded design parameters – yet spillways have failed. We should definitely blame a shoddy construction of 1960s on DJT and climate change and Republicans and anybody except Democrats.

      • Geoff Sherrington

        Curious George says –
        “So far the flow never exceeded design parameters – yet spillways have failed.”
        So what we are looking at is a mistake made by engineers, perhaps being an underestimate if the ability of flow to erode the bedrock in the spillways. It is not too easy to model or predict erosion rates like this, so the engineers were tested.
        In the parallel case, we are looking at global warming consequences, if they exist. We do have a significant number of informed people claiming that climate sensitivity is zero and the bulk of the climate experts being unwilling to give more than a broad range of climate sensitivities, AR5 IPCC. It is not too easy to model or predict CO2 warming rates like this, so the climate workers were tested.

        After plausible evidence of failure emerges, critics look for the scapegoat. An evil side of human nature works. The comments on this blog show the diversity of scapegoats in a series of words that really do not advance society much except for the self-satisfaction of the writers.
        Would it not be productive to forget the blame game and proceed with appropriate remediation?
        For the dam case, remediation is in progress as more concrete is poured and no doubt engineering has incorporated the need for more skill in spillway design.
        In the global warming case, I see little or no equivalent of remediation, more a movement of sticking to the guns that seem not to be able to fire with acceptable accuracy. Defence of models and asserting their dominance, as is being done, is the equivalent of continuing to build dams with known fatal flaws. This is not a desired outcome.
        Geoff

      • David L. Hagen

        Geoff Sherrington Re: “looking at is a mistake made by engineers”
        I seriously doubt that was an engineering “mistake”. Much more likely it was a refusal by water boards to pay to have it properly design and built.

      • Oroville Dam was built to standards but those standards are for it to “fail safe”, as there is no way to design a dam to withstand a major seismic event or a 1,000 year deluge.

      • David L. Hagen

        Wayne Lusvardi
        That sounds good. Any details?
        But was it BUILT to fail safe?
        e.g., consider the emergency spillway. The far end of the ogee weir is a parking lot that some at MetaBunk.org speculate was left as a sacrificial plug to begin failure as far from the dam as possible. See:

        California State Water Project, Vol. III Storage Facilities, Chapter V. Oroville Dam and Lake Oroville p 63-140, Bulletin No. 200, November 1974 Dept of Water Resources, State of California
        https://ia800302.us.archive.org/3/items/zh9californiastatew2003calirich/zh9californiastatew2003calirich_bw.pdf
        See Spillway p 133: “In part of the emergency spillway, an additional 10 feet of excavation was required to reach acceptable foundation rock, resulting in considerable additional time for excavation and placement of the backfill concrete to subgrade.”
        Then why the evacuation over concern that the emergency spillway would fail?
        Steven N. Ward’s model of even partial failure of this emergency spillway would flood Oroville in ~ 35 minutes and Yuba City in 18 hours.

    • David L. Hagen

      rebelronin
      Ask what is the null hypothesis of natural climate change and how do we distinguish from it? If you cannot distinguish a model from the null hypothesis what is left of the scientific method? See Richard Feynman

  15. Don’t slam the builders. From memory, the Oroville Dam was designed to accompany another dam upstream. When that dam was cancelled, Oroville Dam should have been reassessed. it wasn’t. Probably because of the immense cost.

    Also, the operators didn’t retrofit the spillway with aerators — as done at other dams (e.g., Hoover and Blue Mesa) after the near-failure of Glen Canyon on the Colorado River in 1983.

    http://www.mercurynews.com/2017/02/17/oroville-dam-what-made-the-spillway-collapse/

    • As they say “ya-pay-yer-money-and-takes-yer-chances”! Or perhaps better phase would “ya-do-not-pay-yer-money-and-then-suffer-the-consequences.”

  16. A question for the mineshaft –

    Do the models used in climate science adequately adjust for skewness — aka “fat tails” — in weather events?

    As a cautionary tale, consider stock market investing. That security market prices had large skews was discovered by Benoit Mandelbrot (of fractal geometry fame) and his student Eugene Fama (2013 Nobel) in 1963-65. But it was not adequately accounted for in the tactical asset allocation models that became popular in the mid-1980s (aka “portfolio insurance”). This mistake was discovered by Jack Treynor in mid-1987, but still not accounted for. The result was the 1987 market crash.

    Many professionals still have not absorbed the lesson, as large moves are often described as 5-sigma events — or even 12-sigma events. The kind of events that should be seen in a market once in the lifetime of the Sun.

    The appeal of simple bell curves outweighs knowledge and even common sense.

    • The problem presented by C02 only worsens if you promote persistence.
      This is a skeptic own goal

      • David L. Hagen

        Steven Mosher
        You presume CO2 is a “problem”. Try including the benefits of greening the earth, increasing agricultural productivity for 2/3rds world farmers. I am not “promoting” persistence, but rather acknowledging reality.

      • Not really, unless you wish to argue that CO2 causes persistence too.

      • Berényi Péter

        I can’t quite see how persistence leading to the Californian megaflood of 1605 could be related to CO2.

        Similar 1,000 year events happened in 212, 440, 603, 1029, 1418, and 1862 AD, so maybe, but only maybe the problem is with statistics, not carbon dioxide.

        IMHO truth is never an own goal.

      • “You presume CO2 is a “problem”.

        no presumption.known physics.

        when you prove otherwise your nobel awaits you.

    • Editor
      One should not overlook market manipulation and greed. These seem to be more of a force in the past twenty years. Try tracking the recommendation profiles of a select group of stocks by brokerage houses.

      On a different topic, key individual’s within the IPCC, CAGW group have on record stated that the CO2 hoax was purely in place to dismantle 150 years of growth under the industrial model. There has been no real discussion past that point. There are obvious reasons, fat, bloated, the ratio of productive to administrative, end of useful life, etc.

      No one has asked these folk why, what if any support structures will be put in place, or does that society just whither and die. What is the intended future.

      It should make an interesting article.

  17. Some comments on occurrence and recurrence. Lived in Darwin, small city flattened by Cyclone Tracy in 1974. Expect roughly 1 cyclone every 30 years on average. Overdue to get flattened again.
    Live near Eildon Dam. Down to 12% in a recent prolonged drought, nearly overflowed last year. Our Flood plain suffers a severe flood every 30 years. Usually takes 2 years of rain to bring the big flood on so look out this year as a near full year lat year.
    Oroville is hence at much greater risk now in the year ahead. Lots of snow and meltwater runoff.Lots of work to do. Another express or two and a disaster in the offing.
    Personally there is no planning that really helps for large scale catastrophe. I like where I live but I know one day it will flood. Being human and resilient and insured I will rebuild I guess, preferably in the third year!

  18. Probable maximum precipitation is not statistical at all. It is based on the most extreme recorded rainfall events that are transformed taking into account physical factors according to a standardised methodology.

    ‘The primary aim of assembling the storms database was to identify rare storm events. By searching over such a large zone as that for GTSMR (see Figure 1), the probability of
    identifying a very rare event improves. However, not all storms from this large region are directly comparable. There are features specific to a location that will influence the depth
    of rain that could be expected. These include:

    1. Spatial distribution of rainfall
    2. Storm type
    3. Topographic enhancement of rainfall
    4. Local moisture availability
    5. Geographic variation in decay of storm mechanism”

    It is a global methodology and as long as it is competently performed engineers have done their duty as far as this aspect of public safety is concerned. It is backed up by comprehensive risk management strategies. It is all almost inevitably meticulous engineering. This is used for high risk structures like dam spillways. Criticism is almost always ill-informed and ill-considered.

    It does miss large events. Over the Holocene – immense variability is evident in a high-resolution sediment record. Christopher Moy and colleagues examined a sediment core from Laguna Pallcacocha in southern Ecuador. More rainfall and runoff from a warmer sea surface in the eastern Pacific washes more red sediment into the lake. So we know it was pretty rainy in South America a 1000 years ago. Some 5,000 years ago there was a change from more upwelling to less – that dried the Sahel. Just 3,500 years ago there were a long series of warm Pacific events with red intensity greater than 200 – and civilisations fell. For comparison – red intensity in the ‘monster’ 1997/1998 El Niño event was 99. Extremes in the Holocene put those of the 20th century to shame.

    All climate activist hydrology, on the other hand, starts in the 1950’s and everything since is extreme. Climate science has imagined that climate is static unless acted on by global warming – but what they lack is a convincing deterministic framework for demonstrating climate non-stationarity.

    Most of the relevant terms are discussed here – http://www.itia.ntua.gr/getfile/1001/1/documents/2010JAWRAHurstKolmogorovDynamicsPP.pdf

    https://watertechbyrie.files.wordpress.com/2014/06/moys-20023.pnghttps://watertechbyrie.files.wordpress.com/2014/06/moys-20023.png

    Sensible hydrologists assume stationarity – that is that there are a limited number of deterministic processes that operate consistently – although with great variability – in the climate system. But the data is lacking in detail and deterministic prediction is impossible. The best that can be done – without massively over-engineering – is to build in safety margins and manage risk. Engineering as a profession routinely operates under uncertainty.

    Models – and a statistical scheme is a model – are applied to data. For less extreme events a skewed, exponential distribution is assumed. It works well enough. A subtle distinction is a stratified stochastic analysis – that takes into account 20 to 30 year climate shifts. Hurst applied a power law to Nilometer data – to explain departure from random variation. Didier Sornette applied a power law to extreme events – dragon kings – that occur at tipping points.

    The model I prefer is the deterministic, chaotic one. Climate means and variance shift every 20 to 30 years and add up to variability over millennia. With dragon kings at transitions. In practice similar to the Hurst model – but recognising the underlying complex and dynamic system origins of variability. Climate is clearly better described as an ergodic, chaotic system – but stationary over a sufficient period will do.

    “The final bill includes several of our priorities, including the creation of a High Hazard Dam Rehabilitation program…

    While Congress has vowed to pass a water resources bill every two years (the last one passed in 2014, but before that 2007) the authorization components of the bill still must be funded through annual appropriations. The 114th Congress concluded by passing a continuing resolution (CR) to keep the government open and running through April 2017. This sort of stopgap funding measure is not the type of major injections of infrastructure investment necessary to reduce the estimated $1.6 trillion infrastructure funding gap that is expected by 2020. We are hopeful that the 115th Congress will work diligently to fund infrastructure programs and increase federal appropriations to important programs.”

    http://www.infrastructurereportcard.org/a-big-wiin-for-water-resources/#p/dams/conditions-and-capacity

    I doubt that the problem was PMP estimates – but perhaps maintenance is an issue. But whatever the fallout – engineers stayed at their posts, managed damage from the immense forces in play and took responsible action to minimise public risk.

    • Robert I Ellison: Probable maximum precipitation is not statistical at all. It is based on the most extreme recorded rainfall events that are transformed taking into account physical factors according to a standardised methodology.

      The second sentence contradicts the first. Perhaps you have an idiosyncratic definition of “statistical” that excludes analysis of recorded data. I expect you meant to write something different.

      You prefer a deterministic model? That would be nice if anyone had one with known parameters and a track record of successful prediction. Meanwhile, ignoring the statistical analyses linked by David Hagen is foolish, as he showed.

      • Picking the largest rainfall on record is not statistics and deterministic chaos is very different to simple cause and effect. Everything is fully deterministic – including the ‘atmospheric rivers’ that transport moisture from oceanic hot-spots to drop it on land – but so complex and dynamic as to be seemingly random and unpredictable.

        Major spillways are designed for what is nominally a return period of in excess of 10,000 years. The design capacity of the Oroville spillway derived using the PMP methodology was not exceeded it seems.

        David’s statistics boil down to the the Hurst-Kolmogorov power rule for Nile data. As seen in the Nilometer data – that there is a clustering of floods and droughts at different times. The long record allows the data to be put into a more comprehensive theoretical framework.

        “Change is not synonymous to nonstationarity, since even an ideal stationary white noise process involves
        change, which however becomes less and less distinct as the time scale of viewing the process (e.g., time scale
        of averaging) increases. However, the climatic and all geophysical processes demonstrate more prominent
        change at large scales in comparison to white noise or even to typical stochastic models such as Markovian.
        This does not reflect nonstationarity. Rather it warns us to change our perception of natural processes as resembling
        these simple idealized mathematical processes and to move towards a new type of stochastic dynamics.
        The “new” description does not depart from the 60- to 70-year old pioneering works of Hurst on natural processes
        and of Kolmogorov on turbulence. Essentially, Hurst’s discovery in 1950 of the behaviour named after
        him and the model that had been proposed by Kolmogorov 10 years earlier recognize the multi-scale fluctuation
        of natural processes and describe it in stationary terms.”
        http://www.cwi.colostate.edu/nonstationarityworkshop/SpeakerInfo/Koutsoyiannis_Abstract.pdf

        The Nilometer data is fascinating. Rainfall in the Mediterranean Basin is influenced by ocean surface temperatures in the tropical Pacific and the north Atlantic. The variability in ocean surface temperature year to year, decade to decade, century to century result in persistent regimes of droughts and floods. Because of the importance of Nile River flows to the Egyptian civilisation water levels have been measured for 5,000 years and recorded for more than 1,300. The ‘Nilometer’ – known as al-Miqyas in Arabic – in Cairo dates back to the Arab conquest of Egypt. The Cairo Nilometer has an inner stilling well connected to the river and a central stone pillar on which levels were observed.

        There is a little more on Joseph and the Nile here.

        https://watertechbyrie.com/2015/06/06/obama-and-the-syrian-drought/

        I’d suggest it may be you – Matthew – who misunderstand David’s statistics. I suggest that David misunderstands hydrological design methodology.

      • David L. Hagen

        Robert I. Ellison Re: “Major spillways are designed for what is nominally a return period of in excess of 10,000 years.”
        vis: On the Relationship between the 10000 Year Flood and Probable Maximum Flood

        Re: “The design capacity of the Oroville spillway derived using the PMP methodology was not exceeded it seems.” Why then did both main and emergency spillways fail though the PMP methodology was not exceeded?
        Note Richard Donnelly documents 10 dams where the spillways had to upgraded by 100% to 369%.
        ICOLD 2015 Question 97 Spillways

      • Robert I Ellison: deterministic chaos is very different to simple cause and effect. Everything is fully deterministic – including the ‘atmospheric rivers’ that transport moisture from oceanic hot-spots to drop it on land – but so complex and dynamic as to be seemingly random and unpredictable.

        You still do not have an actual relevant model that is as informative as the statistics

        Picking the largest rainfall on record is not statistics

        It is not all of statistics, but the study of extremes is a subset of statistics. Ignoring the evidence of extremes is foolish.

      • Robert I Ellison: I’d suggest it may be you

        Sure. But what you wrote does not establish that. What you wrote is foolish.

      • Estimating a 10,000 year flood with a log-Pearson extrapolation of 100 years of data is the most absurd procedure I have ever heard of. I did say nominally – and it is based on PMP procedures. Let me say it just once more the flows did not exceed spillway design flows which are considerably higher than the ARkStorm flows. The hydrology is not the problem.

        I am an engineer and hydrologist which should be obvious to anyone but the numerically challenged. You make some amateur errors and then make a great noise about how hydrologists have ignored Hurst-Kolmogorov dynamics for white noise this and Markovian that. I doubt that you understand any of it. It is complete nonsense and I am heartily bored with it.

        As for Matthew – the name of the theory is deterministic chaos and implies that a deterministic model is impossible. And picking the highest number in a year of flows and then ranking them is something a 10 year old can do – defining it as statistics reveals his level of relevant expertise. That of a 10 year old.

      • Robert I Ellison: defining it as statistics reveals his level of relevant expertise.

        You have never studied the statistics of extremes, have you?

        And you have yet to propose a single alternative that is better than taking account of the 1862 rainfall.

        But what is your recommendation going forward? Ignore statistics and write about deterministic chaos? While you are doing that, the CO2 opponents in California are still supporting the “bullet train to nowhere”.

      • The statistics of picking the largest number in a year and then ranking them? I have certainly been trained in flood frequency analysis and it is not generally defined as statistics of extremes or anything else – but if Matthew wants to insist – by all means – he can call it what he wants. It is an utterly trivial point but repeatedly being called foolish by someone whose only points are repetitive and trivial is a waste of everyone’s time.

      • Did he finally get deterministic chaos? I doubt it. But now we return to the problem that it is the same behaviour that Hurst pondered over. Calling it deterministic chaos identifies the underlying mechanism that we know operates at all scales in the climate system – but that Hurst didn’t. Still a trivial point emerging from a complete misunderstanding of the term by Matthew – who suggests that I should contemplate a bullet train instead.

      • Robert I Ellison: . Calling it deterministic chaos identifies the underlying mechanism that we know operates at all scales in the climate system – but that Hurst didn’t. Still a trivial point emerging from a complete misunderstanding of the term by Matthew – who suggests that I should contemplate a bullet train instead.

        How you do go on!

        The “bullet train” is an actual California project, pushed by Gov. Jerry Brown, a serious misallocation of resources. The aim is to reduce future global warming by reducing Californians’ use of transportation fuel.

        Without at actual tested model, that would include accurately estimated parameters, the deterministic chaos provides less information relevant to planning for the future than does the history of extremes. If you knew the system was in its attracting set (or its “strange attractor”), the history of extremes would tell you something about the limits of the attractor. That’s more information than naming the system a deterministic chaos.

        The claim “calling it deterministic chaos identifies the underlying mechanism” is false: more is involved in identifying a system than merely classifying it broadly and naming it. That’s no more “identifying” than if you said it can be described by a high-dimensional dissipative system of nonlinear differential equations. While possibly true, it provides less information for planning than the series of extremes.

      • Yet the Nilometre data remains the same. Dynamical complexity explains both persistence and abrupt shifts in climate data – and demands that we change our expectations about future behaviour from slow and gradual to abrupt and potentially large.

        How you do go on.

      • Robert I Ellison: Dynamical complexity explains both persistence and abrupt shifts in climate data – and demands that we change our expectations about future behaviour from slow and gradual to abrupt and potentially large.

        That doesn’t help Californians. What should our expectations be about rainfall events? Will another 1862 event occur in less than 100 years? Less than 50 years? More than 200 years? How about the next 2017 event?

        How exactly is the dynamical complexity more informative than the history of extremes, without actual parameter values and tested predictions?

        I have to leave now. I think I have used up more than my allotted posts.

  19. I worked for California’s largest urban water agency for 20 years and conducted valuation studies of probable property damage loss in the event of breach of any of its dams or dikes; along with a statistical analysis of the probability of dam failure; for Lloyds of London insurance underwriters.
    Dams don’t typically fail to the dam itself but often because of the adjacent slope fails or a gate valve fails. Thus, the Banqiao Dam and Shimantan Dam failures may not be the most apt example. In my recent article posted at MasterResource.org titled “Denial is a River in California: Can Oroville Spark Dam Building” I propose that the 1963 breach of Vajont Dam in Italy from a landslide may be the more apt example. Link:
    https://www.masterresource.org/water-policy/denial-oroville-dam-building/

  20. As an student at the National Technical University of Athens, I was very interested to see the work of my professor, Demetris Koutsoyiannis, mentioned in this article.
    Also, as I am currently studying climate persistence and the causes of the Oroville Dam spillway failure for my thesis, this article could not have come at a better time, and will help me greatly in my upcoming research. Thank you very much.

    • David L. Hagen

      Aristotelis Koskinas
      We look forward to your thesis and findings which promise to provide important insights. I encourage you particularly focus on the differences between climate persistence and traditional hydrologic distributions such as log Pearson Type III distributions, combined with understanding how the Stadium Wave fits in.
      Can we redesign/upgrade dams to safely accommodate the massive flows from an ARkStorm and California’s megafloods? Can we predict cumulative precipitation far enough ahead to adequately lower reservoir levels to accommodate those deluges? How much do spillway flows need to be increased and/or auxiliary spillways provided for such flows?
      Oroville engineers appear to have been concerned over the main spillway eroding back upstream towards the main dam – so risking using the untested “emergency spillway” – but that was worse. AND it left Oroville Lake Reservoir full near the height of the precipitation season – instead of at minimal levels for safety. etc. Lots to study.
      My compliments to Prof. Koutsoyiannis and his major developments on HK dynamics in the face of “persistent” opposition over orthodox climate positions.
      PS I encourage exploring the posts at MetaBunk.org for interesting technical details, but then comparing/discussing with dam expert Steve Cahill etc.

      • Thank you for recommending the MetaBunk.org forum. The Oroville Dam Spillway Failure thread contains a wealth of technical information which will surely prove invaluable for my thesis. Especially helpful are the questions you raised concerning improving dam safety. I do indeed have a lot of interesting study ahead of me.

      • David L. Hagen

        Aristotelis Koskinas
        From MetaBunk.org, for Oroville Dam design and construction details see particularly the document:
        California State Water Project, Vol. III Storage Facilities, Chapter V. Oroville Dam and Lake Oroville p 63-140, Bulletin No. 200, November 1974 Dept of Water Resources, State of California
        https://ia800302.us.archive.org/3/items/zh9californiastatew2003calirich/zh9californiastatew2003calirich_bw.pdf
        e.g., See Spillway p 133: “In part of the emergency spillway, an additional 10 feet of excavation was required to reach acceptable foundation rock, resulting in considerable additional time for excavation and placement of the backfill concrete to subgrade.”

      • David L. Hagen

        Correction Scott Cahill

      • Aristotelis Koskinas, it is good to see the works of Demetris Koutsoyiannis, mentioned in this article. I wonder, are you going to publish as Demetris did one of his works. I, and several others, had the chance to comment and ask questions. It was a great experience. I hope however you publish you will do a guest post here.

      • David L. Hagen

        Additionally from the link you provided, Spillway p.133:

        “The area below the emergency spillway was not cleared.”
        “The depth of overburden in the approach channel was deeper than estimated and the slopes had to be changed [..] to prevent sloughing.”
        “The slopes in the flood control outlet gate section proved to be of a lower quality rock than anticipated.”

        It seems that the warning signs were clearly there, but no additional precautions/safety measures were taken. Apparently boards did not approve funding for such measures, and it is definitely worrying if this lax approach has been applied to other dams with similar issues.

        jfpittman

        When it is complete, the diploma thesis will definitely be published online at the ITIA research team website here: http://www.itia.ntua.gr/en/documents/9/

        This webpage contains theses written by various students at the National Technical University of Athens over the years, focusing on topics such as Water Resources Management and Hydroclimactic Prognosis.

        Most of them are written in Greek. However, since my thesis directly concerns the Oroville Dam, I intend to also publish it in English.

        After that, I would be very interested in creating a guest post here as well, containing the most important findings of my research.

      • David L. Hagen

        Thanks Aristotelis. We look forward to your findings. For official communications see FERC.GOV on Oroville Dam Service Spillway (P-2100) e.g. the FERC Project No. 2100 – Oroville Emergency Recovery – Spillways, Independent Board of Consultants. Memorandum No. 1 with further details on the issues you raised. Note the clay and voids and water flows under the spillway!

      • Hello Aristotelis, yes I would be very interested in your guest post on this topic. Send me an email when it is ready curryja at yahoo.com

      • It is interesting to note that in California Gov. Jerry “Moonbeam” Brown pushed legislative reform of the state’s environmental law (CEQA) to fast track and streamline concentrated solar farm installations without regard for loss of bird life and the habitat the birds live on (e.g. insects); and more recently has fast tracked the installation of expensive batteries on its electric grid in record time without the usual delays for environmental clearances (see: “A Big Test for Big Batteries”, New York Times, Jan. 14, 2017) but apparently did not even have on a back burner a project to fix the open and notorious Oroville Dam spillway defects and deficiencies. Dams are life-line facilities, as is the power grid, and a dam failure can cause more loss of life and property damage than an atom bomb in some cases (Fukushima losses were mostly from water damage due to sea level rise not radiation per se). California has forsaken public safety (San Bruno gas pipeline explosion, constant blow offs and breaks of the water distribution system in the City of Los Angeles, run down highways in the interior parts of the state causing avoidable accidents and fatalities, Oroville) in favor of anything that creates a “green” power grid at breakneck speed.

      • David L. Hagen

        Thank you for the additional data! I’m looking at it now.

        Judith Curry

        Excellent! I will inform you when it is ready.

  21. Artificial lakes, which is basically what a dam is for, natural lakes have things in common
    1) they don’t last forever, over time they will silt up unless regularly dredged. 2) the water out-flow will undercut the dam, the rapidity of which is down to the geology under the dam and the rate of water.

  22. My forecast two years ago for California was rains starting to return late 2015, increasing through 2016, and too much by early 2017. And I predict a negative AO/NAO and El Nino bias through to around the next sunspot maximum, with California continuing a generally wetter phase until then.
    Solar wind geometry through the sunspot cycles is currently running higher around sunspot maxima and weaker between the sunspot maxima. So the next positive AO/NAO and La Nina regime will be around the next sunspot maximum, as occurred around the last two sunspot maxima. Here this is expressed with reference to the AMO, which is due a warm phase until the next sunspot maximum. This will move U.S. drought away from the southeast and into the Plains.

    https://www.linkedin.com/pulse/association-between-sunspot-cycles-amo-ulric-lyons?trk=pulse_spock-articles

  23. David Hagen, thank you for the essay.

    We’ll probably have an opportunity for a long update when the Sierra Mt. snowpack melts.

    My prediction is that Californians will continue to let the flood control/irrigation infrastructure continue to degrade. I don’t have a large network of liberal, middle, and conservative friends and associates, and I obviously can’t read everything, but my sense is that a majority of Californians remain more interested in trying to prevent future warming than in trying to prepare for the weather fluctuations that will persist. There are times when it would be nice to be wrong, and it would be nice if I were wrong about this.

    • oops. “Sierra Nevada mountain snowpack” reads better than Sierra Mt snowpack.

    • It is a political issue. Much easier to scare people about an imaginary threat and send money to cronies and gain control over resources than prioritize real problems and solve them.

      I think Oboma clung to climate change-global warming because it was hapenning far in the future and no measure of success or failure was possible.

      It became the greatest threat vs muslim terrorism because attacks can be seen and projections of computier programs for 100 to 300 years can’t be evaluated till 100 to 300 years.

      CA won’t change till reality smacks us in the face many times.
      Scot

      • Reality the unacceptable face of life, at least for some.

      • You might want to read why the condition of Oroville was not corrected even though ample funding was available. Google: Denial is a River in California: Can Oroville Spark New Dam Building?

        https://www.masterresource.org/water-policy/denial-oroville-dam-building/

      • Wayne

        Great illustration of how ignoring reality works for just so long but in the end reality has the last laugh. They do live in a different universe out there.

        Several years ago I was following what I thought were massive, insurmountable budget problems in California. And then they went poof, gone. Maybe they really didn’t disappear. It might have been just an illusion after using a lot of smoke and mirrors and duct tape.

      • David L. Hagen

        Wayne Lusvardi Thanks for linking to your excellent essay. Denial is a river in California

        Oroville is an apt metaphor and symbol for California government that continues to be in denial about its dysfunctional water and energy infrastructure policies and priorities. . . .Failure to fix the designed weaknesses of the spillways surely was “penny wise but pound foolish” given the enormity of the consequences of failure. Why California policy makers were in denial about this looming disaster- waiting-to-happen when they had ample resources to fix it will be elaborated upon later in this article.

  24. “Cavitation erosion severely damaged the Glen Canyon spillway in 1983. Aeration was added to correct that problem and then to Hoover and Blue Mesa dams (Falvey 1990). Why was the Oroville Dam not retrofitted with protective aeration to prevent erosion per best practice? ”

    The Glen Canyon spillways are tubes. Oroville is a chute, thoroughly aerated below the damage.

  25. David Wojick

    The Oroville situation was created by a simple error, which should not have happened. The emergency spillway involved creating a waterfall, which is unusual. The erodibility of the material hit by that fall should have been assessed, which we know how to do. Either this was not done or they got it wrong. There is no overarching issue here.

    As to whether we should be designing dams for ARKstorms and megafloods, the answer is no. There are simply too many highly unlikely extreme events to design for (including catastrophic climate change). We design for expected events.

    • David L. Hagen

      David Wojick – The ARkStorm and megafloods are very specific expected events to be designed for. You ignore the multiple failures of the main spillway, the emergency spillway and blockage of the hydropower flow etc.

      • David
        My name is Wayne Lusvardi and I worked in a non-engineering capacity for California’s largest urban water district for 20 years and conducted valuation estimates of probable property damage loss in the even of a breach of any of its dams or dikes for both that agency and Lloyd’s of London insurance underwriters. I also conducted a study of the statistical probability of dam failure to accompany my valuation estimate.

        According to California officials the Oroville Dam main spillway was designed to fail in the case of a peak event. There is no way to design any facility for, say, a 9.0 Richter Scale earthquake. This “fail safe” policy of engineering design was also adhered to in the infamous “O Rings” sealing joints on the U.S. Space Shuttle in 1986 that failed resulting in disaster. Domestic building standards for earthquake retrofit of old seismically unsafe structures in California only call for preventing the structure from falling on people not in preventing the building from being broken and thereafter inhabitable.

        Most dam failures are not of the dam but of the adjacent slopes or in some cases the gate valves. The underlying bedrock below the Oroville Main Spillway was evaluated prior to construction and deemed able to withstand a peak event without excessive cavitation. The Vajont Dam failure in Italy in 1963 is the more comparable event because it was caused by a landslide (see my article “Denial is a River in California” at MasterResource.org).

        All dams are designed with redundancies. The major problem of the Oroville near failure was the downslope on the emergency spillway which had to have the foundation below the spillway buttressed with rip rap rock to prevent cavitation and possible collapse of that structure.

        A complicating factor was that the tube that goes under the dam called the penstock was not operable even before the event that could also have helped draw down water behind the dam. The water that flows through the penstock spins the turbines in the power house.

        Another complicating factor hardly mentioned was that the power lines coming out of the power house run parallel to the dam and the emergency spillway on the very downslope that was being washed out. The concrete bases of these transmission line towers could have been eroded causing even more damage even if the dam and spillways remained intact.

        Bottom line, all three ways to draw down the dam were compromised to some degree during the event – a Black Swan event.

      • David L. Hagen

        Wayne Lusvardi
        Thanks for bringing local expertise. I presume we can design/build up to a given risk level and then evacuate when events are proceeding above that.
        How did the current Oroville dam failures fit the probability models?
        I would be interested in your comments on the ARkStorm evaluation of 3x higher damage than the 7.8 magnitude earthquake study.
        We would welcome further documents detailing such risk modeling if publicly available.

  26. Here’s a link to the Jones and Ricketts paper:

    http://www.earth-syst-dynam-discuss.net/esd-2016-35/esd-2016-35.pdf

    –e.g.,

    The history and philosophy of gradualisms and trend analysis as its key tool for understanding how the world works has its origins in the scientific enlightenment and since then has defined H1 as the dominant paradigm of climate change (Jones, 2015b). This has been reinforced by the success of methods for long-term trend analysis (Jones, 2015a). Phenomena that do not fit this model are labelled as noise and considered to be random.

  27. Prevailing winds pick up moisture over oceans and transport it to land. Nothing mysterious or novel.

    https://watertechbyrie.files.wordpress.com/2017/03/usgs-ar.jpg

    The USGS created a syhthetic storm from two relatively recent storms – and stalled it over California. The runoff was then compared to a flood frequency analysis. Typically – the highest flow in each year is chosen and then the floods are ranked. If there are 100 years of data this automatically gives the 1 in 100 year flood. The floods were then fitted to a log Pearson distribution with a regional skewness factor to extrapolate for less frequent events – and as I said compared to the ARkStorm. Pretty standard hydrology. It assumes the recent past is representative of the near future – but persistence suggests that’s a reasonable assumption. Less reasonable the further out we go.

    The result is a runoff map showing varying runoff intensity – up to a 1000 year average return interval.

    https://watertechbyrie.files.wordpress.com/2017/03/usga-ari.jpg

    As I explained above – major dams are designed by a different method of estimating probable maximum precipitation from recorded high intensity storm precipitation transformed by local and regional factors including moisture availability. The end product is nominally a storm with an average return interval of greater than 10,000 years. If designed, constructed and operated satisfactorily – the Oroville dam should survive the ARkStorm.

    • David L. Hagen

      Robert I. Ellison Then why the spectacular failures we see on the Oroville main and emergency spillways at far below the PMF? Being designed for and actually withstanding such flows are very different issues.

      • The flows seem less than the design capacity – as far as Wikipedia can be relied on. There was a quote above from the USACE on new funding allocations for major dam rehabilitation. That may explain it. They talk about a $2.6 trillion dollar shortfall for water infrastructure.

        But the hydrology doesn’t seem to be the problem.

      • David L. Hagen

        Robert I. Ellison If we have such spectacular failures below hydrology design limits, how will the dam manage for ARkStorm and mega floods which are clearly ABOVE the 500 year design used?

      • You have misunderstood. More frequent events – 500 or 100 year events for instance – may be used to show flood inundation downstream or to design flood levies or other infrastructure. Housing would be above the 100 year level for instance – emergency services like hospitals above the 500 year level. But the dam is not designed to a 500 year event.

        https://watertechbyrie.files.wordpress.com/2017/03/pmf-oroville.jpg

      • David L. Hagen

        Robert I. Ellison Re: “But the dam is not designed to a 500 year event.” Reality Check: Please read William E. Reibsame page 85, 86
        ADJUSTING WATER RESOURCES MANAGEMENT TO
        CLIMATE CHANGE

        4.2.3. Flood Control Sensitivity to Climate Fluctuation . . . Simple comparison of flood control requirements for Folsom and Oroville dams indicates large differences in climate sensitivity. The most striking contrast is the estimated return periods of their respective RDF’s – a direct indicator of reliability. Both design floods had estimated recurrence intervals of roughly 500 years when the dams were designed. Subsequent flood events have, however, resulted in reduced RDF expected return intervals. Folsom Dam’s original RDF was based on the rainstorm of December, 1937, then the worst on record. Using daily runoff data through the late-1940s, hydrologists estimated that its return period was over 500 years. But, precipitation episodes in 1950 and 1955, while the dam was under construction, would have exceeded the RDE When factored into updated hydrologic analyses in 1977, these events (and floods in 1964-65 which slightly exceeded the RDF) yielded a recurrence interval of roughly 120 years (Neal, 1986). On the other hand, designers of Oroville Dam, built in 1965, had benefit of the 1950s floods in their calculations, and enlarged its capacity accordingly. Its flood control capacity was not severely stressed until 1986. . . .
        4.2.4.. . .The reservoir tended to operate close to design flood standards more frequently than expected, and the new RDF return period of 120 years calculated in 1977 was a dramatic drop in apparent reliability.

      • “Flood control is typically tied to thresholds such as the 100-year event (a flood expected to occur, on average, once a century) in rural areas, or to the 200-year flood in heavily developed basins. In flood control systems based on reservoir storage, this event is referred to as the reservoir design flood (RDF). In flood plain management it is called the ‘base flood’, and zoning decisions are tied to it. A ‘maximum probable flood’, (MPF) usually a hypothetical event with a recurrence interval of 1000 years or more, may also be calculated to design critical control works like spillways, whose failure would be catastrophic. The
        design-flood return period is essentially an estimate of system reliability, because failure should not occur during more frequent (less severe) events.”

        The RDF in the licence is down to 200 years from 500 after 1997 – it means various flow parameters downstream can be met in a 200 year storm. It seems to approximate the attenuation of flows downstream to 300,000 cfs.

        The PMF isn’t typically given a return period in technical documentation. But it is used to design critical structures on major dams. The PMF is 700,000 cfs.

        I really can’t be bothered getting across all the details of flood attenuation in a particular scheme – but it is clear that spillways of major dams are not designed to a return period – even the 1000 year or more suggested by your obsolete reference on Oroville and global warming.

    • David L. Hagen

      Robert I Ellison
      State requirements for spillways vary widely from 100 year to 3000 year to full Peak Maximum Precipitation (PMP) or Peak Maximum Flood (PMF).
      FEMA publishes: Summary of Existing Guidelines for Hydrologic Safety of Dams
      Ch 9 Summary of Current State Hydrologic Design Guidelines July 2012
      Note: Figure 9.5 Range of Spillway Design Flood Criteria for New High Hazard Dams

      • Note the Californian risk based guidelines.

      • David L. Hagen

        Robert I. Ellison Wonderful. But how was that actually implemented?

        “The maximum event is a storm derived from the Probable Maximum Precipitation and is equated with a TCW of 30.”

        Oroville Dam has a TCW of 36. PMP required! Yet the main spillway failed at some 10-15% of the peak spillway flow. The emergency spillway failed at 3% of its design flow. Even the 1.3% hydropower flow could not be used for lack of power lines and from being blocked by eroded spillway material. The marvelous California Spillway Risk Analysis was completely undermined by inadequate construction of both the main and emergency spillways, lack of aeration retrofit (though implemented on other dams), lack of maintenance, and fragile power line design.

      • The original Oroville PMF was 7000,000 cfs – that it cracked at 95,000 cfs has nothing to do with the hydrology. As I keep saying.

        https://watertechbyrie.files.wordpress.com/2017/03/oroville-outflow.jpg

      • That’s 700,000 not 7 million.

      • David L. Hagen

        Robert I. Ellison I am raising issues of persistence where natural variations exceed common calculations with consequent inadequate design, inadequate construction and inadequate maintenance, and operation.
        The original PMF study was updated by USACE in 1980.
        The Peak PMF inflow to Lake Oroville is 960,000 cfs.
        Table 12.7-1 Summary of the PMF Study by USACE 1980
        Drainage Area 3607 square miles.
        Sub-basins 18
        PMP 28.9 inches.
        Month of Storm: January-February
        Basis for PMP: HydroMeterological Report No, 36
        Butt Valley Dam: Failed
        Snowmelt: 4.5 inches
        Peak Inflow: 960,000 cfs
        Total 8-day Volume 5,217,300 acre-feet.
        In 2003, the peak INFLOW was increased and the peak OUTFLOW sset at 798,000 cfs due to changes in precipitation and flood modeling.

        The 2003 HEC-HMS model is recommended as an
        updated, calibrated model and the resulting PMF is recommended for use in subsequent operational studies for Lake Oroville. The PMF routing considering full operation of all spillway gates and the effect of non-operation of one and two spillway gates is under way at this time.

        12.8.2 Lake Oroville Storage Routing
        The inflow Hydrograph, natural PMF plus Butt Valley Dam-break Flood, was routed through Lake Oroville. The results of routing are shown in Figure 12.8-1 and listed below:
        Table 12.8-1 Oroville Dam PMF with Dynamic Routing (1983)
        PMP: 28.9 inches
        Snowmelt: 4.5 inches

        Peak Inflow: 1,167,000 cfs (Occurs at hour 40)
        Eight day Inflow Volume: 5,217,300 acre-feet
        Initial Elevation: 855 feet
        Maximum Reservoir Elevation: 921.4 feet (Occurs at hour 58-59)
        Peak Outflow 798,000 cfs (Occurs at Hour 58-59)

        SP-E4: FLOOD MANAGEMENT STUDY FINAL REPORT Oroville Facilities Relicensing FERC Project No. 2100
        http://www.water.ca.gov/orovillerelicensing/docs/wg_study_reports_and_docs/EO/SP-E4.pdf See: Page 1-2; 1-3; 12-13; 12-14

        The 2013 ARkStorm study was 33 years later than 1980 and 10 after the 2003 study. On top of that, the 1605 megastorm was ~ 50% higher than the ARkStorm – and one of 7 megafloods bigger than the ARkStorm in the last 1800 years.
        Thus I doubt that Oroville spillway was designed for an updated Peak Maximum Precipitation or Peak Maximum Flood that included either the ARkStorm, or the 7 California megafloods, or a full Hurst-Kolmogorov persistence analysis. Of equal concern are the main and emergency spillway failures at far below the design Peak Outflow of 798,000 cfs.

  28. Said above, will say again. This post and its comments raise a real engineering conundrum. To what level of observational risk do you engineer at what cost? My permanent residency building (1998) directly on the ocean in Fort Lauderdale was engineered to withstand a Cat 5 with 150mph winds, thanks to Andrews. The oncrete/rebar footings are massive and frequent (~10 foot spacings) and set down into bedrock ~ 80 feet below the sand. All the shatterproof balcony glass is set in 1/4 inch thick heavy aluminum frames stainless steel (salt corrosion) double bolted every 18 inches into reinforced concrete 2inches thicker than before. (The window frame footings are 4 inches wide, 5 inches high, and 1/4 inch thick). The lower garage is designed to be storm surge flooded despite berms and dunes. Cheap? Hardly. Sufficient against another Hurricane Andrew–also no. We took a direct Wilma hit near the eyewall. Two tornados spun between the two towers on the 10 acres (we think). Repairs took 18 months. Several projecting end unit condos had their hurricane proof glass shattered with total interior destruction. Wilma was a weak Cat 3 here. Was quite a night.
    No such thing as a perfectly safe design. Ditto Oroville, although it sure looks like some mistakes were made there concerning ‘bedrock’. Any amateur geologist knows greenschistnis soft and erodible.

    • Wilma was pretty sporty.

    • I’d guess your footings are more for settlement than cyclones.
      But maintaining rigidity comes at a cost.

      I have a circa 1990’s timber house nominally designed to a category five cyclone. Corrugated steel roof, aluminium windows, everything timber – beach bohemian style. Everything tied back with threaded steel to poles and besser block footings set on emerging rock at the back.

      Marcia hit as a solid category 3 – it was forecast as a category 5. I imagine the house was fairly bendy but there was no damage when we returned. We taped up the windows and hid out at the local shelter.

      In general you are right – over engineering is horrendously expensive. It is quite often preferable to design to a specific risk and engineer in contingency planning. It seems likely that the Oroville evacuation was a contingency plan.

      • A black swan is defined as an event or occurrence that deviates beyond what is normally expected of a situation and is extremely difficult to predict. Dam engineers in California stated they had never experienced the confluence of such events (record rainfall, busted main spillway, blocked penstock through power station and threat of emergency spillway collapse).

        Environmental groups opposed the construction of new outlet gates at Oroville that would have allowed the dam to hold 10 feet more water.

      • Taped up the windows? To what end?

      • David L. Hagen

        Wayne Lusvardi
        Not having experienced it does NOT mean it is a “Black Swan” event – just that they did not take the effort to do a full failure tree analysis. The severe erosion of the emergency spillway from overflow was explicitly warned about by the Sierra Club et al. The consequent blockage of the hydropower was a simple extrapolation of the consequences of erosion. Upstream trending erosion is the next implication as the surface was not hardened. The challenge is whether the concrete ogee weir would have withstood downstream erosion or not. But by then there was not the time to evacuate to find out, in case it did fail.

      • as far as paying attention to the Sierra Club, does the story of the boy who cried wolf mean anything to you?

        The Sierra Club has very little credibility amoung most people.

      • David L. Hagen

        davidelang How about Steven N. Ward’s models?

      • David L. Hagen

        Robert I. Ellison The combined main and emergency spillway design flows appear to be 750,000 cfs. See Fig. 81 page 99 (OR Pdf 151/546) of 10/16/67.
        Consider:

        “It is important to recognize that during a rare event with the emergency spillway flowing at its design capacity, spillway operations would not affect reservoir control or endanger the dam,” wrote John Onderdonk, a senior civil engineer with FERC, in the Federal Energy Regulatory Commission’s San Francisco Office, in a July 27, 2006, memo to his managers.
        “The emergency spillway meets FERC’s engineering guidelines for an emergency spillway,” he added. “The guidelines specify that during a rare flood event, it is acceptable for the emergency spillway to sustain significant damage.

        http://www.mercurynews.com/2017/02/…tate-officials-ignored-warnings-12-years-ago/

      • Yes David. Progressive rather than catastrophic failure is a recognised engineering technique.

        And roving – the tape is simply to hold the broken glass together. Might be marginally better than millions of shards tearing into everything.

        The only safe places in an intense cyclone is a purpose designed community shelter or inland.

      • No matter the probability or intensity of USGS ARkStorm it would have to drop precipitation (not snowpack which melts slower) where the watersheds are for the 10 dams in Northern California shown on the map at the following link (the 2 dams in Southern California only receive water from the Northern Cal dams and should not be considered): Link

        http://www.sacbee.com/news/7jvjhh/picture133210924/binary/reservoir0216.JPG

        I visually plotted whether the hot spots for the ARkStorm are geographically close to the watersheds of the dams. There are 9 major backbone dams in Northern Cal and 5 of them roughly would be conterminous with the ARkStrorm “hot spots” (shown in red and dark blue). But New Melones Dam is an “all-green” dam only for fish water releases and flood control. Oroville and Shasta Dams are the largest and both appear to be within a drop zone for the ARkStorm.

        The ARkStorm would conspicuously leave Lake Cachuma in Santa Barbara County, Lake Casitas in Ventura County and Lake Castaic in Los Angeles County without much water. Lake Casitas in Ventura County is not connected to the State Water Project and totally depend on local rainfall.

        Such ARkStorms once formed a lake (called Tulare Lake in Kern County in the Southern part of the state) that was so big it bisected the state in half.

        There are 1,400 reservoirs in California of which the Federal government has 20 of them and the State of California 22. All of these reservoirs hold about 42 million acre feet of water at capacity. 30% of water comes from groundwater in a normal year but 60% in a dry year.

        In an average year California get 194 maf (million acre feet) of rainfall/snowpack and only about 82 maf is captured within the labyrinth of water systems: of which 8.8 maf goes to cities, 34.3 maf to agriculture and 39.4 maf to the environment. But in a WET YEAR (e.g. ARkStorm) cities get 7.7 maf, farms 27.7 maf and the environment a whopping 62.1 maf (or 64% of the water pie). A WET YEAR gets 335.8 maf of water (think double + for ARkStorm).

    • It’s not even a real storm – but if it were you couldn’t expect either the temporal or spatial pattern to be repeated in the next storm.

      Flows are routed through the watershed to a point where they are compared to a 1000 year flow calculated in a flood frequency analysis at that point.

      It is a storm that happens over about 12 days. It is assumed the catchment is saturated – so perhaps more likely in a wet year.

      These are just general principles. I know nothing about snow and I only know that it never rains in Southern California – but man it pours.

  29. If there was a lesson from this, it is not about 1000 year events, it is about weak links in the infrastructure system. Oroville was a weak link. These occur through human error, or negligence, or footdragging or lack of information. Extreme events, even 30-year events, challenge weak links, and more extreme events in a future climate challenge these more. It is only to be expected. Climate change is an issue for infrastructure that cannot be dismissed because these weak links exist and because climate change will affect the frequency of what has been a 30-year event until the 20th century. In some cases limits will be exceeded that were not even known about, for example urban drainage or energy and water supply, and this will be first seen in an extreme event. Planning can mostly adapt in advance, but that planning has to account for significant change and not a static 20th century climate.

    • Jim D, Oroville seems less like a weak link and more like a deliberate lack of planned maintenance/advancing knowlege upgrades. Glenn Canyon taught a lesson in the 1980s. Oroville did nothing. The spillway cracks in 2013? did little. The auxiliary/emergency spillway issue raised in 2005 was resolved by lawyers, not engineers.
      I dont think the 20th century climate was ‘passsive’ or ‘static’ (both your words); remember the dust storms of Grapes of Wrath? I do think the climate model projections for the future are garbage, since they have also miserably failed so far this century. Please do provide ‘alt evidence’ if you dare.

      • That’s just it. Planning for the 1930’s again is not sufficient, far from it. The background is already a degree warmer, with maybe a few degrees more this century, so the next of those events will be worse, and infrastructure needs to be better than that. That area is relying on the deep aquifers that are running out in a few decades. Are the farms even sustainable in a changing climate? Those are real questions. They need to plan for transferring water if they want to keep farming there. 20th-century thinking is fatal, and most people already realize this.

      • Oroville is just the most obvious example of the kinds of things that will continue to go wrong in the US and globally, and climate change will expose them. Droughts followed by floods is going to be the norm. Perhaps California has learned for this generation at least not to be too complacent about extreme events. But it looks like we need these reminders now and then.

      • “Droughts followed by floods is going to be the norm.”

        It seems that “droughts followed by floods” is the norm in most places – most especially CA and Oz. Certainly there is a great deal of evidence to suggest this is so. So your prediction is “more of the same”, yet you appear to believe this is somehow “new”.

      • Look, dams are designed for a partial breach in the event of a catastrophic event. There is no such thing as a bullet proof dam, freeway, bridge, etc. The foundation at the base of the Oroville emergency spillway was buttressed with rap rap rock to prevent cavitation but my guess is that gunite-ing or shot-creting the steep downslope of the emergency spillway wouldn’t work anyway in the event of a full dam breach. In other words, economics dictates that dams are built only up to public safety standards where harm can be minimized or limited. Reportedly, the Fukushima 9.0 quake was equivalent to 31,250 atom bombs. There is no way to design anything to withstand that. The problem at Oroville was a Black Swan event – where all 3 methods of drawing the reservoir down were inoperable or not desirable and thus design redundancy was ineffective.

      • David L. Hagen

        Wayne Lusvardi
        I don’t think this is a “black swan” event. The lack of hardening on the emergency spillway was raised in 2004. Issues with the main spillway have caused earlier inspections. Both were preventable but not addressed. See Curry’s discussion: https://judithcurry.com/2011/05/02/anticipating-the-climate-black-swan/

      • kneel63, in that case I think the prediction must be for droughtier droughts and floodier floods.

      • Well, look. I worked for the largest urban water district in California. They were the agency the Sierra Club appealed to through the Federal Energy Regulatory Commission (FERC) to finance upgrades to Oroville Dam emergency spillway. Several problems with this. First, the Metro Water District of Southern Cal’s charter doesn’t allow them to finance or undertake flood control projects. Secondly, FERC had only jurisdiction over the hydropower station and releases of the portion of the reservoir water in reserve for flood control purposes. FERC had no jurisdiction over the state owned dam. Next, there was ample funding available in 3 water bonds to undertake such a project (Prop. 1, Prop 84 and Prop 1-E). So funding wasn’t the issue. Probably the biggest hurdle though was engineering. Would a gunite-ing or shot-creting the steep downslope below the emergency spillway work at all when the weight of the water might just crush the concrete lining of the slope? Just a minor release of water washed out a downslope asphalt road. However, a new auxiliary spillway was recently completed with mostly federal funds on Folsom Dam — see here: https://www.usbr.gov/mp/mpr-news/docs/factsheets/folsom-dam.pdf.

        Why didn’t the Sierra Club just lobby the state to use ample bond monies to fix the dam? Why didn’t the Sierra Club think the state would not respond and went to the federal level?

        For documentation that ample funds were available see here:

        Why Trump Should Not Fund Oroville Dam Fix
        https://www.masterresource.org/water-policy/trump-not-fund-oroville-dam-fix/

      • Re: David L. Hagen “I don’t think this is a “black swan” event. The lack of hardening on the emergency spillway was raised in 2004. Issues with the main spillway have caused earlier inspections. Both were preventable but not addressed”

        That the problems at Oroville were foreseeable or fixable does not obviate that they were a Black Swan event. What made it an anomaly or outlier is that multiple events all happened simultaneously (early, heavy rains; blocked penstock; un-repaired main spillway; unreinforced foundation to emergency spillway wall; transmission line towers on downslope that could have been washed out). The main spillway is made to fail but to “fail safe”. The hardening of the downslope of the emergency spillway may not have worked (the weight of the spilled water wiped out a road on the downslope and may have just wiped out any concrete hardening of the slope as well).

        A similar potential problem was present at Folsom Dam, a federal dam, and instead of hardening the slope they installed a new auxiliary spillway – see photo here – http://safcanews.org/folsom-dam-joint-federal-project/

        I’m making no excuses for not fixing the problems at Oroville but we need to understand the mindset and fail safe policy of the dam designers.

        I don’t know yet if there were any upstream feeder dams that were silted up that contributed to the problem at Oroville. The highest probability of dam failure is from the surrounding slopes which is what we see at Oroville. Those slopes are not man made fill but natural bedrock covered with eroded material and vegetation.

        The real story is why was Folsom, a federal dam, recently retrofitted with an auxiliary spillway and Oroville, a state dam, not retrofitted when there were ample state bond monies available? A likely answer is that California is so focused and busy on environmental projects that redistribute wealth to various constituencies that they no longer are concerned about public safety (the San Bruno natural gas line explosion in San Francisco in 2010, the constant blow ups of water pipes under the streets of Los Angeles, the retrofitting of the San Onofre Nuke Plant, so that it could be a peaker plant that could ramp up and down to back up intermittent green power, failed and the plant was shut down eventually resulting in the South West Blackout in 2011).

      • Jimd:”I think the prediction must be for droughtier droughts and floodier floods.”

        Sigh.
        I am not aware that we have approached the 1930’s “dust bowl”, or the 1860s CA floods, so clearly this must be referring to “… than if there was no global warming…” That’s highly speculative and completely unprovable as you well know. And since we currently seem incapable of predicting when droughts/floods will happen, or of what severity they will be when they do (in advance, that is), of what use are such “predictions”? I can’t see that they are of ANY use, but please advise if you think otherwise.

      • kneel63, feel free to deny that it is even possible that a warmer climate has a modified water cycle that leads to this situation. I just report what the scientists say, and you can say no way, and that is fine by me.

    • Climate change? What climate change? The entire southwestern United States is one gigantic hydraulic water system that was mainly built in the 1930’s to overcome………….climate change! It’s called drought and it is abnormal in many parts of the world but NORMAL in California. Dry years occur typically in succession over a 4-year interval followed by a deluge in the fifth year. The recent California drought started in 2012 and like clockwork ended in 2017. California once was bisected by huge lakes in the southern Central Valley. All that climatologists have been able to do is forecast for about a decade (it’s called the Pacific Decadal Oscillation) which means every 10 years the Pacific Ocean temperature changes causing weather changes. But even then forecasters can’t predict when it would actually occur or its magnitude.

      Jim Steele, former biologist for the State of California, conducted a study of temperature changes over the past 100 years in the state and could find no evidence of climate change. His conclusion: It’s not getting hotter, just less cold.

      Climate change is built into the raison d’etre of dams. If you want to do something about climate change build more dams.

      • This is only the beginning of the ramp. One degree is a little harder to notice than three or four. Maybe a few more forest fires, or a few less skiing months and the Sierras getting generally less reliable as a water source. These things already figure into the planning there, and I think regular Californians have noticed these changes which is why they are among the most forward-thinking states in terms of climate action.

      • the biggest cause of big forest fires in California is the fact that the natural forest fires have not been allowed to take place for decades. This has resulted in conditions that mean that the fires that do start (for whatever reason) are much harder to knock down and much more likely to become major fires.

        This has nothing to do with ‘global warming’

      • Yes, the skeptics keep hoping that, and that the season isn’t getting any longer. I have heard that one before.

      • > Jim Steele, former biologist for the State of California, conducted a study of temperature changes over the past 100 years in the state and could find no evidence of climate change. His conclusion: It’s not getting hotter, just less cold.

        Like a boss, no doubt.

        Citation needed, BTW.

      • It is 20 to 30 year periods involving cool (warm) PDO and increased frequency and intensity of La Nina (El Nino).

        https://earthobservatory.nasa.gov/IOTD/view.php?id=8703
        https://eoimages.gsfc.nasa.gov/images/imagerecords/8000/8703/sst_anomaly_AMSRE_2008105.jpg

        More salt in the Law Dome ice core is cool Pacific conditions. There was a warm Pacific high last century that caused most 20th century warming. A cooler Pacific this century seems likely.

        https://watertechbyrie.files.wordpress.com/2014/06/vance2012-antartica-law-dome-ice-core-salt-content.jpg

      • Here is the source
        Jim Steele, Landscape and Cycles.
        http://landscapesandcycles.net

      • Come on, Wayne. Be a little more specific.

      • OK here you go.

        We all know that deserts were once rainforests and vice versa, witness the La Brea Tarpits in Los Angeles.

        Link https://en.wikipedia.org/wiki/La_Brea_Tar_Pits

        And we know in a regional view that the Anasazi Indian Tribe died out at the urban village at Chaco Canyon, New Mexico in the late 1200’s ostensibly due to prolonged drought. The Anasazi had fluorished during a wet period during which the captured the water and diverted it into irrigation ditches to live an agrarian agricultural life. But the Yurok Indians
        in the northwest still thrived.

        Link Chaco Canyon: https://www.learner.org/exhibits/collapse/chacocanyon.html

        Link Yuroks
        https://www.gilderlehrman.org/history-by-era/american-indians/resources/cultures-americas-1200-bc–ad-1600

        So we knew for civilization and a technological economy to work in the southwestern US that water resources, which were abundant in high elevations where people didn’t live and came in short cycles of 4years dry and 1 year wet, with a deluge every so often, could be developed to grow cities and economies.

        historical water cycle california –
        http://www.bytemuse.com/post/drought-historical-rainfall-california/

        And we knew in the 1930’s that droughts can sometimes be lengthy (10-15 years) but were mostly local or subregional while other areas still has abundant water. So during the FDR New Deal era, large dam and hydro projects were built to mitigate these long/short, local/regional water cycles.

        So the American southwestern civilization is one huge water hydraulic conveyance system and locally effluent treatment systems to maintain public health. And it was all built to alleviate or mitigate climate changes not daily or seasonal weather changes.

        So the recent “drought” wasn’t a drought at all. It was a dryspell and a human created water shortage. For if 4-years dry and 1-year wet is the typical cycle, then a drought would have to be longer than 4 years (presumably california water agencies were storing 4 years of water to withstand the 4 dry years. The recent Cal water shortage lasted from 2012 to 2015, 3 years; and in spots it persisted to the end of 2016. So technically it wasn’t a drought. Where drought was occurring was over on the Colorado River system over the past decade or longer. But the Colorado River system had a water storage ratio of 4/1 while California had 0.5/1. The Colorado system could last dry years and actually had much more in backup storage that was gradually depleting. While California couldn’t withstand more than about a 1/2 year dry spell.

        This prompted me to propose different definitions of drought than it being anything that politicians wished it to be. What I came up with somewhat humorously is as follows (not Punta means peak, ano is year, an a means non, so apunta ano means a non peak water year).

        Judicial apunta: a court-ordered water shortfall.

        Regulatory apunta: a water shortfall caused by some regulatory agency but not the state Legislature.

        Legislative apunta: a water shortfall caused mainly by legislation (such as AB 32).

        Demographic apunta: a population-induced water shortfall.

        El Niño or La Niña apunta: a water shortfall from ocean water temperature in the Pacific Ocean.

        Ag apunta: a water shortfall caused mainly by farmers switching to permanent all-year-round crops.

        Lago apunta: drought caused by a lack of water storage reservoirs in watersheds

        Junta apunta: the humorous sounding but serious term referring to a water shortfall caused by a small group militantly seizing power in a coup or junta (e.g., the Aral Sea in Uzbekistan — an inland sea intentionally shrunk to the status of a lake and wasteland resulting from Soviet central planning in the 1950’s).

        Junta punta: a term referring to the aggregation of water caused by centralized planning (e.g., when the Metropolitan Water District of Southern California built the Colorado Aqueduct and spillage resulted in the creation of the Salton Sea in Southwestern California).

        UNTA punta: a totally unserious term meaning an accidental spill of water caused by some bureaucratic agency (e.g., the United Nations Transitional Authority or the fictional University of North Tulare Authority).

      • > So the recent “drought” wasn’t a drought at all.

        Thy Wiki might need revision:

        The state’s reservoirs are not even big enough to hold 1 year of precipitation. On February 11, 2017 — less than 1 year after the worst drought in 1200 years[3][4][5][6][7][8] — 3 of the largest reservoirs were simultaneously dumping water into the ocean for flood control. Lake Oroville reached 101% of its design capacity for the first time in 48 years.[9] Combined outflow from lakes Shasta, Oroville, and Folsom was 370,260 acre-feet that day.[10] This water would have been worth $370M if it had been available for watering lawns.[11] Residents who ripped out their lawns less than 6 months ago watched in horror as all of this water spilled into the Pacific Ocean in 1 day.[12]

        https://en.wikipedia.org/wiki/Droughts_in_California

        A few scientific studies might also deserve due diligence if Denizens are into denying the worse Californian drought in 1,200 years.

      • Re: “So the recent “drought” wasn’t a drought at all” – Willard

        Precisely. That ignoramus Donald Trump had it right when he stated that there was no drought in California.

        By the way, you are using $1,000 per acre foot as the price for water which is misleading. That unit price would more reflect the finished or developed price of potable water at the retail level not the wholesale price of raw water. Raw ag water rates (not prices, for there is only a tiny market for water in California) vary wildly from, say, $25 to $150 per acre foot. System water is a “public good’ (it is not excludable and there are no rivals for its use), therefore, it is valued based on cost recovery. To charge water ratepayers more than cost at the retail level requires a vote of the ratepayers. However, water rates charged by private regulated water utilities are Cost-Plus a rate of return approved by the Public Utilities Commission. The reason the raw water rates are so low is dependent on whether the dams and conveyance system financing costs have been paid off. For example it costs only like something in the order of magnitude of $100 per acre foot to convey raw water from the Colorado River to Southern California because the Colorado River Aqueduct bonds were retired long ago. If that system had to be rebuilt the cost would easily be $1,000 for RAW water. So, no, farmers in California do not get water subsidies because of low raw water rates (anymore than homeowners with a 3% mortgage are getting a subsidy if mortgage rates climb to 8%).

      • > That ignoramus [teh Donald] had it right when he stated that there was no drought in California.

        Nice proof by assertion, Wayne.

        The “So the recent “drought” wasn’t a drought at all” were your own words, BTW.

        Here’s NG on California’s drought:

        John Nielsen-Gammon, professor of atmospheric sciences who also serves as Texas’ State Climatologist, says the current drought in California is so far comparable in many ways to the 2011 Texas drought, the worst one-year drought in the state’s history that caused more than $10 billion in damages and led to numerous wildfires and lake closings.

        “This is the third year of California’s drought and it is on pace to be as dry as Texas was in 2011,” Nielsen-Gammon, a California native who grew up in the San Francisco area, explains.

        https://www.sciencedaily.com/releases/2014/02/140210161236.htm

        Not sure you’d want to challenge NG with your proofs by assertion, but hey, you got 20 years of experience etc.

      • Re: Nice proof by assertion, Wayne. — Willard
        I can always tell those on the political Left. They project their own fallacies onto others. Look, I offered data, not empty assertions, that California had no drought.

        I will briefly repeat that data for readers: on average four out of every 5 years are dry and 1-year is either wet or deluge. Therefore, it is the task of government to have water in storage to meet that 4-year dry interval (give or take 1 or 2 years). That storage can be in a surface reservoir, groundwater basin, water bank, or water conservation credits along the Colorado River system, or in carry-over water from the prior year.

        You relied on an appeal to authority, not data. I also offered links to authorities such as Bob Johnson, US Bureau of Reclamation water consultant and the US Dept. of Interior “Water Master” of the Lower Colorado River, now president of the Water Education Foundation – link here:
        http://www.watereducation.org/board-member/robert-w-johnson-president

        Drought is anything government and the media wants it to be in California but that doesn’t mean it is anything but a dry spell and a man-made water shortage.

      • > Look, I offered data, not empty assertions, that California had no drought.

        You offered no such thing, Wayne. You just handwaved around stuff. And we’re supposed to infer from that that California had no drought?

        Sure. That’s just enough for denial. To dispute what we can read the thy Wiki and its citations or what NG himself says, you need better than that.

        Look. You can’t even cite Sierra Jim’s crap properly. You cite an some random dude, yet you don’t quote him. You don’t even seem to realize who’s NG. Heck, you can’t even recognize your own words and put them in my mouth.

        You must be new here.

      • Re: “You offered no such thing, Wayne….You don’t even seem to realize who’s NG”.

        Well, you must be losing an argument to resort to twisting and filibustering. I will let readers decide if I offered data. You offered no data. You just cited an academic in Texas that a one year hot spell was a drought and a 3-year dry spell in California was a drought. Wow, what sophisticated analysis that takes.

        It so happens that I wrote a couple of articles and compared how Texas and California were handling their droughts when I was a water policy analyst for the Pacific Research Institute. Anyone can read those articles at links below:

        In Fighting Drought San Antonio Leaves LA in the Dust
        http://www.capoliticalreview.com/top-stories/in-fighting-drought-san-antonio-leaves-l-a-in-the-dust/

        Texas Anti-Drought Plan is SWIFT; CA’s is slow.
        http://calwatchdog.com/2013/08/02/tx-anti-drought-plan-is-swift-cas-is-slow/

        I have also been to Texas and investigated the San Antonio Water System’s proposed Vista Ridge Pipeline Project.

        You only have words to offer. Neither you nor NG have offered any data or analysis. I am not trying to refute NG or you, just offering an alternative way at defining “droughts”. The very word “drought” sells newspapers and web hits and manufactures public hysteria and thus is a useful word for politicians. There is no god in the sky, or in some Blue GAIA, or some academic at a university who has decreed what a drought is definitively. You and NG want it defined as any 1-year hot spell. I don’t because it is up to government to plan to manage such dry periods by storing as much water as possible during the expected 1-year deluge.

        By the way what is your full name?

      • > You offered no data.

        Neither did you, Wayne. The only difference is that unlike you I don’t pretend I did. Neither do I conflate data with analysis. And if I was disputing a definition, I would pay due diligence to it, something you still fail to do.

        My two citations provide enough evidence for what I’m claiming. NG, whom you brush aside as an “academic,” is Texas state meteorologist. I suspect that he’s in a good position to know when using the word “drought” was appropriate or not. Not only NG’s the Texas state meteorologist, but he’s the Bobby Orr of Climate Ball.

        Let Denizens do whatever they please.

      • > here is the data I relied on

        The data of Fresno, yes, and to do what exactly, to substantiate that the worst drought in 1200 years in California wasn’t a a drought?

        Perhaps are you refuting the claim when you say It’s no mystery that California is in a state of drought.?

        Can you spell cherrypick?

        And where’s the analysis?

        Heck, where’s your discussion of the concept of drought?

        Have you ever considered hiring a technical writer before submitting your posts, if you can call these posts?

      • Allow me to drill down deeper in the definition of drought. A drought in Texas to a meteorologist is not the same as a drought in California because a true drought in California is a lack of enough water in northern Cal reservoirs to meet human needs over the anticipated dry period. So Gammon’s simplistic definition does not fit with California realities.

        You accused me of cherry picking data but you and your expert picked out only one year to define as drought, while I trended data from the last 100 years.

        The California Dept. of Water Resources does not necessarily consider the past recent drought its worst. It was the most severe hot spell but not the worst drought:

        “California’s most significant historical droughts of statewide scope were those with the longest duration or driest hydrology – the six-year drought of 1929-34, the two-year drought of 1976-77, and the six-year event of 1987-92. Although the two-year event of 1976-77 was brief in duration, water year 1977 was the single driest year of observed statewide runoff and 1976 was also extremely dry. The state’s most recent drought was 2007-09, and it is briefly covered in this report to provide context for drought impacts under a recent institutional setting”.
        LINK
        http://www.water.ca.gov/waterconditions/docs/California_Signficant_Droughts_2015_small.pdf

      • Thanks, Wayne. That’s better.

        Here’s NG’s comparison of Texas and Cali’s droughts:

        http://www.chron.com/news/science-environment/article/A-M-researchers-compare-Texas-and-California-6231430.php

        Since California is the #1 agricultural state of the US of A, I’m not sure I’d prefer reservoir levels myself.

        In fact, I’d stick to simpler things, e.g.:

        https://www.ncdc.noaa.gov/temp-and-precip/drought/historical-palmers/psi/201003-201502

      • Ok now we’re resonating. Yes the Palmer Drought Severity Index is a long time standard for measuring drought. But so much of California’s water is in deeper basins than what the Palmer Index can measure. There is no known precise measure of how much groundwater California has. By the way, here is the Palmer Drought Severity Index for California in an historical perspective, which shows the recent drought was not as deep as the one in 1898. Here is the link:

        http://icons.wxug.com/hurricane/2014/pdsi-ca-dec-2013.png

        So much for the media notion that it was the worst California drought in 1,200 years.

      • Once again, I am not trying to get sucked into refuting or supporting John Gammon’s no-analysis data that 2011 was a record hot year in Texas. What Gammon wrote is self obvious. He called it a drought. I don’t. It was a hot spell.

        Here is some more data – California’s Water Balance Table which was embedded in the articles I provided links to.

        http://www.water.ca.gov/swp/watersupply.cfm

        What Gammon defines as drought is meteorological drought:

        meteorological drought – lack of precipitation
        agricultural drought – lack of soil moisture, or
        hydrologic drought -reduced streamflow or groundwater levels
        reservoir or reserve drought – lack of sufficient water in reservoirs to meet human needs or historical dry periods; a water deficiency

        The first three definitions don’t mean much in a technological society where water first has to be made potable and conveyed to users and in turn users have to use it for effluent removal to be reconveyed to sewage treatment plants. Yes, if you lived on a barren island, meteorological drought would be most important. If you lived in California or Iowa in, say, 1850, agricultural drought would be most important. If you live in California’s Central Valley circa 2012 – 2017, and you are a farmer, then hydrological drought would be more critical, especially in California because during drought farmers shift from imported water to groundwater, thus bailing out coastal cities from dangerous water shortages.

        I worked in a water agency so I prefer the last definition.

      • The people in Texas sure thought it was a drought in 2011. This famously happened.
        https://en.wikipedia.org/wiki/Days_of_Prayer_for_Rain_in_the_State_of_Texas
        Their prayers didn’t work unfortunately. That governor is now Trump’s energy secretary, so maybe he has some novel solutions for climate change too.

      • Jim
        Texas take 5 years less than California to build a major dam (if it isn’t killed by a green lawsuit) and in Democratic controlled San Antonio they are building a pipeline with private financing that will bring water from other nearby counties and they will pay those counties for their water (not take it). So whether Republican or Democrat, Texas seems to do better than California with water projects. There is something in the water in Texas that is different than California. The last thing Democrats in Texas want to do is become like California Democrats and they know it.

      • Helps to have a much higher annual rainfall too, and not rely on rivers from the mountains. Texas just needs a few reservoirs and they are set.
        https://s-media-cache-ak0.pinimg.com/originals/f6/32/7c/f6327c0a1af4bcd4457bc9c1c17005f2.jpg

      • Yes, the reason that Texas does so much better at water conservation than California is that it is more decentralized to a large number of agricultural and municipal water districts. But imagine if two thirds of Texans lives in West Texas where the map you posted shows it to be dry and somehow water had to be imported from East Texas. That is the pickle California has been in historically.

        Even worse, most of the groundwater in the state is located in the inland areas of the state but the big cities are all along the coast where there is less groundwater (see groundwater map below).

        Due to geological formations, San Diego has no groundwater.

        And as you can plainly see the desert areas of Southern California has abundant groundwater contrary to the liberal book by journalist Marc Reisner “Cadillac Desert”. But Reisner’s book is probably the first source that commenters on blogs, both liberal and conservative, refer to when they want to discuss water policies. But Reisner’s book is nothing more than a metaphor that is out of touch with geophysical reality.

        https://mavensnotebook.com/wp-content/uploads/2015/06/MHall_UCDavisLawClass10-19_-2-2015_v2_Page_03.jpg

        Readers may be interested in reading my data based related article “Big Coastal Cities, Not Wealthy Cities and Farms, Have Drained Northern California Reservoirs” – link here:

        http://hydrowonk.com/blog/2015/05/28/big-coastal-cities-not-wealthy-cities-and-farms-have-drained-northern-california-reservoirs/

        Here’s a mind blower for all the “Cadillac Desert” fans out there is waterworld: Statistics indicate the City of San Francisco uses only a miserly 92 gallons of water per person per day. But it still imports 87,840-acre-feet water (28.6 billion gallons) per year more than all the cities in the Palm Springs area even with their 120 golf courses. In San Francisco, they drink the Bay for dessert.

        Stay tuned for a mind blowing follow up post on how California had a structural drought not a meteorological drought.

      • Wayne, thanks for your comments here. Hydrowonk blog looks really interesting, i’m now following on twitter. pls send me an email if you would like to post one of your essays at CE. The company also looks interesting (from the perspective of my company Climate Forecast Applications Network)

    • Re: That area is relying on the deep aquifers that are running out in a few decades.- Jim D

      The major regional groundwater basins are NOT being depleted in California. That is bunkum. They withstood a very similar drought to the recent California drought in 1977-78 and recharged. The National Academy of Sciences conducted a study of groundwater depletion in 2012 and only the Tulare Basin (in the south) was under threat of depletion in another 390 years!

      Graph of Historical Groundwater Basin Depletion in California

      http://calwatchdog.com/wp-content/uploads/2014/04/Groundwater-chart-300×213.png

      The NAS study can be found at the following link:

      http://www.pnas.org/content/109/24/9320.short

      • To be clear, I was talking about the dustbowl area in response to a previous question about the 30’s.

  30. It’s not raining, what could go wrong ? Looks solid to me.

    Maybe some changes in material design to stop the water from eating away at the concrete.

  31. Pingback: Will the Oroville Dam survive the ARkStorm? | privateclientweb

  32. Jones & Ricketts (Ibid.):

    Over the 5 long term, this warming conforms to a complex trend that can be simplified as a monotonic curve, but the actual pathway is steplike… this rules out gradual warming, either in situ in the atmosphere or as gradual release from the ocean, in favour of a more abrupt process of storage and release. The precise mechanisms by which this occurs remain to be determined. This conclusion supports the substantive hypothesis H2 over H1, where the climate change and variability interact, rather than vary independently.

    In the field of climate science, we apparently are confronted with the reality that no matter how much we learn, the only certainty that we are able to come away with is that, science asking the wrong questions.

    • They’re not asking a wrong question.

      • Who bears the burden of proof?

        Should scientific skeptics of AGW be required prove the negative of all of the claims of global warming alarmists?

        Should we abandon the scientific method and spend trillions of dollars without proof AGW is even a problem?

      • Wagathon
        Seems to be a lot of jellyfish these days, wonder why?
        “Then you’ve got climate warming, which just amps up jellyfish in unbelievable ways. Fractions of degree changes above normal water temperatures amp up their metabolism, they eat more and breed more and live longer — it’s astounding what a little bit of warming can do for jellyfish. Trawling gives them new room for their polyps to settle, and while acidification or chemical pollution doesn’t hurt jellyfish, it hurts everything else like fish and shellfish that struggle with environmental change.”
        http://www.popsci.com/built-for-survival-jellyfish-are-quickly-becoming-pests

        My comment:
        The last species that will experience distress will be homo sapiens but their big brains will use technology to prevent their extinction and accelerate their evolution.
        As to the rest of the flora and fauna we’ll try to protect the ones we need (including pets!) and the rest will just be gone. Not a good/bad outcome, but predictable collateral damage.

    • What wrong question did Jones17 ask?

      • David L. Hagen

        Jim D So that is evidence of “catastrophic majority anthropogenic warming”?
        You compare a low pass filter geological record with recent data?
        Then you presume 1 parameter out of hundreds dominates the model?
        Be serious:

        With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.

        Attributed to von Neumann by Enrico Fermi, as quoted by Freeman Dyson in “A meeting with Enrico Fermi” in Nature 427 (22 January 2004) p. 297

      • Your reference to HK statistics was imprecise in what time scale you would use as a basis. The way I read it, global warming would have to exceed Ice Age deviations to register as significant by any statistical measure that includes them, which I don’t think you meant to say. Lovejoy (2014) has looked at centennial scale statistics and find the current perturbation to be significantly outside random even allowing for fat tail distributions.
        http://www.physics.mcgill.ca/~gang/eprints/eprintLovejoy/neweprint/Anthro.climate.dynamics.13.3.14.pdf

      • David L. Hagen

        Jim D
        Will anywhere within 9 order of magnitude range be sufficient?!
        For the tropical tropospheric temperature “anthropogenic signature”, global climate model predictions since 1979/81 are already ~300% to high over the satellite range. See Christy Feb. 2, 2016
        With a period N of ~ 60 year for the Pacific Decadal Oscillation, comparing global PDO temperature oscillations probably will need to be at least 2N long. etc.
        See the references I provided. e.g.
        Y. Markonis, and D. Koutsoyiannis, Climatic variability over time scales spanning nine orders of magnitude: Connecting Milankovitch cycles with Hurst–Kolmogorov dynamics, Surveys in Geophysics, 34 (2), 181–207, doi:10.1007/s10712-012-9208-9, 2013.
        http://www.itia.ntua.gr/en/docinfo/1297/ corrected preprint

      • DH, I was asking you to be more precise about your null hypothesis. Does the climate have to go outside the range of all previous climate before you regard it as a significant event? Your HK criteria suggest that if you take it at face value. As for AGW, it explains the relation between CO2 and temperature seen in graphs like Lovejoy’s and this one, and the gradient is 2.3 C per doubling. This is just observations and theory, as is Lovejoy’s work. No models. The observations fit AGW with its central estimate of transient sensitivities, so you can credit AGW with a century of observational evidence.
        http://woodfortrees.org/plot/gistemp/from:1950/mean:12/plot/esrl-co2/scale:0.01/offset:-3.2

      • OP-Ed Steven? Is that what the alarmist argument has come down to? Mere opinion?

        Such a waste of science. We no longer need it! We just opine the answers!

      • David L. Hagen

        Jim D
        Waving a multi millenial low pass filter is a poor excuse for trying to hide the Medieval Warm Period, the Roman Warm Period and the Minoan Warm period etc. https://wattsupwiththat.files.wordpress.com/2013/03/gisp2-ice-core-temperatures.jpg

        FYI Blackbody radiation proportional to T^4th is an overriding negative feedback that totally dominates all other “amplifications”.
        Try realistic physics that includes ALL the major feedbacks.

        PS By “stable” I mean that the global temperature has been alternating between warm interglacials and cold glacial periods – but it is stable in not going outside those ranges. Paleo climate evidence shows we hae the majority of the warm interglacial period behind us and are moving towards the end of the holocene interglacial.
        So what are you willing to do to prevent rapid cooling into the next glaciation – rather than the lemming mantra of global warming?

      • David L. Hagen

        Jim D
        Those advocating majority anthropogenic global warming have the burden of proof to show statistical solid evidence of warming anthropogenic contributions substantially different from the HK statistics that track natural variations. To date, they have not yet seriously addressed HK statistics, let alone showing statistically sound validated evidence of differences from them.

      • > Waving a multi millenial low pass filter is a poor excuse for trying to hide the Medieval Warm Period, the Roman Warm Period and the Minoan Warm period etc.

        What does a high MWP tell you about the odds of a low CS, DavidH?

        Asking for a friend.

      • Oh, and what if I told you that not everyone’s equally impressed by Kousto’s stuff, e.g.:

        http://www.climatedialogue.org/long-term-persistence-and-trend-significance/

        ?

        Why do you keep pushing statistical irrelevancies in a post that is supposed to be a dam in Cali?

      • DLH, my question about HK statistics that you still haven’t answered was a quantification one. Does anything less than the size of an Ice Age register as significant in the way you use those methods? What size perturbation would register as significant. How about the speed of change being unprecedented? Does that count in any way? What we have now in paleo terms is a step function of a degree in a century.
        As for cooling, the risk of an AMOC slowdown is increased by high emission rates, and that leads to a substantial cooling of European climate with, maybe, sea-ice or icebergs reaching Scotland, so if that is a genuine concern to you, think about that.

      • DLH, some don’t count the Planck Response as a feedback. Positive forcing leads to warming. Warming stops when the surface gets warm enough to counter the forcing. That warming itself is considered a negative feedback by some, but it is not the good sort of negative feedback that the skeptics are always talking about when they mention negative feedbacks, and it is better to distinguish it as just the response to the forcing.

    • David L. Hagen

      Wagathon See my article:

      I propose the following Climatic null hypothesis that:
      “Natural climatic variation is quantified by the stochastic uncertainty envelope of historical and paleo data, embodying the nonlinear chaotic interaction of atmospheric, oceanic, volcanic, solar, and galactic processes, including climate persistence quantified by Hurst-Kolmogorov dynamics.”

      Scientists proposing catastrophic majority anthropogenic global warming models (a.k.a. “Climate change”) bear the burden of proof of providing clear robust evidence supporting validated model predictions of anthropogenic warming with strong significant differences from this climatic null hypothesis.

      • The scientific hypothesis is more like the first two of Newton’s Laws. (1) The climate changes only when it is nudged and (2) the change is proportional to the amount it is being nudged. The last 1000 years of data support this hypothesis, and suggest that increasing that nudging by a factor of a few more in the century would continue to support this hypothesis.
        https://static.skepticalscience.com/graphics/past_change_med.jpg

      • > The last 1000 years of data support this hypothesis, and suggest that increasing that nudging by a factor of a few more in the century would continue to support this hypothesis.

        Ok, what was the nudge that moved the world out of the medieval warm period into the little ice age, and what moved it out of the little ice age?

        we don’t have any idea what the temps were 1000 years ago, we barely have it for 100-150 years ago.

        to be making predictions with a claimed accuracy of 0.1C based on this is just silly

      • The MWP and LIA were relatively small compared to what we have now (see graph). Mostly the changes have been solar and volcanoes until after the LIA. Also note the downward trend from the Milankovitch precession cycle effect which explains why the LIA is about the coldest it has been in thousands of years. If someone is claiming a prediction of 0.1 C accuracy that would be wrong, but the scale of change is of order one degree already, and we can see it is getting larger in line with AGW’s forcing. Skeptics call this just coincidence or ignore it altogether.

      • Jim

        This is a theoretical question as that graph of yours does not reflect reality, but if it were, do you really want to live in a world some one degree centigrade or more cooler than it was in the 1800 and 1900’s?

        How would we keep warm? How we would grow enough food in such a cool climate? How would the trees grow well enough in order to produce nonsensical tree ring readings?

        Tonyb

      • With Greenland and AMOC tipping points, pushing the climate could lead to a cold surprise for Europe along with fast-rising sea levels, so don’t get too comfortable with the rapidly changing climate we find ourselves in. I am for a stable climate.

      • David L. Hagen

        Jim D Climate is “highly stable” between -7C and 4C in the glaciation period. https://upload.wikimedia.org/wikipedia/commons/f/f8/Ice_Age_Temperature.png If you want a “stable climate”, what do you propose to prevent global climate from descending into the next glaciation?

      • We know from the last 10000-15000 years that deglaciation is an unstable process, subject to abrupt reversals and meltwater sea-level pulses. That is what we are pushing against with added forcing because there are still substantial glaciers left, and these are on the brink. Apart from that, if you really define an 11 degree fluctuation as “highly stable”, I would question your judgment. Mass extinctions need far less. What is relatively stable and favorable to the growth of civilization is the Holocene that has mostly been within half a degree of today’s temperature for 10000 years.

      • David L. Hagen

        Jim D By what remarkable evidence you support your claim that the Holocene has “mostly been with half a degree of today’s temperature for 10,000 years?” Constrast a typical university lecture

        By 5000 to 3000 BC average global temperatures reached their maximum level during the Holocene and were 1 to 2 degrees Celsius warmer than they are today. Climatologists call this period either the Climatic Optimum or the Holocene Optimum.

        Have you been working too hard to eliminate the Medieval Warm Period and ignoring the Little Ice Age? etc.

      • I just posted that.

      • ‘ We have to get rid of the Medieval Warming Period.’

      • No, the MWP is clearly seen around 1000 AD. It was a blip by comparison with the Holocene Optimum, but the skeptics love it anyway, because they have no sense of the long view.
        http://www.realclimate.org/images//Marcott.png

      • You probably missed the world wide drilling expedition that confirmed that the MWP was 1) world wide 2) was as a warm in duration and warmer than the current warm period. A small blip it wasn’t. Neither were the other periods that were evident, that weren’t noted historically.
        So, perhaps you can tell me, since co2 lags temperature by 800 years, is the current rise in co2 entirely from anthropogenic sources or is it due to that warm phase 800 years ago ? Perhaps you can tell me why the growth of co2 has leveled off at 3 ppm since 1998 ? I just want to know how you can add 10 to 12 billion metric tons of anthropogenic co2 in 2016 and get almost the same amount of co2 as in 1998 ? A one off variation ? Let me answer that for you, No.

      • Actually temperature lags CO2 in some clear examples. You see this from the way paleoclimate volcanic periods got warm after the volcanic events, not before, e.g. the Permian-Triassic transition. The volcanoes raised the CO2 level, and it got warm. As CO2 levels dropped due to mountain building, as in the Eocene, it got cooler. There are these aspects of paleoclimate where CO2 levels changed due to geological processes and the temperature followed, only now it is us instead of geology doing it. As for the CO2 growth rate leveling off, I don’t think so.
        http://woodfortrees.org/plot/gistemp/from:1950/mean:12/plot/esrl-co2/scale:0.01/offset:-3.2
        If only more skeptics knew more about paleoclimate we wouldn’t be having these arguments about cause and effect.

      • From 1998 which was 2.93 ppm, and in 2005 which was adjusted from 2.52 ppm to 3.10 after March 2015, the co2 did not exceed the 2.93 ppm until last year, and then only by a few hundreds of a ppm. You are wrong on that. Co2 levels also follow cosmic ray activity and solar cycle from peak co2 to peak co2.
        The year following 1998, the ppm was 0.93 ppm. 1998 was the only year were the amount of co2 matched the amount that was produced and the difference between what was calculated to be sunk that made its way into the atmosphere. Since 1998, there were 8 years were the co2 levels did not exceed 2.0 ppm. For 2011, 7.5 billion metric tons was missing from the accounting. Every year is short. For at least the last 12 years anthropogenic co2 increased by 1 billion metric tons over the last.
        The anthropogenic co2, according to the European agency, leveled off in Feb of 2015, at 38 BMT later revised down to 36. NOAA is in agreement with those numbers. The sinks are continuing to grow. Or they are the same or smaller than 60 years. In light of the fact that when 12 BMT was produced in 1965, the sinks took half, and the other half became co2 in the atmosphere, at 0.93 ppm. Today the entire 12 BMT is being sunk along with another 7 BMT. If you graph the co2 ppm per year with the change in temperature, there is no doubt that co2 is following temperature. There is an underlying warming trend. The ratio of natural and anthropogenic co2 that is increased or decreased is unknown. As you just pointed out the carbon cycle is not in balance. There are other factors as well. AGW IS not my theory. I don’t have to prove what is was. AGW does. For it to be a valid theory it has to prove why in the absence of increased co2 it became warm enough to grow grapes in England and barely in Greenland, similar plants in China. It has to prove that whatever was making it warm, is not happening now.
        I’m not arguing about instances of volcanoes. I am arguing that the ice cores were used to verify AGW, except that the ice cores showed an 800 year co2 lagging temperature. Which AGW still fails to acknowledge.
        I don’t think so, I know so. I also know NOAA has adjusted the co2 record at least twice since March of 2015. It correspondingly adjusted the temperature record. Not by much, but enough to throw the results into question. There is no reason to think that at some future date they will correct the record for 2016, which came in at a little over 3 ppm, to what I calculated it should have been, at least 5 ppm. Adjusting the data to fit the calculated amounts is not science. And that’s what AGW is doing.

      • There is a realization that a warming ocean by itself outgasses CO2 at perhaps 10-15 ppm per deg C from chemistry considerations alone. This accounts for the rise of CO2 from 190 ppm in the Ice Ages to 280 ppm after. It was not emissions but just equilibrium chemistry, and that is important to note. Likewise even as the globe warms now, some of the contribution is from this chemistry effect of warmer oceans holding CO2 less efficiently, so maybe 10 ppm is also contributed by the degree of warming, but the other 100+ ppm is from emissions. In warmer years such as 1998, this outgassing also increases, and that correlation is seen in the data, so you have to allow for that when comparing the increase to emissions.

      • I will repeat, last year should have been on the low side of 5 ppm to a high of perhaps 8 ppm. Being a declining solar cycle, an el nino, and the warmest year on record. The accounting for co2 is adding up.
        You haven’t addressed any of the issues of the sinks. This is another dance around the issue type of response. And neither the the short term lag in the last 60 years of co2 to temperatures. It is beyond obvious. Neither the longer term of 800 years, which we do have records of, no matter how much AGW would wish it to go away. And AGW keeps trying, … see the current issue in this blog of solar forcing. As if, another dead end discussion, as this one is, it is a common tactic among warmist to engage in endless futile debate that resolves nothing, it’s in the brochure from 2002 … I am well aware of the geologic past.
        And really, warmer oceans holding less co2, I agree with that, then how are you explaining the the sinks that are currently, by official standards, pulling out 1 and half times more co2 than all that was produced in 1965 ? Tell me, in addition to the acknowledged 17 BMT that was pulled out of the atmosphere in 2011, where is the other 7.5 BMT that isn’t in the atmosphere?

      • If you are suggesting no one expects sinks to take half of the CO2, I am not sure where you get that from. Do you want me to explain why I think there is a sink that takes half of the emissions? The rise rate is about 2.5 ppm per year, but we emit about 5 ppm per year. Natural sinks try to keep up but can’t. If we can slow emission growth rates below the previous exponential rate, nature could keep up better and that sink fraction may exceed a half.

      • NOAA..SAYS 26% Land, 24% Oceans. 50% atmosphere. In fact not only are the sinks taking half… more is missing from the equation. And the sinks are growing when they should be shrinking. The sinks are not only keeping up, they are, if all of the co2 increase is anthropogenic, they are eating into it. Let me say it again, for 8 years since 1998, the ppm were below 2. 00 . Another 4 years were barely over 2 ppm each year. The co2 level in 2014 was 0.05 ppm above 1977 and 2013 was 0.05 ppm below 1977. The year Pinatubo went off, co2 levels fell to 0.48 ppm. Let me put it another way, just in the 8 years where co2 levels were below 2.00 ppm, there is a total of 96 BMT that went somewhere, in addition to the 50% that had to have been sunk. It’s well over half already. I estimate the sinks are taking 67 to 80% of anthropogenic co2. If.. if… anthropogenic co2 is the only cause of increase. Last year the ppm increase in the atmosphere should have been 5, lowest estimate, it should have actually been between 6 and 7 ppm. It was 3.1 ppm.
        No year since 1998 should have been below 3.00 ppm NOAA adjusted 2005 from 2.52 ppm to just over 3. . I’m sure if I keep at it they will adjust them again.

      • Since 1998, the CO2 level has risen by 38 ppm and emissions have totalled just under 600 GtCO2. This works out close to 50% of the emissions since 1998 alone is added to the atmosphere.

      • So Jim D, if the oceans release CO2 as they warm, and CO2 causes warming, why has the earth not spiraled into an oven thousands/millions of years ago?

        What is there that causes this CO2 to go away? or what else is there that limits temps.

        David Lang

      • The CO2 is a feedback to ocean warming, but not strong enough for a runaway effect. For a degree of warming, CO2 adds 10-15 ppm which may add another couple of tenths of a degree including its own water vapor feedback. This is a converging feedback because it adds less than the cause. It converges to 1/(1-f) where f=0.1-0.2.

      • any positive feedback is enough for a runaway effect, unless there is negative feedback elsewhere.

        But we’re being told that there is no negative feedback

      • No, mathematically, the series convergences to an amplification factor because each response to the other gets smaller as long as f is less than 1 and the infinite series converges to a finite number, 1/(1-f). There is a class of positive feedback that is amplification and not runaway, which is f between 0 and 1.

      • x * 1.00000(…)0001 repeated forever won’t converge on any number.

        if adding CO2 raises temps, and raising temps increases CO2 release, both factors are > 1, and you hae a runaway

        The only way you don’t have a runaway is if there is some other factor limiting one effect ot the other

      • Try it for yourself, the ocean warms by 1.0 C, CO2 outgasses and due to GHG effects makes that 1.2 C, CO2 responds to that extra 0.2 C with 0.04 C (20%), so now it is 1.24 C. The next cycle adds 20% of 0.04 which is 0.008, so we have 1.248, etc. It is an infinite series that converges mathematically to 1/(1-0.2) which is 1.25 C. How ever many cycles you go, you won’t exceed .25 C.

      • David L. Hagen

        Jim D
        You still have not explained why you are working so hard to avoid warming and thus to encourage a return from the current warm interglacial period back down to the full glaciation temperatures.
        Please study HK dynamics with its specific H index for each type of climatic region/period – that is very different from random or Markovian variations.

      • DH, as I mentioned to tonyb who is also averse to cooling, there are tipping points ahead related to Greenland and the AMOC that cool Europe substantially and raise sea levels faster. These reversals of warming are not new, and pushing these glaciers with more forcing is not the way to go.

      • Long view, Jjim, lol as they say.
        Hockey stick mann, melder together
        of lotsa’ past data. Geologists, climate
        historians, ice core-scientists say ‘no.’ …
        And others with due scepticism of narrow context,
        a fer bristlecones ‘n such.

        http://wmbriggs.com/post/195/

      • Apology fer the stutter in yr name Jim, not Jjim.

      • As I said, the MWP is there and skeptics love it. Was it the sun? We don’t know. Some would say so. Then what? They would say, perhaps forcing does affect climate after all, and finally we are all on the same page.
        https://www.ncdc.noaa.gov/sites/default/files/Reconstructions-of-volcanic-forcing-and-total-solar-irradiance-from-proxies-v2.jpg

      • Jim D,

        Maybe you want a stable climate. I live in the tropics, and I’d like it a little cooler. I guess people in Siberia might want it a little warmer.

        Could you suggest action to make the world wide climate not too hot, not too cold, but just right?

        That way, people would not have to worry about going elsewhere on holiday, to nice sunny places. Just a reasonable question.

        Cheers.

      • “Since irradiance variations are apparently minimal, changes in the Earth’s climate that seem to be associated with changes in the level of solar activity—the Maunder Minimum and the Little Ice age for example—would then seem to be due to terrestrial responses to more subtle changes in the Sun’s spectrum of radiative output. This leads naturally to a linkage with terrestrial reflectance, the second component of the net sunlight, as the carrier of the terrestrial amplification of the Sun’s varying output.” http://bbso.njit.edu/Research/EarthShine/literature/Goode_Palle_2007_JASTP.pdf

        Not even close Jimbo.

      • …but the MWP.

      • But… but… can’t be bothered.

      • jimd

        you said

        ‘….but the skeptics love it anyway, because they have no sense of the long view.’

        You are being ironic aren’t you?

        tonyb

      • If I noted that the MWP was possibly solar forcing, what do the skeptics think of that? Yes, solar forcing was important? No, it can’t possibly be solar forcing? Yes, the sun was stronger then, but it was coincidental? I think skeptics would differ in their responses, but would be wary of allowing for solar forcing to be important for the MWP.

      • > > If I noted that the MWP was possibly solar forcing, what do the skeptics think of that?

        The skeptics would be happy that you admitted that Human actions aren’t the only thing that affects the climate. They would then start asking for information about how much the sun changed, and if we have any idea when it’s going to change again (hint, Solar Scientists are reporting that the sun readings over the last few years are odd, and they don’t understand what’s happening)

        Then skeptics would ask how many of the climate models take the changes to the sun into consideration.

      • Solar effects are even visible today, and these are included in assessments of forcings. Back in the MWP a solar increase could cause that couple of tenths of a degree of warming that they infer from paleo records, as it could early in the 20th century too when the sun strengthened, but that didn’t last till today. These occur occasionally. Perhaps the MWP was an exceptionally long solar active period so it showed up in paleo records. Compared to these solar variations, on a W/m2 basis, the current anthropogenic CO2 forcing is much higher, which is why it has had so much more effect.

      • Mike Flynn, check with 350.org. They have materials that answer your question about what a just-right climate would be, and maybe you would want to join them.

  33. David Springer

    I live on the shore of a large impounded lake. Far and very safely above the spillway elevation in case you were wondering. That said I still have stuff like boats and docks and assorted other things in the Corps of Engineers designated flood plain. There have five “100-year” floods in the past 25 years. The designated 100-year flood plain elevation has been raised 3 times since the dam was built in 1930, the last time less than 10 years ago.

    I believe statistical determinations such as 100-year flood plain elevations are driven more by politics than climatology.

    The best laid plans o’ mice an’ men gang aft agley but almost always so when those plans are laid by bureaucrats.

  34. The article would have been much more helpful without constant question marks. Learn from the IPCC and make statements. If uncertain, say how uncertain.

  35. As is true of many things, The discussion, professional and non-professional, of climate change is a computation of an equation with too many variables to obtain useable results. Climate changes are a myriad of cycles, some of day, month, year, some of ten thousand years. Drilling through the ice of Antartica, we find fossils of ferns. At this point, mankind has records of climate that reach back 400 years, of those only about 50 years data are professionally accurate. To assume that one could project based on such limited history is optimistic at best.

    Dams will suffer extremes of rainfall. This will be a part of cyclic change and a part of random events and influences of many variables.

    Dams must be designed and maintained with prudent recognition of such events. I was on a panel where a man argued that for a class 1 dam in a certain state, the probability of the design storm over a particular watershed was one in two thousand. Not very likely, he said. Someone chimed in -“but there are PMF storms” I asked him how many class 1 dams there were in this state, and he answered three thousand. I then pointed out that, by his math, the event would be encountered yearly.

    Calculating weather is like soothsaying. It is false science. It is the infrastructure design that we can control. Trying to calculate the incalculable will only serve to frustrate.

    • David L. Hagen

      Scott Cahill Thanks for pragmatic expertise.
      How practical is it to design main and emergency spillways for an ARkStorm? Or the 6 larger California Megafloods?
      Or where do we lay out evacuation procedures above a given reservoir level and predicted rainfall?

  36. Lake Henshaw, a Reservoir in San Diego County, California was built in 1923. The lake’s full capacity was lowered in 1978 from its original >251M m3 full capacity to 68M m3 out of earthquake concerns…

    • Lake Wohlford Dam in San Diego was strengthened in 2016-17 to prevent dam failure and San Vicente Dam in San Diego was increased 117 feet in height to double its capacity recently.

  37. https://youtu.be/kQe0J5NLLT4 Oroville spillway grand re-opening.

  38. November 2016 – La Niña to bring more drought to California – Scientific American. Don’t you love it when they get it wrong.

    Instead a full blown La Niña Modoki evolved in the Pacific. A self sustaining upwelling in the central Pacific with twin Walker Cells pushing warm surface water both east and west. Moisture rises over warm zones in both the east and west and is transported over land masses in Hadley Cells and by Coriolis forces. Descending air over the cold water zone condenses what moisture remains into low level cloud.

    http://www.ospo.noaa.gov/data/sst/anomaly/2017/anomnight.1.30.2017.gif

    The Modoki is fading – as it does this time of year. What follows next depends on the winds – and factoring in Ekman transport – driving the north and south Pacific Ocean gyres. Clockwise in the north and anti-clockwise in the south. The gyres spin up with negative polar annular modes – high pressure at the poles push circumpolar wind and storms into lower latitudes. The annular modes are modulated by solar UV changes warming and cooling ozone in the upper atmosphere – through changes in the intensity of Polar Cell circulation.

    Thus small changes in solar activity produce large changes in the climate system that modulate the Earth energy budget – over millennia at least.

    https://watertechbyrie.files.wordpress.com/2014/06/moys-2002-2.png

    The proxy of Pacific conditions is based on the presence of or less red sediment in a lake core. More sedimentation is associated with El Niño rainfall. It has continuous high resolution coverage over 12,000 years. It shows periods of high and low ENSO activity alternating with a period of about 2,000 years. There was a shift from La Niña dominance to El Niño dominance that was identified by Tsonis 2009 as a chaotic bifurcation – and is associated with the drying of the Sahel. There is a period around 3,500 years ago of high ENSO activity – in excess of a 200 red intensity – associated with the demise of the Minoan civilisation (Tsonis et al, 2010). In comparison – red intensity in the 1997/98 ENSO event was 99.

    El Niño intensity and frequency was at it’s highest level for a 1000 years last century. This mechanism was responsible for most 20th century warming. A return to more normal La Niña conditions seems quite likely this century – but then – surprises seem quite likely as well.

    It is not like science has missed this – just that nonsensically simple physical hypotheses prevail over inferences derived from data in some quarters

  39. This is the USGS ARkStorm. It a combination of 2 storms stalled in the model over California.

    https://watertechbyrie.files.wordpress.com/2017/03/usga-ari.jpg

    Over small areas there are 1000 year flows. Over the larger area the catchment flows are on average much less intense but the outlet flows on the larger catchment add up to a 1000 year flow. Flood frequencies are based on flood frequency analysis using a skewed log Pearson Type 3 distribution. Short intense bursts of rainfall will give you flash flooding in a small catchment – a big storm front with less intense average rainfall will give major flooding in large catchments.

    The ArkStorm combines both modes in following recorded events. It is a broad storm front with intense pockets of rain that is calculated to happen on average once every thousand years.

    As for 2000 year storms happening every year somewhere in the state – these can’t be treated as independent events – like a coin toss – and thus the probabilities do not add up to once per year somewhere.

    The nature of storms is that the most intense rainfall happens from relatively small storm cells – where the rain falls is the result of turbulent transport – a matter of chance. Where the most intense rainfall happens can be different in different storms of course.

    There is a concept called intensity, frequency and duration. The highest outflows for a specific frequency (say 1 in a 100 year) occur in a storm with the same duration as the time taken for water to travel from the top of the catchment to the bottom – with a rainfall intensity of the specified frequency. Synthetic storms of various IFD’s have been developed to facilitate the calculation of design rainfall. If you want to see what a 1000 year flow looks like – use a 1000 year synthetic storm of the right duration and route it through the catchment of interest including any dams.

    The probable maximum flood is something very different and has nominally a 10,000 year return period.

    http://arr.ga.gov.au/__data/assets/pdf_file/0013/40603/ARR23_Nathan.pdf

    And yes – spillways of Class 1 dams are designed for this.

    Having to close the main spillway seems a bit unfortunate but nothing to do with the hydrology or the spectre of mega-floods.

    Finally – I might ask what you understand in this?

    ” We apply advanced, well-documented spectral methods to estimate the regularities in our data sets. SSA arises from an interest in connecting nonlinear dynamics with time series analysis [Broomhead and King, 1986]…

    The periodicities shared by the present analysis of Nile River records and global or North Atlantic modes of ocean or coupled ocean–atmosphere variability [Ghil and Vautard, 1991; Schlesinger and Ramankutty, 1994; Dijkstra and Ghil, 2004; Simonnet et al., submitted manuscript, 2004] do not demonstrate a causal relationship. Our results suggest, though, quite strongly that the climate of East Africa has been subject to influences from the North Atlantic, besides those already documented from the Tropical Pacific [Walker, 1910; Quinn, 1992; Kumar et al., 1999] and those of possibly astronomical origin [Hameed, 1984; De Putter et al., 1998; Mann et al., 1995; Vaquero et al., 2002]. Moreover, the fairly sharp shifts in the amplitude and period of the interannual and interdecadal modes over the last millennium-and-a-half support concerns about the possible effect of climate shifts in the not-so-distant future [Alley et al., 2003].” http://onlinelibrary.wiley.com/doi/10.1029/2004GL022156/full

    Perhaps there are other questions come to think of it. How do climate shifts factor into the null hypothesis and why am I repeatedly asked for a deterministic model of Hurst–Kolmogorov dynamic processes?

  40. Pingback: Weekly Climate and Energy News Roundup #263 | Watts Up With That?

  41. One clarification for those listening in on the conversation, farmers don’t use potable water, they use raw water. So in the link to the California Water Balance Table, the rainfall that falls in California is not the same quality as the water that is eventually allocated to cities which is potable water.

    And to clarify, if you live in California you hear the drumbeat by academics that farmers use 80% of the water. Look, humans use 52% of the developed system water (cities and agriculture for food) and the environment uses 47% on average. You can’t split hairs and claim that farmers use 81% of the water; people eat food grown with water. Moreover, the official statistic is that agriculture uses 41% of the developed water in California on average, and 28% in s Wet Year.

  42. David

    Great research and write up.

    My “dam engineer” creds may be a bit rusty; my career changed unexpectedly in the mid 80s.

    In 1976 I was head of power and water scheduling for the Colorado River Storage Project in Montrose Colorado. I was the first to computerize the management of the CRSP dams water releases.

    A couple things to your piece:
    – there wasn’t an acknowledgement in the designs and ops procedures prior to about 1980 that we needed to consider warning the downstream folks of what was happening. Thankfully this has mostly changed.
    – study the 1976 Colorado Big Thompson disaster. There are elements of your analysis of storms there.
    – study the Oklahoma Arkansas River floods of 1973, specifically Keystone Dam above Tulsa. A few inches inches more and releases would have been required to meet inflows and the consequences could have been significant.
    – While with USAID in the UAE in 78 & 79, we saw evidence of the type of storms you describe in the wadis.

    The USBR had computer dam failure models in the 70’s; these should be invaluable in evaluating the failure of either emergency spillway or the main; as they considered the strength of the underlying material.

  43. Was the 2012-2015 California drought structural or meteorological? A US Bureau of Reclamation (BoR) study in 2008 seems to have answered that question in advance. The BoR study estimated that California had a supply-demand gap of 2.28 million acre-feet of water in a normal water year in 2008, four years before the onset of California’s supposed Big Drought. In other words, California reservoirs were in a structural (not cyclical) water deficit even before the 2012-2015 drought.

    The State Water Project delivers only 10 percent, and the Federal Central Valley Project only 23 percent, for a total of 33 percent of the state’s urban and agricultural water supplies (called “developed water”). The other 67 percent comes from local water systems and the Colorado River (link forthcoming).

    LINK to BoR study:
    https://www.usbr.gov/mp/cvp/docs/Water%20Supply%20and%20Yield%20Study.pdf

    But a “drought” is declared in California not when farmers have their water allocations cut but when cities have their water resources threatened and have to raid farms and desert water basins of their water. So in reality in California, a “drought” is only when Big Blue cities may be impacted.

  44. This article was a great insight on the Oroville Dam and in the methods used on hydrology. Also for me this article serves as a reminder of the responsibilities engineers and scientists have against society. The work is done by prof. Koutsoyiannis and the ITIA team is at the very least fascinating and i feel very fortunate of being their student. As a small contribution from me in this conversation, i leave this article of DK, A Random Walk on Water (https://www.itia.ntua.gr/en/docinfo/923/). In here DK shows that even if we know and understand fully the dynamics of a system we cannot predict in the longterm via deterministic models and thus design and decision-making cannot depend on such predictions.

  45. From the study linked to by George above. Which I quote because it provides useful questions to apply to thinking about natural systems.

    “The following summarizing questions can represent the conclusions of this article:
    – Can natural processes be divided in deterministic and random components?
    – Are probabilistic approaches unnecessary in systems with known deterministic dynamics?
    – Is stochastics a collection of mathematical tools, unable to give physical explanations?
    – Are deterministic systems deterministically predictable in all time horizons?
    – Do stochastic predictions disregard deterministic dynamics in all time horizons?
    – Can uncertainty be eliminated (or radically reduced) by discovering a system’s deterministic dynamics?
    – Does positive autocorrelation (i.e. dependence) improve long term predictions?
    – Are deterministic predictions of climate possible?
    – Are the popular climate “predictions” or “projections” trustworthy and able to support decisions on water management, hydraulic engineering, or even “geoengineering”to control Earth’s climate?

    The most common answer to all these questions is “yes”.

    Hopefully, the above discourse explained why my answers to all of them are “no”.

    I would tend to agree with Koutsoyiannis and not with Matthew.

    This example shows that for long horizons the use of deterministic dynamics gives misleading results and a dangerous illusion of exactness. Unless a stochastic framework is used, neglecting deterministic dynamics in long-term prediction is preferable. In very complex systems, the same behaviour could emerge also in the smallest prediction horizons. This justifies, for example, the so-called ensemble forecasting in precipitation and flood prediction. In essence, it does not differ from this stochastic framework discussed, and is much more effective and reliable than a single deterministic forecast.

    All practical hydrology is based on data that is then transformed either by statistical distributions or by consideration of physical factors in the watershed. The latter builds on stochastic analysis with deterministic augmentation to calculate potential rainfall. There are globally utilised methodologies for this. Most hydrologists simply use a few standardised tools – along with relevant expert judgement. There is a rarefied domain of high level expertise where the tools are constantly being improved.

    Deterministic models of regional rainfall would seem impossible. Certainly in the foreseeable future. It is a problem orders of magnitude greater then a climate model in which only a single global parameter matters.

    Finally, Lorenz’s theory of the atmosphere (and ocean) as a chaotic system raises fundamental, but unanswered questions about how much the uncertainties in climate-change projections can be reduced. In 1969, Lorenz [30] wrote: ‘Perhaps we can visualize the day when all of the relevant physical principles will be perfectly known. It may then still not be possible to express these principles as mathematical equations which can be solved by digital computers. We may believe, for example, that the motion of the unsaturated portion of the atmosphere is governed by the Navier–Stokes equations, but to use these equations properly we should have to describe each turbulent eddy—a task far beyond the capacity of the largest computer. We must therefore express the pertinent statistical properties of turbulent eddies as functions of the larger-scale motions. We do not yet know how to do this, nor have we proven that the desired functions exist’. Thirty years later, this problem remains unsolved, and may possibly be unsolvable. http://rsta.royalsocietypublishing.org/content/369/1956/4751

    Climate models have the same problem as the hypothetical deterministic water model. The impossibility of defining a single deterministic solution. Perturbed physics models have thousands of starting points that are within the measurement error of initial conditions. Solutions diverge exponentially through time.

    https://watertechbyrie.files.wordpress.com/2017/02/ngeo1430-f1.jpg
    Rowland et al 2012

    It gives an envelope – an estimate of model imprecision – in which the real solution may fall. There is of course the additional problem of whether the processes and coupling in the model adequately reflect processes and coupling in the real system. The answer there would be no.

    The IPCC opportunistic ensemble is very different. It consists of single deterministic
    solutions from a number of models each of which individually “gives misleading results and a dangerous illusion of exactness.” Together they converge on reality? That would be no as well.

  46. David

    I am raising issues of persistence where natural variations exceed common calculations with consequent inadequate design, inadequate construction and inadequate maintenance, and operation.

    This is the question you asked.

    1. Are dams designed for millennial confluences of cyclones, weather fronts, and thunderstorms?

    I have answered that question several times now. The answer is that – internationally – major dam spillways are designed to a nominal 10,000 year storm.

    Koutsoyiannis discussed Nile River variability exceeding that expected from random variation – the latter represented as white noise or a Markov chain. You assumed wrongly that hydrologists used white noise or a Markov chain and thus underestimated long term variability.

    The original PMF study was updated by USACE in 1980. The Peak PMF inflow to Lake Oroville is 960,000 cfs.

    This is the graph from the 2005 licence review. The PMF was revised downward (to approximately the original figure) by the updated US HMR59 method in 1999. Inflow PMF is not relevant directly to spillway design. Inflow is not necessarily equal to outflow in dams. The applicable differential equation is that the change in storage is equal to inflow minus outflow. Flood attenuation is one of the purposes of the Oroville Dam.

    https://watertechbyrie.files.wordpress.com/2017/03/pmf-oroville.jpg

    On top of that, the 1605 megastorm was ~ 50% higher than the ARkStorm – and one of 7 megafloods bigger than the ARkStorm in the last 1800 years.

    Citation is needed. In general – from Santa Barbara channel sediment – ‘megastorms’ occur on average every 200 years – but can we get accurate flows from that?

    Looking at detail at the ARkStorm scenario – there are small areas where there are 1000 year flows – based on comparison with the log Pearson Type 3 method – and we presume that 1000 year flows – and perhaps greater – are possible anywhere in the watershed in a megastorm. The areal extent of the storm causes broadscale flooding that give it the nature of a megastorm.

    Over 1800 years – somewhere in the watershed – you might get two 1000 years storms – three 500 years year storms – nine 200 year storms. Or you might get three hundred year storms in a month. It is just probabilities. In general – the longer the period the more rare and extreme the flood you might get.

    As for a “full Hurst-Kolmogorov analysis”.
    Therefore such models are proven inadequate in stochastic hydrology, if the long-term persistence of hydrologic (and other geophysical) processes is to be modeled. This property, discovered by Hurst [1951], is related to the observed tendency of annual average streamflows to stay above or below their mean value for long periods.”
    http://onlinelibrary.wiley.com/doi/10.1029/2000WR900044/epdf

    How far above or below can only be determined where data points are available for stochastic analysis. Hurst-Kolmogorov dynamics is not a magic trick by which data can be conjured out of nothing.

    Persistence is a property of climate shifts. Means and variance of the climate state shifts abruptly – persists for a while – and then shifts again. If there were long enough records it could be observed directly and long term extremes calculated using simple flood frequency analysis. Unfortunately – records are usually less than 100 years long and recourse to both statistical dodges and estimates based on recorded data and physical properties is necessary in practical hydrology.

    • And yes I do realise the graph says inflow PMF. Sometimes the method changes – but a modelling assumption can give significant differences in outflows. In the ballpark is probably the best that can be expected.

  47. The big thing this year has just been the volume of inflow that we’ve had to manage,” he said. “In just a 6-week period beginning in mid-January through the end of February, we passed through the lake the entire volume of the lake essentially. So 3.5 MAF is the maximum storage for Lake Oroville; we passed that complete volume through the combination of the spillway and the power plant over a 6 week period.”

    THE NUMBERS
    Inflow into Lake Oroville
    February was 540% of average
    Jan/Feb inflow was 4.4 million acre feet (MAF) (= average annual)

    • Previous high was 3.45- MAF in 1909
    • Recent highs: 3.19 MAF in 1986; 3.07 MAF in 1997

    -Released the entire volume of the lake (3.5 MAF) in 6 week period beginning in mid January

    https://mavensnotebook.com/2017/03/20/hydrology-and-state-water-project-operations-update/

  48. Looks like all the 1000 year flood periods will have to be adjusted.

    http://www.nature.com/nclimate/journal/vaop/ncurrent/full/nclimate3239.html

  49. David L. Hagen

    Oroville Dam Update
    Damage, design flaws in Oroville Dam spillway point to lengthy repairs, consultants say

    The main spillway at Oroville Dam is riddled with design flaws and so badly damaged that an independent panel of experts hired by the state has concluded it’s probably impossible to repair the structure completely before the next rainy season begins in November. . . .
    Above the crater, consultants described design problems in the intact portion of the chute that are so “gross and obvious” they will have to take priority this year, said J. David Rogers, a dam expert from Missouri, who reviewed the report at The Sacramento Bee’s request. Rogers said the problems the consultants described were so egregious he was surprised the spillway didn’t fail decades ago.. . .
    “We will state what appeared clear to everyone … that it is imperative that the Emergency Spillway not receive additional flows and that a long-term mitigation and re-design plan begin now,” the panel wrote.
    The panel report also said that while touring the spillway, consultants spotted “extraordinarily large” amounts of water gushing out of drains designed to move water out from beneath the intact portion of the chute. The water was flowing even though the spillway’s gates were closed and it wasn’t raining, the consultants wrote, adding that they believed further investigation is needed.

    For details see reports on Oroville Dam Service Spillway (P-2100) at FERC.GOV on
    FERC Project No. 2100 – Oroville Emergency Recovery – Spillways, Independent Board of Consultants. Memorandum No. 1
    Note:

    Operate the reservoir to prevent spill over the Emergency Spillway. . . .
    MO – 1.1 In some areas of the foundation of the chute slab, compacted clay
    was used to fill depressions in the rock foundation. This calls into
    question whether the portions of the slab that appear undamaged
    by the failure should be replaced during the restoration.
    MO – 1.2 The drains appear to flow for some appreciable time after the gates
    are closed and no precipitation is occurring. The BOC believes this
    situation should be investigated.
    M1 – 1.3 It is imperative that the Emergency Spillway not receive additional
    flows and that a long-term mitigation and re-design plan begin now.

    Geophysicist Steven N. Ward has provided a new summary Youtube presentation on a full dam breach, a partial dam breach, and a partial emergency spill breach.

    • There are 57 piezometers located around the dam to measure subsurface water flows. I wonder what they show?

      • David L. Hagen

        Wayne Lusvardi
        A clue:

        On August 1, 1975, an earthquake of magnitude 5.7 occurred near Oroville Dam. The epicenter was located 5.5 miles underground and 7.5 miles southwest of the dam along the previously unknown Cleveland Hill fault. Foreshocks began three days earlier with 29 foreshocks within 5 hours before the main shock (DWR, Bulletin 203, 1977). A special investigation of the dam was prompted by the foreshocks on the morning of August 1. Engineers from the Division of Safety of Dams took measurements from the 57 hydraulic piezometers located in the dam. 36 measurements were completed before the main shock at 1:20 PM. After the shock, measurement were taken again for comparison. Immediately following the main quake, the piezometers nominal pressure increased 14 feet in the foundation, 50 feet in the core block, and 23 to 27 feet in the embankment. As significant as these increases are, none of the pressures exceeded values for previous high reservoir stages (DWR, Bulletin 203, 1977).

        Oroville Dam Tour Information

      • Yes, I am aware of the 1975 earthquake. But the recent visual inspection by a team of experts reported in newspapers reported that water is flowing long after the spilling from the dam has stopped reportedly from what appears to be under the main spillway. Piezometers might be able to tell if there is subsurface movement of water and at what acceleration.

  50. David L. Hagen

    From Drought to Normal in 3 weeks in California!
    Hydrology/SWP Operations
    Update
    John Leahigh

  51. David L. Hagen

    Mr. David E. Capka, P.E. Director, Division of Dam Safety and Inspections
    Federal Energy Regulatory Commission, 888 First Street, N.E., Routing Code: PJ-123 Washington, D.C. 20426 Phn (202) 502-6314; Fax (202) 219-2731 email David.Capka@FERC.gov
    March 27, 2017
    Director Capka
    Re: Review Design Peak Maximum Precipitation/Peak Maximum Flood for Oroville Dam (P-2100)
    Revaluation: I recommend reevaluating the critical Design Peak Maximum Precipitation (PMP) and Peak Maximum Flood (PMF) for Oroville Dam in light of developments since the 2004 reauthor-ization. See attached review “Will the Oroville Dam survive the ArkStorm” March 17th Climate Etc.

    Record 2017 Precipitation: John Leahigh (2007), Principal Engineer for the California State Water Project, reported that record February inflows to Oroville Lake were an unprecedented 540% of average. The Northern Sierra 8-Station precipitation index was 211% of average from Oct. 1 2016 to Mar. 10, 2017 and its snowpack 166% of average. Its Jan/Feb precipitation of 46” was 115% of the previous record. Oroville Lake’s Jan/Feb inflow of 4.4 Million Acre Feet (MAF) was 126% of the previous 1909 high of 3.48 MAF, and equal to the annual average inflow.
    Concurrent Events: Dowdy and Catto’s (2017) storm/extreme weather analysis found: The highest risk of extreme precipitation and extreme wind speeds is found to be associated with a triple storm type characterized by concurrent cyclone, front and thunderstorm occurrences.A similar event caused the 1976 Colorado Big Thompson flash flood (USGS 2006).
    Climate Persistence: Koutsoyiannis (2010) finds that standard deviations with climate persistence (Hurst Kolmogorov dynamics) are often double those of white noise presumed by climate models.
    ArkStorm 1000 year study: The United States Geological Service with California’s Dept. Water Resources, modeled a 25 day ARkStorm – the “Atmospheric River 1000″ [year] Storm”, likely to cause 350% damage of a 7.8 magnitude San Andreas fault ShakeOut earthquake (Porter et al. 2011).
    7 California MegaFloods: The USGS reports seven California megafloods over the last 1800 years larger than the ArkStorm. The 1605 megaflood was ~ 50% higher than the ArkStorm (Porter 2011).
    Dam Operation: Keystone Dam operation in the 1973 Oklahoma Arkansas River floods came within inches of forcing releases to match inflows with likely catastrophic results (Davis, 2017). Oroville Dam was controlled to overflow its emergency spillway Feb. 11th in 2017’s record rainfall season.
    Oroville Flood Models: USCS Steven N. Ward‘s (2017) flow models from full and partial breaches of the Oroville Dam, and a partial emergency spillway breach, show floods at Oroville within 30 minutes, with the Feather River flood reaching 44 m, 22 m, or 20 m (144′, 72′, or 66′) above its bed.
    Oroville Dam Design: The major 1950 and 1955 rainfalls cut Folsom Dam’s Reservoir Design Flood (RDF) by 76% from 500 to 120 years. The Oroville Dam was rapidly resized before construction in 1965. Yet its 1968 Spillway Design PMF 720,000 cfs and 2.51 MAF capacity was severely stressed by the record 1986 flood. Including the 1969 and 1986 floods, the 2004 reauthorization increased the Peak Inflow from USACE 1980 design of 960,000 cfs to a 1983 design of 1,167,000 cfs with a Peak Outflow of 798,000 cfs. (Coe 2004)
    2017 Oroville Dam Redesign: In light of record precipitation, why did the 2017 Design Team REDUCE the Peak Outflow 19% to 646,000 cfs for the PMF? “Restore both spillways to pass the Probable Maximum Flood (PMF) flows without failing, and with damage below the Emergency Spillway to be expected. These include the following estimated flows: Gated Spillway peak design outflow of 277,000 cfs; Emergency Spillway peak design outflow of 369,000 cfs” (Craddock 2017)
    Increase PMF? With record precipitation, megaflood evidence, and studies warning of higher extreme weather, with likelihood of confluence and climate persistence, should not the Oroville Dam Design Peak Maximum Flood and Spillway Peak Outflow parameters be increased rather than decreased?
    Yours sincerely
    David L. Hagen, PhD
    61485 County Road 13
    Goshen IN 46526-8713 USA
    Encl. Hagen, D. L. (2017) Will the Oroville Dam Survive the ARkStorm? Revised 27 March.
    cc Ted Craddock, Project Manager, Oroville Emergency Recovery – Spillways Executive Division, California Dept. Water Resources, PO Box 942836 Sacramento CA 94236-0001, Ted.Craddock@water.ca.gov

    William Croyle, Acting Director, California Dept. of Water Resources P.O. Box 942836 Sacramento, California, 94236-0001 c/o Janiene.Friend@water.ca.gov

    Mr. David Panec. Chief, Dam Safety Branch, California Department of Water Resources

    P.O. Box 942836, Sacramento, CA 94236-0001
    Sharon Tapia, Chief Division of Safety of Dams, California Dept. of Water Resources, 2200 X Street, Room 200 Sacramento, CA 95818
    Paul Shannon, Dam Safety Branch Chief, FERC 202-502-8784 Rm 62-63 Paul.Shannon@FERC.gov
    Douglas Boyer, Risk Informed Decision Making Branch, FERC Chief, FERC, 503-552-2709
    Frank L Blackett Regional Engineer FERC, 100 First Street, Suite 2300 San Francisco, CA 94105-3084

    References

    Coe J. et al. (2004) SP-E4: Flood Management Study, Oroville Facilities Relicensing FERC Project No. 2100, Final Report, California Dept. Of Water Resources. Table 12.8-1 Oroville Dam PMF with Dynamic Routing (1983) bit.ly/2m24ZAA; Attachment 3, Table 3 p 7 bit.ly/2lmdDfz
    Craddock, T. (2017) Letter from DRW, regarding Independent Board of Consultants. Memorandum No. 1 California Department of Water Resources, March 17, 2017
    Davis, Terry. (2017) Comment 84265, March 21 12:41 am, Will Oroville Dam Survive the ARkStorm? Climate Etc.
    Hagen, D. L. (2017) Will the Oroville Dam Survive the ArkStorm? Climate Etc. March 17th. References
    & Appendix https://judithcurry.com/2017/03/17/will-the-oroville-dam-survive-the-arkstorm/

    Koutsoyiannis, D. (2010b) Memory in climate and things not to be forgotten (Invited talk), 11th International Meeting on Statistical Climatology, Edinburgh, doi:10.13140/RG.2.2.17890.53445, Presentation; http://bit.ly/2mpRnSQ

    Leahigh, J. (2017) CA Water Commission: Hydrology and State Water Project operations update, March 20, 2017
    Porter, K. Et al. (2011) Overview of the ARkStorm Scenario: U.S. Geological Survey Open-File Report 2010-1312, 183 p, and appendixes.
    USGS (2006) The 1976 Big Thompson Flood, 30 years later. Fact Sheet 2006-3095
    Ward, Steven.N. (2017) Oroville Dam Failure Simulations.mov bit.ly/2nnE3MA bit.ly/2nYgRHq

  52. David L. Hagen

    What “permanent drought”? New all-time rainfall record set for California

    Percent of average for this date: 207%
    Northern Sierra Precipitation Sets Water Year Record Atmospheric Rivers Pushed Total to 89.7 Inches since October 1.
    . . .Today’s total surpassed the previous record of 88.5 inches recorded in the entirety of Water Year 1983. The region’s annual average is 50 inches.

    Source: https://cdec.water.ca.gov/cdecapp/precipapp/get8SIPrecipIndex.action

    5 things to know about Oroville Dam’s immediate future

    1) Damaged primary spillway will be back operation
    DWR forecasting shows the potential for two more rounds of rain over the next 10 days. When coupled with increased runoff from the Sierra, DWR announced it will need to need to put the primary spillway back into operation for 10 to 14 days beginning at 9 a.m. Friday.