by Judith Curry
Something a bit different this week.
The Fascinating Ways People Are Adapting to #ClimateChange [link]
Which diet makes best use of farmland? You might be surprised. [link]
NYTimes: How Markets Can Help Us Adapt to Climate Change [link]
A nuanced look at the question of “climate refugees” [link]
Welcome to the emerging world of the circular economy [link]
Six Conflict Prevention Takeaways from New Climate and Conflict Research [link]
Dams in China, Vietnam will block 80% of sediment from #Mekong delta. Why that’s bad: [link]
6 maps that could change your perspective on the world [link]
The best way to predict #poverty is by combining satellite images with machine learning [link]
A single disaster can undo years of economic development. New #PacificPossible report calls for better resilience [link]
#Microinsurance could play vital role in #DisasterRiskManagement in South Asia [link]
Ottawa: Preparing for climate change-driven disaster [link]
Against all odds, rural women in #Myanmar cope with natural disasters & #climatechange [link]
Research shows need for wildlife corridors under #climatechange [link]
This Salmon Farm May Disrupt A Whole Industry – For The Better! [link]
Killing lakes, felling trees for wired cities. India’s urban development: technologically smart, ecologically stupid [link]
The Louisiana floods are devastating, and climate change will bring more like them. We’re not ready. [link]
New study shows impact of man-made structures on #Louisiana’s coastal #wetlands [link]
Louisiana’s sinking coast is a $100 billion nightmare for big oil [link]
Vital #disasterprep read on lack of public awareness ahead of unnamed #Louisiana mega-rain [link]
Fish (and Food Security) on the Move: Implications for International Security [link] …
Special Issue: Health Security and Climate Change [link]
The Toilet Revolution:[link]
The World Bank Climate Change Action Plan [link]
Gardens in the sky: The rise of green urban architecture [link]
Re: A nuanced look at the question of “climate refugees” [link]
Well, this lawyer is certainly considerably less vituperative in her dutiful recitation of UNEP/UNFCCC inspired and oh-so-very-tedious climate change memes.
But in my dictionary, ‘less vituperative’ <> “nuanced”. In this particular lawyer’s case – considering that there have never been any documented “climate refugees” – one can only hope that, assuming Burkett is a practicing lawyer, her research and advocacy on behalf of her clients is considerably more thorough.
Ah, the old “it hasn’t happened in the past” gambit. You know what they say, “There’s a first time for everything”. The Vikings migrating to Greenland during the MWP and leaving due the the LIA ought not be a terribly controversial precedent in this forum.
Then there’s Pandey et al (2003), Rainwater harvesting as an adaptation to climate change:
Climate change and human migration
The archaeological and historical records show many instances of societal collapse, attributed to a combination of social, political and economic factors54. However, recent studies have implicated climate change and drought as the primary agent in human migration, cultural separation, population dislocation and the collapse of prehistoric and early historic societies (see refs 1–5). Human migration due to climate change and environmental stress is a well resolved survival strategy across continents, including Africa55, Eurasia, South America and Australia (refs 1–5).
Ice cores from Kilimanjaro provide an ~ 11.7 kyr record of Holocene climate variability for eastern equatorial Africa56. The periods of abrupt climate change shown in ice cores include ~ 8.3, ~ 5.2 kyr ago, and a third period, ~ 4 kyr ago coinciding with the ‘First Dark Age’ known for the most severe drought in tropical Africa. In a more recent history, effects of variable precipitation and drought in equatorial east Africa during the past 1100 years bear links with climate change and cultural effects57. Oral traditions as well as scientific evidence both indicate drought-induced famine, political unrest and large-scale migration of native population, and establish strong connection between cultural development, climate change and water stress (see ref. 57).
Recently, a robust study by Nunez et al.4 found widespread evidence for human occupation influenced by water availability in the Atacama desert in northern Chile from 13,000 calibrated 14C years before the present (cal yr BP) to 9500 cal yr BP, and again after 4500 cal yr BP. Initial human occupation coincided with a change from very dry phase to a humid phase. Archaic open campsites at elevations above 3600 m show that humans lived around late Glacial/early Holocene palaeolakes on the Altiplano. Human use of these sites was terminated between 9500 and 4500 cal yr BP due to drying of the lakes. The fact that climate change was instrumental for human occupation and abandonment of the area also provides support to proposition that early people in the Atacama region focused on favourable, humid habitats such as lakes, springs and streams58.
Never happened obviously, because you’re omniscient and scientists don’t know nuffin’. How well am I conforming to your expected level of vituperation? I wouldn’t want to disappoint.
Something to keep in mind is that the Holocene is the longest period of relative climate *stability* our species has ever known. And anyway, it’s not all bad news; Table 1 is chock full of historical and pre-historical examples of successful adaptation to climate changes by way of rainwater harvesting. If they could do it, so can we. What’s a little minor disruption amongst billions of friends, amirite?
Brandon, Brandon quite contrary
How does his comment-count grow?
By hooking on snippets
And/or context-free clippets
His importance above all, he must show.
I’d say the evidence I presented to rebut your main argument is at least as good as your poetry.
“Something to keep in mind is that the Holocene is the longest period of relative climate *stability* our species has ever known. ”
Depends on how we define our species. If modern humans date back less than 120,000 years, then the Holocene may have been the only period of more or less stable climate. I say “may” because it’s uncertain what life may have been like in eastern and southern Africa during millennia when the continental glaciers in the polar regions partially receded. But almost certainly, in Africa there were centuries of relatively stable climate during periods of glaciation..
The Eemian (MIS-5e, 124,000–119,000 ago) was a period of relatively stable climate. Can we say that none of the species of Homo during the 25,000-year Eemian were Homo sapiens?
I would accept the claim that our species dates back at least to the start of Eemian and probably further.
How might Africans of our species have survived the glacial period prior to the Eemian? The African ancestors of all modern humans could have found refuge on the shores of the great lakes of southern Africa. Various (research) drilling programs have shown that these lakes are deep enough and big enough to survive the dessication of the continent during times of glacial advance.
I don’t accept Neanderthals as our species but some people do, classifying them as “Homo sapiens neanderthalensis”, a variety of our species able to produce fertile descendants with modern humans. Thus, some people would accept that our species has been around since at least the start of Marine Isotope Stage 11 (the Hoxnian) 424,000 to 374,000 years ago. If so, that would have been 50,000 years of more or less stable climate.
Recommended reading: Paleoclimate and Evolution, edited by Elizabeth Vbra (Yale, 1995). Worth buying.
This source puts the first homo sapiens sapiens first appearing around 200 ka, which would cover the Eemian.
No, I would not make that argument.
They’re extinct, so it’s moot for purposes of this discussion whether we can consider them “us”. :)
Anyway, back to relative stability …
… the Vostok record from Petit et al. (1999) is typically the image I see in my mind’s eye, and to me the Holocene looks like the most stable climate regime in the past 400 ka. We’ll never know, but I’m of the opinion modern civilization owes its success very much in part to that stability. Doing what we can to not destabilize climate by our own activities seems like a reasonably prudent idea. Most folks here don’t dig on Marcott et al. (2013), but I don’t have a problem with it …
… and that sliver of instrumental data doesn’t look much like stability to me. YMMV.
Brandongates, nice job with the Marcott 2013 spliced to hadcrut. That drops you down to -30 credibility points.
Give the old guard a few years and the Pilgrims will be climate refugees. The press release will look something like this:
“It was an extraordinary finding,” said Associate Professor Abram, from the ANU Research School of Earth Sciences and ARC Centre of Excellence for Climate System Science.
“It was one of those moments where science really surprised us. But the results were clear. The climate warming we are witnessing today started about
180</strike) 395 years ago.”
Ah, the joys of scientific discovery.
Read more at: http://phys.org/news/2016-08-humans-climate-years.html#jCp
I know, right? Heaven forfend someone put the present in context with the past. Does this …
…get me to minus infinity?
brandon funny, I would try to explain to you the issues, but your purpose is political not advancing science.
Oh sure. Like I couldn’t think of a less elaborate scam to raise my own taxes.
brandon, “Oh sure. Like I couldn’t think of a less elaborate scam to raise my own taxes.”
Possibly, release your tax returns :) One of the reconstructions use by young master Marcott was produced by Jessica Tierney, one of the co-authors of the surprising discovery paper of anthropogenic global warming starting in the tropics circa 1830. Marcott used part of her ~70,000 year low resolution reconstruction but omitted her 1500 year high resolution “cap” reconstruction.
Since I have already provided you with links to the Marcott supplimental material, a critique of Marcott et al., by Rosenthal et al, and since splicing a time series with greater than 120 year smoothing one with much less smoothing is a great con game trick, political bias is the more generous assumption.
You wouldn’t believe them. Being a good alarmist, my returns are heavily adjusted.
I’m constantly amused that people accusing me of being political are the ones scoring the political points.
I read Rosenthal, of course, and fiddled with the data as you might recall. What I don’t recall is seeing that paper as a critique of Marcott; it was its own reconstruction.
Would you like to try again without any assumptions, generous or otherwise? Try this: post the same plot with the instrumental record spliced on as you best see fit, and then we might better measure how big is my political bias. 2-sigma error bars might also be illustrative. I’m pretty sure what you’ll find is what Marcott and Co. wrote in their conclusions:
Our results indicate that global mean temperature for the decade 2000–2009 (34) has not yet exceeded the warmest temperatures of the early Holocene (5000 to 10,000 yr B.P.). These temperatures are, however, warmer than 82% of the Holocene distribution as represented by the Standard5×5 stack, or 72% after making plausible corrections for inherent smoothing of the high frequencies in the stack (6) (Fig. 3). In contrast, the decadal mean global temperature of the early 20th century (1900–1909) was cooler than >95% of the Holocene distribution under both the Standard5×5 and high-frequency corrected scenarios. Global temperature, therefore, has risen from near the coldest to the warmest levels of the Holocene within the past century, reversing the long-term cooling trend that began ~5000 yr B.P.
Brandon, 2 sigma error bars for a times series with 120 year smoothing is only meaningful when comparing to other times series with 120 year smoothing. This is the con-game issue. The instrumental period would be a dot at the end. It isn’t rocket science.
As for the comparison, take a look at fig. 2, note that the variability of IWT which has a minimum 50 year smoothing, is greater than the uncertainty of surface water temperature (Marcott). I believe this is the one you “thought” confirmed Marcott. IWT should be much more stable than SST, making the difference even more significant. Also notice the lack of spurious spike in the work by seasoned professions.
I want to compare modern instrumental temps to Marcott for perspective. That gives me some information about what is happening now relative to the past. I have proposed that you do it in the way that you feel is most appropriate.
You tell me that what I’m asking for is a con job. And then you refer me to Rosenthal, which I’ve already reviewed and which we’ve already discussed.
Which one of us is running the shell game here?
brandon, “I want to compare modern instrumental temps to Marcott for perspective. That gives me some information about what is happening now relative to the past. I have proposed that you do it in the way that you feel is most appropriate.”
I want to win the lottery, but based on the real math, not some feeling for perspective, it is pretty unlike. If you want to compare in some scientifically relevant you can do some work, start simple, average which ever temperature series you like, to one average point, then average the first half, then the last half. Those are three possible values you could get for the last paleo point. About a half degree range. Then average the dates on the time scale the same way, those or your potential time splice points, About 50 years for simplicity. Now you have an uncertainty for your temperature series and paleo series that you can add to your dots. Right now you should be realizing there are simple to simple ways to check your “perception” before you start polluting the minds of others If you just feel link sticking a high resolution on the end of and extremely low resolution and share it with others, you might want to take some uncertainty limits on it.
Since your methodology isn’t scientifically sound, I would say you are conning yourself.
As I said, Marcott supplemental material is on line, Steven McIntyre has a good commentary, There is a wealth of information at your finger tips. Show us your statistical fung fu.
That’s a first…
captd, you can estimate a 200-year average for 1900-2100 based on your projections, maybe 1 C, maybe 2 C, then use that if it makes you more comfortable to compare with Marcott. Or you can refuse to do that estimate, and that is fine too because it is what some skeptic (can’t remember who) did last time I raised this issue.
Then do some “real math” and show me the results already.
We’re investigating a centennial-scale problem. We have a over a century worth of modern instrumental data. Marcott gives us centennial-scale resolution for the entire Holocene. From this, it stands to reason that there is some “scientifically sound” method to compare the two.
Note again how the only “useful” information I’m getting is reserved wisdom, and that I’m a con artist.
No. You’re the one who has elevated himself to final arbiter on what constitutes scientific soundness and real math. You get to be the one to demonstrate the statistical kung fu.
“No. You’re the one who has elevated himself to final arbiter on what constitutes scientific soundness and real math.”
I gave you the math, it is just that simple to find an uncertainty range. Marcott has a little something on the subject concerning his reconstruction and the past 100 years:
“Q: What do paleotemperature reconstructions show about the temperature of the last 100 years?
A: Here we elaborate on our short answer to this question above. We concluded in the published paper that “Without filling data gaps, our Standard5×5 reconstruction (Figure 1A) exhibits 0.6°C greater warming over the past ~60 yr B.P. (1890 to 1950 CE) than our equivalent infilled 5° × 5° area-weighted mean stack (Figure 1, C and D). However, considering the temporal resolution of our data set and the small number of records that cover this interval (Figure 1G), this difference is probably not robust.” This statement follows from multiple lines of evidence that are presented in the paper and the supplementary information: (1) the different methods that we tested for generating a reconstruction produce different results in this youngest interval, whereas before this interval, the different methods of calculating the stacks are nearly identical (Figure 1D), (2) the median resolution of the datasets (120 years) is too low to statistically resolve such an event,…”
Each time series is different so it is the author that is the arbiter. Perhaps you can explain to him why he is wrong.
Gee, Numbnuts, you just stepped on your crank with this one. You do realize that all of your examples have to do with natural climate variation, right? Meaning they are irrelevant as a counterpoint to Hillary’s point of there being as yet no evidence of AGW related refugees.
Way to go.
Not only does Gates step on his crank, he then proceeds to trip over it.
Assuming he is correct about the Holocene being the longest period of stability regarding climate, he misses which direction climate is due to head as it leaves that period of stability.
Marcott reports their data in 20-year bins, with a 1-sigma uncertainty estimate. I see no reason therefore to have to rebin the paleo data into 50 (or 60) year samples. I should be able to simply multiply the 1-sigma uncertainty by two and plot the error envelope. Then I should be able to resample the instrumental record in 20-year bins, calculate the standard deviation of the residuals for each bin and multiply that by two.
Here’s what that looks like:
The final instrumental data point (a 20 year bin centered on the year 2000) is about even with the warmest point of the Holocene. The maximum uncertainty values for each series are also about dead even. That’s all quite consistent with the results as reported in the paper.
The tail end of the reconstruction not being robust — as stated in the paper itself — is what makes it particularly informative to splice in the instrumental data for the 20th Century.
I have nothing against natural variability. It obviously happens.
Not so much:
“The tail end of the reconstruction not being robust — as stated in the paper itself — is what makes it particularly informative to splice in the instrumental data for the 20th Century.”
It doesn’t tell me much. You have a low resolution reconstruction which means it isn’t capturing the minimums or the maximums and you have what appears to be a 0.8 C range where you can attach the instrumental record and still be within the 1 sigma range. Could be really really warm or could be pretty much where it was before the LIA. I’m betting on the where it was before the LIA answer.
My plot shows a 2-sigma error envelope, so we should be seeing just over 95% of the mins and maxes.
These aren’t absolute temperatures, they’re anomalies.
The uncertainty bars don’t show greater resolution than the measurements. They only show the uncertainty of the measurements used. That would be like showing the uncertainty bars of the annual temperatures of the Mojave Desert and claiming the daily variation was included within.
brandon, “Marcott reports their data in 20-year bins, with a 1-sigma uncertainty estimate. I see no reason therefore to have to rebin the paleo data into 50 (or 60) year samples.”
The easiest way to “see” that is compare the whole instrument period sd to a 120 year smoothed instrumental sd. There is a big difference. Binning doesn’t magically change that, it just makes it easier to work on a spread sheet. What you have to do is average instrumental to the same frequency as paleo to make the splice, so your instrumental start would be below the paleo finish in this case. You used 20 year binned, but basically you need a longer instrumental period to match to average matching Marcott’s averaging, then rebin that to 20 years. As it is, 2 or three instrumental points is the best you can do. This is pretty basic, hard to move past basic stuff if you aren’t willing to do a few what ifs, mentally or on a spread sheet.
“The tail end of the reconstruction not being robust — as stated in the paper itself — is what makes it particularly informative to splice in the instrumental data for the 20th Century.”
Not if you can’t pick a “robust” point to splice. That up tick in Marcott you spliced to is a large part of the problem. What Marcott tried, was splicing to Mann et al. to get a temperature reference, but the little dips and pops in Marcott aren’t robust.. Rosenthal et al. gets into that with the Moberg, Mann, Oppo comparison.
Taking a reconstruction that you know isn’t particularly robust and making it appear to be is what?
Brandon, 2 sigma should capture 95% of events. So you only miss fairly unusual events, like say the LIA, nice job rationalizing.
Here Nick Stokes emulated Marcott only looking for events hidden by Marcott’s method. You can add the already available reconstructions often by the same authors, to improve data back a couple thousand years, and get another answer.
brandon on the validity of the 2 standard deviation being a 95% level with mixed smoothing methods. Using ERSSTv4, the standard deviation for 20 year smooth is 0.18, for 60 years, 0.14 and for 100 year 0.03, there appears to be an exponential decrease in relevance, depending of course on the chosen time series.. Oh, wait, that was something you could have done on your own isn’t it?
BRG, you make a big mistake relying on Marcott. The Science paper is a clear case of academic misconduct that Science should have but did not retract when presented with written evidence. That evidence is contained in my second guest post here at the time, Playing Hockey–Blowing the Whistle. You would have known if you had done any proper research. You drank the Cool-Aid on that one.
It is one of several clearcut cases. My first ever guest post here back in 2011 concerned deliberate misrepresentation by NRC of a paper that itself was gravely flawed. Oleary’s Quobba Ridge misrepresentation in guest post By Land or By Sea. PMEL’s misrepresentation in guest post Shell Games. Blowing Smoke contains several other examples as well.
Beyond the uncertainty monster lies some very ugly stuff. Worse than Mann’s hockey stick Nature trick to hide the decline.
All this sound and fury about academic misconduct, and nobody has shown me what I’m asking for: a comparison of Holocene temperatures to the modern instrumental record. Do we care more about what’s actually happening with the planet, or whether climate scientists are con artists and I’ve drunk their Kool Aid?
brandon, “All this sound and fury about academic misconduct, and nobody has shown me what I’m asking for: a comparison of Holocene temperatures to the modern instrumental record. Do we care more about what’s actually happening with the planet, or whether climate scientists are con artists and I’ve drunk their Kool Aid?”
There aren’t any “global” surface temperature reconstructions of the Holocene that have the resolution to directly compare to instrumental. That should give you some idea of the issues involved. Rostenthal et al. took a different approach and looked at reconstructing ocean heat capacity, it still has some issue but I consider it progress in the right direction. Their biggest problem btw was an overly exciting press release with a minor error that killed the excitement. PAGES2K is still working on just the last 2000 years to get a consensus. There is a geothermal reconstruction of the Holocene, but that is low frequency.
The problem is that global mean temperature anomaly means more climate scientists than it means to the globe. Changes of 10 degrees from -35C to -25C in Arctic winter are not something you will find in temperature proxies. Most everything is dead or frozen below minus 20C and most proxies are based on living things, diffusion into liquids, evaporation rate, but not on how cold the surface of the ice is.
You need apples to compare to apples.
BRG, you changed the subject rather than address the very specific allegations that stuff you rely on is more than just wrong. Not good form. At least read and reply factually to my allegations. You have been criminally misled. Multiple ways, multiple times. You have the references now. Check them out and stop trying to duck truth when it is going to hit you upside the head. (My old uncle Joe Papke coached some Olympic Gold Medal boxers, so the analogy seems apt.)
I spliced to the final two 20-year bins of the 19th Century.
I don’t disagree. Appreciate my frustration at repeatedly asking you to make the comparison I’m attempting to do, and you repeatedly not doing it.
We’re looking for century-scale trends. Marcott has that resolution. We’ve got almost two centuries of instrumental data. You should be able to give me a plot that answers century-scale questions. Can you do it? Will you do it?
You, and now Stoat have got me thinking about whether they even do that.
I retract my comment about 95% of the variability.
brandon, “We’re looking for century-scale trends. Marcott has that resolution.’
No you don’t, one point every 120 year OR SO, doesn’t provide century scale trends. Synthetic points, binning, doesn’t count. Some of the 73 reconstructions used had more than 300 year resolution. Even with 25 year samples you don’t get a true indication of century scale trends, you get about 80% of a century scale trend.
Another problem with paleo is matching the average temperature of a 100 year period to the actual date that average was meaningful. You can’t assume away one uncertainty because you like the other.
Then once you have all that covered, your “global” temperature index has to be the same 73 location which you can compare with whichever index you like and find a correlation coefficient. You have more than five stages of uncertain processes that can have additive “sensitivity”. Assuming “normal” distribution in that case is a huge leap.
This would have been a lot easier if you had downloaded Marcott’s spreadsheet and played with it yourself. Just “seeing” the reason for the spurious uptick would have sent you someplace else. I went to Rosenthal.
Then I can get a fair reconstruction of tropical ocean temperature using one proxy type G. Ruber Mg/Ca which reproduces about 60% of century scale variation. Try it, one point every 50 years shows just how much detail?
Unsurprisingly, all roads again lead to selective ignorance.
The paper addresses this argument:
The results suggest that at longer periods, more variability is preserved, with essentially no variability preserved at periods shorter than 300 years, ~50% preserved at 1000-year periods, and nearly all of the variability preserved for periods longer than 2000 years (figs. S17 and S18).
For the global 5×5 reconstruction, the 2-sigma uncertainty averages +/- 0.27 C for the 20-year bins. So, statistically we can argue that it is *possible* for a 0.5 C temperature excursion to “hide” undetected inside any 300 year interval we choose, and not register … all we get is a smoothed average which looks like the previous 300-year interval. This is more or less the argument you’re making yes?
Who said anything about preferring one uncertainty to another? To my knowledge, stevereincarnated is the only one in this thread placing bets for the splice point to be on the low side of the uncertainty range over 1860-1880. Is it not equally probable that the attachment point is higher?
More declarations by fiat. Whence your confident knowledge of the necessity you propose? It’s not like this hasn’t been tested:
Figure 1: Temperature anomaly data (thin colored lines) at the same locations as the 73 paleotemperature records used in Marcott et al. (2013), the average of these 73 temperature anomaly series (bold black line), and the global average temperature from the National Climatic Data Center blended land and ocean dataset (bold red line) (data from Smith et al., 2008).
It would be most helpful if you’d tie these declarations to something people have actually said.
Perhaps it wasn’t clear …
… that this is my own plot of that very data.
Not when the paper itself tells us that the uptick isn’t robust.
Which, as we’ve previously discussed, is geographically limited. I refer you back to your assertions about about 73 locations and assumptions of distributions. Perhaps see also, “You can’t assume away one uncertainty because you like the other.”
What is your opinion on Borehole temperature profiles?
I examined these in some detail and corresponded with the University of Michigan.
I am naturally suspicious of all novel proxies, perhaps because an extensive study of tree rings left me completely baffled as to why anyone would think they are anything more than a possible indicator of moisture within a tight micro climate.
Borehole reconstructions demonstrate exactly what is shown by looking at the instrumental record, which is that we can detect a generally upwards change to temperature dating back to the late 1600’s. In other words such as Giss and Hadley are staging posts of temperature rises and not the starting point. (see my article ‘The long slow thaw’
Looking at the individual borehole data it can be seen they can be interpreted in a number of ways with many showing a temperature decrease, which is overwhelmed by those showing an increase, thereby giving us an unsatisfactory global ‘average’ which hides the interesting nuances.
I think more attention needs to be paid to regional or koppen classification reconstructions. As Marcel Leroux commented ‘the world has many climates.’
Looking at the globe as one averaged temperature disguises what may be happening in regions/countries/areas
I don’t know much about them except that they’re very low resolution. I’ve never played with the data.
All proxies were once novel. All proxies also have their own particular strengths and weaknesses. For tree rings, I see their main advantage being their annual resolution — which is decidedly not an advantage of boreholes at the opposite extremes. A major disadvantage is as you mention, confounding climatic factors. Another one is seasonality — they only say something about conditions during the part of the year when trees experience the most growth, i.e., not wintertime.
Perhaps the most infamous problem is Mike’s Nature trick of hiding the decline, better known to dendroclimatologists as the divergence problem. What people who simply cannot let bad PR go to waste might not realize is that it’s not a ubiquitous problem. Another charming feature of the prejudice against treemometers is the implied argument that all of Dr. Mann’s Hockey Schticks are made entirely of wood. They aren’t, and that goes right back to his ambitious debut … MBH98 was a multi-proxy study:
We use a multiproxy network consisting of widely distributed high quality annual-resolution proxy climate indicators, individually collected and formerly analysed by many palaeoclimate researchers (details and references are available: see Supplementary Information). The network includes (Fig. 1a) the collection of annual resolution dendroclimatic, ice core, ice melt, and long historical records used by Bradley and Jones6 combined with other coral, ice core, dendroclimatic, and long instrumental records.
That’s my tree-ring lecture.
Some borehole reconstructions more than others (as you allude below), just as some multi-proxy studies show a more distinct MWP and LIA. Moberg (2005) comes to mind, and it agrees well with Huang (2000), a borehole study.
So you’ve argued with me before. The thing about looking only at a temperature time series is that it doesn’t say a thing about what’s driving a given trend. At best, all it tells you is that a trend happened. As there are multiple factors affecting temperature variability, I don’t subscribe to the notion that late 20th Century warming can be a priori explained as a rebound from the LIA.
I’m certainly not going to balk at the idea of doing more paleo work, especially not in currently undersampled regions. I caution that doing so might require the use of novel proxies.
I don’t think it disguises anything, it simply doesn’t reveal local nuances. All the major instrumental surface temperature products come as gridded data, and KNMI Climate Explorer makes it easy to obtain. Anyone with an Internet connection and the desire to do so can drill into regional variability to their hearts’ content.
brandonrgates: “Another charming feature of the prejudice against treemometers is the implied argument that all of Dr. Mann’s Hockey Schticks are made entirely of wood.”
Nothing of the sort!
He used sediments too, but the damn things accumulated upside down, so he quite rightly inverted the graph to get his Hokey Schtick the right way up.
Can’t have any of those declines left unhidden, can we?
I dealt with mbh98 in great detail in my article ‘The long slow thaw’. Primarily tree rings, primarily peak summer and primarily Northern Hemisphere. I do not have this visceral hatred of Mann neither do I believe his work to be unblemished nor of the highest quality.
Let me put a ‘what if’ to you.
The borehole data, with its many faults, shows a rising temperature trend from the 1600’s. The instrumental records backed up by such as tree lines, crop records etc show the same. Climate historians over hundred of years had also observed the lessening of the cold winters and noted the many hot summers.
If there had been a fresh faced, eager, ambitious, newly minted phd ‘Michael Mann’ type at such as the University of Michigan, who had the get up and go to promote their rising temperature chart based on boreholes and tied it in with all the other information available, would the IPCC have been wound up after a decade as we reverted to the notion of dramatic climatic variability throughout history, as such as Lamb and Budyko had realised was the ‘norm.’?
Well then I guess we now both know that I missed that part when I first skimmed it.
Specifically, mostly N. America and W. Europe. I would be the last person to argue that statistical infilling is preferable to better spatial sampling. I don’t have a problem with the primacy of tree ring networks, AFAIK they have the best temporal resolution of any extant proxy. I would however argue that it’s advisable, if not essential, that tree rings not be used all by their lonesome. I would say the same thing about any other proxy; I consider it axiomatic that a single line of evidence is a far weaker basis for establishing claims than multiple lines. Or to put it another way, treemometers make a stronger case combined with other evidence than they would in isolation.
I do harbour a fairly deep loathing of Mann-bashing. I apologize if my own prejudices spoke too loudly.
I can’t really comment on whether his work is highest quality or not. I can only point out that his methods appear to have gained much traction within the field, which I take to be an endorsement. I also consider subsequent works by independent investigators which are broadly consistent with the original Hokey Schtick to be a vindication of his earliest studies.
I wouldn’t say that any study by any author or team is ever unblemished.
I don’t think I can reasonably answer your what if scenario until you better qualify (or better, quantify) and/or expand on its numerous embedded assertions. What does “all the other information available” at the time of MBH98/99 suggest? How is it quantitatively different from those Hockey Schticks? When where and how did the IPCC back away from the notion of past climate variability? What’s the dividing line between “dramatic” past climate change and … vanilla … change?
Earlier researchers than Lamb and Budyko recognized that past climate change is the norm. One of them was Svante Arrhenius, who proposed in 1896 that variations in the levels of carbonic acid in the atmosphere was the main driver of Ice Age cycles. Since FAR, the IPCC have disagreed with him, instead pointing to orbital forcing by way of the later works of Milankovitch and Berger as the main driver of multi-millennial climate change. But they hung onto Arrhenius’ notion that atmospheric CO2 concentration modulates heat loss from the surface because it comports with physical theory subsequently developed by Beer, Lambert, Stefan, Boltzmann, Planck and Einstein (to name just a few) AND is readily observed inside and outside laboratory controlled conditions.
Please take care when appealing to “all the other information available” … I might hold you to considering some of it.
Must’ve been some powerful sediments. You’d think somebody would have corrected this by now.
No hidden decline is safe from the penetrating gaze of McI’s sampling algorithm:
Figure 4.4: One of the most compelling illustrations that McIntyre and McKitrick have produced is created by feeding red noise [AR(1) with parameter = 0.2] into the MBH algorithm. The AR(1) process is a stationary process meaning that it should not exhibit any long-term trend. The MBH98 algorithm found ‘hockey stick’ trend in each of the independent replications.
Yup, those are some *mighty* red cherries, the dozen ripest of the reddest 100 of 10,000 independent replications. Where would random sampling be without R’s “order” function?
Some of the ‘other information’ is contained in the reference section of ‘The long slow thaw’ sections 4, 5, and 6
There is also other information within the article itself. Many hundreds of references are contained in various of my other articles.
I got into a huge argument with Willis once when I defended Dr Mann. As I say I do not think he is a ‘great’ scientist but equally the visceral hatred shown to him by many sceptics is unpleasant
Sky gardens. Sure, why not. Heck, Germans been doing that forever. Herbs and flowerboxes front every south window in rural Bavaria. Makes urban space nicer. Does not solve basic ag/food issues. Pretty does not equal productive.
Pingback: Week in review – climate adaptation edition – Enjeux énergies et environnement
I read the “killing lakes felling trees” link. It’s a cliche and hyperbole rant. The only concrete proposal is that indian cities should plant fruit trees instead of ornamental trees. I guess to ensure that while the indian city-zens are surfing the internet and dying of thirst and pollution they will have fruit to eat.
How about no trees at all to free up water for The Toilet Revolution:[link]?
But then what would they *eat*?
Thanks. Did not know about the toilet revolution. Now i do. Really great link.
Sometimes it’s the little things. :)
PS, credit for the link to our host, not me.
brandongates – Thanks for the toilet revolution link. The very first time I heard about open defecation was in a Facebook post from an Indian expressing his disgust of the practice after his return to the US from India. In a country with such a high population density it is a hideous and dangerous practice.
All the climate change (nee Global Warming) hoo hah and it’s accompanying “sustainable energy” nonsense for the third world is misdirected attention. Those poor people have a hierarchy of needs: medical care, food, clean water, sanitation, electricity, education, and the rule of law. Windmills, solar panels, and phony bio-fuels schemes waste time and cost lives.
Credit for the toilet link goes to Dr. Curry from the head post. All I did was work it into the conversation. This is the first time I’ve heard that it was a problem, it’s one of those things that again makes me feel quite fortunate to have #firstworldproblems.
I can’t fault the argument that the developing world has more immediate pressing issues than future warming. OTOH, pretty much every government in the world accepts that AGW is an issue, including developing world governments. And the consensus is that the poorer nations stand to be the most impacted if the warming were allow to go unchecked, being the ones least equipped to adapt. All this while the developed world have been the ones owning the bulk of cumulative emissions.
As someone who agrees with the consensus of world governments, I basically have two equitable choices:
1) The industrialized world decarbonizes first, developing world second.
2) Everyone decarbonizes at the same time, but the industrialized world shoulders most of the burden for the developing world’s decarbonization.
A third option is some sort of mix, which may be more realistic, but it’s easier to think about as a dichotomy. A fourth option of course is not decarbonizing (or at least not until the market decides to do it on its own), but I don’t see that as a viable option.
I fiddled with how (1) might work. If the target is zero emissions by 2100, the rich nations (inc. China) have until about 2055 to completely decarbonize by my growth assumptions for the developing world.
In the rural US, people dug a couple of deep holes and built a little house with a two-hole seat in it. Not exactly rocket science.
A data-rich study of N. American agriculture adapting to natural climate change.
Ron Clutz: https://rclutz.wordpress.com/2016/08/22/adapting-works-mitigating-not/
Thank you for the link.
re: Words of Caution on “Climate Refugees”
… “if we”… “we might”… “specific interventions”… like, funneling petro dollars into Jew-hating ME cultures and then, intervene with the US Army 3rd Infantry Division?
Pretty good argument for devaluing the petro dollar, iff’n you ask me.
The Louisiana Flood (this time) ==> Those interested in the “climate change cause” of the Louisiana Flood should see the wiki page on the Great Flood of 1927. I have on my bookshelf, as I type, a documentry DVD of the flooding in 1927, titled The Great Flood (inquire at your local library). Aaron Neville recorded the great Randy Newman song, titled Louisiana 1927, about the flood.
All this by way of illustrating that the recent flood was not a record breaker, not even in the same league, and that humans have a very poor memory for past weather disasters.
As always, our hearts go out to those whose lives have been disrupted, and to those who lost loved ones.
Funny. The article said nothing about it being a record flood. Quite the opposite really:
Though smaller than the devastation wrought by Hurricane Sandy in 2012, this latest flood reminds us of what a changing climate has in store for us: Places that have flooded before will flood again, and places that haven’t in the past will do so for the first time.
These disasters are the new normal — several other states are currently recovering from disasters of their own.
It’s a frequency argument illustrated by an anecdote, which neither your strawman nor anecdotal appeal to the 1927 flood addresses.
As always, let’s hope this generation is making good choices for future ones.
Floods were none have been before! Sounds like time to revive the US Flood Control Program, which the enviros shut down with NEPA circa 1970.
No squirrel is too small to chase for the bias brigade.
Can you clarify, by way of a link, where the data is that looks at modern day floods and compares them to older ones. Presumably they would make allowances for far more people often living in unsuitable places in areas where flood mechanism such as flood plains have been built on?
I don’t know that I can, or why I should need to. The article isn’t making a claim that flood frequency has already increased, only that it will. It would be up to the folks listing anecdotal examples about past floods by way of saying nothing unprecedented is happening to go do the legwork you’re asking.
Now, if you’d like to ding the article for not giving a source for the prediction they invoke, I’ll happily join you in that criticism.
Aaack, as I’ve already quoted, the article states: These disasters are the new normal — several other states are currently recovering from disasters of their own.
That is arguably a statement that the present has changed from the past, and they don’t cite support for it. So I’ll have to ding them for that too.
Yes, I agree a source should have been cited for the authors assertions. Presumably it doesn’t exist.
good to know that you apparently agree that, as yet, nothing out of the ‘ordinary’ has yet happened (as regards flooding) and we are in the world of computer simulations.
Are there any areas of climate change happening now that you believe are genuinely previously unprecedented during the Holocene.
‘Are there any areas of climate change happening now
that you believe are genuinely previously unprecedented
during the Holocene?’ h/t tony b
As a serf student of history, I’d say we humans needst
beware ass-umptions regardin’ patterning of the past
repeated in the present, but needst also beware goin’
overboard regardin’ some point in time as a unique
tipping-point event … usually not. Pays ter look critically
(sceptically ) at the long record, cross -referencing,
( we’re prone ter confirmation bias) and ter hold our
theories tentatively, ‘ tenta – tiv – ly…’
IPCC ARs are full of are full of language predicting higher rates of precipitation and increased frequency of storms, and as a result floods. A specific citation is nice though, so one can read all the caveats, etc.
I’m not saying that anything unusual hasn’t happened. What I think is that it’s difficult to tell because weather is localized and noisy, observational quality decreases as one goes into the past, etc.
With respect to inland flooding, JCH makes a good point that here in the US, flood control has improved since the early 20th Century because of building dams, levees, etc. Thus, there could be (and hopefully has been) a *decrease* in the number of flood events even as the planet has been warming. That gives hope that inland flooding is one of those things we in the US will be readily able to cope with. The article is urging more preparation, with which I don’t disagree. I don’t know that I can stand by the statement, “we’re not ready”.
I would argue that the rate of temperature rise over the instrumental period is unprecedented over the entire Holocene judging by the Marcott data. The HADCRUT4 trend over 160 years ending in 2010 is 0.45 °C/century. The maximum rate in the Marcott data for 160-year intervals is 0.17 °C/century, around 1780 CE. Back toward the beginning of the Marcott reconstruction, the rate was just under 0.10 °C/century. I’m not sure how real those two comparisons are — first derivatives are noisy and the Marcott reconstruction gets much smoother around 400 CE.
There you go with your loopy, upside down logic again, demanding that skeptics prove a negative.
But this time you’ve managed to outdo yourself with the part about it “isn’t making a claim that flood frequency has already increased, only that it will.” As Hannah Arendt:
Ah, no. In that case I’m saying that applying anecdotal evidence to a question of frequency is fallacious.
Yes, that was an error on my part. The article states that we’re seeing a “new normal”, which I had forgotten when I wrote that comment. I also immediately caught my own error and corrected it.
This reader wants substantiation of that claim, and the article doesn’t provide it. Therefore, the article gets dinged in my book … as I have already stated.
Please continue exhibiting poor reading skills, Glenn. It does you much credit.
Well there you go with your argument by gibberish again.
But at the end of the day, and after all the rhetorical back flips and flourishes, the question remains the same: Do you have the empirical evidence to demonstrate that burning fossil fuels causes more frequent and severe flooding or don’t you?
Kip, on a smaller scale , turn your thoughts to the Ellicott City MD. Flood.
The authorities did what they were set up for from previous events, but by the time the phones were ringing telling folks to flee, those folks cars were being washed down the street.
JCH, sorry I’m getting back late, I only just noticed your comment. A couple of points about your observation, with regards to the Ellicott City flood. Floods at that location are a known risk. The county has a full time emergency management center which is responsible (in part) for vulnerability to floods. An early warning automated system of gauges throughout the drainage basin, mitigation studies, practice drills, and (believe it or not) computer models. A couple of years ago the Crisis Center hosted an open house so us tax payers could see how our money was spent. It looked like a Hollywood set right out of the movies. I’ve read elsewhere that Howard County is one of the “richest” in the country.
It seems we can afford the best. I hope that means professional staff as well as window dressing. So despite these efforts, we get a flood that the system could barely cope with. The best wasn’t good enough? Of course, the “blame game” is in full swing with fingers pointing in all directions. It is an election year. Well the county had a PR type on the local TV station. I’ll point out two things, he wasn’t standing by a bridge, and was not wearing a clown costume. I can’t speak for Missouri or Louisiana.
More anecdote not answering a frequency argument.
Gates the people claiming we will see more frequent flooding events and events in places we haven’t seen them before lack even anecdotal evidence. They got nothing – except models and faith.
Irrespective of whether that’s true or not (hint: it’s real tough to prove a negative claim), how does that excuse the anecdotal arguments Denizens are making?
What a horrible thing to only have physics.
No negative claim Gates, meaning you don’t have an answer and just threw out chaff to hide behind.
The claim being discussed is the one made by the alarmist camp, saying that we can expect to see more frequent flooding events and events in places we haven’t seen them before. Someone responding to that claim by pointing out past similar events may not be strong evidence against that claim, but it doesn’t excuse the proponents of that claim from providing the evidence which supports it. It really is that simple. They are making the claim (or stating their hypothesis), thus the responsibility to provide data supporting that hypothesis is on them. And as I said, they lack even anecdotal evidence.
There you go with your loopy, upside down “logic” again.
Let’s take a very simple example of someone being asked to prove a negative claim.
In this case, the district attorney is demanding that the defendant prove the negative claim, the negative claim being, “No I didn’t (shoot Joe).”
Now let’s apply that to the immediate case:
In this case, it is brandonrgates who is demanding that the skeptic prove a negative, the negative claim being, “I doubt that (burning fossil fuels causes more frequent and severe flooding) .”
Trivia question: how many Missouri River dams were there in 1927?
Before you can compare historical floods with a contemporary flood event, you have to account for every single mitigation project between then and now, including ones that might have made modern flooding worse. Only then can they be compared. It’s unlikely such an analysis has been done for the recent Louisiana flooding.
Too often some bozo looks at a flood gauge by some bridge and shoots his mouth off on the internet.
What’s your point?
Has all of the channeling and flood control infrastructure made comparisons between now and the past something not so simple? I’ll agree with that.
Or are you implying that extreme “climate events” are being hidden (.i.e can’t be assessed against past events) by all this infrastructure? That’s a lot tougher case to be made. Because in addition to all that flood control infrastructure, we also have a huge increase in impervious surface area and a large decrease in low land marshes and other wetland areas which would have acted as a sponge for flood waters.
As I said, anyone interested in the recent Louisiana flooding should compare tot he Great Flood of 1927.
Those wishing to try to turn everything into some kind of climate wars fight will have to do so without me. (I am not much interested in that.)
It is the press, the media, in general, that were pushing the climate change angle, as they do with every major weather event.
Why don’t you mention the flood mitigation built since 1927 that may have made this look like a smaller event in the areas effected by the 2016 flood?
compare the Great Flood of 1927.
1927 was a flooding of the Mississippi over a great area with rains over a large area.
2016 LA floods were from locally extremely heavy rains:
In the areas effected by the 2016 flood, there are records of prior flood events, so historical floods can be compared… doesn’t have to be 1927. 1953 would also do fine, and a flood during WW2 basically drove the funding to turn the Missouri into multiple large dams.
They are saying the Amite River crested at a record 46 feet, so the little 2016 rain event should be very competitive with history.
The Great Flood of 1927 ==> Many believe that it is the flood mitigation efforts taken after the 1927 flood, intended to ensure that such a flood never happened again, that have contributed to the current flooding patterns and to the “sinking” of the Mississippi delta.
In the current flooding, many areas that flooded were developments built on drained wetlands…..a hint that maybe that wasn’t the right place to build. Revkin, at the NY Times, is quite a promoter of the idea that a lot of the losses due to present day weather disasters is due to “building in harms way”. I agree with him on this. My wife and I live half the year on our sailboat ad have been up and down the East Coast too many times. We see million dollar homes, and entire neighborhoods, being built on land no more than a few feet above current normal high tide — not to mention cities appearing on coastal barrier islands, which are nothing more than sandbars.
My Aunt and Uncle owned an old military barracks on a barrier island along the east coast. When they bought it there was a ferry to that end of the island, and a bridge on the far end of the island. The ferry was faster than driving 80 extra miles to use the bridge. At the time there were several small fishing villages on the island. Their end of the island was remarkably wild… beautiful dunes and live oak forests… full of rattlesnakes.
The development forces demanded a new bridge. Soon resort hotels and big beach houses followed. Huge money led it… anybody who opposed development was branded an environmental wacko. The forest is mostly gone. It’s basically shore-to-shore construction.
‘ Revkin, at the NY Times, is quite a promoter of the idea that a lot of the losses due to present day weather disasters is due to “building in harms way”. I agree with him on this. ‘
I fully agree. I worked with the UK environment agency for some 10 years on one of their flood defence committees. They had no legal mandate to tell councils to refuse developers wanting to build on flood plains and the public purse had to pick up the cost of subsequent flood defences, often once a flood had occurred.
There are many many examples of people building in highly desirable areas that have road names such as ‘Floods reach’ ‘water street’ ‘Cliffs edge’ ‘beach road’ etc. They didn’t seem to get the idea that these roads were called that for a reason…
It’s a gigantic “so what?” moment. Opposing development is opposing economic growth. Don’t be a commie.
Reply to JCH ==> The point here is where development is done in areas prone to flooding, hurricanes, etc. Building on known 50 and 100 year flood plains, for instance. In Louisiana, many of the flooded neighborhood were built on drained wetlands — areas where water normally pools — those known to be prone to flooding.
Building on the beachfront of the Outer Banks of North Carolina — totally in harms way. Hurricane Irene opened new inlets on Hatteras Island, for instance, and damaged many homes built on the beaches there.
The people, via taxation by local, state and federal governments often are asked to pick up the bill for repairs through government sponsored insurance programs.
Sometimes this type of foolishness is used to try to “prove” that weather is causing more disasters, as the dollar amount keeps going up — when the dollar amount really reflects simply more development in harms way.
The point JCH is that one of the artifacts used by alarmists whenever a weather event occurs is to point to the record amount of damage being caused. Conveniently leaving out the part about growth from development – or as Kip quoted “more stuff in harms way”.
No one here is necessarily arguing to stop or reverse development. What is being asked is for some honesty.
Grab hold of your seats!
Almost unbelieveably, there was a segment on the CBS Evening News last night that attributed the increase in fires in California to something else besides CAGW.
It begins here at minute 00:09:40
eGads! California forest fires caused by Charlie Osgood’s retirement? (actually, couldn’t get the video to play…)
I don’t know why it won’t play. Maybe this link will work. It’s the 8/28 program.
Health security and climate change. What exactly is a “health security”? Is it that an ambulance will likely be available in emergency? The information is brought to us by the THE CENTER FOR CLIMATE & SECURITY
EXPLORING THE SECURITY RISKS OF CLIMATE CHANGE. The advisory board is full of retired admirals and generals. I already feel we are at war with climate – or maybe climate change.
The title of this thread and the articles linked got me thinking about the journey you’ve taken to get from AGW conformist, to “right on the money”.
I am thinking about where you’ve come from, how you got to here and my interpretation of where I think you may be heading. I think you are making a significant contribution to where policy analysts need to focus.
In my opinion GHG emissions are not a serious threat and the public, voters and policy makers are slowly becoming more convinced of this.
Therefore, there will be no significant world agreement to command and control polices to limit or control global GHG emissions. There will be no global carbon pricing schemes. Therefore, any attempts by countries or regions to impose such policies will fail, eventually.
However, what I think you are advocating – policies to improve resilience – are highly relevant no matter if GHG emissions are a problem or not.
What the world needs most are policies to increase GDP growth rate – i.e. development and productivity – for the whole world. Policy needs to focus on removing the barriers that hold the poor countries back.
One of the most important factors to focus on is to reduce the cost and increase the availability of reliable and plentiful energy for all countries as rapidly as possible.
Peter: written over 2 years ago (so some of the links may be out of date), this article concludes in much the same way: https://ipccreport.files.wordpress.com/2014/04/uk-climate-policies-are-pointless2.pdf
Thank you for that. It’s interesting. So is your bio. I am interested in unbiassed analyses of the risks (both positive and negative) of GHG emissions. (Project risk management considers risks to be both positive and and negative risks, where positive risks are opportunities to improve the outcomes (e.g., increase scope, improve quality, reduce schedule, reduce cost).
Reblogged this on Climate Collections.
Are we also preparing for cooler global temperatures?
New study shows impact of man-made structures on #Louisiana’s coastal #wetlands
Louisiana’s sinking coast is a $100 billion nightmare for big oil
For at least 40 years it has been very clear that the combination of river levees, which starve the delta of sediments, and man-made canals, which encourage the intrusion of salt water and mechanical erosion, lead to the reduction of wetland areas which protect man-made infrastructure.
Will slow sea-level rise slightly exacerbate the problem? Of course. Otherwise there is no significant nexus between the loss of lower Mississippi delta and “climate change”. This is a real problem which requires it own solution set and it should be divorced from its more popular cousin.
The World Bank’s Climate Change policies and those of the UN’s IPCC are as valid as the Bethe-Weizsacker model of nuclear “binding energy”
re: 6 maps…
We can’t all agree climate change, “is not getting reversed or slowed down,” by anything we do; otherwise, we couldn’t refer to a hindsight look at weather as ‘climate change.’ (italics added)
We’d have to call it, ‘climate no-change.’ Can we all, however, agree the author understood that an article about life around the world 50 to 100 hence needs global warming to be shoehorned in to augur up some relevance?
Re: Bjørn Lomborg’s Toilet Revolution …
$11 per person. Cheap! Most First World people spend more than that to repair the flapper valve in just one of their flush toilets.
I can imagine New York City, Chicago, LA, London and Paris without solar panels or wind turbines. I can’t imagine them without sanitary toilets.
This is actually a retread from the “Week in review – energy, water and food” from the 20th, but an article worth highlighting again. At the risk of boring readers a second time I will repeat my comments from that post:
I have always had a fascination with simple inventions which had a huge impact upon society.
Few ideas are simpler or as impactful as the “s” trap and the indoor toilet and sink which it enabled. With the indoor flushing toilet, mankind could occupy housing stacked vertically and compressed horizontally with few limits and still enjoy clean and healthy conditions. The “petri dish” effect of large cities was eliminated with this one improvement, using far less water than was needed for Roman-style sanitation. Once carried away, waste could be treated at a few key locations chosen by urban planners and engineers.
But country folks and mariners, among others, have long known that the flush toilet may not always be the optimum solution. Many smart people have studied the problem for a long time, and for certain situations some pretty good solutions already exist. In any case, the world could certainly use more options which are socially acceptable and capable of improving local living conditions and community health. This seems like a very worthwhile project. I hope that it is managed well enough to avoid the human tendency to infest and destroy such well-intentioned initiatives.
What diet makes best use of farmland?
From the article:
“The researchers concluded that a strategic shift toward a plant-based diet could reduce the amount of land needed to feed U.S. consumers and at the same time increase the number of people who can be fed from our agricultural resources. The result could be more food — without the need to clear more land — for hundreds of millions of people around the globe. ”
So, how will that be achieved? By using the power of the state? People make their own food choices based on habit, cultural preference, tastes, availability, and cost. How do you change that, assuming the change is desirable?
It is interesting to note:
1. Humans evolved a large brain as omnivores – eating meat provides far more energy for a high-calorie burning brain than plant foods.
2. Food crops are cyclical and vulnerable to drought, weather, pests, and disease. Feast and famine. They are also incomplete proteins.
3. Animals are food storage and insurance on the hoof, or chicken foot.
4. Some animals survive on marginal land – goats in the mountains of Afghanistan, for example. Others survive in cold wet climates – like sheep, cows, reindeer, etc. They can eat tough crops that are not that interesting to people. Consider turnips during the enclosure and the LIA in England. In that case, climate change induced an agricultural revolution away from wheat crops grown on commons.
5. Pigs are very efficient – they eat human refuse and grow rapidly in small spaces.
6. Chickens can scratch out a living. :)
7. Domesticated fish: There are human waste eating fish cultivated for human consumption in SE Asia. Eeeewwww! There are other species that tolerate warm water low in oxygen. Not for me but, whatever. I’m more of a brook trout guy…
As for the future, well, you know the thing about predictions.
I suspect we will be “growing” food from cells in the lab someday. Hopefully we’ll get that done before we have to start eating Solyent Green.
In the meantime, we’ll have to rely on modern industrial agriculture regulated by well-designed environmental laws. It actually works pretty well.
Poverty, not agriculture, is the problem.
Nice comment. In much of the world, food animals can graze a landscape that cannot be farmed: too dry, too short a growing season, too rocky, too steep. Sheep, goats, yaks, Masai cattle, camels, Swiss dairy cows, alpacas, llamas, and various poultry occupy this very large ecological food niche globally. There was a reason plains indians like the Arapahoe depended mainly on buffalo while eastern tribes like the Piscataway grew corn, squash, and beans to supplement fish and game—Arapahoe didn’t have center pivot irrigation systems. And many animals give ‘double’. Milk then meat. Eggs then meat. Wool then meat.
Homo sapiens evolved as an omnivore in hunter gatherer fashion. Agriculture just made both acrivities much more efficient. The game is in the pastures, the gathering is in the fields. The more vegan is better view is very parochial. The whole vegan diet thing is nuttery unless one is Hindu.
If South Dakota had to produce plants that people directly eat… its agricultural economy might simply disappear. Ultimately, they produce meat. There would appear to be many places where producing plants that people eat directly that would be significantly more efficient at doing so.
May I say that I share your confusion? Just in the last weeks WAPO had articles, first saying farmland is threatened by agw, then saying farmland causes agw. I suppose I can see both ideas, but it seems that the coin has two sides with us on the edge. Someone up thread mentioned tripe. My old granny came over on the boat from Scotland. We ate tripe. Your point about cultural tastes struck a cord. I can go to my local grocer and get huge jars of pickled pigs feet, but if I want tripe I have to put in a special request with the meat department. Keep fishing.
Good commentary, Justin.
“The researchers concluded that a strategic shift toward a plant-based diet could reduce the amount of land needed … The result could be more food — without the need to clear more land”
The article is behind the times with some of its ideas and theories. There’s a lot of disinformation spread by greenies. Most 1st world nations reached peak land use for agriculture decades ago. More and more land has been released back to nature as a result of both technology and more efficient land use. The results have led to a doubling of forest for most developed nations over the last 100 years. Chicken won the protein wars because of the limited amount of resources needed to raise them.
The below article was posted on this site some months ago, but it details the agricultural trends that we are seeing today. It’s pretty fascinating.
As I have understood, even cutting anthropogenic CO2 emissions may be regarded here as an adaptation action. However, there is no need to cut anthropogenic CO2 emissions, since there is no real risk of global warming that could be caused by anthropogenic CO2 emissions; https://judithcurry.com/2016/08/23/measuring-bias-in-the-u-s-federally-funded-climate-research/#comment-807304 .
Looks that way, Lauri, but you could make it no real risk of harmful global warming, to include the Lukewarmers.
So there is a ‘Center for Climate Security’.
And it’s a bunch of military guys.
Climate Industrial Complex?
I notice that as there are more and more non-profits, I feel less and less secure.
“The Louisiana floods are devastating, and climate change will bring more like them. We’re not ready.”
Vox appears to me to be highly ideological and close minded.
<a href="https://www.wunderground.com/history/airport/KBTR/2016/8/28/MonthlyCalendar.html?req_city=Baton%20Rouge&req_state=LA&reqdb.zip=70786&reqdb.magic=1&reqdb.wmo=99999#calendar" at about 30cm on 12 August.
In the US, such rains of 30cm or more do not indicate any trend, despite IPCC exhortations that they are increasing:
The NatGeo article on people adapting to climate change is an excellent example of utterly patronising waffle – Wow! Look here! People coming up with solutions without the aid of the white man! Remarkable! What any of these “remarkable” actions have to do with climate change is also never even mentioned, let alone explained.
Other folk’s questions help us sort our own thoughts. Thank you all.
scientists declare: The Anthropocene epoch
Humanity’s impact on the Earth is now so profound that a new geological epoch – the Anthropocene – needs to be declared, according to an official expert group who presented the recommendation to the International Geological Congress in Cape Town on Monday.
You may remember that our old –and much lamented – friend, RGates used to promote the idea of the anthropocene.
Our new friend Brandon gates does not seem to mention it as much, seems to agree that the dire things predicted by models haven’t happened as yet and that there is such a thing as natural variability.
We can almost make him an honorary sceptic.
Yes, I do remember RGates (a true gentleman of the blog), we had few exchanges on the subject of SSW
re Anthropocene: I think whole thing is nonsense according to the so called next to worthless evidence listed at the article’s end.
They are now looking a “golden spike”, I would suggest they should go back some 5000+ years, to that now so much troubled area, in the past known as Mesopotamia where the first written language records were found.
Not until he foreswears Marcott. Hope he will read my old post on that.
I take it you mean this post?
Makes the same point that McI, Tamino, Nick Stokes etc. all make: the 20th Century uptick isn’t robust (as stated in the original paper). This is old news. My response is the same as it always is: we have real thermometers for the 20th Century.
A few weeks ago you created an interesting chart and I supplemented it with some of my own
You were hoping to combine them all. Anything come of it?
It seemed to demonstrate that tacking highly variable thermometer readings -taken every day or month and averaged for an annual figure- presented an entirely different picture to somewhat theoretical novel proxies such as corals and tree rings whereby the sampling was often every 50 or even 100 years.
In short, the spaghetti type reconstructions appear to be a very coarse sieve through which real world annual variability seems to fall.
What isn’t robust is the blanket assumption that there were no such “upticks” in previous centuries.
But there were none (or not enough) previously.
plenty of evidence of upticks (and downticks in CET)
Its a period I am still researching but there looks to be another uptick around 1500, early in the 1400’s, late in the 1200’s…
Upticks and downticks appear to be the ‘norm’ if anything in climate can be said to be normal, rather than the gently changing data that averaging and novel proxies with coarse sampling produces
Mostly negative results I’m afraid, the main one being that proxy reconstructions disagree with each other in annoying ways. It would be best to generate plots illustrating what I’m talking about. I can briefly describe what I attempted to do.
For each paleo temperature reconstruction, I attempted to do a multiple linear regression which took into account solar, volcanic aerosols, orbital and CO2 forcings as well as modes of internal variability (to wit: ENSO, PDO and AMO taken from Mann 2009b). Getting good fits involved dialing in leads/lags for the forcings and internal variability, sometimes dubiously radical ones. No one single set of parameter twiddling provided decent fits for all of the temperature recons.
My conclusion was that I was doing nothing more useful than wiggle-matching and overfitting, thus getting visually plausible but physically implausible results. One diagnostic for that was sensitivity to CO2 forcing, which was widely different across all temperature reconstructions. That was to be expected from the beginning as the long-term trends for each temperature recon are also widely different.
My struggle also reaffirmed something I already knew; my simple linear models simply aren’t up to the task of emulating a highly non-linear system at such a highly simplified global level. And ultimately, that the temperature recons I was working with so widely disagree with each other it was impossible to consistently torture the data in a way that obtained a cohesive narrative from them.
I spent at least a full working week in my pursuit, perhaps two. Unless I’m confident that I know what I’m doing, if I can’t solve it in half a month, I can’t solve it.
I’ll bang away at making some plots this week, for better or for worse, because a picture is worth a thousand words — especially of failure. Feel free to follow up with me about it if (when) I procrastinate further, I did enjoy our discussions on the topic.
Thanks for your interesting and detailed reply. I think wiggle matching is probably an all to common problem as people spend a long time creating data and following a hypothesis they have invested a lot of time in.
I am thinking of making you an honorary RGates but I promise not to tell Sou That you seem to be enjoying it over here. :)
Let’s assume there were. Options are:
1) External forcing.
3) Internal variability.
4) The laws of physics changed.
5) Something else.
6) Some combination of all the above.
If we can’t rule out (4) right off the bat, we may as well give up. (5) isn’t very satisfying because it allows us to make up anything we wish to believe, and that’s not good science in my book. (6) is valid, and I would argue necessary because One Explanation To Rule Them All doesn’t seem appropriate in a discussion about a complex physical system such as a planet.
(1) implies a high climate sensitivity, which doesn’t bode well for CO2 and other anthropogenic GHG emissions.
(2) implies strongly positive feedbacks, and again a high climate sensitivity to external forcing.
(3) implies a low sensitivity to external forcing, but an outbound energy imbalance when the surface and atmosphere are warming and a loss of ocean heat content — which is inconsistent with warming trends since 1950.
We have them today. What they suggest is that we’re warming, and that we’re driving it. That they don’t tell us the history of the entire planet is an impossible standard of proof which is the typical refuge of people who, for whatever reason, don’t like what a consilience of evidence backed by well-established theory is telling us. You know, the sorts of folks who declare by fiat that there are no transitional fossils and ask, “If we evolved from monkeys, why are there still monkeys?”
You’re welcome, it was my pleasure to write it up.
As you can see, we’re already back to my regularly scheduled acerbic programming. Keeping up appearances for Sou is important, donchaknow. :-)
Without making unwarranted assumptions about how clouds were behaving, you can’t conclude anything about “outbound energy imbalance”.
Due to to potential variations in cloud-generated albedo, you can’t say anything about how much solar SW gets captured
And due.potential variations in cloud-generated “greenhouse” effects, you can’t even say anything about the capture of outbound IR from the surface, which means you can’t make assumptions about how it relates to surface temperature.
You can’t make unwarranted assumptions that clouds reflect sunlight.
Yup. Do you suppose a change to the downwelling IR due to GHG’s might change the balance of day clouds and night clouds?
That’s a different mechanism from what is posed in my item three. Answering “but clouds” to (3) implies a strongly positive cloud feedback, not a strongly negative one.
Worse than unwarranted assumptions is trying to have things both ways.
For some reason, I cannot seem to properly thread my replies today. My apologies.
The problem, IMO, is that you can’t assume there’s any causal relationship between global surface temps and heat flow into the deep (subthermoclinal) ocean.
Assume that both that heat flow, and some aspects of clouds, vary in (pseudo-)random ways on time-scales from annual to millennial. Assume that the “feedback” from clouds to CO2 “forcing” is strongly negative at a global level.
In that case, you could have a very low global sensitivity to CO2, while surface temperatures, rainfall, and deep ocean heat flow all vary substantially and (pseudo-)randomly at a regional level. And have been all along.
The “global averages” of those features wouldn’t vary as strongly, but could easily show sharp upticks and downticks of the sort we see in the 20th century.
Of course, those foregoing assumptions were just for the sake of illustration. Actually making them would be as unwarranted as assuming that the 20th century uptick is totally unprecedented.
As I’ve said before, my personal opinion is that it’s likely the regional effects of CO2 strongly outweigh its global effects, and combine in a variety of non-linear ways with regional internal unforced variation. But that’s just a guess. Based on the probable (IMO) existence of substantial regions of strong negative cloud feedback to CO2 “forcing”, adding to likely positive feedbacks in other regions.
Heat had to get there somehow. Options are from above or below.
OK, I was leaving a few parts out a commonly understood.
AFAIK the major flows of heat into and out of the subthermoclinal ocean are as follows:
1. Upward [sic]* via subduction of cold water at/near the poles
2. Upward via upwelling off the west coast of continents
3. Downward via wind/eddy-driven turbulence over much of the ocean.
Other flows like volcanic (from below), diffusion (from above), etc., can probably be ignored as trivial for a first-order analysis. If I’ve missed a major, please let me know.
Most of the deep ocean is within a few degrees of freezing, and century-scale variations in transport, even 20th-21st century variations, add up to a tiny fraction of a degree.
1. Cold water subduction could well be influenced by current and wind patterns, as well as temperature at the polar end of the mid-latitude gyres. It’s certainly plausible that there’s some sort of longer-term correlation between temperature and subduction rate, but it’s also plausible that there isn’t, or even a reverse correlation within limited bounds.
2. ENSO (and its Atlantic equivalent to a much smaller degree) provide major variations of some sort in some aspects of upwelling, by bringing more or less cold water into effective range of surface mixing. There is no good reason to assume any direct (e.g. linear) relationship between average surface temperatures at a longer time-scale than ENSO and upwards heat flow.
And while El Niño would seemingly involve both higher global surface temperatures and lowered upwelling rates, it also seems to involve higher rates of planetary heat loss (AFAIK).
3. Eddy-driven heat transport involves bringing heat from a warm surface into the colder depths. Small changes to surface temperature would seem (to me) to have a smaller effect than variations in storm tracks in the West Pacific, South China Sea, and (to a lesser extent) Caribbean.
AFAIK fluid transport by eddies has two components: a sort of “scooping up/down” of boundary layer fluid when the eddy is stretched, and lateral transport in the plane of the eddy, which could accomplish vertical heat transport when the eddy’s plane of rotation isn’t vertical.
I would assume (in the absence of studies, which I couldn’t find when I last looked) that both of these mechanisms would be substantially amplified by increased bottom relief. Thus, we should look to regions with high bottom relief as well as substantial wind-driven meso-scale eddies for the greatest downward heat flow. It helps if they’re in the tropics, where the surface water is much warmer.
For variations, which is what we’re looking for, I would guess (but that’s all it is) that the major source of variation would be changes to storm tracks, synoptic activity, and gyre edges. Specifically, how they interact with variations in bottom relief.
As far as I can see, none of these major heat flow mechanisms offers any justification to assume a linear (or pseudo-linear) response to small changes in “global average temperature”.
* A downward flow of water that’s colder than the main body is tantamount to an upward heat flow.
Fair enough. I could have done a better job asking for clarification.
AFAIK (which ain’t much), you’ve covered the majors. It’s not clear to me that we can entirely ignore diffusive transfers. Other than that, no issues here.
Sure. Down to 2,000 m, the oceans have about 540 times the heat capacity as the atmosphere. About 1,000 times to full depth. Figuring out land masses’ “effective” heat capacity (if there is such a thing) is trixy. Point is, the oceans are a tremendous heat sink and thus I would argue that they’re the single best diagnostic of planetary energy imbalance (if someone held a gun to my head and forced me to pick one and only one).
From the mid-20th century to present, the ratio of 2,000 m ocean temperature to global land/ocean surface temperature (HADCRUT4) is 1:7.6. Bintanja et al. 2008 (a three million year paleo model study constrained by temperature proxies) puts the “deep ocean”/surface ratio at about 1:5, with peak to peak (and trough to trough) lags of about 2,000 years. That lag is about double the commonly-cited round-trip time of the thermohaline “conveyer belt” of 1,000 years.
True or not, that’s my multi-millenial-scale picture of forced change and surface/ocean temperature equilibrium.
Deep-water formation pretty much has to be density driven, so don’t leave out salinity.
I’ll restate that just to make sure I understand: ENSO frequency is too high to explain any secular surface temperature trend.
Yes, for the physically plausible reason that higher surface temps correspond to higher rates of heat loss. This is my main problem with internal variaiblity diddit to explain surface temperature rise since the mid-20th century.
I think this is one of several chicken-egg problems we’re facing here. That would tend to put more heat into the oceans, but I’m (as usual) having difficulty imagining how that would get us a warming surface at the same time.
You keep saying that and I keep trying to figure out where you think I argued it.
From the article:
China’s government-run nuclear industry is based on foreign technology but has spent two decades developing its own with help from Westinghouse Electric Co., France’s Areva and EDF and other partners. A separate export initiative is based on an alliance between Westinghouse and a state-owned reactor developer.
The industry is growing fast, with 32 reactors in operation, 22 being built and more planned, according to the World Nuclear Association, an industry group. China accounted for eight of 10 reactors that started operation last year and six of eight construction starts.
Abroad, builders broke ground in Pakistan last year for a power plant using a Hualong One, supported by a $6.5 billion Chinese loan. Also last year, Argentina signed a contract to use the reactor in a $15 billion plant financed by Chinese banks.
Sales come with financing from state banks, a model that helped Chinese companies break into the market for building highways and other public works in Africa and the Middle East. State-owned companies also are lining up to invest in nuclear power plants in Britain and Romania.
“This is generating significant build-up of skills and industrial experience,” said Mycle Schneider, a nuclear energy consultant in Paris, in an email.
Still, Beijing is “seriously underestimating” how hard global sales will be, said Schneider. He said obstacles include strict quality controls, regulations that differ from country to country and competition from the falling cost of wind and solar.
“There is simply no market out there,” said Schneider.
At home, Beijing faces public unease about nuclear power following an avalanche of industrial accidents and product safety scandals.
Norway drips with hypocrisy. These people need a swift kick in the pants. Their economy is based almost solely on oil, but then they do this … from the article:
Norway’s $900-billion wealth fund can no longer invest in Duke Energy, the biggest U.S. power firm by generation capacity, due to alleged breaches of environmental law at its coal-fired plants, Norway’s central bank said on Wednesday.
The fund, which owns 1.3 percent of the world’s listed company equity with stakes in some 9,050 firms, is barred from investing in companies that make nuclear weapons, anti-personnel landmines or tobacco, among other ethical criteria.
Duke Energy and its subsidiaries Duke Energy Carolinas, Duke Energy Progress and Energy Progress Inc were excluded “based on an assessment of the risk of severe environmental damage”, the central bank’s board said in a statement.
At the end of 2015, the Norwegian fund, the world’s largest, held 0.62 percent of Duke Energy, a stake worth $304 million, but it has since sold the shares in Duke and its subsidiaries.
The board’s decision was based on a recommendation by the ethics watchdog for the fund, the Council on Ethics.