by Steve Mosher and Zeke Hausfather
Today the Berkeley Earth Surface Temperature Project publically released their accumulated minimum, maximum, and mean monthly data.
The data set is a composite of fourteen different surface temperature datasets, including:
- Global Historical Climatology Network – Monthly
- Global Historical Climatology Network – Daily
- US Historical Climatology Network – Monthly
- World Monthly Surface Station Climatology
- Hadley Centre / Climate Research Unit Data Collection
- US Cooperative Summary of the Month
- US Cooperative Summary of the Day
- US First Order Summary of the Day
- Scientific Committee on Antarctic Research
- GSN Monthly Summaries from NOAA
- Monthly Climatic Data of the World
- GCOS Monthly Summaries from DWD
- World Weather Records (only those published since 1961)
- Colonial Era Weather Archives
This represents an unprecedented amount of land measurement data available, with 40,752 unique station records comprised of over 15 million station-months of data. It is also an invaluable resource for those of us interested in analyzing land temperature data, as it provides considerably better spatial coverage than any prior datasets.
The overall picture is unsurprising: the Berkeley Earth data shows nearly the same long-term land warming trend found in NCDC, GISTemp, and CRUTEM records.
Note that the CRUTEM record used here is the Simple Average land product rather than the more commonly displayed hemispheric-weighted product, as it is more methodologically comparable to the records produced by other groups. The GISTemp record shown here has a land mask applied.
The Berkeley group does go considerably further back in time than any prior records, with data available since 1750 (or, more reliably, since 1800) with uncertainty ranges derived in part by comparing regions where coverage is available during early periods to the overall global land temperature during times when both have excellent coverage.
The Berkeley group also applies a novel approach to dealing with inhomogenities. It detects series breakpoints based both on metadata and neighbor-comparisons and cuts series at breakpoints, turning them into effectively separate records. These “scalpeled” records are then recombined using a least-squares approach. For more information on the specific methods used, see Rhode et al (submitted).
Having access to the raw data allows us to examine how the results differ if raw data is used and no homogenization process is applied.
Here we see published Berkeley data compared to two different methods using their newly released data. The “Zeke” method uses a simple Common Anomaly Method (CAM) coupled with 5×5 lat/lon grid cells, and excludes any records that do not have at least 10 years of data during the 1971-2000 period. The “Nick” method (using Mosher’s implementation of Nick Stokes’ code) uses a Least Squares Method (LSM) to combine fragmentary station records within 5×5 grid cells and only excludes stations with less than 36 months of data in the entire station history. The “Zeke” method employs a land mask to adjust the weights of grid cells based on their land area, while the “Nick” method does not. The dataset analyzed here is the “Quality Controlled” release, which involves removing obviously wrong data (e.g. a few 10000 C observations) but no adjustments.
The results show that our “raw” series are similar to Berkeley’s homogenized series, but show a slightly lower slope over the century period (and less steep rise over the last 30 years). How much of this might be due to differing methodological choices vs. homogenization is still unclear.
Finally, the newly released Berkeley data includes metadata flags indicating the origin of particular station data. This allows us to compare the standard Global Historical Climatological Network-Monthly (GHCN-M) data that underlies the existing records (GISTemp, CRUTEM, NCDC) to all of the new non GHCN-M data that has been added.
Here we see (using the “Zeke” method described earlier) that non-GHCN-M stations produce a record quite similar to that of GHCN-M stations, though there is a bit more noise early on in the record as the non-GHCN-M set has fewer station months that far back.
Steven Mosher has done extensive work on building a (free and open source) R package to import and process the Berkeley Earth dataset. Details about his package are below:
The first official release of the Berkeley Earth dataset can be a bit daunting at first, but everything is there for people to get started looking at the data. An R package BerkeleyEarth has been created to provide an easy way to get the data. First some background on the data ingest process. Berkeley Earth Surface temperature ingest many different datasets. Those dataset are then transformed into a common format. The package has not been tested with the common format and won’t support reading daily data until a future release. The sources are defined in the file source_flag_definitions.txt. That file is read with the function readSourceFlagDefine().
The next step in the data process is to create a multiple value file. This file is created by merging the source data into one dataset. The various sites will have 3-4 series per site that make up the complete record. At this point only limited QC is done on the data. In future releases of the package the process of merging data and the QC applied will be documented. At this stage the package has not been tested on the multi valued dataset.
There are then two steps that get applied to the multi valued data: There is a quality control step and a seasonality step. After these two processes there are 4 datasets; they are all single valued datasets. The QC step applies quality flags to the data, removing those data elements that are suspect. The seasonality step removes seasonality. This is described in the main readme for the data. For the R package the following dataset was used: Single value, quality controlled; no seasonality removed. Over the course of time and with help from others the other single valued datasets will be tested as well as the multi valued dataset and the source datasets.
Reading the data:
There are three different functions for reading the main datafile: data.txt. That datafile contains 7 columns of data: The station Id, the series ID, the date, the temperature, the uncertainty, the number of observations and the time of observations. The data is monthly data, so the number of observations indicates the days of month that reported observations. The data is presented in a sparse time format. That means missing months are not represented in the file. Values indicating NA for a month are not provided.
A simple routine is provided for reading in this data as is:
readBerkeleydata(). This function does two things. The size of the Berkeley dataset is such that it may overrun the users RAM. To manage this the function creates a memory mapped file of the data. On the first invocation of the function if the memory mapped file does not exist it is created. The function is called like so: Bestdata <- readBerkeleyData(Directory, filename = “data.bin”). On the first call “data.bin” does not exist, so it is created. This takes about 10 minutes. Once that file is created, subsequent access is immediate and the program will return access to the file. This is accomplished using the package bigmemory. All 7 columns are returned as a matrix.
To access only the temperature data, two functions are provided: readBerkekelyTemp() and readAsArray(). At present readBerkeleyTemp() is not optimized. That function will read the data and create a file back matrix for just the temperature data. It takes hours. Once the buffering is optimized this will be reduced. The approach is the same as with reading all the data. The data.txt is read and processed into a 2D matrix. If data.bin exists, it is read. The 2D matrix has a row for every time in the dataset from 1701 to present. There are columns for every station, over 44,000. Missing months are filled in with NA. There will be stations that have no data. Once the file backed matrix is created, access to the data is immediate, but the first call to the function takes hours to rebuild the dataset into a 2D matrix. As noted above, buffering will be added to speed this up. If your system has less than 2GB you are almost forced to use this method of reading. The function is called like so: Bestdata <- readBerkeleyTemp(Directory, filename = “temperature.bin”). Again, if temperature.bin does not exist it will be created by reading data.bin or creating data.bin from data.txt. If temperature.bin exists it will be attached and access is immediate.
The readAsArray() function is for users who have at least 4GB of RAM or more. This function reads data.txt into RAM and converts it into a 3D array of temperatures. At this time the function does not create a filebacked memory mapped version of the data. The dimensions of the array are Station, Month, and year. All of the functions in package RghcnV3 use this data format. So for example, to read the data in and remove the stations that have no data, and window from 1880 to 2010 we do the following:
Berkdata <- readAsArray(Directory = myDir, filename = “data.txt”) Berkdata <- windowArray(Berkdata, start = 1880, end = 2010) Berkdata <- removeNaStations(Berkdata)
The array format has R dimnames applied so that the station Ids are the dimnames for margin 1, months are margin 2 and years are margin 3. Berkdata[1,”jan”,”1980”] extracts the first station Jan of 1980. It’s a simple matter to create time series from the array. The array can also be turned into a multiple time series with asMts(). This function unwraps the 3D array into a 2d matrix of time series with stations in columns and time in rows. Berkdata <- asMts(Berkdata); plot(Berkdata[,3]) just plots the 3rd station as a time series.
In addition to temperature data there are various subsets of station metadata. For every subset of metadata from the most complete to the summary format there is a corresponding function: readSiteComplete() reads the complete metadata, and readSiteSummary() read the file that contains site Id, site latitude, site longitude and site elevation. These functions output data.frames that work with the RghcnV3 format. That means we can take a station inventory and “crop” it by latitude and longitude, usiing the RghcnV3 functions:
Inventory <- cropInv(Inventory, extent= c(-120,-60,20,50))
The package is currently at version 1.0 available on CRAN. The version described above is 1.1 which is in the build que. Version 1.2 which has buffering implemented is due for release shortly. The package RghcnV3 is also there.
Disclaimer: Both Steven Mosher and Zeke Hausfather are participants in the Berkeley Earth Surface Temperature project. However, the content of this post reflects only their personal opinions and not those of the project as a whole.
JC comment: The new version of the Berkeley Earth Surface Temperature dataset is now achieving its goal as an unprecedented data resource, including transparency and user-friendliness. The addition of Steve and Zeke to the team was an excellent move. They have clearly added value to the product. Further, they provide a welcome and needed link between the Berkeley team and the blogosphere.
Thank you, thank you, thank you for additional quantitative information on Earth’s surface temperature.
These results are probably too late to “save” the reputation of those scientists who blindly supported the AGW dogma.
But these results may persuade major research journals and the US NAS, the UK RS, and the UN IPCC to address four decades of quantitative data (1972-2012) on the origin, composition and source of energy in Earth’s heat source – the Sun (summarized on one page below):
http://dl.dropbox.com/u/10640850/The_Sun.pdf
The entire Western scientific structure may otherwise have to be rebuilt after AGW collapses.
One observation concerning the early part of the data is that the apparent variability is very large in the period 1800-1850. That raises the question: Has the temperature really been more variable or is the effect just a consequence of the small number of weather stations that contribute to that period.
My guess that the reason is indeed in the small coverage as it’s well known that wide area averages vary less than local temperatures, but it would be interesting to know, how well this issue has been analyzed. The first step towards answering that question would be to present the average based on the same very limited coverage up to the present.
Pekka, they have done this, this is actually the main feature in assessing uncertainty in their global averages, see the Rohde et al. paper
http://berkeleyearth.org/pdf/berkeley-earth-averaging-process.pdf
Judith,
From the error bands it’s clear that the uncertainties in the early temperatures relative to the more recent ones have been analyzed in that way, but I’m looking at the variability that is perhaps most strongly shown in Figure 5 (lower band) of the paper you linked. I couldn’t find discussion of the large swings between the maxima around 1800 and 1825 compared to the minima around 1815 and 1835.
The visual effect leads one to think that the overall temperatures are uncertain, but the early variability is a real effect, while I think that the variability is just a consequence of the limited coverage and the small total number of measurements. Perhaps you have studied this as well and I was just unable to spot the discussion.
Pekka, Muller and Rohde’s presentations at the Santa Fe Conference described this quite well; unfortunately the link to muller’s presentation no longer works, and Rohde’s is not publicly available.
Judith,
Based on your comment and using Google I found this page with working links to Santa Fe presentations by both on the right
http://berkeleyearth.org/available-resources/
thank you! I’m glad they finally posted Rohde’s presentation, since this is a good qualitative explanation of his approach.
I checked the slides. Neither presentation has comments on the apparent strong variability. I haven’t seen anybody to claim that the data would tell about a real effect, but it would certainly be interesting to know, if the data gives significant evidence for a stronger variability in the period 1800-50. Therefore I’m surprised about the lack of any comments on that.
In Muller’s presentation, he discusses the volcanoes, I think that is a big part of the early 19th century variability
Pekka,
The early nineteenth century saw some very strong tropical volcanic activity. A dip on the scale of what is seen in the BEST data is actually an expectation from modelling of that period, as can be seen in this recent post (and paper) by Mike Mann.
Volcanic activity was also an important part of pre-1940 forcing in the twentieth century. Declining volcanism contributed significantly to warming in the first half of the twentieth century, along with increasing CO2 and other ghgs, and increases in solar irradiance. See, for example Figure 5, and similar data are available from other sources.
Pekka said, “My guess that the reason is indeed in the small coverage as it’s well known that wide area averages vary less than local temperatures, but it would be interesting to know, how well this issue has been analyzed. The first step towards answering that question would be to present the average based on the same very limited coverage up to the present.”
I have been looking into the variability a good bit. The swings appear to be mainly due to Northern Hemisphere volcanic activity, Kuril Islands, Iceland, Aleutian Islands and Alaska. The extremes in the northern high latitude temperatures are enough to have an impact on the global average.
Pekka,
I believe that Richard has taken a look at that ( using the same coverage )
The large variability and its phasing is a current topic of investigation.
Pekka,
There’s also larger than usual variation in solar cycle frequency (or solar cycle length) in that period. It correlates very well with the temperature variation. After 1856 (start of sc 10), the variation in frequency decreased, just like the temperature variations. Frequency generally increased in 20th century, just like the temperatures. Check it out.
Data (these are Timo Niroma’s scl numbers, there are others that are slightly different):
sc#_year (cycle start)_scl_trend
02_1766.5_9.0_warming
03_1775.5_9.2_warming
04_1784.7_13.6_cooling
05_1798.3_12.1_cooling
06_1810.4_12.9_cooling
07_1823.3_10.6_warming
08_1833.9_9.6_warming
09_1843.5_12.5_cooling
10_1856.0_11.2_flat
11_1867.2_11.7_cooling
and so on.
Short cycles (high frequency) are strong (warming), long cycles are weak (cooling).
Thanks Steve and Zeke, and Dr. Curry. I was a bit frustrated by the reactions of some posters on another blog who quickly rubbished BEST. It seems to be the nearest we’ve got to the open-data, open-source requirement that has always been asked for. Congrats on your contribution.
However, I do have some questions:
(1) Peer review in sight anywhere?
(2) UHI effects – are these really dealt with adequately? This was a bugbear with BEST 1.
(3) Changes from BEST 1 to BEST 2 – are they adequately explained and transparent?
(4) How does BEST deal with areas without stations, eg: Arctic, or doesn’t it try?
(5) How is the land + sea version coming along? Will the sea results be ‘preliminary’ in the same way as BEST 1?
(6) Is this version ‘final’, or will there be further versions. If so, how will their veracity be checked? (compare NDSC / GISS mucking about with the past).
(7) Where do I get a bigger PC? :-)
Thanks again.
(1) Peer review in sight anywhere?
Its a slow process from what I observe. But its happening
(2) UHI effects – are these really dealt with adequately? This was a bugbear with BEST 1.
There is a paper about the size of the effect ( if any) in the dataset.
(3) Changes from BEST 1 to BEST 2 – are they adequately explained and transparent?
Largely its putting data into formats that people can use. The orginal
release is posted so you could compare. Not a fruitful excerise
if you apply two brain cells to the question.
(4) How does BEST deal with areas without stations, eg: Arctic, or doesn’t it try?
its kridging
(5) How is the land + sea version coming along? Will the sea results be ‘preliminary’ in the same way as BEST 1?
SST is a ways off
(6) Is this version ‘final’, or will there be further versions. If so, how will their veracity be checked? (compare NDSC / GISS mucking about with the past).
nothings final: This dataset ingests from multiple sources and creates a compilation of all of them. That is a big task. not sure what you mean exactly
(7) Where do I get a bigger PC?
Thanks Steven.
“SST is a ways off”
You bet it is. There’s no anthropogenic warming in the ocean. That’s because where there’s an infinite supply of water to evaporate back radiation doesn’t slow down heat loss. Restricting the ability to lose energy via radiation just increases the amount lost by evaporation. Over land where there is no infinite supply of water the response to increased radiative restriction is a higher surface temperature. So-called global warming is really land warming. Over land we can expect 1.1C per doubling of CO2. Over water we can expect little to no rise at all. Given that there’s twice as much water as land then the average global surface temperature is only going to rise a half degree C per doubling. This agrees with observations. There is no missing heat. The missing heat is evenly distributed in a sphere with a radius of 100 light years with the earth at its center.
I have Kevin Trenberth heself for a reference @ NPR that the ‘missing heat’ may have already been radiated off into space. Now, there’s a scientist.
Somewhere there under that velvet growth of policy.
===================
Dave you seem a bit overconfident. have you looked at icoads? when you finish building a dataset from the ground up please share it
until that work is done I think its best to be open minded. sceptical
steven
your thoughts on remaining open minded have been duly noted
ding!
next!
Seconded.
Do they have plan to incude the sea?
Yes. but first things first.
Mosh
As you may recall I looked at the reliability of SST’s some months ago in an article here and then carried out a long email conversarion with one of the (very good) scientists at the Met office.
I believe that SST’s are utterly unrweiable until around the 1970’s and I personally think you would be wasting your time if you were to try to include the historic data in your calculations.
Judith also seems to be turning against them, so it would be interesting to have her perspective on them
tonyb.
I see a huge need to sort through the SST data and expand the data base with historical ship records, etc. The uncertainty assoc with the ocean part of the data set is much greater than for land, and covers more than twice the area as land. “Official” international groups are taking this on, but I would like BEST to take this on also for an independent evaluation and application of new methods, etc.
Jack & Steven shudder @ the suggestion.
============
Tony I did icoads work a while back. I’m going to suspend judgement and comment. its not fresh in my brain. however I’m not as pessimistic as you are
By accident or design the North Atlantic SST, with 2 corrections happens to be pretty good. Reykjavik atmospheric pressure has been measured regularly and accurately since 1860. Barometer was invented in 1640s, and more accurate aneroid barometer in 1840 (with easy re-calibration against modern instruments).
Existence of a good correlation between the AMO and the Reykjavik atmospheric pressure was not known until very recently (late 2011), hence there is no reason to suspect that the atmospheric pressure data was ‘adjusted’ to correlate to the AMO.
http://www.vukcevic.talktalk.net/theAMO-NAO.htm
John Kannarr | February 18, 2012 at 3:43 pm |
Just out of curiosity, is this accounted for in the climate models?
———
Good question. Here’s a place to read a bit about the answer for just one of the models:
http://data1.gfdl.noaa.gov/~arl/pubrel/r/mom4p1/src/mom4p1/doc/guide4p1.pdf
It’s a technical manual for one ocean model and shows how incredibly complicated these models are, and certainly downwelling and the storage of heat in the deeper ocean is accounted for in the models and even predicted by it, but I think Dr. Curry could more directly answer your question.
Peter317 | February 18, 2012 at 4:02 pm |
R. Gates,
I know about downwelling, but:
a) the NOAA data goes down to 300m, says nothing about >2000m
b) what goes down must come up. Any warmer water getting down below the thermocline will come back up as soon as its beyond the influence of the currents.
——-
Don’t know what data you are looking at, but on this page:
http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/
Of NOAA ocean heat content, if you click on the number 2 under the chart on the right hand side, you’ll see the data goes down to 2000m. The additional 23 x 10^22 Joules of heat stored since 1970 or so in upper 2000m of ocean, could represent only a fraction (maybe around 1/3?) of he total energy stored in the ocean over this period. Even more importantly, the entire heat capacity of the ocean surface (top meter or so) and atmosphere combined is of course a very very tiny fraction of what the full ocean could store, and thus, trying to get a good handle on the total changes in energy content of the ocean all the way down to abyssal levels is critical to understanding the full energy balance (or imbalance) of the Earth climate system.
yes Steven, the steadily rising ocean temperature is undoubtedly what allowed Amundsen to navigate the northwest passage circa over 100 years ago and what allowed the vikings to grow apples and raise cattle on Greenland 1000 years ago.
spare me, please
Don’t hold your breath. Global ocean temperatures have a different story to tell. A very chilly story. Average global ocean temperature is 4C. That’s not because it’s the highest density point either. Saltwater at ocean salinity keeps right on increasing in density until it reaches its freezing point of approximately -2C. The ocean is 4C because that’s the average surface temperature over complete glacial/interglacial cycle of 100,000+ years where even the slow process of conduction has ample time to reach maximum entropy. The real story is our semi-warm interglacial world is thin skin of 16C luke warm water riding atop a bucket of 3C icewater.
Stir, don’t shake.
===========
Hard to see the ocean temperatures telling a “very chilly story” with ocean heat content down to 2000m rising by about 23 x 10^22 Joules since 1970, and 10 x 10^22 Joules just in the past 10 years alone (when the Earth system was supposedly “cooling” according to skeptics). Seems that AGW skeptics always want to talk about tropospheric temps (when they are flat) but have a hard time wanting to discuss where most anthropogenic warming would go naturally– into the deeper ocean.
Ocean data is definitely in the works, but its a rather non-trivial project.
We definitely need to get more accurate, widespread, and independent ocean temperature information, especially in the deeper ocean below 2000m. (It’s a travesty that we don’t have it). As the biggest heat sink on the planet, if there an energy imbalance going on, the oceans are the place most of that energy would go. If the Heartland Institute wants to fund a valuable climate research project, this would be a very good one.
A couple of questions:
1) How does the heat get to the deeper ocean without first warming the surface layers?
2) Because of the thermocline, heat takes a very long time to get to the deeper ocean. So how do you imagine so much heat got there in such a short time?
Peter317,
Energy transport to the deeper ocean is primarily driven through large downwelling currents that occur in several specfic areas around the global ocean. One of those, that is of particular interest, is in the western Pacific, and is part of the Pacifc Warm Pool. The downwelling of warm water from the surface, especially in La Ninas years, can be be quite high in these areas. One way to get a good look at this, is to go to the ENSO update page, here:
http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/lanina/enso_evolution-status-fcsts-web.pdf
And scroll down and look at the sub-surface temperature departures, and the progression over the past few months. What you’ll see is large body of warm sub-surface water building in the Western Pacifc (the Pacific Warm pool). As you’ll see, it doesn’t take as long as you might think for warm water to be forced downward into the deeper ocean in these downwelling areas. Also remember, there are many of these downwelling areas in points around the global ocean.
Finally, keep in mind that SST’s really are measuring heat flux, as they are measuring energy that is in the process of leaving the ocean. So in really gauging the Earth’s energy imbalance, one needs to look a the biggest possible battery that holds energy on the planet, and ocean heat content (and not SSTs or tropospheric temps) are the best and most accurate way of seeing if that battery is charging (i.e. more energy going in than coming out of the Earth’s systems).
Just out of curiosity, is this accounted for in the climate models?
R. Gates,
I know about downwelling, but:
a) the NOAA data goes down to 300m, says nothing about >2000m
b) what goes down must come up. Any warmer water getting down below the thermocline will come back up as soon as its beyond the influence of the currents.
If it was going to the deep ocean it has to pass through the surface first. Local downwelling currents do not explain why the global surface is not heating. What you describe is unphysical and is a crude excuse to explain away what nobody predicted; including the models: After the fact revisionism of models cannot be called a prediction even in this mendacious field.
As for you presenting revised model results as fact, well you can persuade yourself of anything if you really want to but don’t piss on our shoes and say it’s raining. Most of us prefer a plausible theory with backup data. You present an implausible theory with no data. Yet there is really no need for missing heat skilfully evading the surface searchlights and going beyond the barbed wire into the murky depths if there is indeed no large positive feedback of GHG’s. This lack of GHG heat trapping effect is even backed up by the stratosphere not cooling since 1995 and by the radiosonde results. There is no data at all that shows enhanced GHG warming. In any normal endeavout the hypothesis would be accepted as disproven by now. Too many snouts in the trough and too much institutionalised pessimism makes it linger longer.
If you want to salvage something from this farrago then you can only say that the very lower bound of the models (no feedback) is closest to reality. However that represents a scenario which even the IPCC says will be of net benefit to mankind. Too bad eh?
@gates
Warm currents downwelling? That’s new. How does warmer water sink into colder water, exactly?
I think what you are struggling to mangle in how the oceanic conveyor belt works is that warm tropical currents move on the surface to the poles where they cool and sink. Cold currents from the poles travel along the ocean bottom to the tropics where they upwell by being forced against a solid rising ocean bottom like an archipelago or continental shelf.
This is great stuff for people in other disciplines wanting to get their hands dirty analyzing these records.
This is a valuable contribution. Especially as it takes the slog out of reading data from different temperature series, which have different formats.
I still wonder about some of the meta-data. I was intrigued to discover a record for Guernsey AIRPORT from 1856 – 1901 (!) while the coordinates pointed to house I grew up in and certainly isn’t the current position of the airport.
You will find that the metadata now included an “uncertainty” measure
for lat and lon ( I need to get into this more ) however, this is
a measure of precision, not location accuracy. with 40K plus sites
you WILL find stations that are located precisely, but wrong.
Please forward corrections.
We have seen detailed analyses of previous temperature/time graphs. There is no CO2 signature on any of them. Just eyeballing the BEST data, there seems to be no CO2 signature, either. If anyone wants ot claim that there is a CO2 signature, then where is it?
The CO2 trend would be one that goes from 0.03 degrees per decade in the early 1900’s to 0.2 degrees per decade in the later part. While the early part is hard to pick out from natural variability, the later part of the land record exceeds the CO2 effect. Hansen has said in the 80’s, it would be hard to see the CO2 signal until the 90’s, and there it is, sure enough.
Thanks for saving us the need to do any scientific analysis on the data – just choose a random bit and eyeball in your prejudice. Show us where Hansen told us that we wouldn’t be able see the CO2 ‘fingerprint’ *after* the 90s?
Feel free to dismiss the bits of the graph that show rate and extent of global temperature increase that dwarf the puny uptick that exercised hysterics between 1980 and 1998 on whatever grounds you like. That usually works.
What we actually know for certain is that we know practically nothing, to slightly paraphrase Ed Cook.
From Hansen’s 1981 Science paper.
“The predicted CO2 warming rises out of the 1-sigma noise level in the 1980’s and the 2-sigma level in the 1990’s.”
I really don’t understand people who look at these temperature records and see nothing unusual going on in recent decades. I suppose if you blur your vision (perhaps with tears), look sideways, hide the upwards parts, maybe you don’t see anything unusual here. To me it is just denialism at its best.
Jim D, pre-1950 is agreed upon to be natural warming/cooling.
1910 to 1940 = .7C which is more than .2C/decade.
From 1944 to 1980 it was flat.
The modern claim of .2C/decade really only lasted from 1980 to 1998.
And most of that was the same natural warming as 1910 to 1940.
Post 1998 both RSS and HADCRUT are slightly below to zero.
CO2 warming = Zero.
Yeah Jim except for the fact that for the past 15 years CO2 has continued rising at an accelerating pace yet temperature rise went flat. Doesn’t it seem rather obvious that something beside CO2 is at work? Global average temperature, you see, has been rising and falling long before humans entered the picture. Dramatically so in the past several million years as the Pleistocene Ice Age waxes and wanes in cycles tens of thousands of years long. Until the natural causes of temperature anomalies have been identified and subtracted it’s not possible to determine the anthropogenic constribution. And even in that case there are many anthropogenic contributions which constitute both positive and negative forcings. Soot is a good example which tends to shade the planet while airborne causing cooling but accumulates on snow which darkens it causing warming. Methane is now suspected of causing half as much GHG effect as CO2. But methane and soot aren’t politically correct boogeymen you see so we don’t hear much about them. Soot and methane are largely produced by countries that grow rice, use wood stoves and furnaces, slash and burn agriculture, and are in love with diesel fuel in the transportation sector. The United States does none of these things. The United States is the designated scapegoat and thus CO2 is, was, and shall remain the boogeyman in this story because that’s the only way to blame the U.S. for global warming.
Skeptics are sick of warmenizers claiming .2C per decade.
If you measure pre-1950 peak to post-1950 peak (1944 to 1998) peak you get .442. or
.08/decade.
If you measure to 2011, you get .242 =
.036C/decade.
Extending a statistically dubious flat part into the future rather than the more robust trend of the last 40 years is wishful thinking at best. If you plot the BEST land record, you don’t see the flat part at all. That should give pause for thought, but somehow doesn’t among the skeptics.
David Springer brings up soot and methane. Being short-lived in the atmosphere, their atmospheric concentration is determined by their rate of production. Being long-lived, CO2’s concentration is determined by its integrated production over time. The effects of methane and soot are capped by how quickly they can be produced and are self-limiting in this way. Both also have natural sources that can’t be controlled, while CO2 doesn’t unless you count the warming ocean.
Statistically dubious?
http://www.woodfortrees.org/plot/hadcrut3vgl/from:1909/plot/hadcrut3vgl/from:1909/to:1943/trend/plot/hadcrut3vgl/from:1977/to:1998/trend/plot/hadcrut3vgl/from:1998/trend/plot/hadcrut3vgl/from:1943/to:1977/trend
Sunshine and Albedo
http://i40.tinypic.com/xgfyok.jpg
http://i39.tinypic.com/34oc184.jpg
It is dubious to suggest the small amount of warming from 1980 to 1998 will continue past 1998 when it didn’t.
Albedo did not keep going down and sunshine appears to be flattening out.
Bruce, I see your Hadcrut and raise you BEST
http://www.woodfortrees.org/plot/best/from:1970/mean:12/plot/best/from:1970/trend
Jim D. Thats BEST? Wow. Unbelievable.
BEST seems to think 1998 was just another year.
BEST joke ever!
http://www.woodfortrees.org/plot/best/from:1998/mean:12/plot/best/from:1998/trend/plot/rss/from:1998/mean:12/plot/rss/from:1998/trend
http://www.woodfortrees.org/plot/rss-land/from:1979/to:1998/trend/plot/rss-land/from:1998/trend/offset:-.3/plot/best/from:1979/to:1998/trend/offset:-.36/plot/best/from:1998/trend/offset:-.556
Jim D
The BEST work has undoubtedly added new knowledge, and that’s good.
But to compare BEST (land only) with HadCRUT3 (entire planet) is silly, as would be to draw too many conclusions from it.
It’s like comparing the temperature trends of the USA only with that of the entire land record.
Let’s wait until BEST does a comprehensive SST record,as well.
And let’s make sure they address the many misgivings already expressed here by tony b and others regarding data prior to around 1980.
Then, for the land record, let’s wait until BEST has made a reasonable assessment of the UHI impact, which they have not done to date.
Still a lot of work to be done, before BEST tells us much.
Max
@bruce
Trenberth isn’t claiming 0.2C/decade actual at this point. I’m not sure anyone is but Trenberth does point out that according to TOA satellite measurements 0.5W/m2 more energy is entering the atmosphere than is leaving it. It’s fairly straight forward to calculate how much that half watt will heat a given volume of water. If we’re talking about the ocean mixed layer it’s enough to warm it just about exactly 0.2C decade. This expected warming can’t be found in the ocean. The atmosphere can’t hold that much energy without turning the earth into a convection oven rather quickly so that only leaves the deep ocean for where the missing heat is residing. It can’t be found there either but that may very well be due to the fact that distributed equally across the deeps the half Watt is only going to warm it 0.02 per decade and that’s outside our ability to measure. Presumably though any heat reaching the deeps must first pass through the mixed layer where it is easily detectable yet it’s still missing. Therein lies the travesty. The reasonable conclusion is that the satellite data for TOA energy budget is wrong and the earth isn’t really warming but rather patterns of heating and cooling, exchanges between mixed layer and deep ocean, are changing in cyclic ways we don’t understand and hence can’t really predict except by expecting that repeating patterns we see will continue repeating. Climatology consists of identifying those patterns and making predictions based upon expected repetitions.
To find the CO2 “signature” you need to filter the data for known causes of natural short-term variability. Oh, wait, that’s been done several times by different groups and skeptics didn’t like the results…nevermind.
R. Gates rites “To find the CO2 “signature” you need to filter the data for known causes of natural short-term variability.”
What about nthe unknown causes of natural variability. Or are you claiming we have complete detailed knowledge of everything that causes natural variability. We know in precise detail what caused the ice ages, the Roman Warminvg Period, and trhe Little Icde Age.
Sorry, I just dont believe you.
Those like Jim Cripwell who insist that there is no “signature of CO2” in the temperature record are probably unaware of the paper How Robust is the Long-Run Relationship
Between Temperature and Radiative Forcing?, by Terry Mills. The abstract states:
This paper examines the robustness of the long-run, cointegrating,
relationship between global temperatures and radiative forcing. It is
found that the temperature sensitivity to a doubling of radiative forcing
is of the order of 2 ±1oC . This result is robust across the sample
period of 1850 to 2000, thus providing further confirmation of the
quantitative impact of radiative forcing and, in particular, CO2 forcing,
on temperatures.
Although the article is paywalled, a copy was kindly provided to me by email from Prof. Mills. I’m sure he would oblige other requests.
2 plus/minus 1, not 10.
============
There are certainly many gaps on our knowledge of the causes of short-term variability, but more of those gaps are being filled in every year. For example, EUV effects on the stratosphere having been recently added to some climate models.
So, from volcanoes, to ocean cycles, to various solar effects, a pretty good picture is emerging of some root causes of natural variability. Some of the gaps are big ones though, aerosols and clouds in particular. Still, I think the general estimate of a 3C (+ or – 1C) rise in global tropospheric temperature for a doubling of CO2 from pre-industrial levels is a pretty good one.
Ah. Thanks, kim.
Pat Cassen writes “It is found that the temperature sensitivity to a doubling of radiative forcing`
No, I am not familiar with this paper. I will try and find it, but I have some trouble already. Radiaitive forcing cannot be measured. So I am not sure where Prof. Mills gets his data for radiaitve forcing.
R. Gates writes “Still, I think the general estimate of a 3C (+ or – 1C) rise in global tropospheric temperature for a doubling of CO2 from pre-industrial levels is a pretty good one.`
You have used the right word “estimate“ not “measured“. Until I see a measurement for this number, I consider it to be both hypothetical and meaningless.
Jim Cripwell | February 18, 2012 at 4:17 pm |
R. Gates writes “Still, I think the general estimate of a 3C (+ or – 1C) rise in global tropospheric temperature for a doubling of CO2 from pre-industrial levels is a pretty good one.`
You have used the right word “estimate“ not “measured“. Until I see a measurement for this number, I consider it to be both hypothetical and meaningless.
__
You want to see an actual measurement of temperature when CO2 reaches 560ppm? Wow, you intend on living a long time. I’ll have some of whatever you’re drinking!
R. Gates rites “You want to see an actual measurement of temperature when CO2 reaches 560ppm? ”
No. I want to see a measurement of the climate sensitivity for a doubling of CO2. And I have absolutely no doubt at all that, hopefully, when CO2 reaches 560 ppmv, global temperatures will not have been affected by the CO2.
Sorry, Jim, I forgot that you reject all quantitative information related to forcings, on the basis that they are not measured in situ. So feel free to continue imagining that any correlations between independent data sets such as those examined by Mills are purely fortuitous.
Jim Cripwell,
Amazing that you have “no doubt” about what is one of the most important issues in the entire study of climate right now, and certainly no scientist would save they have “no doubt” about what the sensitivity of the climate will be to a doubling of CO2. Your certainty means you are not basing your statement on science and knowledge, but something else…perhaps your personal belief system.
R. Gates writes “no scientist would save they have “no doubt” about what the sensitivity of the climate will be to a doubling of CO2.”
I have never said I have no doubt about climate sensitivity. What I have said is that it has never been measured, and what little observed data we have, convinces me that it’s value is so small that it cannot be detected against the background noise of natural variations.
Pat Cassen, you write “on the basis that they are not measured”
Not quite. On the basis that they CANNOT be measured.
Bah! We’ve been here before, but why is an article written in 2007 and published in 2008 still paywalled? It’s like having to subscribe to Sky Movies (US: cable premium movie channel) to watch a 4-year old film.
R. Gates
Unfortunately, the procedure you recommend leaves out one major step.
Acknowledging that there may be major UNKNOWN causes of natural climate forcing or variability.
An example: solar influence via the cosmic ray / cloud mechanism. We know from the preliminary experimental results at CLOUD CERN that this mechanism exists and we see from empirical data of past temperature and solar activity that it could explain almost all of the past climate change, but more work is needed to confirm its magnitude experimentally..
You are proposing a “we can only explain it if we assume” approach – and this is logically flawed.
We have to first admit that we do not know anywhere near to everything that makes our planet’s climate behave as it does.
Otherwise we fall into the logic trap of IPCC:
1. Our models cannot explain the early 20th century warming.
2. We know that the statistically indistinguishable late 20th century warming was caused by human GHGs.
3. How do we know this?
4. Because our models cannot explain it any other way.
Max
Nice work in progress. Even perhaps a clue in there to attribution, for which we must progress the work.
==================
If you find errors in the data, can you contact someone? I checked one station i Sweden and it didin’t match the data in the dataset. Maybey it is my mistake… but it didn’t look correct. Is there any responsible per country?
Henrik,
You can email info@berkeleyearth.org, or just give the station name or ID here and Steve and I can try and look into it.
Thanks, I email them.
Make sure you site ur source
For folks with access to MatLab (albeit probably a rather small segment of commenters here), you can find the full Berkeley code (and run it yourself, though it will take awhile!) here: http://berkeleyearth.org/our-code/
Zeke,
Thank you and Steve. About the above, when I downloaded the initial version of the code I was unable to run it (I have MatLab) and there was a note in the readme file that seemed to suggest I might have this problem. While I understand that it’s pretty typical to have to tweak systems a little bit to get big new programs to run, is this still an expected result?
billc,
I don’t personally use MatLab, but I wouldn’t be surprised if it requires a few tweaks. Try emailing info@berkeleyearth.org for support in getting it to run.
The combining of city and airport data is a problem that still exists in the new release of BEST. Amarillo is just one example of this:
This image shows the actual raw data for Amarillo, along with the 2-year overlap between city and airport:
http://img710.imageshack.us/img710/8624/amarilloraw.gif
This image shows how the Amarillo timeseries looks to unknowing users using GHCN or BEST:
http://img26.imageshack.us/img26/6742/amarilloghcn.gif
The red dotted line is where airport data begins to be used, but until the solid red line, it is airport data *adjusted upward* to match the city data – in other words, *NOT* raw. From 1948 on, raw airport data is used.
The first image is a red flag to me about why these two timeseries should not be combined and homogenized. The trend is very clearly upward for the city data, and very clearly flat for the airport data. It would be quite a coincidence that the climatic trend changed exactly at the same time that the station moved.
it helps if you cite sources. its the only way to assess claims. raw data is full of bogus stuff. also which best data did u use
JR,
Looks like you are using some older GHCN v2 data in your plot. It would be instructive to look at the records that the Berkeley group provides for that location in the links above.
This image shows BESTv2 TAVG minus GHCNv2 tmean:
http://img818.imageshack.us/img818/8130/bestv2ghcnv2.gif
They are essentially the same through the period that I referenced, except for 1947, where BEST is lower. That is because BEST moved to raw airport in 1947 while GHCN doesn’t change till 1948.
The citation for the adjusted 1941-1948 (or 1947 for BEST) is:
http://www.archive.org
Enter “world weather records” in the search box, scroll down to the entry that says: “World weather records (Volume 1941-50)”, on the next page, click on the book (under “View the book”) to open it, and scroll to page 52. In the notes for Amarillo, it explains the adjustment. If you compare the data printed on page 956, it matches the monthly data in BESTv2 TAVG.
JR,
Ahh, that explains it. Berkeley currently only uses World Weather Records published since 1961, but version 3 will incorporate all World Weather Records data (its next on our list of things to add).
Ultimately we want to include all data we can find in the least processed format. The Berkeley homogenization procedure will also effectively split up records at breakpoints (and there is rather obvious one here), so they will be treated as two separate station records.
I think one of the points of the Amarillo case is being missed. If, for the sake of argument, there was only one thermometer on Earth and it was in Amarillo, the BEST result would make it look like the earth was warmest in the 30s and coolest around 1980, whether or not the timeseries was treated as one or two stations. This is the same result as if the combined timeseries is anomalized. We can’t be sure from either method whether or not we have the right answer because we’ll never know what happened at the city station after the 40s or at the airport site before then. It can be argued that the cooling in Amarillo is due to the Age of Aviation, rather than aerosols or clouds. And also, it shows that it does matter where the thermometers are. Now extend this to the entire Northern Hemisphere and why don’t you see a similar cooling in the Southern Hemisphere?
Judith
A few months ago you carried my article here;
http://judithcurry.com/2011/12/01/the-long-slow-thaw/
There were numerous graphs in Section five detailing the gradual rise in temperature that can be observed for 350/400 years. This includes BEST and Michael Mann in graph 15a
I’ve got no reason to doubt that this new series of graphs shows the generality of the gradual rise in temperatures fairly well, although the degree of precision that has been set out is speculative. I had previously criticised the BEST figures during the first few decades of the 19th Century, these seem to be more coherent now.
The hockey stick seems to have disappeared in the latest BEST version, but surely we need to now recognise that Giss from 1880 was being highly selective in its start point and doesn’t show the broader context. This is that James Hansen did not capture the start of the warming trend but merely plugged in halfway, and that there has been a long slow thaw (with numerous advances and retreats) since the days of Shakespeare.
There are also several periods within that 350/400 year warming timescale during which there were periods around/about as warm as today-the early 1700’s and late 1500’s in particular
tonyb
1880 is a consequence of data set and method.
Mosh
Of course it was, but 1910/15 would have been a better time to start as the stephenson screen had by then introduced greater accuracy and there was far better coverage of both hemispheres.
Anyway, you and Zeke have done a great job on this, congratulations.
tonyb
Tony.
Judgement call.
I prefer to present all the data. If somebody wants to make an argument about covergae and accuracy, then it needs to be made objectively, not subjectively.
My thanks to Zeke and Steven for their work on this, kudos to both of you.
w.
Mosh
I have written about the start date a number of times. I don’t want to say the date was deliberately chosen to make a point , just that there were better start dates if the data to be presented was to be significantly different to that of Hadley. 1880 wasn’t sufficiently different in terms of spatial coverage or accuracy but 1910/15 was. It’s a strange date to have chosen
Tonyb
still subjective, sorry. If you want to argue that 1910 is somehow better, thats a quantitative argument. make it. eye balling doesnt count. how much better? why? etc etc.
For dabbling (rather than deep diving), it is helpful to have the summary data as graphed above. The original release had such a file, BEST Full_Database_Average_complete.txt
I couldn’t locate an analogous summary file. Is there one which I’ve overlooked?
HaroldW,
The link to the chart data is here: http://berkeleyearth.org/downloads/analysis-data.zip
However, I’m not sure that its been updated to version 2 yet. I’ll double check with the team.
Thanks Zeke. Yes, that’s the October version data there.
Mosh
Not sure there is a point to demonstrating that 1910 would have been a more sensible start date. Would it make the slightest difference to what you Hansen or World Governments think?
tonyb
yeah i’m not sure steve’s objection. i get using all the data, but your choice of a cutoff at least invokes a methodological improvement and is therefore at least somewhat objective? what is the criterion that BEST uses to start their record in 1810? a few records go back farther right?
correction 1800 +/-
Bill, the data in the files goes back to 1701, I believe. So, all that data is present.
I can only speak for myself. The record would not change the physics, no matter where you start it. erase the record entirely, and our best science still says that GHGs cause warming.
The Berkeley study in all liklihood will help resolve the discrepancies between datasets but the issue is not about dataset temperature discrepancies as much as it is about whether or not CO2 emissions can have a major effect on global temperature specifically if the level of CO2 predicted for 2050 will cause catastrophic global warming or at least indicate that such catastrophic warming will take place by year 2100.
In this regard the BEST study does not demonstrate any such trend over the past decade with increasing CO2 emissions so in order for the catastrophic global warming predicted by AGW orthodoxy to take place those predicting this catastrophic global warming must first present evidence for a date when the global temperature will start to rise at an as yet unseen unprecidented rate.
The BEST data shows that the warming throughout the 20th century was under 1°C for the entire century and unless something three times this rate begins immediatly we will not be seeing the 3°C temperature increase predicted by the models occurring by year 2100.
The BEST data also confirms the overall cooling from 1942 to 1975 that was accompanied by an increase in CO2 emissions of 16gt/y from 4gt/y in 1942 to 20gt/y in 1975 refuting any possible correlation of increased emissions with increased global temperature.
Perhaps this honest attempt at science will finally put the fraudulent AGW conjecture to rest!
The projections are that five times as much CO2 will be added to the atmosphere between 2000 and 2100 as had been before 2000. That will leave a mark.
It’s clearly off to a flying start, isn’t it? Air temperatures are flatlining since before the start of the century and sea surface temperatures appear to be bombing. Why?
Consider the BEST land temperature which is continuing to rise fast, consider the length of the last solar minimum which only temporarily hindered warming, and consider the increase in ocean heat content, and you won’t have anything left to stand on.
Somehow with the rapid increase in CO2 emissions which reached over 33gt/year in 2010 the increase in atmospheric CO2 concentration has not increased accordingly and is only increasing at a near perfectly linear rate of just 2ppmv/year.
We are currently at approximately 390ppmv and at this rate the concentration will only be 466ppmv by 2050 and 566ppmv by 2100 so not only will tyhe temperature suddenly hace to increase the concentration that is supposedly driving this temperature increase will also have to change dramatically from the linear rate of the past 15 years.
If more and more CO2 is being added to the environment from CO2 emissions from fossil fuels but this increase is not being reflected in the atmospheric concentration common sense would tell us that human sourced CO2 is not the primary source and knowledgeable scdientists would tell us that a steady state increase such as we are observing is from an extremely large source such as oceans outgassing but those who predict rapidly increasing temperature from rapidly increasing CO2 concentration (both of which are not happening) have neither common sense nor proper scientific knowledge
Norm, if you want there to be only 560 ppm by 2100, you need to be backing the change to renewable energy and leaving coal in the ground, because the IPCC business-as-usual scenario has over 800 ppm by 2100. Don’t count on the oceans and sun to keep this natural variability pause going very long. That would be a poor bet to make because natural variability is only good for plus or minus 0.2 degrees now and then.
Jim check out http://www.esrl.noaa.gov/gmd/ccgg/trends/
which shows the linear trend which will have to change dramatically to meet your IPCC scenario of 800ppmv by 2100.
Note on the front pahe is the Jan 2012 value od 393.09ppmv and the Jan 2011 value of 391.19ppmv. this means that with Business as usual the increase from 2011 to 2012 is just 1.90ppmv/year which is less than the 15 year average of 2.0ppmv/year!
This is just one year but the annual growth in CO2 concentration from 2010 to 2011 also was less than the 15 year 2.0ppmv/year running average with concentration of 389.78 in 2010 and 391.57 in 2011 representing a growth rate of just 1.79ppmv/year so does this mean the rate of increase is slowing?
This makes sense because the rate of ocean warming is slowing and along with this the rate of degassing will slow as well.
The only thing that isn’t slowing is “business as usual CO2 emissions and these do not seem to be having much effect on the CO2 concentration.
The question is do you believe more in unfounded IPCC projections or in hard data measured at MLO?
Alas, Norm and his oil patch buddies are coming up dry.
They tried to hide the decline
but reality bit them in the behind.
Jim D
Let me correct you on the IPCC model-based “scenarios and storylines” for CO2 emissions and concentrations by 2100.
http://www.ipcc.ch/pdf/special-reports/spm/sres-en.pdf
CO2 levels have increased by between 0.4 and 0.5% per year compounded (exponential increase) since Mauna Loa measurements started and also over the most recent period.
If this exponential rate were to continue, we would arrive at around 580 ppmv by 2100.
This is IPCC’s “scenario and storyline B1” (AR4 WG1 SPM), with moderate economic growth, a population growth rate that declines sharply with population leveling off at between 9 and 10.5 billion be 2100, and no climate initiatives.
This looks like the most reasonable “business as usual” projection, although the question arises: why should human CO2 emissions continue to grow at the same exponential rate as in the past when population growth rate is expected to slow down from around 1.7% per year to 0.3% per year? (But let’s ignore this discrepancy for now.)
IPCC has other “scenarios and storylines”, involving an increase in the exponential CO2 growth rate, based on very rapid economic growth projections, continuously increasing population, etc., but these would appear to be less likely to me. These have CO2 levels reaching up to 850 ppmv.
If one considers that there is only enough carbon in all the optimistically inferred possible fossil fuel resources remaining on our planet to reach a bit more that 1000 ppmv in the atmosphere (based on latest WEC estimates), these “storylines” appear to be exaggerated and highly unlikely.
http://www.worldenergy.org/documents/ser_2010_report_1.pdf
But, hey, it’s anyone’s guess. I’m just saying that IPCC is “guessing” on the high side, in order to frighten the “policymakes”.
What do you think?
Max
Rather, more data is likely to narrow the bands of uncertainty about past trends.
“but the issue is not about dataset temperature discrepancies as much as it is about whether or not CO2 emissions can have a major effect on global temperature”
So why have skeptics put so much time into arguing the surface temperature records are worthless?
The BEST data disproves the claims of prominent skeptics such as Anthony Watt’s and Joseph D’Aleo
Nice attempt at redirection though Norm
yes, the great dying of the thermometers has been put to rest as an argument. It will stumble around dead for a while
Could someone please comment on my impression that the great dying of the thermometers argument was more about the degree of warming than whether warming occurred. The example for this is in the far north which is invariably shown as bright red in graphics of warming. However the warming there is magnified by a much smaller number of stations located in areas where the spatial mean gets increased. In particular if you lose a station that is cooler than a warm station that still is in the record then the grid point date near the lost cooler station shows a jump because now it reflects the warmer station. So yes it has warmed in the arctic just not as much as some graphics would have you believe.
Billc
This has been well buried in the thread so dont know if you will see it.
There are numerous records going back way before 1800. Some of them -like CET-are a reasonably good proxy for global temperature and we have those from 1659. You may have seen my article The Long Slow Thaw where I reconstructed it back to 1538?
All in all 1880 seems a somewhat arbritary date to start GISS and I suspect was mostly motivated by the slew of data coming on board from the substantial expansion of the US network from around that date. However if you wanted a better spatial coverage of BOTh hemispheres AND some degree of standardisation which came with the Stephenson sreen, the period 1910/15 would seem to be more logical.
Perhaps BEST will try to extend their coverage to earlier than 1800
tonyb
David Springer said:
“There’s no anthropogenic warming in the ocean”
_____
You have absolutely no evidence to make a claim like this with such certainty, but lot’s of evidence that there very wll might be anthropogenic warming of the oceans. This is obviously a poltical statement and not a scientifcally based one.
http://bobtisdale.files.wordpress.com/2012/02/1-global.png
That’s the trouble with science – if you do it you find stuff out. See the data? It’s apolitical – current SST anomaly is the same as in 1983.
http://bobtisdale.files.wordpress.com/2012/02/13-southern.png
Getting very chilly in the Southern Ocean – anomaly DOWN nearly 0.8°C since the 1980s – funny stuff this warming – there’s none of it in the ocean for sure.
“It’s apolitical – current SST anomaly is the same as in 1983.”
1983 contained the 2nd strongest El Nino of the 20th century.
What does that say that we are at the bottom of a La Nina and yet global temperatures are as warm as what it took a very strong El Nino to produce?
Witness SayNoToFearmongers. Is there any question he is dishonest?
Is it really possible that he can’t see the warming trend in the first linked chart he provides?
No it isn’t possible. He’s a liar.
Why would you use the metric of ocean surface temperature (which is really one of heat flux, rather than energy storage), when you can actually look at how much energy the ocean has stored (down to 2000m) over the past several decades? Approximately 23 x 10^22 Joules since 1970, and 10 x 10^22 Joules in the past decade. A better question, for someone with scientific curiosity, is what is causing the ocean to gain so much energy over the past 40+ years?
10 x 10^22 Joules sounds a lot, but what is the average power per square meter that results to that rise. Another way of illustrating that number is the average increase of temperature of the top 2000 m of oceans that corresponds to this energy. These two numbers would tell something while the total amount of energy has no obvious meaning to anybody.
lolwot, No, it just makes it the wrong tool.
Let’s see, what is the largest and most universal metric whereby one could measure the Earth’s energy imbalance?
1) Tropospheric temperature over the past 10 years
2) Sea Surface Temperatures over any period
3) Central England Temperatures over the past 2 centuries
4) Ocean heat content over the longest time-frame largest available
Those who choose 1, 2, or 3 might want to then justify why they didn’t choose the biggest energy storage system on the planet, which both in temporal, spatial, and physical terms dwarfs any other metric available. Choices 1, 2. and 3 will be subject to much more natural and local variability and noise, and so, if you’re looking for a longer-term signal, the deeper global ocean is the place to look.
Going back to my own question. From the NOAA site I estimated the average heat flux of last 20 years as 0.7 x 10^22 J/year. That gives about 0.6 W/m^2 average power for the total area of oceans. That’s about one half of the value 1.3 given in the Energy Budget paper of Trenberth, Fasullo and Kiehl based on indirect arguments.
R. Gates,
You say that the ocean heat content is the best metric we have for the energy imbalance.
But, especially as little is known of the deep ocean heat content, it’s effectively estimated from the energy imbalance.
Yet you’re rather confident of it’s value?
lolwot:
R.Gates:
Seconds out, Round number 1.
just because a spanner isn’t the right tool for putting a nail into a wall doesn’t mean it’s a worthless tool.
lolwot, No, it just makes it the wrong tool.
(accidentally posted higher up)
It’s based in physics which I explained. Energy loss will go through the path of least resistance preferentially. This is perhaps at the forefront of my thinking as it’s exhibited by electricity. It’s generally applicable to all kinds of flows from heat to water to radiation.
Over the ocean the principle path of heat loss is evaporative which accounts for about 70% of it. Radiation is second with 25% and conduction comes up last at 5%. Over a dry desert it is over 90% radiative. This is well known and indisputably establishes which paths offer the least resistance.
What happens when we constrict one path (whether a pipe or a wire or LWIR through the air) is that the flow will redistribute over all possible paths. Given evaporation is the easiest when we restrict the radiative path over the ocean by additional greenhouse gases it simply shunts more energy flow through the radiative path. Over land where there isn’t an infinite supply of water for evaporation when we restrict the radiative path the alternative is conduction. Conduction is very poor in air so what happens is that surface temperature rises which accelerates the flow through both the radiative and conductive paths.
The rise in temperature can be equated to what happens at a dam when you restrict the discharge through the gates so that less water is leaving the containment than is flowing into it. The water level behind the dam rises which in turn increases the pressure at the base of the dam which in turn increases the rate of flow through the gate.
Let me know if there is any part of that you don’t understand.
Correction to above “shunts more flow to radiative path” should read “shunts more flow to evaporative path”.
You mean ‘evaporative path’ in the second sentence of your third paragraph instead of ‘radiative path’.
================
Ah, I went back twice to check which sentence and which paragraph.
==========
David Springer,
So the deeper ocean has been gaining energy because less is being lost at the surface? If that’s your contention, then perfect, as it is quite in line with the current thought that DWLWR affects the thermal gradient at the surface of the ocean, and creates a “dam” (in your terms) to restrict the flow of heat out of the ocean.
Well done…
R. Gates,
With all respect, your assertion is political as well.
How can a data record of ocean heat content be political?
Speaking of land temperatures how many of you are aware that the highest recorded mean annual temperature is 34.4C and happens to be in an equatorial desert. This should make anyone who believes that water vapor is a net contributor to greenhouse warming pause and reassess his or her belief.
Guinness Book of World Records. 1999: 250.
“Between 1960 and 1966, the highest average annual mean temperature in Dallol, Ethiopia was recorded at 94 °F”
http://en.wikipedia.org/wiki/Dallol,_Ethiopia
Dallol is a salt desert at 14 degrees north latitude with 4 to 7 inches of annual rainfall.
One needs to satisfactorily explain to oneself why the highest mean annual temperature in the world is associated with an equatorial desert instead of an equatorial rain forest. The only explanation I can make is that the hydrological cycle is net source of cooling not warming.
One might also wonder why this record was set 50 years ago when CO2 levels were much lower than today. That’s a little more difficult to explain but I will below.
Since ARGO was mentioned in this thread one might also note that the highest sea surface temperatures ARGO has found is a tad under 35C. If you go here:
http://www.spectralcalc.com/blackbody_calculator/
and start plugging in some numbers you might find some surprises and possibly an epiphany about why 35C seems to be a magic number for maximum annual mean temperature and maximum open ocean surface temperatures. The vast majority of ARGO maximums forms what’s just about a brick wall at 30C with rare departures above it. Those rare departures I figure are exceedingly calm seas and clear skies where surface temperature is able to approach the S-B temperature for 500W/m2 at 0.9 emissivity which approximates a calm equatorial ocean with clear sky.
So back to Dallol and why its record annual mean temperature was set 50 years ago. That’s simply as hot as the annual mean can get on this planet. No greenhouse gas can raise temperature above the S-B limit for an ideal black body.
The earth’s temperature is driven by albedo. Greenhouse gases effectively lower the albedo by forcing (through back radiation) the surface (land not water) to absorb & retain more energy. But there is a limit and that limit is an albedo of 0.0 and nothing can make the average temperature warmer than that except a hotter sun or a closer orbit.
” This should make anyone who believes that water vapor is a net contributor to greenhouse warming pause and reassess his or her belief.”
hahaha. idiot.
I think it’s against site policy to someone an idiot even if you’re calling yourself one, loltwot. :-)
David – “…why [is] the highest mean annual temperature in the world… associated with an equatorial desert instead of an equatorial rain forest…”
Think heat capacity and diurnal cycle.
Consult a textbook on atmospheric physics, e.g. Goody & Yung. The rest of your post would also benefit.
When are skeptics going to correct their own “science”?
In the skeptic report “SURFACE TEMPERATURE RECORDS:
POLICY DRIVEN DECEPTION?”
The Summary for Policymakers begins with this point:
1. Instrumental temperature data for the pre-satellite era (1850-1980) have been so widely, systematically, and unidirectionally tampered with that it cannot be credibly asserted there has been any significant “global warming” in the 20th century.
Take note: this is skeptics claiming that it cannot be said that there has been any warming over the 20th century. So much for their claim that they “we don’t deny global warming exists we just question the cause”!
Don’t be fooled, climate “skeptics” will switch arguments inconsistently to avoid taking any flack from their thoughtless agenda.
Anyway, in light of BEST completely blowing this report out of the water when will errata be produced for this report for the many claims that can now be proven to be erroneous?
link for anyone who wants to read the kind of “quality” report skeptics aspire to: http://scienceandpublicpolicy.org/originals/policy_driven_deception.html
lolwot: (btw, is that really your name?) link for anyone who wants to read the kind of “quality” report skeptics aspire to
JK: Please point out which passages in this 209 page paper supports your claim.
Thanks
JK
Do you imagine that there’s some sort of conspirational sceptic ‘script’ which all sceptics sheepishly adhere to?
Don’t be fooled, climate “skeptics” will switch arguments inconsistently to avoid taking any flack from their thoughtless agenda.
JK: Oh, just like the alarmists!
Thanks
JK
In a recent thread the “similarity” between the 1910-40 warming and more recent warming was regarded as a reason for being concerned about the explanation for the more recent warming till explanations for the earlier warming are satisfactory.
However, that “similarity” is quite strongly dependent on the SST data. In this thread, doubts about early SST data are being expressed by one of the people who has noted the “similarity”.
So the two positions seem inconsistent unless you simply wish to suspend judgement till it is found that the 1910-40 warming is not at all similar and the SST data was misleading.
Be uncertain, all ye who enter here.
==============
Steve
If you are referring to me, a case can be made for some degree of very broad accuracy if the time scale and location are very highly specified.
There are some well defined sea routes in the arctic but much of it is a blank canvas. The studies I have seen on that period of warming know the seas were ‘warm’ as the ice had melted. This is a pretty good e-book on the subject but I also hope to write about that era of warming myself at some point.
http://www.arctic-heats-up.com/chapter_1.html
tonyb
The interesting bits for me are the southern latitude routes. data collection is confounded by seasonality. eyeballing some
animations I did long ago of icoads
all that said, heat in the ocean is spatially coherent. thats the argument
for using EOFs to estimate it when data is sparse.
Peter ( Webster) has some serious questions about using EOFs
I havent paid enough attention to it.. so at this point I will wave my arms..
like so… arm wave.. eof.. arm wave.
not flying, drat.
Mosh and Zeke –
From general principles and other data sets, I assume that when ocean figures are added the warming trend is reduced.
Is it possible to merge the BEST land data series with an existing ocean series to see what trend arises? As a ‘just for fun’ exercise at this stage.
R. Gates –
re. Trenberth’s missing heat in the oceans. You keep citing figures of “10 x 10^22 Joules in the past decade”. What does this actually mean to those of us who don’t measure our energy output in ergs? And could you point me at a paper which uses Argo data only to show ocean temperatures over the past few years? And finally, there must be *some* measurements done under the ‘travesty level’ of 2000m. What do they show?
Cui bono,
The most comprehensive data on ocean heat content is of course here:
http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/
In terms of the deeper ocean, one of the better recent studies are here:
http://journals.ametsoc.org/doi/abs/10.1175/2010JCLI3682.1
http://www.sciencedirect.com/science/article/pii/S0967064511001809
Of course, it is a “travesty” that we have no consistent and widespread measure of the ocean heat content below 2000m, as this represents a huge part of the overall energy storage in the Earth system.
Finally, the question related to putting 23 x 10^22 Joules of energy into some kind of metric that is more easily understood. We are talking fractions of a degree over the whole ocean mass (down to the 2000m mark). Such is the power of the specific heat content of such a large mass of water!
Some skeptics may say, “A few fractions of a degree! So what’s the big deal!” I would not disagree with that initial reaction, as discovering the answer to that is what scientists are hard at work looking at– finding out what might be the “big deal” from these seemingly small changes. But this should not lose sight of the larger question as to whether the Earth system is gaining energy right now or not– based on the largest, best metric we have– it is, and the Earth a energy system, has definitely been gaining continuously over the past 40+ years, with no slowdown this past decade, but rather, an acceleration as nearly half the enegy the ocean has gained in the past 40 years has ocurred in the past 10.. If this is a “big deal” or not, remains to be shown, seen, or discovered.
R gates
although we have no consistent view of heat lurking below 2000metres the ipcc in the draft of ar5 assert that the research demonstrates it is warming down there
When I ask for access to the research they refuse saying it was not cited in the draft therefore they do not have to provide it
Tonyb
Tony B.,
Such a response to your request seems unreasonable. Have you tried contacting some the scientists directly who might have this data? Specifically, some of those from this study might be open:
http://www.sciencedirect.com/science/article/pii/S0967064511001809
I guess that is why the sea volume has increased over the last decade. All those erg’s are causing a swelling of the oceans.
No wait, seal levels are falling.
http://www.climate4you.com/images/UnivColorado%20MeanSeaLevelAnnualChangeSince1992%20With3yrRunningAverage.gif
http://sealevel.colorado.edu/files/current/sl_global.txt
Just keep saying all the heat is in the oceans.
Sea levels aren’t falling. You’ve graphed the rate of sea level rise, not sea level.
Sea level is rising
http://sealevel.colorado.edu/
yes adding SST will decrease the warming trend. however its interesting to note that delta SST trend / delta land trend is a constant of sorts.
theres a physical reason for that.
Just for fun I suppose I could throw in SST and do a combined plot,
but, Im a bit focused on other things
SST’s will tell us very little about total energy gain/loss by the Earth as a system over the past several decades (just as lower troposphere temps don’t) but might provide an interesting mechanism for explaining why the system (mainly the deeper ocean) has been gaining energy. If SST’s have fallen slightly over the period, but Ocean heat content down to the measured 2000m has increased, then it means there has been a sort of governor on the heat flux from ocean to atmosphere, and perhaps the notion of DWLWR altering the thermal gradient of the ocean skin layer has another form of confirmation by data. Certainly the oceans have been taking up more energy during La Nina and ENSO neutral periods than they’ve been letting out during El Ninos. A full data set of SST’s might quantify this even better.
Thanks Steven. Appreciate you’re busy!
Steven, with complete respect I’d like to share a comment verbatum, to date, it has made more sense than nearly anything I’ve seen so far in the blogs.
Joe Bastardi says [on WUWT]:
December 3, 2011 at 5:48 am
Again the problem with all the AGW ideas is the quantification of ENERGY. This is my theory on hurricanes and the global temperature, which comes from very basic lessons my father, a degreed meteorologist, taught me when I was young, and is very simple, so much so that it problem would threaten alot of research grant money if correct The “reason” for hurricanes are to take heat out of the tropics and redistribute them into the temperate regions. To lead to hyper hurricane activity, a buildup of energy sufficient to cause excess hurricane activity must take place in the areas where hurricanes are known to occur relative to the need to redistribute the heat.
Suppose we assume that there is, for the globe, x amount of energy available for the processes in the weather. What if the position of the energy becomes “distorted” in other words suppose the need for tropical cyclones is less, because there is already a reason for them not to have to develop, ie more heating ( energy) in the northern latitudes, without a corresponding increase in energy in the hurricane areas. The idea is that a certain relationship between the temperate regions and the breeding grounds must occur to force development.
But here is where the global warming comes in. Say tropical breeding ground water temps stay the same or rise very little, but there is much more rise in the northern areas. What is the need for storms to take energy out of the tropics and redistribute it in the temperate regions, if it already has been done? By this logic, there is a chance that so called global warming can cause less activity since it distorts the ideal pattern for focused accumulation of energy in the breeding grounds, that then must be redistributed! But the rise in northern temps is probably offset by subsequent cooling, as we are seeing now in the tropical pacific with the reverse of the PDO as far as the energy aspects of what a saturated parcel of air means at a given temp. The dirty little secret is that a drop in the wet bulb of 1 degree F at 80 degree has far more implication to the energy of a parcel of air, then the same parcel over a dry climate at 20 below 0. But tropical cyclones are “delicate” as far as how they function, it takes a series of aligned circumstances to produce them in abundance. Disruption of this, even if it is with “apparent” warming, could cause the disruption in the ideal energy pattern globally to produce the major down turn.
One may then argue, well there has to be global warming. Yes and no. There may be an increase in the average temperature, but no increase in the total energy budget of the atmosphere which means the earth will cycle back to where it was before, constantly in search of the balance it can never attain!. And there is a reason to see that, which I will discuss below. One problem is when it comes to seeing if there is truly a rise in temps you cant just add up all the observations and divide them. You must quantify what it means in terms of total tropospheric energy as the where rises and falls are taking place. The global temperature will cool in the coming years, but it will simply be because where it got warmer, cooled. When it does, we will see the increase in ace as the temperature distribution goes back to one where if its colder in the north, it will force the energy buildup near the equatorial ares, primarily over the Pacific where the earths greatest source region for weather is ( I believe the continent of Asia and the southwest tropical Pacific and Indian ocean are key to overall global patterns) and then even as temps cool the ace goes back up. The atlantic in the coming years, responds and cools, but the drop there is not as significant as the return of the pattern in the Pacific to one that breeds more storms in that area of the world, specifically the southwest tropical Pacific. The atlantic will return to the 60s and 70s but ace activity globally will go up.
How can we tell in a quick and dirty fashion that there is no accumulation of excess heat in the lower troposphere, just a distortion of the temperature? The keys are in the tropical production and ace index, and the ice caps. It is a well known point that the northern ice caps have diminished but the southern sea ice is generally going up from where it was when we starting accurately measuring it. The rise in southern ice should not be as great as northern fall in this distortion theory, simply because it takes more change in energy to cool the maritime air around the southern ice cap, than it does to warm the mix of maritime and continental air around the northern ice cap, since it is almost surrounded by land with responds to the warming and cooling of the ocean in the northern hemisphere more, hence the response in the time of the warm pdo and the predicted response in the coming years. So the increase in ice around antarctic and the profile of the ocean in the warm pdo, which is cooler south of 40 south in the warm pdo, argues that the cumulative increase in energy in the north over dry land land that the increase in temp of drier air represents, is offset by the fall, slight as it may seem, of the water around the southern ice cap. There is no change in the total energy of the system ( this was my argument that got people to attack me over) and there is no buildup of energy in the lower troposphere being caused by co2, there is no trapping, that this is not a greenhouse as is commonly referred to, but a thin flimsy veil surrounding the earth that agenda driven types wish to portray as some kind of trapping mechanism that will somehow destroy a process that the earth readily adapts to through a cyclical pattern. There is simply a natural cyclical pattern that develops to respond. Distort for a time that pattern via too much warming in one place and cooling in another and there is a response in the observed features that can tell us that.
So the forecast from all this ( that is what I do, forecast) is that as the Pacifics cold pdo takes hold that the corresponding return toward temps in the northern areas of colder will mean that there is more demand from the earth to redistribute the heat that builds in the tropical oceans even in their colder times, to bring the ace index back up. ITS THE RELATIONSHIP between north and south and where the center point of the the earths max temp is that is important. Imagine if you will a line that represents where the warmest temp is circling the globe. If it is naturally distorted north of the prime tropical breeding grounds, it means there is less reason to breed storms in the tropics. If the energy in the lower troposphere was truly accumulating to cause the need for heat to get redistributed, then the ace would not have dropped and the southern ice cap, would have contracted also, since it would have meant that the waters there over the years we have measured via satellite would have held steady or warmed. The apparent rise in global temps will soon correct ( relatively speaking) and the ace index returning to normal over the coming 10 years will coincide with an observed drop in global temp in the jagged up/down fashion we saw it come up, the northern ice cap will start its recovery, the southern ice cap start its retreat a bit. It is a matter of where the energy relative to what TEMPERATURE REALLY MEANS, and that is not a linear relationship. The fall of the ace index may simply be this message, the earths energy distribution due to “global warming” is distorted. The earth will correct that and as it does, it will prove the cyclical nature of the climate. In a way what is being observed now indicates to one that refused to look at what a measure of temperature really means, that there is some warming, when in reality it is a simply the earth in a natural cycle. Incidentally, as it corrects and cools, in the north, there will be an increase in the ace, which of course will then be attributed to global warming.
One more thing, just one example of distortion and reaction. Think about the years when the worst winter storms in the temperate regions near the US occur. Its when there is blocking, warming in the polar regions. THAT DISTORTS THE PATTERN TO PRODUCE THE UPTICK FURTHER SOUTH Why? if overall there is no change in the energy from the equator to the pole and there is a buildup of heat in one place,there is a corresponding decrease in another. DISTORTION of the pattern is the key folks, but it does not mean that there is a change in what is available in the overall pattern. Once the pattern that set in motion the abnormal storminess is corrected, then the process returns toward normal. There is nothing mysterious about nature simply reacting… to nature!
We see it all the time and I suspect the smoking guns of the climate can be found in the “meaning” of the ace index and the cause for the fall while we are apparently so warm ( its distortion of the global temp) which will correct itself in the coming years. The test of the theory will be to watch the global temps, the ice caps , and the ace index. All they are doing is searching for a balance they can never have, simply because of the nature of the creation!
In the end, for those that love the majesty of the atmosphere, as helpless as we all are to change it, the challenged provided are truly a wonderful gift for the curious and objective among us. In the end, there is nothing new under the sun, there is no buildup of lower tropospheric heat when one considers what heat really is, and the problem is we measure temps in a form where 1 degree in the arctic is given the same importance as 1 degree in the tropics. Events such as hurricanes and icecap increase and decrease are simply the reaction to the earth using what it has, and us now being able to observe it.
John, this is off topic but i will make one comment on this. See my previous post on hurricanes at
http://judithcurry.com/2010/09/13/hurricanes-and-global-warming-5-years-post-katrina/
I don’t buy Bastardi’s argument for the following reason. Tropical cyclones occur in the summer hemisphere, where the pole to equator temperature gradient is small (not in the winter hemisphere, where the pole to equator temperature is large).
R gates
It was catch 22. Only cited material could be provided but as it wasn’t cited it couldn’t be provided. The moment is lost now as this was part of my expert review of the draft and it’s now too late to find any documentation, especially as it would have to be the precise documentation that the ipcc so enigmatically refer to but won’t cite.
Tonyb
Dr. Curry,
I apologize for the “off comment” post but my point and I’ll read over the link. Yet, from my perspective, there is a reason to consider the global redistribution of energy within the context of climate patterns to equate the importance of land based temperature sets in context of their relative value.
The climate Algorithm is poorly defined, thus a small aspect of it is equally so?
That Bastardi explanation looks completely to be based on his gut-level TV weatherman’s intuition (and what his dad told him).
R. Gates –
Thanks for the links. I’ll take a look.
JC comment: The new version of the Berkeley Earth Surface Temperature dataset is now achieving its goal as an unprecedented data resource, including transparency and user-friendliness. The addition of Steve and Zeke to the team was an excellent move. They have clearly added value to the product. Further, they provide a welcome and needed link between the Berkeley team and the blogosphere.
=========
I couldn’t agree more with your comments. Its a great first step.
Skimming the graphs, it jumps out that either the timeframes are to short or land based observations from the existing and predominate Northern Hemisphere station records are insufficient to properly reflect natural climate cycles.
However, the ability to reflect an urban view of temperature changes over the timeframe will be a tremendous asset for City and Regional Planning.
I tried to find the answer to this, but could not do so. Is BEST going to give us monthly updates in the same way as GISS, NOAA and CRU do; into the indefinite future. In other words, have we seen all the data BEST is going to give us, or will the data be updated monthly.
Will BEST do Bright Sunshine and Clouds?
On UHI, it would be interesting to compare the BEST interpretation on S. Korea with a recent paper by local scientists.
See http://wattsupwiththat.com/2011/07/28/new-paper-uhi-alive-and-well-in-china/
“The cities that show great warming due to urbanization are Daegu, Pohang, Seoul, and Incheon, which show values of about 1.35, 1.17, 1.16, and 1.10°C, respectively.”
Obviously the rapid industrialisation in this area since the 1950s would lead to a huge increase in the UHI effect, and a good test of BEST.
Just a thought that someone more confident with the data might like to follow up?
Cui. There are around 7 studies that look at UHI on a long term basis.
They all look at major cities: Major cities are a small portion of the database.
The long term studies of major cities have results that inicate the following values: .05C per decade to .12Cper decade.
That’s worse case. Major city versus rural surroundings.
The actual database has a wide range of urbanization from no urban
to large city. So the UHI effect, if it exists, will be less than what you see in an isolated study of big city cases.
Also, south of the equator UHI is very attenuated. search on latitude and UHI
Thanks again Steven.
I wasn’t suggesting the UHI figures were widespread across the globe; only that the scientists in the specific study cited seemed to conclude that almost half of S. Korea’s *measured* warming in recent decades was down to UHI, so it seemed like an opportunity to cross-check BESTs interpretation with a local study, which might prove interesting.
Shame we haven’t got any figures from N. Korea. :-)
cui.
You have to be careful to insure that you are comparing the same things. For example, are the stations used in the Korean study the same
as those present in the BEST data.
You see the same thing in China, where the local scientists have data
that they dont share with the agencies who collect data. At AGU I ran into guys from china who had loads of data that do not get downstreamed to
the big agencies that collect data.. so it doesnt get into BEST.
That said, having looked at UHI on a country bassis and continent basis I can say that you find a wide divergence. Why? UHI is not simple.
in Asia you have certain building practices. Dense urban hi rise.
In South America, different pattern. in india, different pattern.
AT this stage the best you can do is answer the following question:
If we remove all urban from the station inventories and run a rural only
dataset, do we get a different answer? the answer is no.. with some
wiggle. sometimes we can find a small effect, othertimes not.
WHY the effect is so small ( or even zero statistically ) is an unsolved
mystery. Peterson thought is was “cool parks”. That’s not been
proven.
Steven, as we know global population has doubled since the 60s, this increase being even more accentuated for urban localities versus rural I would expect to have a certain signal of this increase in the database.
Do you know what is the number of rural locations compared to urban locations in the database?
This would give as some information to how diluted the urbanisation signal is.
There is also the RUTI experiment which is based only on the rural locations – unadjusted data. Would be interesting to compare the 2.
I saw an interesting chart of RUTI versus the 1999 GISS chart, before adjustment, showing nice concordance:
http://hidethedecline.eu/pages/ruti/north-america/usa-part-1.php
Lars.
1. The RUTI analysis is not replicatable as they haven’t defined urban or rural in an objective way that is tied to the physical causes of UHI.
2. yes population has about doubled, unfortunately population is only a proxy for SOME of the physical causes of UHI. The cause of UHI is a
change in the energy balance equation in the local area. The most important variable is the area ( or log(area)) that has been transformed
and the transformation that has taken place. Put another way, population
growth doesn explain much. It is modulated by how that growth took place:
see oke and Stewart on Local Climate Zones.
Is anyone looking at the incorrect implementation of the Jackknife formula?
I’ve asked, what I should probably do is get a better understanding of your issue so I can explain when I ask the question again.
Steven – and anyone else who might read:
On my first reading of the paper, I discovered the method which estimates the CI is improperly implemented. I thought it was clever, like Steig’s stuff was, but it is also flawed in its application.
The method removes a fraction of the data and looks at the differential “damage” (lubos’s word) caused by removal of that information. BEST damages the data by removing 1/8th of the dataset 8 different times. This 1/8th is then used to estimate the true CI of the methods and data. It really was very cool.
Like a mannian hockey stick though, I realized immediately that BEST had made an error. They re-weight the data after each 1/8th removal. The tendency of this is likely to underweight the uncertainty (I think) but the effect could go either way.
Anyway you look at it though, the estimate is incorrect. The problem in correcting it is that they were trying to estimate the uncertainty of the method but there aren’t any easy answers now.I beleive they realize the problem but communication has stopped on their end. The CI section of their paper needs to be redone completely from the early publication.
A link to my work on this is here: http://noconsensus.wordpress.com/2011/11/20/problems-with-berkeley-weighted-jackknife-method/
Jeff, I have sent this several times to the team. I think they have been focused on getting v2 out. I think Robert Rohde is starting to revise the methods paper, and will be looking at the critiques. It would be good if Steve can follow up with Robert on this.
“that they were trying to estimate the uncertainty of the method ”
should say “that they were trying to estimate the uncertainty of the method and data combined”
Thanks for the reply Judith. The lack of reply was more of a problem than the problem itself. If there is something I can do to clarify, you only need to ask. If I’m missing something, I’m quite happy to post that on my blog as well.
I will be curious to hear the thoughts of the team.
“R. Gates | February 18, 2012 at 3:39 pm |
if you’re looking for a longer-term signal, the deeper global ocean is the place to look.”
“DocMartyn | February 18, 2012 at 5:44 pm |
No wait, seal levels are falling.”
There is no contradiction between the sea surface cooling and the deep ocean warming by a slightly greater amount and the fact that sea levels may be falling. The sea surface is at a much warmer temperature than the deep ocean.
The thermal expansion coefficient for sea water at one atmosphere pressure and 20 C is 250 × 10−6 K−1. This contrasts with the thermal expansion coefficient for sea water at one atmosphere pressure and 0 C is 52 × 10−6 K−1.
However my problem with R. Gates position is two fold. First of all, who cares if the deep ocean gets a tenth of a degree hotter assuming it can be proved? And secondly, even if the temperature of the deep ocean did rise from 4.0 C to 4.1 C, how will this “heat” ever make it through the surface of the ocean and into the air at any point in time?
Werner, I would not disagree with your basic analysis, or “who cares” reaction. My over all point is not currently about what the effects might be, but rather, that the sub-surface and deeper ocean must be the place to look for energy imbalance in Earth’s climate system, as other areas, such as land surface and sea surface have far too little heat capacity and are far too variable except over the longer term to identify longer- term signals. Those who say the Earth has seen no warming this past decade are simply wrong, as what they can only say is that the tropospheric temperatures have been flat this past decade, but by the far better and greater metric of Earth’s energy balance (by many orders of magnitude) ocean hast content, shows the past decade had the largest increase in total Joules of energy of any decade in the past 40 years. Hardly the sign of an Earth system that is not gaining energy.
“R. Gates | February 19, 2012 at 12:45 am | Reply
…ocean hast content, shows the past decade had the largest increase in total Joules of energy of any decade in the past 40 years”
Even if this could be proven to be true, how could you prove it was due to increased CO2 instead of a huge increase in undersea volcanic activity?
Werner, the amount of such activity it would take to add that many Joules of energy to the deep ocean would be massive. Roughly on the order of 10,000 times all the known undersea volcanoes that have been known to exist over the past 50 years. Granted, there are many undersea volcanoes that exist undetected, but certainly not 10,000 times as many. Additionally of course, we can now observe massive downwelling in areas such as the Pacific Warm pool, so we can see this warmer water be forced downward from surface layers. The warming is not coming from the bottom, but being forced from the surface through downwelling.
“R. Gates | February 19, 2012 at 12:45 am | Reply
shows the past decade had the largest increase in total Joules of energy of any decade in the past 40 years
…Additionally of course, we can now observe massive downwelling in areas such as the Pacific Warm pool”
If the last decade had such an increase, why was the El Nino from 2010 weaker than the one in 1998? (At least according to HadCrut3 and RSS and UAH)
Is BEST really sure that they are getting truly raw data? Is it rally the original data written down by the station people day by day?
Or has it been adjusted by intermediates.
Thanks
JK
Data comes from the sources as identified.
Judith,
Velocity differences were NOT included in the Berkley averaging equations.
Ocean heat content not following climate models:
http://pielkeclimatesci.wordpress.com/2011/12/04/upper-ocean-heat-content-change-by-bob-tisdale/
Bob B.,
I am wondering why you’d choose to use a shorter-term, very selective range of data from Tisdale that doesn’t even go down to 2000m when a much broader range of data is available from NOAA here:
http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/
Tisdale does his best to cherry pick the data to make his point, whereas the full set of data tells the true story. You can eat those Tisdale Cherries if you want, but they are psychotropic and will give you a skewed perception of things.
Because that longer term ‘data’ is regarded as extremely dubious estimates even by those who produce it. Tisdale’s is using the only accurate measurements. As we don’t expect fluctuations in this massive damper there should be no flattening at all. That is what was expected when the experiment was begun. The results were highly unexpected, especially as they initially showed cooling prior to being ‘corrected’ upwards.
Stop and think for a minute before berating skeptics. You are obsessing over a largely natural and pretty minute rise of 0.6 degrees a century and believing an extrapolation to 3 degrees next century – based entirely on models that are unfit for the purpose of such a task. There is ample reason for skepticism.
Now if you want to argue that in the valley of huge uncertainty you fear the upper limit that’s all well and good but some of have long memories of much similar scaremongering by over-excitable scientists that turned out to be flat wrong. What we see here is a familiar story: The data; stratospheric, tropospheric, sea surface and radiosonde all say the hypothesis is disproven but the scientists who have been preaching this hypothesis for 25 years with zero predictive success just can’t let it die. The global cooling scare died when temps went up. What does it take this time?
Tisdale is the king of cherry pickers. He’s very good at making charts that carefully pick the data to make his point. Again, the Tisdale Cherries are psychotropic and will lead to a very skewed perception.
Bummer. I thought you might be able to reign in your insults.
vukcevic has an intriguing comment upthread February 18, 2012 at 6:19 pm thus
‘By accident or design the North Atlantic SST, with 2 corrections happens to be pretty good. Reykjavik atmospheric pressure has been measured regularly and accurately since 1860. Barometer was invented in 1640s, and more accurate aneroid barometer in 1840 (with easy re-calibration against modern instruments).
Existence of a good correlation between the AMO and the Reykjavik atmospheric pressure was not known until very recently (late 2011), hence there is no reason to suspect that the atmospheric pressure data was ‘adjusted’ to correlate to the AMO.’
http://www.vukcevic.talktalk.net/theAMO-NAO.htm
He seems to have teased out a key clue. A bit ironic for an unfunded(?) sceptic to trump hundreds of multi-billion funded consensus experts if true. But inevitable perhaps in the end if they’re all chasing an anthropogenic decoy ( Murphy’s Law plus the sarcastic saying ‘Good enough for government work’ also spring to mind).
Going through Berkeley’s paper: berkeleyearth.org/pdf/berkeley-earth-uhi.pdf
I find it odd that the stations trend analysis is comparing all very-rural locations trend with the global without an analysis of the locations distribution.
Looking at figure 2 in the paper we see different areas covered by the 2 subsets. My question is if the trend of those areas are shown to be the same? I do not think so, which would simply explain the results.
The rural sites do not cover the same areas with the same density of measured points.
Judging by the “look” of fig. 2 it is a comparison of the rural trend of North America, Scandinavia and some Russian and Australian locations to the rest of the world.
We find that North America, Scandinavia and a mix of the rest rural stations show more warming then all stations, the opposite of what is expected from UHI but this does not show that the trend shown by rural and urban measurements is the same and UHI can be ignored.
It measured the trend of a certain region based on rural stations against the trend of global based on all including UHI poluted ones.
A very simple verification is to plot rural versus urban data for the same area what RUTI does and to my understanding invalidates the conclusions of this paper.
The trends shown by RUTI do differ to the overall trend.
Further looking in the Berkeley paper for the warming stations in the US it is relative easy to find big cities shown in intensive red fig. 4 as well as warming in the North half – where UHI would have a bigger influence.
Lars, that is an interesting comment, i had been wondering the same thing, I have forwarded your comment to the BEST team.
Judith,
Thank you for your feedback.
There was another thing that kept me wondering – the new urban cooling effect – and I think I have a theory how this could happen:
UHI is not a constant value. It is different from city to city. For the same city as the city grows the absolute value of the UHI grows – as we can see in the Tokyo example.
Logically the urban heat increase would be non linear on population increase. I understand what Steven says – that the dependency is rather on type of development or location – but assuming 2 different cities, with same type of location, same type of development – I would expect the bigger one (from population pov) to have a larger UHI deviation. So as the city grows its UHI effect – or absolute temp difference – would increase – (keeping the same type of development), but as the city growth larger the temperature increase is not keeping pace with it. I would imagine the dependence on population is a logarithmic function. Something like
1° for first 100 000
2° for 1 000 000
3° for 10 000 000
(numbers only as example for the dependency with no meaning whatsoever)
Berkeley did a comparison between what they considered very rural locations and the rest.
Considering all the database is UHI biased – this UHI would be influencing the trend – showing a growing trend if the location grows. As per the example above, smaller locations would show a higher UHI increase then bigger locations.
Furthermore from demography we know the cities/locations in the developed countries are not growing or have very limited growth.
Such difference in growth would manifest itself in the way that UHI increase is much smaller for a big city and much bigger difference for a small location that are growing – and is exactly what Berkeley team measured.
So the fact that they found a “cooling trend” for urban locations – is not a proof that UHI has no effect on trend. Quite the contrary, it might be proof that the whole database is UHI “contaminated” and UHI effect on trend is not so effective for big cities as it is for small cities growing.
In such case it is not a cooling trend but a difference in the UHI growing trend = the lack of growth of many big cities in temperate regions (Europe, Russia) slow growth (Canada, USA) result in slow UHI growth. Small locations considered very rural= airports – increased air traffic, change in insulation, small cities growth – create bigger UHI increase.
It should be noted that both datasets are used as input to the whole methodology according to the paper. Thus both are used to estimate the area weighted global average. The coverage of the rural dataset may be worse in some areas leading to more uncertainty in the average, but there should not be any systematic bias due to the fact that there are more rural stations on certain parts of the world.
If you look at UHI on a continental basis you find some very interesting things.
In our poster for AGU ( using Ghcn Daily ) we had covergae for
europe, australia, russia, us. You find a small positive UHI effect
But we had no covergae for south america, little in africa.
Adding coverage in those places will diminish the effect.
The other thing that needs to be controlled for is coastal versus inland.
R Gates
http://i54.tinypic.com/2lcp5p5.jpg
http://bobtisdale.wordpress.com/2011/10/24/introduction-to-the-nodc-ocean-heat-content-anomaly-data-for-depths-of-0-2000-meters/
Regarding David Springer’s authoritative, no-nonsense proclamations about no warming in the ocean, please see: http://imageshack.us/photo/my-images/837/02000mohca.jpg/
The ENSO oscillation is now in a deep La Nina which has produced a 20 month-long running heat transfer from atmosphere to ocean. This La Nina (http://www.esrl.noaa.gov/psd/enso/mei/ ) has been the most powerful since 1975 in the MEI index, but the La Nina temperatures are considerably higher than in 1975 (http://www.woodfortrees.org/plot/gistemp/from:1970 ). When El Nino returns to a level of greater than 1.5 on the MEI scale, we will most certainly break all records for the atmospheric temperature. The baseline for atmospheric temperature oscillations keeps increasing as the upper ocean warms due to AGW.
Owen, your analysis regarding ENSO in 1975 is too simplistic. You only take into account the short-scale multiannual variations. On multidecadal time-scales, ENSO was much colder in 1960s and 70s than in 80s/90s/00s. Take a 10-year average of that MEI and you will see how much colder 1975 was.
Owen,
ENSO is driven by trade winds. It’s a surface phenomenon. Slower winds hamper evaporation rate and allow the sun to heat the calm water higher than normal and the absence of wave action prevents the normal mixing of the mixed layer down to 1000 meters and heat builds up on th surface. Since the ocean can’t dissipate heat efficiently via longwave radiation (for the same reason it doesn’t absorb it efficiently either) this ends causing conductive heating to rise and we get abnormally warm air. It also results in an accelerated rush of warm water to the arctic which goes into latent heat of fusion (it melts ice). If you examin arctic ice extent you’ll see a step-change downward in extent approximately 18 months after the 1998 Mother of all El Ninos. This is about how long it takes the oceanic conveyor belt to carry water from the southern Pacific to the Arctic circle.
And the stubborn fact remains that we don’t have to speculate about global ocean temperature any longer. We have it pegged fairly well by ARGO, especially in the mixed layer, and it just isn’t warming. It’s a travesty, by the way, that we can’t explain why the global ocean isn’t warming. Well, it’s not a travesty for me because I can easily explain it to my own satisfaction. True believers in global warming don’t care for my explanation yet have none of their own. What a shame for them.
Oops sorry. The ocean mixed layer is 1000 feet not 1000 meters. Damn you metric system!
“We have it pegged fairly well by ARGO, especially in the mixed layer, and it just isn’t warming. ”
——————————————
What data are you looking at?
Edim,
Then take instead the La Nina in 1988-89 ( http://www.esrl.noaa.gov/psd/enso/mei/ ) – smaller than the current one, but global temps are considerably higher with this current one (http://www.woodfortrees.org/plot/gistemp/from:1970 ). We have been in La Nina conditions roughly for 4 of the past 5 years, and it is still quite warm. The ocean is heating and the baseline is changing.
Owen, give it some time to develop. 1980s and 1990s were strongly positive in average. 2000s were kind of ENSO-neutral in average and it did stop the warming (whatever did – I don’t claim ENSO is the driver). I expect ENSO to be negative in average in 2010s. Cooling will be at least as steep as the 90s warming.
R. Gates and Pekka Pirillä
All this fretting about the (unmeasurable) Gigajoules going into the deep ocean (slipping past the ARGO devices in the upper ocean undetected on the way) is a bit absurd.
The heat content of the ocean is so immense that if ALL the theoretical 2xCO2 equilibrium warming were to go into the deep ocean instead of the atmosphere, this would warm the ocean by 0.002°C.
Yawn…
And this is supposed to come back out some day and fry us all?
Gimme a break.
This is voodoo science at its best.
Max
The heat content of the ocean is so immense that if ALL the theoretical 2xCO2 equilibrium warming were to go into the deep ocean instead of the atmosphere, this would warm the ocean by 0.002°C.
Max – Do you ever pause to think before you press the Post Comment button? I wonder if you realize how foolish statements like that make you look. For reference, remember that “warming” is measured in deg C or K, not in joules. If at equilibrium, the ocean surface became 3C warmer, why not reconsider your statement that “the ocean” would warm by 0.002C? Even the deepest couple of thousand meters would warm well above that amount at equilibrium, even if not by 3C.
Fred Moolten
Looks like you jumped the gun before engaging your brain.
The premise has been expressed that the “missing heat” is going into the “deep ocean” (miraculously by-passing the ARGO sensors scattered around all over the globe measuring the “upper ocean”).
I simply pointed out how absurd it is to think that the lower ocean would store the “missing heat” for later miraculous release back to the atmosphere to “fry” us.
Use your head, Fred. Otherwise you sound stupid (even though I know you really aren’t).
Max
You don’t lose gracefully, Max. Reread what you wrote. Do you truly think anyone will believe that the “ocean” will only warm 0.002C at equilibrium ?
Your comment about the warming at equilibrium is irrelevant to questions about the “missing heat” of a period of less than a decade, if it is in fact missing. I don’t think you understood that either, but it’s a different issue. It’s an interesting one, and if you want to discuss it, we can, but I think you are using it to divert attention from your error.
If you make a mistake, admit it. Trying to insult me will simply hurt your image more than acknowledging you were wrong. Step back and consider how other readers encountering this exchange will react to your 0.002C claim.
0.002 C is ridiculous just from the standpoint that the effective thermal diffusion coefficient will keep most of the heat near the surface. I think 0.002 C comes about if one takes the entire volume of the ocean, which is not the way these kinds of calculations are done.
What happens is the heat from the GHG forcing from the atmosphere will divert partly into the atmosphere+land and partly into the ocean based on the interfacial diffusion rates. This is a basic idea that I lifted from Kames Hansen’s early papers and I modified it to include a modified heat equation solution for planar diffusion:
http://3.bp.blogspot.com/-A9QQUeX8gPw/TzkO83PDjSI/AAAAAAAAA9A/3fYrJn5d9mE/s1600/thermal_diffusion_best_gistemp.gif
Actually Fred, Max has the number about right for annual temperature rise of the deep ocean for a half-Watt imbalance. The numbers are relatively straight forward and work out to 0.002C per year for the entire ocean which is to say for the deep ocean since 90% of it is below the thermocline.
http://wattsupwiththat.com/2011/12/30/losing-your-imbalance/
We question whether the ARGO sondes have the accuracy and precision required to find a half-watt/m2 of missing heat in the deep ocean since it amounts to only 2 thousandths of a degree C per annum.
I fear you don’t know your oceanography any better than you know atmospheric physics. Perhaps you should stick to cancer research. We’re still looking for improvements there so you abandoned your post before your mission was complete. Did you think you might have more success at curing Mother Earth’s “fever” than you did at curing cancer?
Dave – I’m going to give you some advice that you might not welcome, but is worth considering. If you become obsessed with finding fault with someone else because you have fared badly in a previous encounter, you are likely to lose objectivity and make very foolish statements. Having said that, I’ll leave it to you now to explain how, at equilibrium (not annual rates of rise), a warming of the surface to 3C will leave the temperature of the “ocean” only 0.002C warmer. I think it’s apparent to knowledgeable readers that such a result is absurd, but if you want to try to justify it with numbers, mechanisms, and gradients, you should go ahead.
Roy Spencer graphed 15 models of deep ocean temperature “complete ocean data archived for the period 1955-1999”
http://www.drroyspencer.com/2011/08/deep-ocean-temperature-change-spaghetti-15-climate-models-versus-observations/
The trend seems to be 0 +/- .02C/decade.
Even the “experts” don’t think the deep ocean has warmed at all.
The most recent data gathered from the deep waters of both the Atlantic and Pacific would indicate warming per year at least several orders of magnitude greater than an estimate of .002C for a 3C surface temperature rise Note: that’s per year. We are seeing warming of around .01C per year at abyssal depths in some regions. See:
http://journals.ametsoc.org/doi/pdf/10.1175/2007JCLI1879.1
https://darchive.mblwhoilibrary.org/handle/1912/1213
http://journals.ametsoc.org/doi/pdf/10.1175/1520-0442(1999)012%3C3297%3ATAOTNS%3E2.0.CO%3B2
http://journals.ametsoc.org/doi/abs/10.1175/2008JCLI2384.1
An estimate of .002C warming for the deeper ocean over the next century is already very wrong, and looks to be off by at least a factor of a thousand.
The confidence limits suggests the deep ocean could be cooling.
Max presents the value 0.002 C in a way that I cannot understand and which appears to be totally false. It’s, however, a good number for giving the scale.
An average net heat flux of 1 W/m^2 leads to the warming of the oceans by the average amount of 0.002 C/year. The estimated change in the ocean heat content (top 2000 m) given in graphs on the NOAA site corresponds to 60% of that.
It’s clear that the warming of oceans is very far from uniform. Thus any particular volume may warm much faster or slower, but the average appears to be presently 0.001 – 0.002 C/year. This is an indication of the fact that it takes an extremely long time for the oceans to get even close to thermal equilibrium (close to equilibrium might be defined e.g. by the requirement that 90% of the volume has gone through 90% of the ultimate change). As part of the volume is effectively warmed only by heat conduction there are regions which warm still much more slowly than volumes influenced by thermohaline circulation and other forms of mixing, but most if the ocean takes part in some form of mixing.
A warming that’s 1000 times stronger than 0.002 C for the total ocean volume takes 1000 years with an average net energy flux of 1 W/m^2 and reaching that in 100 years requires the average net flux of 10 W/m^2. It’s always good to check such numbers before making categoric statements.
Presumably the “missing heat” (which is allegedly 0.5W/m2 according to TOA satellite measurements) has to pass through the ocean mixed layer. While that is enough energy to raise the entire ocean by 0.002C annually it will be 0.02C annually in the mixed layer. Two hundredths of a degree per year in the upper 1000 feet of the global ocean is well within ARGO diving range and precision of instruments on them. The imbalance can’t be found. I suspect the TOA-balance measurement isn’t accurate or precise enough to put the correct minus or plus sign on TOA energy exchange.
In light of current mainstream climate theory as represented by GCM ensembles we can’t explain why there has not been any significant temperature increase since 1998. Atmospheric CO2 has grown at an accelerating pace during those years yet temperature increase just plain stalled. It’s a travesty that it can’t be explained. Actually it’s not a travesty it’s a falsfication of one or more of the underlying hypotheses and/or a demostration that the GCM ensemble is incomplete.
Seriously, if you assume (for the sake of argument) that land surface temperatures warm 1.1C per doubling of atmospheric CO2 and that ocean surface temperature remains the same then compare that to any observation you can find and it will make perfect sense. All kinds of numbers start falling into place. The “missing heat” is found to be evenly distributed in a sphere surrounding the earth with a 100 light-year radius.
Nothing about earth’s energy budget makes sense except in the light of greenhouse gases not being able to significantly raise the equilibrium temperature of a deep body of water. This is easily explained with basic physics. Where there’s an infinite supply of water to evaporate that is how surfaces preferentially cool. It’s the path of least resistance for heat to escape. Greenhouse gases insulate land surfaces by restricting radiative heat loss and where there the only alternative path of escape is conductive. Rocks don’t evaporate. Conduction is very inefficient so surface equilibrium temperature rises which is the only thing possible in that case. Water however has an extremely efficient path for heat to escape other than radiation and that’s evaporation. Because evaporation entails no rise in temperature between ocean surface and water vapor temperature conduction also slows way down over the ocean and thus there is seldom much difference between ocean surface temperature and the air just above it. No temperature differential means no conductive heat exchange. This is what is happening and all observations of earth’s energy budget, SSTs, ARGO, and the instrumental land record all make perfect sense in light of it. If you focus on northern hemisphere land temperature records you can dig out a small anthropogenic warming in it driven by CO2 and CO2 equivalents which works out. If you average in SST record with land records the anthropogenic signal is greatly diminished because there’s twice as much ocean as there is land. If you subtract the southern hemisphere you can still make out the anthropogenic signal in the northern hemisphere because the northern hemisphere has twice as much land as the southern hemisphere and land is the only place the warming signal is formed. EVERYTHING makes sense with easy consistent explanations if you only assume that GHGs do not insulate deep bodies of water which are free to evaporate.
Max,
Check again, what the CERN CLOUD experiment tells. (Hint: it doesn’t really tell anything about what happens in the atmosphere due to cosmic rays.)
Heat coming back from the oceans is also a misrepresentation of expectations, although some unfortunate formulations by climate scientists have helped in creating that misrepresentation.
Hi Pekka
Yep.
I did check CLOUD.
So far it has experimentally confirmed the cloud nucleation hypothesis of Hernik Svensmark, but the quantification of how this will actually work in our planet’s atmosphere is still uncertain.
Let’s see what the experiment comes up with.
It would be exciting if we had an experimental validation of an alternate hypothesis to AGW as the principal driver of our climate, thereby shifting the currently held paradigm..
Don’t you think so?
Max
I found it interesting that the strongest results from the CLOUD experiment were related to the role of ammonia rather than radiation. It was also clear already before any experiment that radiation has an effect on early phase of nucleation (everybody who has worked with cloud chamber knew that). The experiment confirmed that and gave some quantitative results on that, but how that compares quantitatively with expectations has not been told.
The experiment will hopefully help in improving understanding of cloud formation, but it’s only a small step towards that goal, and it’s totally premature to foresee, what further work will reveal.
The motivation of Kirkby in pushing for the experiment is related to the theories of Svensmark, but that doesn’t tell that the qualitatively anticipated results would tell anything about the validity of Svensmark’s ideas.
CERN CLOUD tells us that GCRs increase number of possible nucleation sites but that in and of themselves are still too small to serve as nucleation sites. Left to demonstrate is that the increased number of smaller particles eventually combine into a greater number larger particles. It’s not an unreasonable leap to presume there’s a good chance they do indeed combine. If only climate boffins where held to such high standards as requiring experimental proof we wouldn’t be here having this argument but instead they’d be out experimenting and possibly finding different answers from the real world than unphysical models of reality in computer memory.
At any rate the most important finding from CERN CLOUD is that it could have falsified Svensmark’s hypothesis and it did not. The hypothesis is alive, well, and now supported by CLOUD not falsifying it. Compare this result to mainstream climate models which failed to predict 15 years of stalled global average temperature despite heroic anthropogenic efforts to burn more and more fossil fuel each year during that time. That is a failure of a hypothesis.
In a nutshell the score is:
Svensmark Hypothesis 1
Alarmist CO2 Hypothesis 0
The game isn’t over of course but it isn’t looking good for the CO2 alarmist position.
I don’t think that here was any slightest possibility that the experiment would not have given qualitatively such results as it gave. Thus there was neither any possibility that it could have falsified directly the Svensmark’s hypothesis. I say this, because the basic phenomena were well known from numerous empirical sources. The quantitative details were not known, but nothing that has been published tells that they would tell anything more specifics without significant additional analysis.
No points to either side based on the steps that have so far been taken.
How can be sure we are not dealing with more fakery?
Mac,
I’m not sure what “fakery” you are referring to, but the point of this data release is to encourage you to do download it and find out for yourself.
I looked at one station in BC, and other than BEST being to 3 decimals and Envronment Canada being to one decimal, it was the same.
The good news for me is that after looking at all of BC with data 1995 to 2011, it is cooling slightly from 1995 to 2011.
We are saved.
If you are referring to location data, then enviroment canada data would be old, since the recent upgrades to locations come in through WMO.
You can help by citing sources in ways that are traceable
Not quite ready. Still working on my R scripts. Pretty useful after only a few days.
A teaser.
The one labelled 7998 is from the BEST data. (Haven’t parsed the station list table yet).
http://i44.tinypic.com/xr7o0.png
The one labelled Victoria Intl A is from EC.
http://i42.tinypic.com/ibj7o4.png
The minor differences are EC using one decimal and BEST 3.
Data from 1995 to 2011.
since there are 15 BEST datasets citing the station number doesnt do much good, in terms of traceability.
Also, just citing the name of the enviroment canada site does no good.
What you need to do is to write a program that fetches both sets of data. That program then can be re run and checked. That is what we asked Mann to do, I’d expect no less of you.
I’ll get there.
The EC data was scraped from here:
http://climate.weatheroffice.gc.ca/prods_servs/cdn_climate_summary_report_e.html?intMonth=1&intYear=1995&prov=BC&txtFormat=text&btnSubmit=Submit%22
http://i40.tinypic.com/nqyez9.png
Brrr.
Bruce,
leaving links around is NOT producing a traceable replicatable case.
Write up something proper that others can use and check.
Notice I didnt say get it published. I just said write it up in a USEFUL way
Do you want a list of the possible 1000 stations that fit the Lat Long of BC but do not have a BC designator in the State/Province code?
SITE_DETAIL.TXT from single-valued
Here are the ones with BC in the name
Station_ID Station_Name
7992 CARMANAH,BC
8070 COWICHAN LAKE FORESTRY,BC
8099 AMPHITRITE POINT,BC
8147 GRAND FORKS RAYFIELD,BC
8152 TOBACCO PLAINS,BC
8219 ROSSLAND CITY YARD,BC
8289 CLAYOQUOT,BC
8341 NORTH NICOMEN,BC
8342 KEREMEOS,BC
8356 NANAIMO DEPARTURE BAY,BC
8375 STAVE FALLS,BC
8388 VANCOUVER UBC
8473 FRENCH CREEK,BC
8482 HEDLEY,BC
8602 FERNIE,BC
8862 POWELL RIVER,BC
8863 OYSTER RIVER UBC
8900 KASLO,BC
8909 KELOWNA AWOS, BC
8969 CORTES ISLAND,BC
9014 MALIBU JERVIS INLET,BC
9059 VERNON COLDSTREAM RANCH,BC
9061 DUNCAN LAKE DAM,BC
9066 NAKUSP,BC
9148 THURSTON BAY,BC
9166 PEMBERTON MEADOWS,BC
9175 HIGHLAND VALLEY LORNEX,BC
9194 INVERMERE,BC
9204 QUATSINO,BC
9268 SINCLAIR PASS,BC
9307 SHALALTH,BC
9319 SIDMOUTH,BC
9325 CAPE SCOTT,BC
9343 TRIANGLE ISLAND,BC
9389 REVELSTOKE A, BC
9470 BARRIERE,BC
9493 EGG ISLAND,BC
9495 SEYMOUR ARM,BC
9501 GLACIER NP ROGERS PASS,BC
9575 DONALD,BC
9606 VAVENBY,BC
9612 MOSLEY CREEK SAND CREEK,BC
9619 DOG CREEK A,BC
9628 100 MILE HOUSE,BC
9639 RIVERS INLET,BC
9652 BIG CREEK,BC
9702 WINEGLASS RANCH,BC
9719 CAPE ST JAMES,BC
9762 ALEXIS CREEK,BC
9770 BOSS MOUNTAIN,BC
9837 IKEDA BAY,BC
9861 OCEAN FALLS,BC
9918 ALEXIS CREEK TAUTRI CRK,BC
9954 TASU SOUND,BC
9986 SEWELL INLET,BC
10008 RED PASS JUNCTION,BC
10013 BARKERVILLE,BC
10076 MCBRIDE 4SE,BC
10141 TLELL,BC
10170 KEMANO,BC
10200 PORT CLEMENTS,BC
10210 DOME CREEK,BC
10216 OOTSA L SKINS L SPILLWAY,BC
10228 WISTARIA,BC
10229 KILDALA,BC
10242 FORT FRASER 13S,BC
10291 MASSET CFS,BC
10312 ALEZA LAKE,BC
10339 BURNS LAKE DECKER LAKE,BC
10350 SALVUS CAMP,BC
10382 BABINE LAKE PINKUT CREEK,BC
10384 FORT ST JAMES,BC
10398 PORT SIMPSON,BC
10411 TELKWA,BC
10441 TOPLEY LANDING,BC
10469 BABINE LAKE FISHERIES,BC
10486 HAZELTON TEMLAHAN,BC
10495 NASS CAMP,BC
10505 FORT BABINE,BC
10512 PINE PASS,BC
10531 PINE PASS MT LEMORAY,BC
10555 POUCE COUPE,BC
10589 HUDSON HOPE BCHPA DAM,BC
10590 HUDSON HOPE,BC
10593 PREMIER,BC
10610 BALDONNEL,BC
10617 FORT ST JOHN,BC
10662 WONOWON,BC
10665 INGENIKA POINT,BC
10705 BEATTON RIVER A,BC
10706 WARE,BC
10711 TODAGIN RANCH,BC
10797 CASSIAR,BC
10804 PLEASANT CAMP,BC
10808 HAINES APPS NO 2,BC
10813 GRAHAM INLET,BC
10823 LOWER POST,BC
Malahat BC 2002 Dec
Only a 17.89 difference.
EC 2002-12:
MALAHAT |48.575 |-123.530 |BC | 4.1| 1| | 10.6| 0| -2.7| 1| | | | 3.0| 23| | | 1| | | | 416.5| 0.0|1014820
http://climate.weatheroffice.gc.ca/prods_servs/cdn_climate_summary_report_e.html?intMonth=12&intYear=2002&prov=BC&txtFormat=text&btnSubmit=Submit%22
BEST:
7973 1 2002.958 -13.790 0.0050 31 -99
Malahat June 2000
EC: 15.9
BEST: 14.5
And even more …
Feb 2003
BEST -4.45
EC 4.1
I’ll give up now. And assume more errors. And that was one station for 1995 to 2011 data.
As for location data …
I was going to try and match EC to BEST and I used “Pleasant Camp” as my first attempt.
EC: 59.450 -136.367
BEST: 59.45000 -136.37000
Google seems more correct at: 59° 27′ 14″ N, 136° 21′ 52″ W
Here is what you can check.
1. Download all the source datafiles ( links are given)
2. compare the source datafiles with their common format versions.
3. Check to see that the sources are merged corectly in the mutivalue file.
4. check to see how these get combined into single value files.
5. compare the 4 versions of single value files.
The BEST database is a compilation of all the source data that is openly available. It brings together disparite data sources each with their own
unique format. It identifies duplicate records, it applies (or not) the source
quality flags. It applies its own quality test for outliers.
What it doesnt do:
It cannot check on a source’s source
Quote, Steven Mosher, “What it doesnt do: It cannot check on a source’s source”
So you could be reproducing fakery without knowing it.
So the answer to my question: “How can be sure we are not dealing with more fakery?”, is “You don’t know!”
Without checking quality controls at source we do not which of the 14 datasets or parts of these datasets can be trusted.
We have come full circle. We are back at the same point where BEST first started this quest.
Lessons to be learnt: 1. Mixing the bad with good gives only more bad. 2. Only the good will do.
Of course. One never knows in the way that will satisfy cranks. The moon landing could be fake, etc, obama’s birth certificate, blah blah blah.
All sorts of conspiratorial crap. I have never seen a single challenge to a Sources source that didnt depend on a source itself.
When you say MORE fakery you have to establish that there is any fakery.
Too bad we don’t hear much about the raw data. As I’m sure you’re aware, Steven, there’s no warming to be found in the instrument without pencil whipping the living shite out of it.
My favorite is the one where everyone ostensibly woke up in the morning in 1970 and decided they’d start recording the min-max readings in the afternoon instead of in the morning and thus the TOB (time of observation bias) was born. That adjustement alone produced half the 20th century’s warming. Making it worse, the adjustment is universally applied as an invented offset from midnight. The invention was made by
cherry pickingselecting 190 stations with hourly recordings and calculating the average difference in mean daily temperature for every hour of the day used for min/max. In effect this makes 50% of 20th century “warming” dependent on just 190 pre-selected stations with hourly recordings. Isn’t that precious?The other half of the warming is produced by SHAP (Station History Adjustment Program). This complex adjustment is a hodgepodge of assumptions so complex probably no one person understands what all is in it anymore. It adjusts for station location moves and/or changes in station environment. In principle it sounds good when an example such as a change in elevation occurs which would change station temperatures according to lapse rate but then it starts getting into real pencil whipping territory even there because what lapse rate do you use – dry or saturated? So they then start torturing the adjustment with considerations for rainfall rate.
There’s some lovely graphs of how each of the series of adjustments changes the raw data here:
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html
Anyone who uncritically accepts this pencil whipping isn’t playing with a full deck. And the bottom line remains that while man-made warming is indeed due to anthropogenic carbon it isn’t the carbon in fossil fuels. It’s the carbon in the lead pencils used to whip the raw data into the proper shape to fit the pet theory.
Too bad we don’t hear much about the raw data. As I’m sure you’re aware, Steven, there’s no warming to be found in the instrument without pencil whipping the heck out of it.
My favorite is the one where everyone ostensibly woke up in the morning in 1970 and decided they’d start recording the min-max readings in the afternoon instead of in the morning and thus the TOB (time of observation bias) was born. That adjustement alone produced half the 20th century’s warming. Making it worse, the adjustment is universally applied as an invented offset from midnight. The invention was made by
cherry pickingselecting 190 stations with hourly recordings and calculating the average difference in mean daily temperature for every hour of the day used for min/max. In effect this makes 50% of 20th century “warming” dependent on just 190 pre-selected stations with hourly recordings. Isn’t that precious?The other half of the warming is produced by SHAP (Station History Adjustment Program). This complex adjustment is a hodgepodge of assumptions so complex probably no one person understands what all is in it anymore. It adjusts for station location moves and/or changes in station environment. In principle it sounds good when an example such as a change in elevation occurs which would change station temperatures according to lapse rate but then it starts getting into real pencil whipping territory even there because what lapse rate do you use – dry or saturated? So they then start torturing the adjustment with considerations for rainfall rate.
There’s some lovely graphs of how each of the series of adjustments changes the raw data here:
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html
Anyone who uncritically accepts this pencil whipping isn’t playing with a full deck. And the bottom line remains that while man-made warming is indeed due to anthropogenic carbon it isn’t the carbon in fossil fuels. It’s the carbon in the lead pencils used to whip the raw data into the proper shape to fit the pet theory.
I really like what they’re doing here, we’re finally getting somewhere close to a ‘good’ data set.
the inclusion of differently analysed data was a joy to see (from my background), if i can find the time (which at the moment is scarce in the extreme), i’ll look forward to togin through it in more detail.
Keep in mind that even a “good” dataset is severely limited temporally and spatially. The only real global temperature record we have that passes the giggle test for spatial coverage adequate to call it ‘global’ is the satellite record which began in 1979. As far as we can determine from it nothing unusual or unprecedented has happened. If the average temperature had kept increasing since 1998 by 0.2C per decade it’s one thing but when, as of now, the satellite record shows 20 years of warming and 15 years of holding steady, a reasonable person reassesses the notion that CO2 was the major driver of those first two decades in the satellite record.
Hi Juduth Curry, I was informed that the RUTI project was discussed to some degree here in this blog.
Some argue that RUTI cant be replicated.
This is nonsence.
The stations used bu RUTI are there for all to see, just like Hadcrut etc.
Then some argues that “because they dont know what RUTI criterias are for chosing stations is not defined by a general sharp definition, we cant use RUTI.”
What are Hadcruts criterias for chosing 87 temperature stations in USA with an average of 1,3 mio peoble? What are Hadcruts clear definition for choosing only 10 stations officially rural in the USA when many hundreds are available?
First of all:
If all unadjusted temperature data was fully available, it would be much more relevant to “demand” one general rule of how to choose data valid worldwide. Such a demand shows that even sceptics sadly has no idea what we are up against (!)
Reality is that we have SCARCE, CHERRY PICKED, CUT DOWN pile of sometimes adjusted “unadjusted” data, so the magic we simply have to do, is to explore each area of the globe manually (!!) find out what is going. There is NO simple definition when facing a corrupted dataset.
How one has to play “Sherloc Holmes” to recover dayta is very clearly shown here where I restore the real NW Europe temperature trend from data supposed to hide it:
http://hidethedecline.eu/pages/ruti/europe/nw-europe-and-de-bilt.php
In many areas, the situation is better than in Europe, for example Zambia data is more complete than German data sets…
Some here argue that “RUTI cant be replicated.”
This is nonsence.
The stations used bu RUTI are there for all to see, just like Hadcrut etc.
Then some argues that “because they dont know what RUTI criterias are for chosing stations is not defined by a general sharp definition, we cant use RUTI.”
What are Hadcruts criterias for chosing 87 temperature stations in USA with an average of 1,3 mio peoble? What are Hadcruts clear definition for choosing only 10 stations officially rural in the USA when many hundreds are available?
The great advantage of RUTI is that the proceedings area for area are there for all to see and judge for them selves. This makes RUTI the obvious choice compared with any other ground based temperature source.
And by the way its rather wrong there is no explanation of how RUTI evaluate Rural stations, its because they haven’t checked it out.
UHI.
In RUTI the UHI approach is explained in the general introduction:
http://hidethedecline.eu/pages/ruti.php
Here a part of the introduction:
“RUTI is not all rural nor all unadjusted. However, RUTI is a temperature index aiming to use still more rural data (less use of city and airport data), still more unadjusted data when available and reasonable.”
…
“Thus, the main criteria to evaluate if a temperature station is rural or not is to check out the position using google maps. It is the relative growth of a city that determines the UHI pollution for a temperature stations, not the absolute size of a city. Therefore for RUTI use, stations that are located outside urban area or at least do not show a temperature trend significantly different than the near by rural stations are preferred.
In many areas, rural data are scarce and to some degree we have to use some (sub-) Urban data.”
And Judith, RUTI shows area for area what stations are used and why. This is a golden deluxe transparency compared to the conventional data sources.
EX: Here I show Turkish data, I show that the whole area of Turkey has systematically been corrupted since only a few large cities have data public available and thus I dismiss the whole country and I have explained why:
http://hidethedecline.eu/pages/ruti/asia/turkey.php
EX: For Italy, not that many data series are available, but I show which stations are used, and that these data series mutually support each other:
http://hidethedecline.eu/pages/ruti/europe/central-mediterranean.php
K.R. Frank Lansner
PS: If you would like, I can make a presentation of RUTI on your site.
Sorry, some parts came in twice.
K.R. Frank
Are there any plans to compare to the UAH data set where coincident? UAH provides data on three broad levels of the atmosphere:
1 The Lower troposphere – TLT (originally called T2LT). Includes Mixed-Layer Depth (MLD): 1000 to 1500 meters ~ 1 mile
2 The mid troposphere – TMT up to 12 miles
3 The lower stratosphere – TLS[3]