by Judith Curry
Inside Higher Education (UK) has a lengthy article by Darrel Ince entitled “Systems failure”, with the subtitle “A scandal involving clinical trials based on research that was riddled with errors shows that journals, institutions and individuals must raise their standards.” An interview with Ince can be found here. There is discussion of this on two threads at Bishop Hill (here and here). The article and interview are very thought provoking, with relevance to the context of the climate debate.
The article chronicles the publication of a 2006 paper on chemotheraphy by a group of researchers at Duke University that was touted as a major breakthrough and stimulated several clinical trials. Two biostatisticians, Baggerly and Coombes, investigated the research and found major problems with the statistical analysis, and pointed this out to the Duke researchers. They corrected some of the small errors, but remained adamant that the core research was solid. Then follows a story whereby Baggerly and Coombes attempted to publish their concerns, with little success. The issue morphed into an ethical one when the Duke research was being used on cancer patients in clinical trials. Finally they managed to get their critique published in the Annals of Applied Statistics. This came to the attention of the National Cancer Institute. Duke responded but continued with the clinical trials. Baggerly and Coombes finally resorted to FOIA request, Duke’s investigation doesn’t find any problem, and on and on the saga goes until finally there was an Institute of Medicine inquiry.
Excerpts from the article about what we should learn from this:
No one comes out of this affair well apart from Baggerly and Coombes, The Cancer Letter and the Annals of Applied Statistics. The medical journals and the Duke researchers and senior managers should reflect on the damage caused. The events have blotted one of the most promising areas in medical research, harmed the reputation of medical researchers in general, blighted the careers of junior staff whose names are attached to the withdrawn papers, diverted other researchers into work that was wasted and harmed the reputation of Duke University.
What lessons should be learned from the scandal? The first concerns the journals. They were not incompetent. Their embarrassing lapses stemmed from two tenets shared by many journals that are now out of date in the age of the internet. The first is that a research paper is the prime indicant of research. That used to be the case when science was comparatively simple, but now masses of data and complex programs are used to establish results. The distinguished geophysicist Jon Claerbout has expressed this succinctly: “An article about computational science in a scientific publication isn’t the scholarship itself, it’s merely advertising of the scholarship. The actual scholarship is the complete software development environment and the complete set of instructions used to generate the figures.”
Baggerly and Coombes spent a long time trying to unravel the Duke research because they had only partial data and code. It should be a condition of publication that these be made publicly available.
The second tenet is that letters and discussions about defects in a published paper announcing new research have low status. Journals must acknowledge that falsifiability lies at the heart of the scientific endeavour. Science philosopher Karl Popper said that a theory has authority only as long as no one has provided evidence that shows it to be deficient. It is not good enough for a journal to reject a paper simply because it believes it to be too negative.
Journals should treat scientists who provide contra-evidence in the same way that they treat those putting forward theories. For an amusing and anger-inducing account of how one researcher attempted to have published a comment about research that contradicted his own work, see “How to Publish a Scientific Comment in 1 2 3 Easy Steps“.
The second lesson is for universities. University investigations into possible research irregularities should be conducted according to quasi-legalistic standards. In his evidence to the Institute of Medicine inquiry, Baggerly stated that he and Coombes had been hindered by the incompleteness of the Duke review – specifically in that the university did not verify the provenance and accuracy of the data that the researchers supplied to the review, did not publish the review report, did not release the data that the external reviewers were given and withheld some of the information Baggerly and Coombes had provided to the review.
The university’s explanation for not passing on the new Baggerly and Coombes material was a “commitment to fairness to the faculty” and a senior member of the research team’s “conviction and arguments, and in recognition of his research stature”. A similar argument in a court of law would not have been allowed.
The third lesson is for scientists. When research involves data and computer software to process that data, it is usually a good idea to have a statistician on the team. At the “expense” of adding an extra name to a publication, statisticians provide a degree of validation not normally available from the most conscientious external referee. Indeed, the statistics used might merit an extra publication in an applied statistics journal. Statisticians are harsh numerical critics – that’s their job – but their involvement gives the researcher huge confidence in the results. Currently the scientific literature, as evidenced by the major research journals, does not boast any great involvement by statisticians.
A fourth lesson from the Duke affair concerns reproducibility. The components of a research article should be packaged and made readily available to other researchers. In the case of the Duke study, this should have included the program code and the data. This did not happen. Instead, Baggerly and Coombes spent about 200 days exploring the partial materials provided to conduct their forensic investigation. In a pre-internet, pre-computer age, packaging-up was less of an issue. However, the past decade has seen major advances in scientific data-gathering technologies for which the only solution is the use of complex computer programs for analysis.
A number of tools are now being developing for packaging up research. Among the best is Sweave, a software system that combines an academic paper, the data described by the paper and the program code used to process the data into an easily extractable form. There are also specific tools for genetic research, such as GenePattern, that have friendlier user interfaces than Sweave.
What is worrying is that more scandals will emerge, often as a result of the pressure on academics, who are increasingly judged solely on the volume of their publications (some systems even give an academic a numerical rating based on paper citation) and their grants, and on how patentable their work may be. Our universities are ill-prepared to prevent scandals happening or to cope with the after-effects when they do happen. There is a clash here between collegiality and the university as a commercial entity that needs to be resolved.
About Darrel Ince
Darrel Ince is professor of computing at the Open University. His web site is here. He is writing an account of the Duke University affair with the title The Cracks in Science. Ince also wrote an article on Climategate in the Guardian.
The Times Higher Education Supplement has an accompanying editorial, that is also quite good, with the title “To get to the truth, open up” and subtitle “More transparency from scientists, journals and institutions would go a long way to ensuring that flawed research is quickly detected.” An excerpt:
So what lessons can be learned?
We may struggle to change human nature, but we ought to be able to ensure that journals, as Professor Ince says, “acknowledge that falsifiability lies at the heart of the scientific endeavour” – they must be less quick to dismiss challenges to their published papers and more willing to admit mistakes.
Duke itself has acknowledged that in work involving complex statistical analyses, most scientists could benefit from a little help from the statistics department before publishing.
Professor Ince goes a step further, arguing that all elements of all the work (in the Duke case, the full raw data and relevant computer code) should be made publicly available so that others can replicate or repudiate the findings.
In this age of information and the internet, that can’t be too difficult, can it?
JC comments. At the heart of this is the rather extreme disincentives for researchers to admit and correct mistakes; this needs to change. But the system will be self correcting with greater transparency (public availability of raw data and computer code). Some of the parallels to recent episodes in climate science are striking. The interesting thing about this example is that it wasn’t just an academic dispute with the egos of researchers at stake, but the research impacted chemotherapy decisions with life and death impacts. The climate dispute was mostly academic prior the 1990’s but now it as at the heart of international energy policy debate. The single most important thing that could be done is institutionalizing the requirement for complete transparancy of data, methods, code.
Sigh. Can we FedEx bound copies of this to appropriate individuals and institutions?
I agree, Tom.
It appears that the scientific community may have been used to destroy the USA’s dominant position in the world by getting it to focus its funds and resources on the impossible task of stopping climate change.
Why? I don’t know.
But I am intrigued by the fact that the rise of the AGW propaganda roughly corresponds with the fall of the Berlin wall in 1989 and Nikita Khrushchev earlier warnings that We Will Bury You !”
With kind regards,
Oliver K. Manuel
Former NASA Principal
Investigator for Apollo
Major Mistakes Not Acknowledged
1. Solar wind data from Apollo lunar soils showed Earth’s heat source is NOT a ball of hydrogen .
Data from the 1995 Galileo Probe of Jupiter confirmed: The Sun is a plasma diffuser that sorts atoms by mass and covers the photosphere with lightweight elements : 91% H, 9% He.
Now four decades after Apollo, the current science news story: “Scientists surprised by solar wind data from the Genesis mission.”
2. Nuclear rest mass data revealed neutron repulsion as the most powerful source of nuclear energy – an energy source that causes violent eruptions and releases hydrogen (H) as a waste product .
Advocates of AGW ignored nuclear rest mass data and the information they contain about Earth’s violently unstable source of heat.
Today’s surprising science news stories:
“Crab Nebula’s gamma-ray flare mystifies astronomers”
“Fermi telescope spots ‘superflares’ in the Crab Nebula”
“Crab nebula: The crab in action & the case of the dog that did not bark”
Earth’s climate is controlled by a neutron star in the core of the Sun . It also behaves like “the dog that did not bark”.
William Herschel noted the link between sunspots and agricultural production in the 1800s. Carrington confirmed the Sun’s impulsive behavior in September 1859 – when Earth was totally engulfed by a solar eruption .
These mistakes expose that world leaders and their army of government-funded climatologists are totally powerless compared to the forces that control Earth’s climate.
1. “Solar abundances of the elements”, Meteoritics 18, 209-222 (1983).
2. “Isotopic ratios in Jupiter confirm intra-solar diffusion”,
Meteoritics 33, A97, 5011 (1998).
3. “Neutron repulsion confirmed as energy source”,
Journal of Fusion Energy 20, 197-201 (2001); “Super-fluidity in the solar interior: Implications for solar eruptions and climate”, Journal of Fusion Energy 21, 193-198 (2002)
4. Stuart Clark, “ The Sun Kings: The Unexpected Tragedy of Richard Carrington and the Tale of How Modern Astronomy Began” [Princeton University Press, 2007] 211 pages
Presumably the same Professor Ince who submitted evidence to the Sci/Tech committe at the Houses of Parliament ref climategate/CRU
I am Professor of Computing at the Open University and the author of 18 books and over a hundred papers on software topics. My submission to the committee is an expanded version of an article that I wrote for the Guardian and was published on 5th February 2010.
4. One of the spin-offs from the emails that were leaked from the Climate Research Unit at the University of East Anglia is the light that was shone on the role of program code in climate research. There is a particularly revealing set of emails that were produced by a programmer at UEA known as Harry ReadMe. The emails indicate someone struggling with undocumented, baroque code and missing data which forms part of one of the three major climate databases used by researchers throughout the world
5. A number of climate scientists have refused to publish their computer programs; what I want to suggest is that this is both unscientific behaviour and, equally importantly ignores a major problem: that scientific software has got a poor reputation for error.
6. There is enough evidence for us to regard a lot of scientific software with worry. For example Professor Les Hatton, an international expert in software testing resident in the Universities of Kent and Kingston, carried out an extensive analysis of several million lines of scientific code. He showed that the software had an unacceptably high level of detectable inconsistencies. For example, interface inconsistencies between software modules occurred at the rate of one in every 7 interfaces on average in the programming language Fortran, and one in every 37 interfaces in the language C. This is hugely worrying when you realise that just one error-just one-will often invalidate a computer program. What he also discovered, even more worryingly, is that the accuracy of results declined from 6 significant figures to 1 significant figure during the running of programs.
7. Hatton and other researchers’ work indicates that scientific software is often of poor quality. What is staggering about the research that has been done is that it examines scientific software that is commercial: produced by software engineers who have to undergo a regime of thorough testing, quality assurance and a change control discipline known as configuration management. Scientific software developed in our universities and research institutes is often produced by scientists with no training in software engineering and with no quality mechanisms in place and so, no doubt, the occurrence of errors will be even higher. The Climate Research unit Harry ReadMe files are a graphic indication of such working conditions
8. Computer code is also at the heart of a scientific issue. One of the key features of science is deniability: if you erect a theory and if anyone produces evidence that it is wrong then it falls. This is how science works: by openness, by publishing minute details of an experiment, some mathematical equations or a simulation; by doing this you embrace deniability. This does not seem to have happened in climate research. Researchers have refused to release their computer programs-even though they are still in existence and not subject to commercial agreements. For example, Professor Mann’s initial refusal to give up the codes that were used to construct the hockey stick model that demonstrated that human-made global warming is a unique artefact of the last few decades (He has now released all his code).
Barry, thanks for spotting this.
Check DEVIL’S KITCHEN ( http://www.devilskitchen.me.uk/2010/10/is-it-in-their-nature-to-lie.html ) “Is it in their Nature to lie?” for much more on the HARRY_READ_ME.txt files, and the exploration by “…your humble Devil actually delves into the methods and programming of said software.” of what is the heart of this post.
I think you might agree that the title of this topic might well be “On avoiding, detecting, admitting and correcting mistakes”.
To that end, let me suggest the following thought-starters:
1) One proceeds from documented requirements.
2) Data base analysis and design is a sub-process within the overall development process.
3) Raw, original data may be revised to correct documented entry errors, but are not updated with the results of processing raw data (reanalysis, homogenization, adjustments, averaging, etc.).
4) Processing Results are kept in separate, designed database(s). Each Processing Result is distinctly identified by name, purpose and version.
5) Each Processing Result refers to the versioned Query and Parameters that extract its input from the database of raw data. The Query (queries) (code) is/are referenced by the Processing Result.
6) Each Processing Result refers to the versioned Processing Procedures and Code used to produce the results.
7) The Output of the Processing Result is referenced by the Processing Result.
So, in the spirit of Dr. Curry’s blog, I invite all to hammer away at this idea until it makes sense :-) .
I understand the desire to see the code of someone else’s GCM, but I absolutely do not understand why a developer would ever provide that information. If I had developed a climate model that could accurately forecast (+/- 10%) temperature and rainfall at a local/regional level I would certainly protect that code to the maximum extent possible.
The truth is there is no such GMC today, but that does not mean that one will not exist at some time in the future. If I have written a model, why should it be publically disseminated? It is proprietary information and has value in the marketplace.
If it’s funded and developed by a commercial entity for commercial purposes, then I agree.
If it’s being funded and developed with public funding for public purposes, then no dice.
I agree with your assessment. If the government pays for the development, then they own the code. That does not mean it would be made public however.
It may have value in the market place. But just because you wrote it does not mean it is yours. It is most likely that it belongs to the entity that paid you to write it. Once you have taken their shilling, you do not have control over how it is used.
And so, if you choose to work on the government’s dime, your work belongs to the government, not to you.
However much you may like to think yourself on a higher intellectual plane, as an academic, you are in many respects just another tradesman plying for hire by results/time. Don’t kid yourselves that there’s anything too special about it.
For me, it would be great if everything was open source, open access and universally available. Often, though, a model will be funded over many stages by many different organisations and could involve buying in data, which IS proprietary, so ownership and dissemination it not necessarily straightforward.
“as an academic, you are in many respects just another tradesman plying for hire by results/time”
this sounds like Marx to me!
Universities (particularly those in the USA) have simple rules about creating spinoffs and intellectual property, so it will be easy to assess on a case by case basis if the academic is able to exploit the value of the research they produce – some are special. There should certainly be incentives for academics to produce commercially valuable work, especially if there are commercial returns to the university AND the government, unless you have a socialist view of government funding?
Maybe the military should make its research open access if it is paid for by the government?
“Maybe the military should make its research open access if it is paid for by the government?”
Regrettably you hold the view of skeptics as enemy rather than scientists assisting clarifications of climate hypothesis!
NO – I hold the view that scepticism is crucial in understanding the complexity of science and asking the right research questions. I think that, for example, Steve McIntyre would be a really useful contact for CRU to have made use of, had it not been for a clash of personalities.
I am here to learn about interesting articles and from comments by people who work on climate related topics. I also view there to be idiots at both ends of the spectrum (the real enemy are the dogmatic, whatever they believe). I don’t hide my views on this topic.
My question seems a fair analogy – IF the basis of something being open is that it is paid for by tax dollars THEN what about other things tax dollars pay for that private companies make money from that are NOT in the open. If NOT, then there must be another reason.
all one needs to do is change the names of the interested parties and it tracks very well…..except for the inquiry
When climate inquires match that performed by the Institute of Medicine inquiry in the noted case, climate science in general might win back a bit more respect.
“When climate inquires match that performed by the Institute of Medicine inquiry in the noted case, climate science in general might win back a bit more respect.”
Yes and no. Over the long term, it is the only way that climate science has any hope of restoring credibility. Over the short term, such an inquiry process would likely wreak havoc on the careers and the findings of many of the biggest names in the field. One has to conclude that this is why they fight so hard to avoid any such meaningful inquiry.
There is a wide and deep valley which has to be crossed. Once crossed, progress and repaired reputation can begin. Unfortunately, rather than getting it done quickly, it looks like it will take a long, long time.
This reminds me, by way of contrast, of a comment by Pekka yesterday:
It has always been true that I learn most, while trying to explain the issues to others, and that is true both when I’m basically right from the beginning and when I have erred and must find, where the error is. In collaborative research that has been one of the most efficient methods of getting over obstacles.
The humility and sincere desire for knowledge expressed above is a far better indictment of the activity described in this post than any I could craft. As was noted, medical research and researchers have been tarred by this fiasco. The fact that lives were endangered because of it makes it all the more disgusting.
Pekka’s pretty cool.
All research data and work papers ought, in my view, be published to the cloud live, as it happens.
That way, all these questions just go away.
People can access the raw and finished work from day one, and reproducibility is assured, as well as the opportunity for full, complete and even-handed search for and correction of errors.
I’m fine with things being behind the curtain until the authors are ready to publish. Once that point’s reached, however, I’d love to see a system where publishing (via the web) is gate-less. The journals could then compete to review (pro or con) what’s out there.
I must disagree. Unfortunately many mistakes can be made prior to the point of publication. I have not read the background to the specific case mentioned but I am familiar with many of the requirements for valid medical research.
The researchers must be completely familiar with the research question, background, definition of the condition being investigated, methods & accuracy of the diagnostic measures used, natural history of the disease, advantages & disadvantages of the various treatment alternatives available, etc. For many decades, there have been exhortations for researchers to consult professional statisticians and epidemiologists at the outset in the planning stages, BEFORE any data collection to ensure that the statistical design is appropriate and that sample sizes are adequate, that any confounding factors can be analysed correctly, and to exclude bias (as far as possible) etc. In medical research the investigational protocol must be spelled out in its entirety, setting out all the relevant details, methods of data collection, data storage and security, privacy etc.. Along with this, the ‘investigator’s brochure’ is also developed to ensure that all participants have uniform procedures and comply with legal requirements etc. The point of all this is that the investigational protocol is an extremely important document which needs to be critically evaluated, preferably by independent experts (e.g. FDA or similar) and any criticisms answered in full before any further work is done on the project. This is not only necessary from an ethical perspective but also to ensure that appropriate resources are available to complete the project. Given that all this is well known to the medical community, I cannot understand how things went so far off the rails in the case cited. It sounds like the work of amateurs. In my view the journals are not ‘off the hook’ either, because they have an editorial responsibility, a duty of care to ensure that appropriate critical review of the manuscript by relevant experts is considered prior to acceptance for publication. Was this a case of ‘pal review’?
While I can understand the desire for researchers to protect their results prior to publication in general (which was the basis for what I stated), I can certainly concede the need for much greater oversight much earlier in the case of medical research.
It’s not a matter of researchers ‘protecting their results’ the emphasis is on ‘getting it right’, minimising the possibility of errors at every stage of the project (planning, design, ethics approval, resource allocation, recruitment, data collection, safety monitoring, data collation and analysis etc.). It’s a process of asking “What have we forgotten? What could go wrong? Is there something additional we should/must do?” This is routinely expected of medical research nowadays. If it is not done properly the whole endeavour is wasted because someone ( a regulator, an independent expert in the same field, a statistician or an epidemiologist, will emerge with a devastating critique.
I think we’re in agreement…medical research is a special case.
“It has always been true that I learn most, while trying to explain the issues to others“.
It was ever thus – the less the person you are are explaining it to is involved, the more valuable it is. It’s really very simple – by forcing you to re-organise your understanding of the situation in order to produce a coherent explanation, you will see the issue from a “different angle” and you will “stumble on” the answer you were looking for. Research or engineering, it matters not – this technique is just as valuable when fixing a car as it is when creating a new scientific theory.
Steven Mosher popped up in the Guardian comments section of the Climategate article mentioned in the above article, with a memorable quote
“….So, I take a hard hard line on this. If you dont freely release your data and freely release your code in all cases then I am not rationally bound to even consider your claims. you haven’t produced science, you’ve just advertised it. the real science, is not the paper describing the data, its not the words describing the algorithm. the real science is the data AS YOU USED IT and the code AS YOU RAN IT. To check your science in the most efficient way, we need the data as used and the code as ran.”
and whilst we have disagreed about many things, I’m in total agreement on this. This mediacl scandal is an almost exact parralel with pre and post climategate behaviour by all the parties (institutions/journals, etc)
It is true that this seems to happen on all sides of the debate and I’m with Steve Mosher on condeming it all, wherevere it comes from.
“Scafetta claims Benestad and Schmidt have made many errors of their own. However, he is now refusing to give them the program code he used to allow them to try to replicate his results.”
I was just about to link to that as well…. ;)
as 2 wrongs do not make a right, so presumably you agree Phil Jones, Mann et all will now publish all their code/data as well?
Absolutely – I believe that quite a bit of this has been published.
From the article that I linked to ” Back in 2003, Michael Mann, now at Penn State University in University Park, initially refused to make available the data and code relating to the “hockey stick” graph. Now he is releasing all the data and code relating to his papers.”
Why did it take him 8 years?
And why is CRU still refusing to release data?
Why has Scafetta refused to release code?
Don’t know – but at least he doesn’t have a 13+ year track record for refusal. Yet.
Actually we had a thread on Scafetta’s latest paper in which he participated, he did address this issue, saying his results should be easily reproduced. Not a defensible argument, IMO.
Since Scafetta is also at Duke, one can only hope that Duke has learned some lessons in this regard from the chemotherapy episode cited in this post.
If I had to hazard a guess about why Scafetta refuses to release code, I’d say that he enjoys the fight. When we asked him for code over at CA he wanted to turn it into a game. Like he knew something we didn’t. The problem was that his description of his algorithm did not allow people to reproduce it. gavin tried, others tried. To put it mildly Scafetta was an ass. I am under no rational obligation to believe anything he writes. Until he coughs up code he will be one of those guys I can dismiss out of hand. He needs to show his work. A paper of his which describes work he claims to have done is nothing more than a bad advertisement. I’m not buying what he is selling. And no one here can give me a rational argument why I should accept claims he refuses to support with a code and data release.
My PO is that he is having fun showing how little his opponents actually know about the tools that are needed to do the work. Since they pass themselves off as experts he is probably extracting exquisite joy from showing how limited they really are and humilitating them in the community!!!
“I am under no rational obligation to believe anything he writes. Until he coughs up code he will be one of those guys I can dismiss out of hand. He needs to show his work. A paper of his which describes work he claims to have done is nothing more than a bad advertisement. I’m not buying what he is selling. And no one here can give me a rational argument why I should accept claims he refuses to support with a code and data release.”
This shouldn’t be a personal decision that Mosher makes. This should be an absolute legal requirement for policymakers. In the US, we have Constitutional rights to cross-examine our accusers. In essence, our life, liberty, or property cannot be put in jeopardy in a criminal proceeding by secret testimony. The parallel to secret science is easy to see.
Unfortunately, policymaking doesn’t come with the same procedural rights for those whose lives, liberty and property can be jeopardized by its enactment. But the importance of the issue should be recognized by policymakers. It is immoral for governments to make decisions which have enormous consequences for the citizenry while denying those citizens an opportunity to cross-examine the supposed “science”.
why is CRU refusing to release data?
That is the wrong question. The right question is
“why is CRU using data that is covered by confidentiality agreements ” WHEN
A. the amount of confidential data is SMALL
B. the addition of this data doesnt change the answer
C. FOIA guidelines insist that confidential data should
only be used if it is NECESSARY to the MISSION of CRU.
basically, why is CRU asking for MORE trouble by using confidential data? Seems stuck on stupid.
They have yet to show that more than a tiny amount of the data is covered. WHY DIDN’T THEY RELEASE THE DATA NOT COVERED!!
I think the ClimateGate email from Phil Jones covers it quite well. Something about, I will destroy the data before giving it to them!! Why shouldn’t we believe him? Is he a LIAR or something??
Why didnt they release the data not covered?
The only evidence we have is what they wrote. it would take too long to segregate the data.
FOIA stipulates that IF the responders have reason to believe the request will take more than 18 hours they can deny the request. This is important. Why
1. Because the complaint is often made that we harassed CRU. We did not. Every request that was deemed to take more than 18 hours was denied. I had one denied for this reason. I did not resubmit, because I did not want to overburden them. What this means katman is that they cannot have it both ways: since they denied Mcs appeal for the open data using the 18hour excuse, they cannot also argue that the other requests were burdensome. Burdensome requests were denied routinely.
2. It gives us some evidence that there database wasnt constructed very well.
Finally, they have moved forward making open data available. the BUT doesnt address my issue.
How much of a shambles does there need to be, for a researcher to take more than 18 hours to dig up some of his own work?
“And why is CRU still refusing to release data?”
The data are being invented by CRU, sorry, no such data at CRU now.
“Why did it take him 8 years?”
A lot of data, each has to be consistent with the hockey stick. He was working very hard to get it out in eight years. Phil Jones CRU data will probably take a hundred years. MM is efficient compared with PJ.
Michael Mann is now “releasing all the data and code relating to his papers?” That must be news to Chris Horner and ATI. The article Martha is quoting is from December 2009. But a more recent article indicates that:
“On January 6 , the American Tradition Institute (ATI), along with state Delegate Bob Marshall (R-Prince William), presented UVA with a Freedom of Information Act (FOIA) request seeking essentially the same information Cuccinelli demanded last year in a civil subpoena: e-mails Mann sent to and received from 39 scientists and all of his assistants; all documents generated by five specified grants; and Mann’s computer algorithms, programs and source code.”
The last article I can find on the subject, dated April 14, 2011, indicates that UVA, receiving letters of “support” from the ACLU and other capitalistic drivers (oops, sorry, couldn’t help it), has still not complied with the FOIA request.
Why is anyone still asking for Mann’s “computer algorithms, programs and source code” if they were already released in 2009?
Thanks Louise. From my begining in all of this Jon Claerbout has been a hero of mine. Primarily because he articulated dream for science that I shared. Jon is still around and he does answer mails, perhaps Judith can contact him.
with a few small exceptions (like a GCM run) there is nothing that prevents us from making science articles into executable documents.
Mosh must have been agitated! “the code as ran” s/b “the code as run”. /GrammarNasty
The take-away I got immediately from Climategate was that the CRUde-Krew knew damn well that their data and analyses were junk that couldn’t stand even flickers of the light of day.
yup, pissed. You see for years I have had to put up with stupid arguments from smart men. Smart scientists who know damn well that the real science is the data and the code. Smart men who know the importance of passing on our collected knowledge to others. Smart men who make stupid arguments piss me the hell off. The sad thing is that there was nothing of consequence to hide.
“… that their data and analyses were junk that couldn’t stand even flickers of the light of day”
Phil Jones would rather destroy the data than release the data – no such data available. And this is “solid science” consensus and the politicians buy that and most countries buy that as “solid” evidence.
Yet another example of the problems created when you (increasingly?) have capitalistic drivers in academia. It certainly makes me wonder why some folks want to move even more into privatized education.
Because public education teaches so much that’s not true?
A couple illustrative books – both older, both still applicable.
Lies My Teacher Told Me by James Loewen
Don’t Know Much About History by Kenneth Davis
Have you ever read the Second Amendment section of any American Civics text? AFAIK, there are NONE that teach the truth about that. If you find one, let me know.
Public schools and colleges are producing brainwashed statists.
I blame statist idealogue drones who infest academia and run climate blogs.
Why would anyone want more private education, over the wonderfully successful education our progressives provide for free?
Well… “The rate of public school students entering college after graduation has fluctuated between 62-67% in recent years…In private schools the matriculation rate is typically in the 90-95% range.”
Oh, and for minority students, those who owe so much to their progressive benefactors? Check out this modestly interesting graph that shows that 87.8 % of Black students attending private schools go on to college, while 38% of those going to private schools go.
I know correlation doesn’t equate with causation, but is it really a coincidence that Barack Obama, Bill Clinton, Al Gore, the Kennedy’s, and pretty much every progressive politician in the country who can afford to, sends their children to private school?
Defending the abomination that is public elementary and secondary education in this country is only slightly more laughable than defending government funded CAGW science as examples of the comparative weakness of the free market.
What happens when statisticians look at it:
Are Private Schools Better Than Public Schools?
Appraisal for Ireland by Methods for Observational Studies
Hebrew university of Jerusalem, Israel, and University of Southampton, UK;
National Cancer Institute, NIH, Rockville, MD, USA.
In observational studies the assignment of units to treatments is not under control. Consequently, the estimation and comparison of treatment effects based on the empirical distribution of the responses can be biased since the units exposed to the various treatments could differ in important unknown pretreatment characteristics, which are related to the response. An important example studied in this article is the question of whether private schools offer better quality of education than public schools. In order to address this question we use data collected in the year 2000 by OECD for the Programme for International Student Assessment (PISA). Focusing for illustration on scores in mathematics of 15-years old pupils in Ireland, we find that the raw average score of pupils in private schools is higher than of pupils in public schools. However, application of a newly proposed method for observational studies suggests that the less able pupils tend to enroll in public schools, such that their lower scores is not necessarily an indication of bad quality of the public schools. Indeed, when comparing the average score in the two types of schools after adjusting for the enrollment effects, we find quite surprisingly that public schools perform better on average. This outcome is supported by the methods of instrumental variables and latent variables, commonly used by econometricians for analyzing and evaluating social programs.
However, application of a newly proposed method for observational studies suggests that the less able pupils tend to enroll in public schools, such that their lower scores is not necessarily an indication of bad quality of the public schools.
In other words, they managed to find a way to justify their conclusion that –
Indeed, when comparing the average score in the two types of schools after adjusting for the enrollment effects, we find quite surprisingly that public schools perform better on average.
Again – bull feathers.
The most likely criteria for private school enrollment isn’t the ability of the student, but the ability of the parents to pay the freight.
It’s in the Annals of Applied Statistics. You think you’re smart enough to take them on, write them a letter. Here.
My kids went to private school. They’re bright, but not significantly more so than their friends who went to public school.
My grandchildren go to private school. They’re bright, but not significantly more so than their friends who go to public school.
There are liars, damn liars – and statistics. And if you use the wrong criteria, your statistics will lie to you. Especially if they confirm your biases.
Not obvious to you? Your blindness isn’t my problem.
No vested interest there.
Move along, nothing to see.
That’s hilarious. The data shows Owen to be spouting cow manure, and so he responds: “Don’t see that I’m obviously (and without any evidence) right? That’s proof that you are blind,” and “your blindness isn’t my problem.”
Classic psuedoskeptic logic. I’m gonna borrow that, Mkay?
> [T]he less able pupils tend to enroll in public schools […]
Who would have guessed.
Who would have guessed.
Only those who have a vested interest in that being the case.
> Only those who have a vested interest in that being the case.
Only those who do not share our own guesses have a vested interest.
BTW gentlemen and ladies, I hope y’all understand that Joshua has successfully executed his purpose as a troll by diverting the purpose of this thread, even if only temporarily.
Joshua is not responsible for anyone’s comments but his own.
Didn’t say he was – I was just commenting that we fell for it.
Jim, some of you fell for it. His comment was so stupid I dont know why you bothered to respond… now of course we digress.
Yup, you’re right. At best I would have left the troll alone. At worst I should have stuck with my first answer and left him alone. Not one of my better days.
Now – back to playing with the computer problems – hardware? software? both?
Both of my kids attended inner-city public schools. It ruined their lives! They’re gangsters! The only white girl in my son’s high school class invited the only white boy, him, to the prom. It was kind of cute.
He’s in medical school.
Good for him. And you. But does he know the meaning and purpose of the Second Amendment to the Constitution.
But then, somehow I seem to recall that that’s not on the agenda for you so it doesn’t need an answer.
So much for using statisticians. Lol.
There’s your answer Judy. Only statisticians who agree with Jim and tailor their work accordingly need apply.
Too funny, Jim.
My “purpose” as a troll, eh? Because I’m secretly plotting to divert “deniers/skeptics” from their efforts to defeat my goal of enforcing carbon taxes that kill millions?
Actually, my “purpose” was to make my point – that profitizing education has negative ramifications. In terms of elementary schools through high schools, numerous studies show that when you control for the SES variables, funding, % of special needs kids, etc., private schools do no better (if not worse) than public schools. Despite rhetoric to the contrary, our public schools compete well against the schools of any other country where student populations are below a 20% poverty rate.
As for “higher” education, from an inside perspective at multiple universities, I have seen how a focus on financials has watered down academic focus and prioritized marketing; sacrificing academics for the sake of rankings, sacrificing academics to spend on facilities so as to attract tuitions, etc. I have seen how a focus on research outweighs a focus on delivering educational services to students. The obsession with prioritizing publishing over all (while paying lip service to teaching) has educational ramifications.
Here’s a nice interview about for-profit universities:
For profit charter schools do no better on average than traditional public schools.
There is no reason why privatizing education, and research driven by profit motivations, wouldn’t have the same kinds of flawed results as research by pharmaceutical companies pushing their products, companies that hide the toxicity of their products for profit. or companies that attract investors by misleading financial statements.
And Jim – if you wish not to respond to my posts, feel free to act accordingly.
To write that more clearly – “Despite rhetoric to the contrary, our public schools where student poverty rates are blow 20% compete well against the schools of any other country.
And none of this is on topic for this thread.
I believe you meant to link to this:
Tell us what the topic of the thread is, Jim.
Here’s a nice interview about universities in general:
“Higher ed, he said, was an elaborate scheme to deprive young people of their freedom of thought. He compared four years of college to a lab experiment in which a rat is trained to pull a lever for a pellet of food. A student recites some bit of received and unexamined wisdom—’Thomas Jefferson: slave owner, adulterer, pull the lever’—and is rewarded with his pellet: a grade, a degree, and ultimately a lifelong membership in a tribe of people educated to see the world in the same way.
‘If we identify every interaction as having a victim and an oppressor, and we get a pellet when we find the victims, we’re training ourselves not to see cause and effect,’ he said. Wasn’t there, he went on, a ‘much more interesting . . . view of the world in which not everything can be reduced to victim and oppressor?’
This led to a full-throated defense of capitalism, a blast at high taxes and the redistribution of wealth, a denunciation of affirmative action, prolonged hymns to the greatness and wonder of the United States, and accusations of hypocrisy toward students and faculty who reviled business and capital even as they fed off the capital that the hard work and ingenuity of businessmen had made possible. The implicit conclusion was that the students in the audience should stop being lab rats and drop out at once, and the faculty should be ashamed of themselves for participating in a swindle—a ‘shuck,’ as Mamet called it.”
I’m betting Joshua got lots and lots of pellets.
Go to the top of the page – it’ll tell you.
Jim – the “topic” of the thread comprises the subjects of the posts that people write. I wrote one post. In the subsequent thread, you wrote many. In reality, you contributed to the “topic” of the thread far more than I did. If it diverged from the focus of Judy’s post, you contributed far more than I. I’m a little surprised at you. Most people of what seems to be your political persuasion often speak about “personal responsibility.” I have no idea why you’d seek to blame me for posts that you have written.
Judy’s post was about the pressures that corrupt academia and academic research. My post was on that topic. Part of what corrupts academic processes is a pressure to publish, and the financial drivers that are prioritized in academia today are very much related to that pressure to publish.
Thanks Willard – that was the interview I wanted to link.
I had been a teacher in both public and private schools.
In the public school my job was mainly behavior management and there was little education. In the private school, it was the opposite. In the public school, the students were coming to class to socialize. In the private school, they came to learn.
The problem as I saw it was the parents of the private schools students, as a result their children, value education so these students don’t disrupt the class. It was the opposite in the public school. Only a couple of students were sufficient to disrupt the whole class. I fled teaching in a year after I saw this reality.
If you are a parent, send your children to private school.
I have taught in both public and private schools also, and my experiences do not match yours.
One of the public schools I taught in was in an upper class community. You would be hard pressed to find a more academically oriented parent population anywhere else in the country. The school graduates a very high percentage of students that attend the highest ranked colleges in the country. One of the private schools I taught in was a very expensive, very exclusive school. Many of the students there were extremely passive about their education – with an attitude of privilege and entitlement that worked in direct opposition to their academic development.
There are clear correlations between SES and academic performance in our schools. Those correlations can sometimes be seen reflected in differences public and private schools. The causal relationship between SAS status and school performance is complicated, but it isn’t the fact of being public or private schools that creates those differences. Some valiant efforts are being made to address those correlations effectively. Unfortunately, in order to do so they have to overcome facile and counterproductive generalizations about public and private schools and the ignorance they foster.
i’ve taught in both private and public schools So fire up a link to a scientific study, I’ll go download the code and data and see if it computes. Otherwise, your babbling about about your personal experience or things youve read are just advertisments. Wish I had a Tivo
That is the topic. I could care less why people have come to believe that papers without code and data are science. Fix the problem, not the blame
JCH already cited this above:
Sorry, but that advertisment does not contain a reproduceable result. There are two options:
1. A sweave document which can be executed.
2. operational link to the actual data as used and the code as run.
GIYF (try sweave).
“The term “reproducible research” is newer, and I don’t know who coined it. The basic idea is simple. It’s the scientific ideal.
Research should be reproducible. Anything in a scientific paper should be reproducible by the reader.
Whatever may have been the case in low tech days, this ideal has long gone. Much scientific research in recent years is too complicated and the published details to scanty for anyone to reproduce it.
The lack of detail is not entirely the author’s fault. Journals have severe “page pressure” and no room for full explanations.
For many years, the only hope of reproducibility is old-fashioned person-to-person contact. Write the authors, ask for data, code, whatever. Some authors help, some don’t. If the authors are not cooperative, tough.
Even cooperative authors may be unable to help. If too much time has gone by and their archiving was not systematic enough and if their software was unportable, there may be no way to recreate the analysis.
Fortunately, the internet comes to the rescue. No “page pressure” there!
Nowadays, many scientific papers also point to “supplementary materials” on the internet, either at the journal’s or the author’s web site. It doesn’t matter so long as the material is permanently available. Data, computer programs, whatever should be there.
But even more, the entire analysis should be reproducible. In real science, this is hard. Redoing all the chemistry, or all the field work, or whatever is asking a lot.
But in mathematical and computing sciences, like statistics, reproducibility is perfectly possible. It only takes will and knowledge to do it. “
Blaming capitalism for academics lying does not stand the historical test.
People lie whether capitalist, socialist, communist, theocratic, or atheist.
Think of Lysenko, as a nice place to start reviewing this.
“People lie whether capitalist, socialist, communist, theocratic, or atheist.”
No disagreement there, hunter. Government funding does not cause lying any more than does a profit motivation. Nor do either prevent lying. I’m glad that we can dispense with the myth that privatizing education (or anything else) will necessarily lead to better outcomes.
As reads: “capitalistic drivers in academia.”
Amend to read: “political drivers”. :-)
Recall that PNS calls for “democracy”.
Dr. Curry” “At the heart of this is the rather extreme disincentives for researchers to admit and correct mistakes; this needs to change.”
Joshua: “Yet another example of the problems created when you (increasingly?) have capitalistic drivers in academia.”
What insightful statements. Well, OK, the first one is, the second one is nonsense. Progressive researchers, working at progressive universities, funded by progressive governments, all insulated from the discipline of a free market, put out shoddy research, and are dishonest in trying to cover it up. The government controlled “investigations” uniformly find nothing wrong. And the problem is “capitalistic drivers?”
The researchers in theses cases for the most part don’t get rich. What they get is what every progressive wants – more influence, ie. more power. The “extreme disincentives” that Dr. Curry rightly see as the problem, are those that are everywhere in progressivism. A successful researcher is one who generates more government funding, which means a larger budget for his department, and more press for the university, which means more government funding…. The reason Michael Mann and Phil Jones are still working where they are, is that they are still highly successful researchers, in this progressive model.
A bad idea can have bad consequences in either a capitalistic system or a progressive one. The difference is that in a capitalist system, the bad ideas ultimately fail. If a company puts out a substandard product, it loses business to better capitalists. If it loses enough business, it goes bankrupt and its assets are sold to those other, better capitalists to ise more productively.
In a progressive system, on the other hand, there are no competitors. The government is the only real producer, and a progressive “product” that is a total failure…just keeps on being produced. Just look at the East German Trabant, or Penn State climate research.
Universities like Duke, Penn State, and UEA are run by progressives, staffed by progressives, and their research funding is provided by progressive governments. So by all means blame their misconduct on “capitalistic drivers.” Just don’t expect to be taken seriously when you do so.
When Michael Mann gets fired, when Phil Jones is required to resign, and when the funding for Duke, Penn State and UEA dry up, get back to me about the free market.
A wonderful post, although I am uncomfortable with Prof Ince’s dislike of computer simulations stated in his article.
In engineering, we use computer models all the time. To do so we write codes that “simulate” the real world using material models that “approximate” the actual materials we are using. We build things, test them in the lab, and fix the material properties and loading simulations until we have a reasonable understanding of the differences between our simulations and our lab models. We then go forward (cautiously I might add) to applying the results to things we build for real. We have done this often enough that most of us use commercially proven codes for our designs. If we don’t use verified codes, our Quality Assurance process requires that the new code must go through a rigorous validation process before it can be used on a project. Once a project is done, we are required to retain all of the documentation pertaining to the design (including any code used) for specified periods of time in case of discovery requests. The files must be thorough enough to reconstruct the design in the same detail in which it was done. We are legally responsible for any errors caused by faulty data, code, or design process.
It appears to me that we are allowing unverified climate research to enter the political process and dramatically influence the setting of national priorities without anything close to a serious validation process. The science is still in its infancy. The validity of the data being gathered is constantly and rightly being challenged, the models are hotly debated, and physical processess that govern the whole thing are just being understood. Go on with the research, but keep it out of the public policy arena until it is ready for prime time.
That is also my plea.
Gary M’s and some others’ comments above are more ideological than thoughtful. Look at what Gary M selected from the URL http://privateschool.about.com/od/choosingaschool/qt/comparison.htm and then look at the whole article. Consider the stereotypes “statist idealogue drones, ” one example from above. Hey guys, you do a disservice to empirically grounded science that we all want by blathering simplistic opinions about off topic subjects. I’m unaffiliated politically and a life long educator who agrees with some of your criticisms, but finds your ad hominen attacks very unhelpful and counterproductive. I direct many of my liberal and conservative friends here hoping they’ll find reasonableness and thoughtful comments here. Judy Curry is a good example of thoughtfulness and civility, but some oy you are not.
In perhaps one of the most penetrating and devastating essays in recent years, http://blogs.the-american-interest.com/wrm/2011/05/12/establishment-blues/ Water Russel Meade puts his finger squarely on the problem which the underlying subject of this post is just a symptom of. A couple of paragraphs from the essay that are relevant to this discussion:
The entire essay is well worth the read. As he points out the vested interest of institutions is to perpetuate their existence, so the natural inclination of the institutions is to defend against criticism so as to maintain the status quo. Although this is not an all together new development, the interlinking of many existing institutions is more pronounced than it has ever been. Add to that the progressive belief structure which infects most institutions with its “the ends justify the means” mind set, you get example after example of not only unscientific practices but out and out immoral ones.
It has gone beyond merely group think or cliquish self defense, it has grown into a superiority cult spread across a wide spectrum of vital societal institutions and it is dangerous but ultimately it will self destruct. A lie can never win because it is unnatural.
Jerry thanks for the link
The essay of Walter Russell Meade is thought provoking and certainly equally relevant to other countries as well.
My very brief summary of what I read in the essay is:
The responsibilities have been impersonalized, i.e. those who make or influence the decisions or who should lead the society forward do not accept wide enough personal responsibility for their acts, but rather restrict their responsibility to following rules set by others, even when they should realize that the rules are deficient.
This sort of dovetails with (I won’t quote or link it, because it’s been done so many times before) Ike’s farewell address. If people really understood that beyond the misimpression given by taking the “military industrial complex” out of context.
The common element in all this malaise, is ever-increasing state funding and control of universities, guilds etc, to force them into being ‘progressive’.
My comment is based on years of scientific observation. You might want to direct your complaints to Joshua, who decided to open up the discussion to include his political preferences.
Thoughtful post and good discussion. By coincidence, I just had a painful experience with this. While I was writing a post on this very topic (May 11: Who wants to live on the Real World? I’d rather live on Planet Denial” http://www.livingontherealworld.org/?p=276)
I was made aware of a mistake I’d allowed to creep into my prepared remarks, at a Senate hearing, of all places. Not a lot of fun to unwind that (“My Mistake!” http://www.livingontherealworld.org/?p=277). Easier to pass judgment on someone else’s flaws, than to confess your own! But by going back to the Senate staff immediately, I actually had an opportunity to extend the dialog/make the underlying point a bit more effectively. They proved to be gracious and understanding. Made me want to be more forgiving myself the next time out.
Hi Bill, I have been keeping up with your very interesting blog. Your acknowledgement of your mistake is very refreshing!
Some researchers commit statistical errors. Real statisticians point out the errors. The researchers circle the wagons, and attempt to censor the statisticians. The statisticians resort to FOIA. The battle continues.
Haven’t I seen this one somewhere before? Seems rather familiar.
From the article:
So Duke came into action because of Potti’s CV, notbecause of their deeply flawed research. Dispicable attitude and highly damaging for the reputation of Duke. I would ask the Board to resign.
Even top experts are wrong more often then not, but don’t expect a confession any time soon.
Any scientist using cosmic rays (10Be as the proxy) records based on the Greenland’s ice cores, to backtrack temperature records prior to 1950, is going to find an encouraging correlation with England’s temperature (CET).
Solar scientists also found that the strength of helispheric magnetic field conveniently does the same.
Of course that is not so, but both groups persist with their erroneous belief that they know what 10Be data represent.
Admitting and correcting mistake, no chance.
Two comments. First, I do not think it is good enough to involve a statistician only when the research is moving to publication. In my experience consulting statisticians often discover that the research methodology was deficient from the beginning, and that they are really being asked to put adhesive plasters on work that could have been done so much better. I doubt that any competent statistician would have okayed the bristlecone work, for example, had he or she been brought in at the beginning.
Second, one of the weaknesses that has become embedded in the creation of the relatively new profession of researchers is that the great majority of them do not teach, and some have never taught. Bright undergraduates, interested in your subject, will pursue you if they think you are wrong or if they think you were not balanced in your presentation. I learned quite early in my own university teaching career that a public confession — ‘I was wrong’ — actually lifted my reputation with my class, and more widely. We do make errors, and we are often wrong. Digging our heels in and defending a position against all comers is usually (not always) a misguided strategy. As so many have said on so many of your threads, becoming one’s own severest critic before publication is the wisest course.
Just to ‘parrot’ my usual line with an observation:
Implement industry-standard procedures into basic academic practice and this issue of ‘not admitting mistakes’ disappears over-night. I’m not kidding, it REALLY is that simple.
I’ve never understood the prevalence of some academics to do everything to avoid being proven wrong. Hell, i’m happy to be proven wrong- it saves me a LOT of time in the long run….
Though I agree with you about the need for academia to implement better “industry standard” procedures… I don’t think implementing such procedures will really fix the problem.
The problem has more to do with arrogance. Having someone from “outside” your field of expertise come along and riddle your theory with holes is very problematic for some personalities. This is not a problem unique to science either. There will always be “know it all” types who cannot fathom anyone who’s not on their level actually understanding aspects of their own research better them. Because of that, they will not admit to any possible mistakes uncovered… they are too proud.
“The problem has more to do with arrogance.”
A lot of examples here, AGWers or skeptics.
Arrogance is a human condition independent of ideology.
Hubris leads to tragedy. Which is why we all should agree that we don’t want these arrogant ones on either side of the debate dictating policy. So how about it? Can we get everyone to agree that neither AGWers nor skeptics should be allowed to dictate policy? Doesn’t that seem logical, as well as fair? ;-) [I can see one side balking]
No appeals to authority. No science which isn’t fully cross-examined, i.e. transparent, audited, replicated, verified, validated. No room for scientific hubris to infect policy.
True, but when the people ‘outside your field’ are the ones checking it before you cna have it released (for basic procedure, data integrity etc etc) and there are very REAL consequences for errors, then i DO think it will have a significant effect- the ‘over night’ was flippancy, but the change would be quick.
It’s either that or they’d loose their jobs- same end effect though- better science.
Great post Judy! No one needs to be told why!
The simplest, and automatic, method of improving science is what I learned in elementary school:
SHOW ALL YOUR WORK
There is nothing new or revolutionary there.
“SHOW ALL YOUR WORK”
Hidding data not up to elementary school standard!
Once upon a time, I was doing some research and a statistics issue came up. I sought to engage someone in our statistics department. I thought that the issue might be of sufficient theoretical interest to make it worth a statistician’s time. So I contacted someone I knew on the statistics faculty and asked if he could refer me to someone.
The conversation went something like this:
Statistics person: “Sure thing. How much money do you have?”
Me: “What do you mean?”
Statistics person: “I doubt you could find anyone to work with you for less than $20,000.”
This was not a bribe, per se, but would go to cover the salary of the statistician for the weeks that he or she would spend working on my problem.
I didn’t have the $20,000 to spare from my research grant. Hence, I did not engage a statistician.
Moral: Engaging a statistician is easier said than done.
At the moment, I have a great problem in temporal-spatial statistics that I’m working on, but no grant money for the problem. If anybody out there knows a statistician familiar with that area that will work for free, please put them in touch.
I didn’t have the $20,000 to spare from my research grant. Hence, I did not engage a statistician.
You were getting paid for the grant work you found interesting and you expected some uninterested person from another department to work for free? Really?
I think what he’s saying is that an easy way to low-ball the grant estimate is to leave stuff out that they never wanted to begin with. Imagine a contractor who does business like that. Oh, you wanted paint, too? There’s nothing wrong with this natural siding.
I have done the opposite – a research project needed a bit of out-of-discipline help from a climatologist and I have provided it free of charge. At the moment, I’m even listed in a grant as an unfunded co-investigator.
But that is what makes you, you.
While I have disagreed with you onissues over the years, I have seen demonstrated over that time that you have integrity and character as well as a hunger to understand and explain to others how things work.
You and Dr. Curry stand out in this mess as true pillars of the highest standards.
“At the moment, I have a great problem in temporal-spatial statistics that I’m working on, but no grant money for the problem. If anybody out there knows a statistician familiar with that area that will work for free, please put them in touch”
Depends what you want done. what do you want done? If you have an interesting enough problem (not a cook book thing) I’ll see what I can do.
it would help if you worked in R, cause thats the only work some of us will do for free. That means the work we do for free for you will get shared with everybody. there is no free lunch, but some dont cost money
Wouldn’t it make more sense to include the calculation and cost of a statistician in the original grant application? I’m pretty shocked that this doesn’t happen as a matter of course.
I work in an engineering company and all the costs of completing a piece of work (including commercial, legal, admin, H&S, Quality Audit) etc are included when calculating the costs (and hence the price to the customer) – not just the engineering effort.
Yes quite. The underlying problem though is dogged resistance to the very idea of auditing, be it by statisticians or anyone else.
I don’t see that the role of a statistician as part of the multi-disciplinary team would be that of an auditer. In my own experience, I have used statisticians to advise on the experimental design and the data collection as well as the subsequent analysis. Perhaps it is the mindset of statistician as auditer rather than statistician as a full member of the team that causes problems?
My point was that statisticians should be an intergral part of the team – I really don’t understand why you have a problem with that.
I don’t understand why you imagine I do, as I clearly indicated the exact opposite right at the outset.
Glad to see you don’t exhibit resistance to auditing yourself.
I see the problem as being one that thinks of statisticians only as auditers which seems to be where you’re coming from.
Again, I have no idea where yo get that from.
And even if there are such persons somewhere, whatever problems may come from an auditing-only approach to statistics, resistance to auditing is not one of them. The blame for that lies entirely with those being audited, and trying to hide and deceive.
It was you, Louise, who made the ludicrous suggestion that resistance to being audited, might be the fault of the person doing the auditing. You pointed a finger at the mindset of the auditor, implying that anyone who dared to check an argument, thereby cannot be a full member of a team.
It’s just more of the same old establishment anti-auditing mindset that the Climategate crooks and their apologists are so fond of.
No I did not.
I referred to people who think that a statistician’s role is that of being an auditer have the wrong mindset. Here, I’ve fixed it for you:
Perhaps it is the mindset of ‘statistician as auditer’ rather than ‘statistician as a full member of the team’ that causes problems?
This is just ‘consensus’-style attempt to dress up resistance to auditing, as being caused by the person doing auditing.
So Yes, actually you did say that – it flows logically from your comments. And resistance to auditing is prima facie evidence of fraud and/or incompetance.
My point was that statisticians should be an intergral part of the team – I really don’t understand why you have a problem with that.
I get audited all the time – that’s not a problem. I see the problem as being one that thinks of statisticians only as auditers which seems to be where you’re coming from.
There is of course no reason at all the statistician should not advise and contribute from the start.
But to suggest that resistance to being audited, is the fault of the person doing the auditing, is ludicrous. It’s precisely the sort of nonsense the fraud- and secrecy-riddden ‘consensus’ was built on.
“But to suggest that resistance to being audited, is the fault of the person doing the auditing, is ludicrous. ”
Who is suggesting this?
The climategate e-mails are populated with statements based on the idea of unworthy or unacceptable auditors and how the team can thwart those they dislike.
And the latest from CRU regarding FOIA requests demonstrate this as well.
if you have a statistician planned for on your team then you would have a better plan and a better chance of passing any audit by a genuinely neutral auditor, at least technically.
Mosher – OK, here goes:
Objective: interpolation of missing (radar wind profiler) data (two dimensions: time & height), preserving as much information in a particular frequency band as possible. Gaps in time domain vary from single data points to most of observation period. Gaps in height domain more likely at higher altitudes, so most gaps are not bounded by data.
Current approach: Beckers & Rixen-type iterative PCA interpolation, designed to extract frequencies of interest, for longer gaps (in time). Linear interpolation in time domain over smaller data gaps.
Problem: Rather than filling each missing point with either PCA or linear, use a maximum likelihood blend of the two, taking into account the 2-d size and structure of the data gap, the location of a particular point within the data gap, and the characteristics of the observed data.
Fun part for statistician: At some point the availability of large swatches of data on which to test the accuracy of the two interpolation procedures places a severe constraint on the number of independent tests that are possible. Is curve fitting necessary to avoid noisy estimates of accuracy? What do you do in general when the accuracy of the individual estimates is itself only poorly estimated? When does the maximum likelihood approach become impractical, and how do you avoid artificial discontinuities in the interpolated data if the interpolation method changes?
I’ve been working in Python, which can call R procedures.
As an aside, this probably has some relevance for paleoclimate reconstructions, though that’s not why I’m doing it.
Not have the cash to hire a statistician at one phase of your research is just a fact of life. Realizing that your research could use a statistician is common sense. Like the old western quote, “A man’s gotta know his limitations.”
I would seem reasonable, that depending on the potential impact of the results of your research, the journals, institutions or governments would pony up the cash for the statistician to verify the results. Post publication scrutiny is the real test. So it amazes me that some researchers are fighting the natural evolution of the system.
Don Aiken’s observation that statisticians need to be involved from the get-go, to prevent pointless design and data collection from getting under way in the first place, is key.
But I shudder to think about the intensity of the abuse from the group-mind, and the sharpness of the screws that would be applied, to said statisticians. They would need phenomenal strength of character to withstand it, and might end up simply providing more opaque smoke-screens.
To be polite – balloney (my first reaction was a lot stronger)
Statisticians are frequently included as a full member of a multi-disciplinary team where I work. They’re not subject to abuse of any other kind of harassment. They’re as much part of the team as the software engineer, the human factors practitioner and the systems architect.
And where I work, the testing staff are considered integral, contributing members of the software development team. Nonetheless, there are plenty of shops where a counter-productive, hostile relationship exists. Just because your enterprise does things right isn’t evidence of universality.
What Gene said. The resistance of the CRU-Krew, Hokey Team, and climatologists in general to input from competent specialists, especially statisticians, is becoming legendary.
I presume that this is because tight and professional sampling and analysis would rapidly discredit the kind of dreck they now publish.
If science were solid, there would be no roles for the statiscians.
This could be instantly implemented, by following Steve Mosher’s approach of dismissing out of hand any work that fails a test of complete transparency of data, methods, code.
This would apply too to work that while providing all its own data etc, cites in support, others that don’t. So something like the IPCC report would be rejected out of hand if any the of papers it uses failed the basic test.
And to implement this appoach, what is needed is a pubic register of all climate-related papers, along with an Accept/Reject-out-of-hand status. And if the journals in which each article was published were included, it would provide a means to asses the quality of journals too. Ditto the universites / institutions that employ the authors.
PUNKSTA: Do you not realize how many people your system would
de-employ? The circuitous methods used by these people has kept many
academics employed for years.
It’a actually pretty simple. in the same way a magazine has a fact checker
A science journal can employ a reproduceability checker.
That is a paper needs to be submitted with TURNKEY Code. The code should run and produce the charts and graphs and tables in the paper. There would
of course be exceptions for codes that take too long to run (like a GCM), but we need not let these exceptions rule.
At one point I thought of creating an organization to do this independently.
But, only so many hours in the day
There is another aspect here which has not been addressed. That is the issue of ‘cherry picking’ – select the material which supports your favoured hypothesis and ignore any which opposes or refutes it. In the government regulatory agency where I worked for 20 years the professionals were often asked to evaluate ‘literature-based submissions’. Eventually, the ground rules emerged in relation to literature searches and it became mandatory to provide detailed information on the specific databases searched, the time period covered by the search, the search strategy employed (search terms used, boolean logic used), the number of citations retrieved at each stage of the search, the selection criteria employed (for including and excluding citations) for further analysis. This was accepted as part of due diligence and scientific rigour. On these criteria, IMOP, IPCC failed miserably!
Omission of these checks invites selection bias – under which framework one can ‘prove’ almost anything.
And as to the relative merits of public vs private education, the answer is to put them on a level playing field in terms of public funding, and letting the public decide how they want their tax allocations spent.
This could be done by ending all direct government grants to universities and schools etc, and instead using the money to back education vouchers given to students (or their parents), to be used at the institution of the latters’ choice. These would then be redeemed by the government, using the same money as before.
This would maintain political funding of education, but remove the political control of it.
Your proposal assumes that the public is in a position to determine the merit of scientific proposals. While there are certainly major headaches associated with the current grant allocation system, they are nothing compared to the problems we would see if it became a popularity contest among parents.
Perhaps eventually papers such as this one, and the situation that it addresses, will lead to statistical analysis of research designs not being an add-on afterthought, but built into initial estimates for the research project as a necessary component. I am aware that some software developers seem to regard testing of the software as an afterthought, too, but not any of the developers I would ever trust. And would anyone even think of doing a research project without considering that data might need to be collected and analyzed along the way? I hope that the appropriate statistical analysis becomes integral to all research, and it is such a disappointment to learn that it hasn’t been so throughout the sciences.
Have just read the very enightening paper by Joseph Postma, as astrophysicist.
He explains the how the “Thermodynamic atmosphere effect” fully accounts for normal observations of temperatures on earth. There is no need at all for an atmospheric greenhouse, since earth’s temperature according to Postma is exactly as it should be. He could have written the paper as a first year uni student.
So how long will it take for the establishment to kill the unphysical CO2 greenhouse?
Thank god for science!
“Have just read the very enightening paper by Joseph Postma, as astrophysicist.
Thanks for the link. I think Science of Doom, Fred Molten, Pekka should read it objectively and then rebut. James Hansen, Michael Mann, Phil Jones are on gravy train will never see this article objectively.
First half of the paper of Postma is correct as far as a rapid reading could find everything. Then Postma starts to be sloppy and use all kind of averages without proper justification. It’s always difficult to tell precisely where the real errors are in a paper that doesn’t even try to be precisely correct, but is rather to give a correct general view of the matter.
Here the problems are related to the assumption that the effective radiative temperature of the Earth as seen from the space would automatically be about the same as the temperature of the troposphere at the altitude of about 5km. This assumption is, however, totally false without the greenhouse gases, as without greenhouse gases all radiation would come directly from the surface. Thus the 5 in Postmas formula (16) would be zero without the greenhouse effect and the temperature of the surface -18C, not +14.5C.
When this fundamental error is done, lot of totally false conclusions are drawn from it.
Thus the fact that the first part of the paper is correct doesn’t tell that the conclusions would have anything to do with reality.
As you note, Eq 16 is more or less correct. As is the explanation of the thermodynamics as far as it goes.
So I think he states a tollerable description of the thermodynamics of a semi opaque atmosphere, aka the Greenhouse Effect.
Ironically this is precisely the way the effect was and sometimes is described.
As I read Postma, he does not automatically assume anything. He draws the -18deg temperature altitude from observation (clearly a reference would be nice but is probably common knowledge for atmospheric scientists) and then computes surface temperature based on lapse rate. The fact that this is approximately at 5km altitude is a function of the atmosphere of gaseous mass and nothing to do with any “radiative greenhouse properties”.
He cannot do that without making the unsubstantiated assumption that I mentioned, or alternatively making some nonsensical assumptions like claiming that molecules emit radiation only upwards and not to all directions as they perfectly well known to do.
The comment of Fred Moolten presents one more experiment to check this fact. I presented a variation of the same idea, as Fred’s original proposal might have contained features that are difficult to realize in practice, while my proposal could be implemented easily. (Doing it in Chile is based on the fact that the results are most obvious, when the sky is guaranteed to be clear and the moisture of air low, but it could be done also elsewhere, when these conditions are satisfied.)
It’s not a joke, but full fact that the Earth surface would be cold without greenhouse effect and the counterarguments of Postma based on unsubstantiated and wrong assumptions.
@Pekka Pirilä “It’s not a joke, but full fact that the Earth surface would be cold without greenhouse effect “
A statement that can never be verified by any independent observer can never be a “fact”.
Personally, I found no major flaws in Postma’s description or logic, though I may have to take the time to independently verify the numbers he quotes.
Should I be surprised, or should anybody care (including yourself)?
The Postma paper regurgitates one of the common blogosphere misconsceptions about radiative transfer – similar claims were made in the Skydragon thread and they can be found elsewhere. I ordinarily don’t respond to them, but what intrigued me about Postma is that he describes a hypothetical experiment that can actually be done to test his claims. He states:
“For example, imagine a blackbody which is absorbing energy from some hot source of light like a light-bulb, and it has warmed up as much as it can and has reached radiative thermal equilibrium. The blackbody will then be re-emitting just as much thermal infrared energy as the light energy it is absorbing. However, because the blackbody doesn‟t warm up to a temperature as hot as the source of light, its re-emitted infrared light is from a lower temperature and thus of a lower energy compared to the incoming light that it is absorbing. Now here‟s the clincher: imagine that you take a mirror which reflects infrared light, and you reflect some of the infrared light the blackbody is emitting back onto itself. What then happens to the temperature of the blackbody? One might think that, because the blackbody is now absorbing more light, even if it is its own infrared light, then it should warm up. But in fact it does not warm up; it‟s temperature remains exactly the same.”
Well, that’s a testable claim, isn’t it? And in fact, it would not be impractical to test it, although not with a conventional “mirror” but with an infrared (IR) absorbing/emitting surface that can return some IR back to the heated object. It would take some setting up, but for $50,000 (winner take all and loser pay the experimental costs), I would be willing to do it if the materials are available. The exact setup would have to be refined, but I see it in terms of an absorbing body (e.g., a black object) in a vacuum chamber with insulated walls that are relatively transparent to IR at the start of the experiment, inside a large room that is kept at a constant cool temperature, and with a light shining through an aperture into the vacuum chamber (the light bulb kept at a constant temperature and radiance through appropriate thermostatic control). After the initial temperature of the object stabilizes, the chamber walls are painted black to send some of the IR back to the heated object and the temperature measurements are repeated. Postma claims the temperature won’t rise. Is anyone willing to put up $50,000 to support his claim?
How about an experiment done at night in Chilean desert close to the big observatories.
The setup is as follows.
A body of high emissivity (essentially a black body) is put under a cover made of plastics transparent to IR radiation and heated by a light source from one side. The cover would be built as a flat box to allow for the next step. When the temperature has reached a constant value another sheet of plastics is put inside the box described above. This time the material is styrox to provide good insulation, which allows the lower surface to respond to the influence of the black body without affecting radiation from the upper surface to the space. It he IR emissivity is not high, the lower surface is covered with material of high emissivity. Careful arrangements are made to assure that the sheet is not heated by any source warmer than the black body heated by the light source. Thus it will certainly remain colder than the black body, and this can be verified by measurements. Now check, what happens to the temperature to the black body after this second sheet has been added.
Fred, the fact that you assume the temp change would be so small that it would take a significant amount of money for equipment to measure it shows me just how much your claim is worth. In the real atmosphere a warming of that magnitude is meaningless.
I have a couple comments on the Postma paper,
For one, as many times as he mentions that the greenhouse effect violates the laws of thermodynamics, he never specifies what the laws of thermodynamics are that the greenhouse effect violates. He does state a nymber of times that heat always flows from hot to cold and never in the opposite direction which is utter rubbish. Heat flows in both directions independantly.
My second point is anyone who wants to explain the atmosphere of Venus, needs to explain why there is an overabundance of deuterium in the atmosphere of Venus. And explain why there is so much pressure.
“He does state a nymber of times that heat always flows from hot to cold and never in the opposite direction which is utter rubbish. Heat flows in both directions independantly.”
He is referring to the 2nd law of thermodynamics. Please read up on it and then review your statement… you may want to rephrase what you wrote a bit.
If there were no planets orbiting our sun, would the sun be its current temperature?
Yes because the difference with and without planets is minimal compared with the vast empty space. Only line of sight of the planets from the Sun has minute delay of the Sun’s energy flow towards the planets in accordance with radiation.
I meant “decrease” which is a better word than “delay”.
In my little theoretical solar system, the planets provide a greenhouse effect for the sun.
Right, the second law of thermodynamics, which does not prohibit the flow of heat from cold to hot.
Black body radiation from the earth shining on another object, heats it no matter what the temperature of the other object. Even though the other object may be hotter than the earth and actually heats the earth more.
Perhaps you and the rest are confusing the second law with earlier theories such as the caloric theory or heat transfer theories involving phogiston.
How did you past your thermodynamics at your university? Your professors must be sleeping or you did not take thermodynamics as one of your subjects.
Heat flows in both directions, how did you pass thermodynamics?
It must in order to satisfy the requirements of the Standard Model which limits the information a photon can carry to three pieces, two for the direction it is traveling and one for the energy.
Using mirrors and lenses, what limits the temperature that can be reached by focusing sunlight on an object?
The temperature of the surface of the sun or the amount of energy that can be transmitted to the target?
“My second point is anyone who wants to explain the atmosphere of Venus, needs to explain why there is an overabundance of deuterium in the atmosphere of Venus. And explain why there is so much pressure.”
Don’t know anything about the deuterium, but, definitely sounds like something worth checking into.
I would like to ask if you know anything about the difference in mass between the atmosphere of Venus and earth?? I think you will find an explanation of the pressure differential there.
Oh, since we are talking hypothesis here how about this explanation for the deuterium:
The electric universe theory, really?
Transmogrification of nitrogen into carbon, really?
You might study a little physics. What YOU call transmogrification or shapeshifting is solid physics. Ask the boys and girls at the accelerators what can happen under high energy levels.
I do O18(p,n)18F for a living dude, I am one of those boys and girls you are referring to.
I just call it transmogrification to honor Bill Watterson.
And I never called it shapeshifting.
Glad to hear I musunderstood your comment.
Bill made transmogrifier though. Sorry I missed the humor.
And I have never heard of any iron catalyzed nuclear reactions.
Found evidence of cutting edge cold fusion research involving iron or nickel catalysis, but not nitrogen into carbon.
Evidence of that would be worth a lot more than explaining what happened to Venus.
And that was evidence of research not evidence of cold fusion.
you are chasing your tail. What evidence do you or anyone else have that Venus was anything like the earth where nitrogen would need to be converted to anything??
That was in response to your posting the Thunderbolt link, which you were citing as an answer to where all the CO2 on Venus came from.
“And the nuclear energy difference between the nitrogen molecule and the carbon monoxide molecule is quite small. In the presence of the hot, iron-bearing surface of Venus, acting as a catalyst, that planet’s nitrogen was converted to carbon monoxide.”
was from your cite.
What was the highest level science course you have taken?
Apologies. I don’t multitask well and am on other conversations.
Your Nuclear Reaction term and talking about cold fusion that is not at high temps threw me.
Yes their hypothesis requires the high temps of the magma to work and is based on a lot of nitrogen, like Titan, as a beginning point. The physics of the reactions are real. The rest?? Who knows. I give it higher probability to conventional nebular theory due to known problems with nebular theory. Someone might be able to point out real problems with their hypothesis also, but, most simply ignore it as heresy without admitting the problems with conventional planetary science.
On the CO2 band broadening, what was the number they gave?? Was it an increase of 10%, 100% 200%?? My understanding is that it isn’t nearly enough without the cloud layer to reflect the IR and the surface is still cooling.
Perhaps Postma should mention the resultant heat flow is from hot end to cold end. You can always demonstrate that resultant heat flow in any laboratory when a hotter body is placed near to a colder body, the resultant heat flow is from hot body to cold body which get higher temperature and the hotter body gets cooler.
I don’t know anything about Venus deuterium. What has deuterium to do with Venus ground density/high pressure? Really like to know, please enlighten me.
Correct me if I am wrong. Venus has so much pressure must be due to heavy and thick layers of atmosphere. CO2 has a molecular weight of about 44 as compared with the Earth’s about 29, its a lot (actually 44/29=1.517 times pressure of the Earth) heavier and denser than the Earth’s atmosphere. Venus’s gravitational force is 88% of the Earth’s. So effectively with equal atmospheric heights, Venus will have a pressure of 1.517×0.88=1.335 times that of the Earth. Venus has 250km of atmosphere and Earth has about 120km of atmosphere. So the atmospheric pressure should be 1.335×250/120=2.78 times that of Earth. But the Venus ground level is 92 bar which is about 90 times that of the Earth’s. So what caused the high ground level pressure of the venus? Must be due to a lot denser (92/2.78=33 times of the Earth’s) packed Venus atmosphere?
Oh, forgotten to take into account of the ground level temperature of Venus and gas law PV=mRT should apply.
That’s net heat flow, which is the difference between the heat flow from hot to cold and cold to hot, which can be demonstrated in any lab.
The ideal gas law cannot be used to solve for temperatue in a situation where the gases are free to expand.
Where did all the CO2 in the venusian atmospere come from?
2nd law of thermodynamics is generally thought of in terms of net heat flow. You are correct in saying gross heat flows in both directions. This is the problem with Potsma’s paper as Pekka point out below.
“Where did all the CO2 in the venusian atmospere come from?”
If I had to hazard a guess, you already have an answer and are chumming the water.
What did you do?
Look up the Second Law and change your opinion?
That was the first thing I said, was that heat flows in both directions independantly.
If there wasn’t a greenhouse effect, there wouldn’t be all that CO2 in the Venusian atmosphere, nor would there be an excess of deuterium
It is the same as the evolution deniers using the Second Law to disprove evolution as the greenhouse effect deniers using it to disprove the greenhouse effect. They are both doing it wrong.
No… I understand the 2nd law. Your comment was phrased in a way that looked as though you did not understand what thermodynamic law was being talked about, further you did not specify net heat flow… similar to the way Postma did not specify which thermodynamic law he was talking about… it appears to me you are fishing for an argument with “deniers”… some call it trolling.
“If there wasn’t a greenhouse effect, there wouldn’t be all that CO2 in the Venusian atmosphere, nor would there be an excess of deuterium”
Can you elaborate on this comment? I’m not sure what you mean here. How does the greenhouse effect influence whether there is more or less CO2 or for that matter deuterium?
“It is the same as the evolution deniers using the Second Law to disprove evolution”
That is a new one for me, I have never seen the 2nd law of thermodynamics used to disprove evolution, how does that work?
Looking forward to your replies :)
Bob, please don’t keep us waiting. Tell us where the Deuterium and CO2 came from. Since you were there to watch I am sure it will be FACT and not HYPOTHESIS!!!
bobdroege, since when did gravity stop limiting the expansion of the atmosphere??
It hasn’t, but at any given temperature a small percentage of the molecules or atoms will have sufficient velocity to leave the atmosphere.
And the deuterium came from the water that is no longer there.
And the CO2 from the weathering of the rocks.
“And the deuterium came from the water that is no longer there.”
And you have this on good authority from what measurements made at the time??
Sorry, hypotheses are proof of nothing. The actual current data can be interpreted in several ways. Your accepted way is only one and probably isn’t the strongest interpretation.
But any hypothesis has to explain all the data, and the hypothesis that the temperature of Venus is explained by the high pressure, doesn’t do that.
And the CO2 is a trace gas hypothesis, therefore it has little effect is contrary to established gas theories such as Dalton’s law of partial pressure, which states the properties of gases are independant of the other gases present.
Of course the pressure broadening of the CO2 absorption bands modifies this idea a bit, but still it is the amount of CO2 present, not the percentage that is important.
Even McIntyre won’t say there isn’t a greenhouse effect.
Bob, the temp of Venus is almost 500C. The BB radiation is centered close to the H2O absorption band and between the two CO2 bands at that point (I think about 7nm). How does CO2 absorb all that IR with almost no H2O for feedback???? Your scenario starts with liquid oceans so the temps could have been from earths up to 100c or so. At those temps the CO2 starts on the shoulder and drops down as the temps rise till the BB is centered in the hole between CO2 and H2O.
I am told that if the earth had that many doublings of CO2 we are only talking about 40c without H2O feedback. Exactly how much do you think H2O can augment 40C??? Double? Triple? You are still a couple hundred C short of 450. There is little H2O now or for quite a while in the past. How do the temps stay up?? As the process in you scenario was happening the H2O was being lost reducing the amount available for augmenting. It simply is wishful thinking by Hansen and his models that the Greenhouse effect, as it is now understood even by Hansen and the IPCC, could ever have created Venus from a world even vaguely similar to earth.
Here is something from a guy a lot smarter than me and possibly you too.
One theory would be pressure broadening, but others are here
Lubos Motl and Steve Goddard, hmm,
I heard a rumor someone questioned Goddard on a specific property of water on WUWT and Tony threw Steve under the bus, and now there is a new climat blog.
but these are better explanations and I think Chris Colose is smarter than me.
The best for understanding the Greenhouse effect is Science of Doom.
One other point though, the sensitivity for doubling of the CO2 concentration calculation is only good for the one doubling.
Each successive time CO2 doubles you really need to recalculate, partly due to the pressure broadening effect, which is why you got the wrong answer when trying it with the changes with respect to Venus.
The best for understanding the GreenHouse effect is the Science of Doom. Uh, the Science of Doom who proclaimed that an error in math disproves the whole Moon Greenhouse thesis without completing the corect computations and showing the results?? The Science of Doom that proclaimed in the same presentation that since the earth emits about 350w/m2 and they measure about 250w/m2 at the satellite that shows the GreenHouse effect?? Without at least converting for area differential??
No, I am not claiming Goddard gets everything right. Few of us do. I do believe that Motl gets it more right than Colose.
This blog referenced by Kuhnkat
is one more example of the ubiquitous error that planetary atmospheres with a high surface temperatures and an adiabatic lapse rate could exist without the greenhouse effect.
Convection puts an upper limit for the lapse rate at adiabatic lapse rate. It does not put any lower limit or maintain such lapse rate unless some other mechanism is heating the bottom to add energy there. A high lapse rate leads to an upward flux of energy and that cannot be maintained without a compensating downward flux and that is provided by the greenhouse effect and IR radiation.
Oh, and Bob, I didn’t get the wrong answer. I don’t compute because I don’t know enough math and physics. What I do is sort through the claims and find the best ones. That is why I pointed you to Motl rather than Goddard or others on the Venus question.
Pekka waves its arms and gravity disappears.
Take a bicycle pump and plug the nozzle. Now press on the pump handle. The gas in the pump compresses and heats. If you keep the pump handle at a specific position after raising the pressure and temp in the gas it will cool to an equilibrium. If you continue to press with a specific amount of force you will continue to generate more heat even as the heat dissipates keeping a higher temperature.
Now hang a weight on the handle. Does this weight continue to provide work similar to your muscle which burns chemicals to provide the energy??? If it doesn’t, please explain what prevents it??
In other words, the atmosphere isn’t in a static vice type situation. Gravity seems to actually do work and raise the temp higher.
Sorry Pekka, argument from authority just ticks some of us off. You may be more correct because the magnitudes are in your favor, but, you have presented nothing to indicate that. Maybe you should wander over to Motl’s and discuss it with him as he CAN do the math. Y’all might actually figure out some of the problems with the sloppy scenarios we currently have if you actually worked at it. Or are you just here to be an apologist like Nick Stokes and others?
Similar arguments have been presented in the discussion of that site. I didn’t look at them carefully enough to tell, whether they were presented correctly in all details, but the basic argument was presented correctly.
Convection is initiated, when a forcing acts to increase the temperature gradient beyond the adiabatic lapse rate as heating the lower lying air will reduce its density and make it lighter than the equilibrium value above after taking into account what happens to the parcel of air, when it rises. Similarly cooling air at the top of the atmosphere makes it so dense that it starts to fall. The same thing can be expressed in terms of potential temperature stating that convection is initiated, when heating would lead to higher potential temperature for a parcel of air than the potential temperature is above that parcel.
On the contrary heating air at the top of the atmosphere makes it even lighter and prevents it from falling and transferring heat downwards. Convection leads to net warming heat flux only when heating is from below. Convection cannot warm the low atmosphere or ground, but only cool it. To have a hot surface we need radiation or an internal heat source of the hot body. These two must provide also the extra heat that is taken off from the surface by the convection that maintains the adiabatic lapse rate. The stratosphere is a well known manifestation of the fact that heating from above stops convection and leads to a temperature profile far from the adiabatic lapse rate.
In the case of Venus the situation is tricky, because so little solar radiation reaches the surface. Therefore the greenhouse effect must be very strong to make the total radiative balance formed by the little solar radiation and the downwelling IR to exceed the very strong IR emission from the hot surface, and leave a sufficient surplus for the convective cooling.
I’m not sure I understand this statement:
” …is one more example of the ubiquitous error that planetary atmospheres with a high surface temperatures and an adiabatic lapse rate could exist without the greenhouse effect.”
Are you saying that on a planet with only nitrogen for an atmosphere, the lapse rate would be zero?
your nicely crafted paragraph leaves out the reflection by the sulfur, chlorine and ??? clouds basically separating two atmospheres. Outside of that it simply states some basic facts. What was it supposed to accomplish??
I think you, as many others, are confused by trying to account for how Venus reached its current state. If the reality we can directly measure is not handled correctly how can we reach an understanding of how it was reached?
I have had recently a rather lengthy argumentation with Fred Moolten on that point with contributions from others as well. The discussion started from this message
with some related comments here
I restate here my views, which are based on well known physical theories. All counterarguments presented by Fred and others are in my opinion weak and without sufficient merit. Tomas Milanovic presented also differing views claiming that the temperature variations of the surface would induce strong mixing, but after some more thinking, which I have explained in the discussion, my view is now strongly that the atmosphere will warm up enough to make even that effect weak.
According to the second law of thermodynamics the equilibrium state for an undisturbed atmosphere with no IR absorption and emission is isothermal as any deviation from that would make the perpetum mobile of the second kind possible with fully realizable technologies like an thermocouple connected to the top and bottom of the atmosphere. (A comment of Quondam in the second link above led to this formulation of the argument.)
Thus it would indeed be true that the adiabatic lapse rate could not be maintained in an atmosphere of pure nitrogen, because such an atmosphere wouldn’t have any mechanism to release heat from the top of the atmosphere to the space (radiation from the surface would escape freely, but the atmosphere would not emit). Therefore the whole atmosphere would reach gradually the temperature of the surface. The temperature of the surface would be essentially the same as without any atmosphere. The temporal and spatial variations of the surface temperature would be reduced and some convection would be induced, but not so much and not reaching such altitudes that the main conclusion would change.
I’m not trying to present any comprehensive description of the atmosphere of Venus, but that doesn’t prevent me from telling, where a specific argument is contrary to the well known laws of physics (thermodynamics and very basic fluid dynamics).
Continuing on what I wrote to Willb.
Pierrehumbert discusses in Chapter 3.6 of his new book the properties of an optically thin atmosphere. That discussion starts with a formula that describes the energy balance of the top skin layer of the atmosphere requiring that absorption of IR radiation from the surface is equal to the emission of IR radiation from the layer.
This equation is valid as long as conduction can be neglected, and it cannot be used, if the emissivity is exactly zero. When the emissivity of the atmosphere is reduced towards zero, the point is finally reached, where conduction starts to be of significance for the balance. At that point the temperature of the skin starts to be larger than that given by the formula of Pierrehumbert. In the limit of zero emissivity, the temperature becomes equal to the surface temperature as the conductive term is proportional to the temperature difference between the top and bottom of the atmosphere, and as this is the only remaining term, when the emissivity is zero.
Convection does not enter this calculation as it’s initiated only, when the temperature gradient would be steeper than the adiabatic lapse rate without convection, and as the situation discussed here is safely on the other side, where the atmosphere is stratified and convection prevented by the large density differentials.
Compressing gas heats it.
Thank you for the summary of your viewpoint and the links to your previous discussions on this topic. One problem I am having with your viewpoint is that I can’t seem to reconcile it with the thermodynamic gas laws that relate temperature and pressure.
Consider again the (hypothetical) planet with a pure nitrogen atmosphere. Due to gravity, a negative pressure gradient as a function of altitude exists. Through convection, heat is distributed uniformly from the planet surface to the top of the atmosphere. Assuming the ideal gas law applies in this situation, the pressure gradient causes the atmospheric temperature gradient to follow the adiabatic lapse rate. No heat is required to escape the atmosphere in order to maintain this lapse rate.
As I understand your viewpoint, I believe you more or less agree with this scenario as far as it goes. However, you maintain that for a nitrogen atmosphere the lapse rate will gradually disappear through conductive heat transfer from the lower to upper atmosphere. So your conclusion would be that convective atmospheric heat transfer results in an adiabatic lapse rate but conductive heat transfer will gradually zero out the lapse rate. Is that a correct assessment of your view? If so, my question to you is:
In your opinion, what is the fundamental difference between convection and conduction that causes this difference in lapse rate results? For convection, heat rises as hot gas molecules move upward in altitude. For conduction, heat rises as molecules come in contact with each other and exchange energy, passing the heat energy upwards in altitude through a chain of molecules. I don’t see any fundamental difference between these two heat transfer mechanisms as they relate to the lapse rate.
Yes it does, but only when it gets compressed. Air parcels that go down get heated, but the point is that no air parcels will go down, unless radiation maintains a situation, where they will move up and down (nothing will go down, unless some others go up).
I repeat: Heating from below will lead to convection. Then rising parcels cool and falling parcels warm. Without heating from below all that stops.
The fundamental difference between convection and conduction seems to be a very different point. That is the point that Fred Moolten was also uneasy about. Therefore it was the subject of a couple of my comments in that discussion.
We have two approaches. One is to use the second law to prove that it must be true without explaining, how that is materialized at micro level. That is done by describing, how a perpetum mobile of the second kind could be built unless the equilibrium would be isothermal in absence of external heat flux through the gas column. That is the proof proposed by Quondam and made simpler in some of my later comments. From that proof we know, what the correct answer is, but we don’t know, how it’s realized in detail.
Quondam provided also this link
The second approach enters the micro physics, i.e., the kinetic gas theory with gravitation. There the difference is that conduction is totally due to motion of individual molecules, while adiabatic convection is related to the motion of air parcels large enough to neglect everything related to individual molecules.
In the derivation of the adiabatic lapse rate, it’s assumed that no molecules cross the parcel boundary. It’s assumed that a thermodynamic equilibrium is maintained inside the parcel, but violated at the boundaries. When the parcel is big enough, this is a good approximation. I’m not going deeper in the derivation of adiabatic lapse rate, as it can be read in many sources.
It’s much more difficult to find microscopic discussion of conduction with gravitation. This paper does some of it
but it doesn’t do the full analysis, which may get involved, although I have some ideas, how it could actually be done without excessive mathematical complexity.
The main idea is clear and form by combining the following two effects and realizing that they compensate each other exactly:
1) When a molecule moves up, it’s vertical speed and kinetic energy decrease.
2) Particles that are slow are less likely to go up and those with least kinetic energy can move only very little up.
These two factors lead to lower density at upper level. The natural thought is that the first point would lead to lower temperature higher up, but checking the details shows that the second point does indeed compensate the effect completely, and the temperature is independent of temperature. I think that this is indeed a rather difficult point, but the effect is certainly there, and the first proof by the second law and the mathematics of Quondam’s link show that it must lead to exactly isothermal outcome.
Yes Pekka, as gas is compressed it heats.
When a parcel drops due to a warmer parcel rising there is greater pressure caused by the gravitational influence increasing on the particles as it falls increasing the pressure on it and heating it. It is just the opposite of what happens to the rising parcel. If the rising parcel cools the falling parcel will warm.
Still you say, what happens when gas drops, but that is irrelevant, when it doesn’t drop. The whole point is that all vertical motion stops without radiative heat transfer in the atmosphere, i.e. without the greenhouse effect.
You describe things that are true in the presence of the greenhouse effect, but absent without the greenhouse effect. The laws of physics of convection are the same, but the physical situation is different. With greenhouse effect we have convection and the adiabatic lapse rate, without greenhouse effect the temperature gradient is reduced below the adiabatic lapse rate and the convection stops.
The convection is not the mechanism that maintains a large lapse rate, it is the mechanism that prevents the lapse rate from exceeding the adiabatic limit, but it does not prevent smaller lapse rates, but stops when those are present allowing the lapse rate to be reduced even more.
This is the common mistake in literature including the blog by Motl. All these writers miss the main point that the adiabatic is the upper limit, not a rate that always results. The convection doesn’t provide any lower limit for the rate of temperature decrease with altitude or forbid increasing temperatures as proven by the existence of the stratosphere.
Quondam’s proof is interesting. He uses the 2nd law of thermodynamics to prove that the gas column is isothermal. Yet his conclusion clearly violates the zeroth law if the gas column is adiabatically isolated and the adiabatic lapse rate in the column is not maintained.
Or are you saying that the adiabatic lapse rate is mis-named and the lapse rate has to be zero to be truly adiabatic?
Adiabatic lapse rate is the lapse rate that results from adiabatic convection, when it is present as it is in the tropospheres of the Earth atmosphere and the Venus atmosphere at least at low and middle latitudes. In polar winter, there is not enough convection even in the Earth troposphere to maintain the adiabatic lapse rate.
There is thus a good reason for the name “adiabatic lapse rate”, but it doesn’t imply that it would be the equilibrium or maximum entropy state in an atmosphere that is not heated from below by radiative heat transfer, and cooled by radiative heat loss to space from the top layers of the troposphere.
The adiabatic lapse rate represents a stationary state of a system in (radiative) thermal contact with exterior systems, but not an equilibrium state of an isolated system. It’s a stationary state only, when there is a continuous convective net energy flux upwards.
I just wrote a related note and have it now available at
This note presents a mathematic derivation that shows, how the isothermal atmosphere is possible with gravitation. We have the direct proofs from second law, but they do not discuss the statistical thermodynamics or kinetic gas theory background that is described in this note. The derivation is mathematically simple, but I do not expect that it’s as easy to understand its basic idea or to get convinced that this is a valid analysis.
Similarly the mathematical derivation linked by Quondam is simple, but it’s not as straightforward to know that all formulas used in it as starting point are valid. These issues are complex enough to require much more knowledge for deciding, whose derivations are valid and whose not. After all my entry into this thread was based on the claim that certain commonly presented arguments are not valid physics. I can provide my views on these issues, and I can satisfy myself that I know, what I’m talking about, but real learning of somebody else has occurred only when a reader of these arguments can herself understand, why some arguments are valid and others are not.
If the nitrogen atmosphere on a hypothetical planet were to become isothermal, then, although the temperature gradient would be zero, the heat gradient (per mole of gas) would be positive as altitude increased. This would mean heat would be pooling at the top of the atmosphere. In addition to the extra heat energy, the nitrogen at the top would also have increased potential energy due to gravity. So despite the fact that there is a single uniform gas between the surface and the top of the atmosphere, the atmosphere’s isothermal characteristic means that an energy imbalance exists from top to bottom and the atmosphere is therefore not in thermal equilibrium.
Thank you for the link to your barometric derivation paper. I am reading it now and so far I have one observation. One of your assumptions is:
“The time differential is also taken to be so small that collisions can be left out of the consideration.”
With this assumption I believe you are analyzing only convective heat transfer, so I am somewhat surprised you do not end up with a derivation of the adiabatic lapse rate.
Thermodynamics tells that the equilibrium is isothermal, not that of equal average energy at all altitudes. The main content of my note is to explain, how this seemingly contradictory result is after all natural and understood. The point is discussed also in the earlier thread and in the paper that I linked to both threads earlier.
The assumption concerning collisions is that they do occur and maintain local thermal equilibrium, but the time span considered in the derivation is so short that the number of collisions is small enough for being ignored in that calculation. It can be argued that the collisions do not change the result at all, when the equilibrium is as described, but that would take more effort than I’m going to use now.
If, as you maintain, a nitrogen atmosphere must become isothermal to achieve thermal equilibrium, then do you agree or disagree that there will be much more heat energy contained in each mole of nitrogen gas at the top of the atmosphere than at the surface? If you agree, then one consequence of this is that the gas molecules at the top of the atmosphere are moving at a much higher average velocity than the gas molecules close to the surface of the planet (from the kinetic theory of gases). Furthermore, it appears you are saying that this is an equilibrium condition. I think your theory needs to explain what is preventing the molecular velocities from being gradually distributed randomly throughout the whole extent of the atmosphere.
Regarding your explanation for the meaning of adiabatic lapse rate, I don’t think you can say this:
“The adiabatic lapse rate represents a stationary state of a system in (radiative) thermal contact with exterior systems …”
The definition of the term “adiabatic” means that heat does not enter or leave the system, either via conduction or radiation.
No, the gas molecules are moving with the same average speed throughout the isothermal atmosphere. The additional energy of the molecules at the top is potential energy in the gravitational field. Average speed and temperature are directly related by the relation m<v^2>=kT, where the angular brackets refer to average. This relation is valid at all altitudes.
The problem that many have had in accepting that the equilibrium can be isothermal is related to thinking that the molecules must be moving slower as they have lost energy, when moving up. Before your message nobody has proposed that they would be moving faster.
The resolution of the controversy is in the observation that the influence of gravity makes the upper atmosphere thinner, and that making thinner is enough to allow for the average speed to remain constant. Thus the molecules of every speed are less likely at high altitude, but the ratio doesn’t depend on velocity for all altitudes at equilibrium. The combination of exponentially decreasing density and Maxwell-Boltzmann distribution of velocities leads to precisely isothermal result in a way consistent with dynamics of individual molecules – and conduction is a phenomenon based on the dynamics of individual molecules.
Convection is different, when it occurs. It moves parcels of air and in that all molecules of the parcel move together including those with too little energy to move any more up based on their own energy and also those that are moving down at a particular moment, because their relative downward velocity compared to the parcel is larger than the upward velocity of the parcel as whole.
There is a selection based on vertical velocity in the conductive process, but no such selection in convective process. Therefore the parcel cools and convection leads towards adiabatic lapse rate, but conduction leads towards isothermal atmosphere.
When convection is present, it dominates, but when it stops, we have only conduction (and radiative heat transfer, if that is present). When convection stops, it remains stopped in absence of heat flux going continuously upwards, which requires radiative heat loss from the top.
In the case of adiabatic lapse rate, adiabatic refers to the convective part of the whole process. The movement of air parcels is assumed to be adiabatic in the derivation of the adiabatic lapse rate. What happens at the bottom and at the top is not included in this consideration.
Neglecting conduction and radiation, a gas column with any temperature profile which is not cooling faster than the adiabatic lapse rate is stable as it’s stable against convection and everything else is assumed absent. Starting from an lapse rate higher than adiabatic, convection brings the lapse rate to adiabatic and stops at that point. We would have a stable atmosphere with adiabatic lapse rate. It would not be the maximum entropy state, but there would be no mechanism to increase the entropy. In that imaginary world that would persist for ever.
But the conduction exists. It’s so weak in the atmosphere that it’s usually neglected in comparison with convection and radiative heat transfer, but in absence of the others, the convection is enough to make the atmosphere isothermal.
Mixing by convection induced by spatial temperature differences at surface might actually speed up the large scale transition towards isothermal for a nitrogen atmosphere, while the surface temperature differences certainly also perpetuate some limited variations for the atmospheric temperatures. This combination of influences is likely, because the heat transfer from the hottest parts to the atmosphere is much more efficient than from the atmosphere to surface at lower temperature, because the first case induces local convection, while the second stops local convection. Thus the atmosphere is heated effectively until it has reached a high enough temperature to slow down this mechanism.
Some low altitude convection cells driven by the surface temperature differences will remain, but they’ll not reach high altitudes, because the air there is already too warm for the convection to penetrate. I.e., the lapse rate is less than adiabatic above some rather low altitude, which is highest close to the point, where the sun is at zenith, and at surface level at poles and on the dark side, where the adiabatic lapse rate is totally absent, if the days are long.
kunhkat, Pekka et al.
For a quantitative rigorous thermodynamic development of the lapse rate, see:
Robert H. Essenhigh
Prediction of the Standard Atmosphere Profiles of Temperature, Pressure, and Density with Height for the Lower Atmosphere by Solution of the (S−S) Integral Equations of Transfer and Evaluation of the Potential for Profile Perturbation by Combustion Emissions
Energy Fuels, 2006, 20 (3), 1057-1067 • DOI: 10.1021/ef050276y
The abstract contains a reference to “a gray-body equivalent average for the effective radiation absorption coefficient, k, for the mixed thermal radiation-active gases at an effective (joint-mixture) concentration, p”. That makes me wonder, what is the value of the solution as it’s not possible to describe the real atmosphere by a grey-body equivalent average.
In any case this calculation has no bearing on the issue discussed above as we have been discussing the case of totally transparent atmosphere, not a grey atmosphere, and as all the issues are specific to the totally transparent case.
“The resolution of the controversy is in the observation that the influence of gravity makes the upper atmosphere thinner, and that making thinner is enough to allow for the average speed to remain constant.”
Pekka – I don’t want to prolong the discussion in these columns, because it’s irrelevant to Earth’s atmosphere, but as you know, I’m reluctant to conclude that an adiabatic profile is not the equilibrium state for a non-emitting atmosphere coupled to a planetary surface, where equilibrium and entropy apply to the entire planet and not merely to an atmospheric column in isolation. Perhaps the discussion could continue via your website.
I would be more willing to accept the passage above that I quoted if it can be made quantitative. My thought experiment involved an arbitrarily chosen dividing line between an upper and lower portion. Over any small time element, the number of molecules crossing downward must equal those crossing upward, and their mean kinetic energies must also be equal. However, while the Boltzmann distribution describes the mean, we can’t assume that in a gravitational field, the energy distribution of the molecules that were traveling downward was equal to those had traveled horizontally or upward from the same starting level. If that is true, it must be shown rather than assumed. An adiabatic profile might be expected if the downward molecules had a higher mean energy than those moving in other directions, and the upward moving molecules a lower mean energy.
Regarding entropic considerations, I infer that an isothermal column involves greater potential energy than an adiabatic one. This should mean that the planet has to do more work to move the heat upward against gravity, thereby distributing the heat away from the distribution gravity would find for it. Does this not entail an entropy reduction for the planet outside of the atmosphere? Has this been considered in evaluating the maximum entropy state for an atmosphere coupled to the surface?
I’m not committed to a particular conclusion, but I remain agnostic about extrapolating conclusions regarding an “isolated” gas column to an atmosphere in equilibrium with the rest of a planet.
This discussion is indeed mostly off-topic, and the indenting of this site makes following it or participating in it cumbersome. Therefore I copied two of my above messages to my own site to the thread “Random topics”. Those who are interested are welcome to continue there.
I simply describe reality. You seem to pick and choose what you include in your thought experiments. In your world there is no viscosity, no rough surface, no rotation, no conductivity, no radiation from collision by anything but GHG’s no, solar wind, no currents, no moon and tidal effects…. You ignore that for the gas to have gotten to altitude there had to be some process you don’t consider to get there. Sorry, the simpl minded models do not represent our earth and even without GHG’s your simplified processes do not either.
The only reality is with GHG’s, but nothing in your list has any influence on the basic validity of my arguments. Viscosity is a part of the theory that I’m describing. Some other details do cause minor disturbances, but I have already mentioned that repeatedly. I have also referred to the more substantial comments of Tomas Milanovic and explained why I don’t believe that even they change the results essentially.
Totally isothermal atmosphere can exist only in isolation, but a fully transparent atmosphere can be close to isothermal even with some disturbances.
Sorry Pekka, isothermal transparent atmosphere fails the smell test.
Warming of the ground/water transmits heat by conduction to gas near the ground/water which must convect and the atmosphere in gravitation must become less dense with altitude.
I wonder about your suggestion that nitrogen is completely transparent. NOTHING is completely transparent AFAIK except vacuum, and that is still being argued. It isn’t even transparent under atmospheric conditions.
Unfortunately even with this transparent gas the heating is also from the top unless you also wish to postulate no UV from the sun.
I will suggest that thought experiments are fine and all, but, so many of them ignore real physical effects that make them pointless.
Then there are the northern lights. Even before gasses glow they are being affected by current flow. Ignoring all these disparate effects may make it easier to compute what you THINK is happening, but, leaves us with the wrong impression and confusion when our computations do not match the reality.
You have persuaded me that your argument has merit. However, I think that Fred Moolten’s arguments and thought experiment are persuasive as well. I am going to take some time to study the papers you and others have linked to.
Thank you for the interesting discussion.
You are right that even N2 is not fully transparent and you are also right in that it’s less transparent for UV. Thus the all nitrogen atmosphere wouldn’t actually be exactly isothermal, but warmer at the top like Earth’s stratosphere (it would also have deviations close to the surface due to the different surface temperatures).
What is transparent enough for being considered fully transparent in this context is determined by the comparison of thermal conductivity and absorptivity/emissivity. For real analysis both the conductivity and the very weak radiative heat transfer should be taken into account. As you noted, the largest difference comes from the UV heating of the atmosphere, which must be balanced by heat conduction downwards to the surface.
The reason that I entered this thread was, however, specifically to point out that the blog of Motl repeated a common misunderstanding in the role of convection and why the adiabatic lapse rate is so prevalent in planetary atmospheres. In these real planetary cases we see a strong continuous convective flux of energy through the troposphere and this can persist only as long as it’s driven by radiative heat transfer cooling the top of troposphere and heating the bottom by a combination of greenhouse effect and solar radiation.
There is no greenhouse effect on the moon, as there is no atmosphere there.
How can you determine the correctness of the Science of Doom calculations if you don’t have the math and physics chops to check it yourself.
And for the HAHAHAHAHAHAHAHAHA
do grow up,
I shant be responding to you again.
You are right, there is no Greenhouse Effect on the moon. There is also no Green House effect on the esrth if you want to get technical. On the earth we are calling radiative effects Greenhouse whereas a greenhouse is based on limiting convection.
On the moon we have the heating and cooling of the soil and rocks providing an interesting effect similar to greenhouse. The typical thought experiment which only uses radiative calculations computes a higher day temp and no night temp. When we take into account the energy conducted deeper into the soil and its conduction back out and radiating we see that the actual moon temps do NOT get as high during the day as the pure radiative comps and do NOT get as cold at night.
This is similar to what we call the greenhouse effect here on earth . A moderation of the temperature extremes. Of course, here on earth the modelers do NOT take this KNOWN flux into account in the models so there is extra forcing provided by CO2 or other components to cover this lack. One of many small fluxes not accounted for in the models.
SOD of course found an error affecting the magnitude of the work showing this. He then claimed it was disproven. Well, the magnitude was disproven, but, the effect still exists even if much smaller. Whether CO2 gives 3c per doubling or only .05c per doubling it would still be a legitimate effect.
Most of these “proofs” that greenhouse effect is impossible are based on mixing gross heat flow and net heat flow at some point. I.e., they start with the correct argument that the net heat flow is always from the hot to the cold, but then they apply that separately to that one part, which is the gross heat flow from the colder to the hotter, which can of course be positive without contradiction with the second law as long as the gross flow from hot to cold is larger. Postma is doing that just error.
The classical thermodynamics discusses only the net heat transfer, but that doesn’t prevent us from going deeper and looking at both directions separately, which is easy and natural in case of radiative heat transfer, but unpractical in case of conduction. Classical thermodynamics is a highly abstracted mathematical theory, which leaves many details out of consideration. That doesn’t make these details any less real and true.
“that just” -> “just that”.
I would really appreciate the possibility of correcting comments for a few minutes after they have been posted like some discussion forums have.
Pekka – if you would verify all your heat flow from hot to cold and cold to hot, then your net comment would not need corrections.
IR radiation up and down can be measured separately, has been measured, and has been found to be in agreement with expectations based on standard theory.
Similar measurements are done in industrial numerous applications.
All of us can measure, how IR radiation varies depending on the temperature using cheap IR thermometers. They give useful results also, when the surface, whose temperature is being measured is somewhat colder than the device.
There just isn’t any credible place left for doubt on the validity of the standard theory of radiative heat transfer.
While not qualified to agree, I nevertheless agree wholeheartedly.
I apologize if this is a stupid question, but I still want to know if some of the long-wave radiation leaving the earth’s atmosphere makes its way all the way back to the sun?
Or does the earth conveniently not radiate in the direction of the sun!
JCH – In the absence of a mysterious new principle of physics, we can say that some of the Earth’s longwave IR radiation does reach the sun, keeping the sun slightly warmer than it would be otherwise. The magnitude of the effect is negligible.
Some of the IR radiating from Earth reaches the Sun, but the energy that Sun receives from Earth is only 50 trillionths (based on my own calculation from the temperatures of Sun and Earth, the radius of the Earth orbit and the radius of Earth) of the total energy emitted by Sun. Thus all planets combined have a totally insignificant influence on the radiative energy balance of the Sun.
I believe the original question was –
If there were no planets orbiting our sun, would the sun be its current temperature?
And it occurs to me that there’s another mechanism involved – With no planets, the shape of the Sun would change slightly due to lack of gravitational effects, thus affecting the total area and the radiation characteristics. And, I suspect, the internal processes.
All very minor effects (probably) with very minor changes in output.
But that’s the point: while small, the trace planets are providing a greenhouse effect. If mankind was emitting lots of new planets, eventually the heating of the sun could become significant. Without mitigation of our out-of-control planet creation activities, we might, I don’t know, melt the darn thing out of the sky.
Without mitigation of our out-of-control planet creation activities, we might, I don’t know, melt the darn thing out of the sky.
Not to worry, it’ll eventually melt itself out of the sky.
Have patience, but watch out for that last dying gasp. :-)
Lots of arm waving Pekka. Show us where his math is wrong.
Has it been six weeks since April Fools’ Day already?
To renew my objection now, as it ws then, “At the page 16-17 break, Postma appears to explain perfect understanding of the Greenhouse Effect of water vapour comparing desert to rainforest at night.
Why doesn’t this occur to him by the page 1-2 split?”
Special pleadings that what applies to water vapor doesn’t apply to CO2, all other things being equal, is just plain illogical.
Can’t we admit when we’re rehashing mistakes uncorrected?
when you find those two special transitions in CO2 in liveable temp range like H2O has let us know.
Irrelevant, but also pointless.
Then you need to be more explicit in your complaints:
“Special pleadings that what applies to water vapor doesn’t apply to CO2, all other things being equal, is just plain illogical.”
H2O is one of the more special things in existence and special pleadings are all about its special properties. Isn’t that just special?
You need to read harder before posting reactionary nonsense as a reflex.
For more explicit details, follow the links provided above to the original.
Sorry Bart, I am ignorant. You better explain to me in detail what you are whining about. I simply am too dumb to see it.
“You need to read harder before posting reactionary nonsense as a reflex.”
The author of that needs to dismount his self-crafted pedestal.
In retrospect, I concur.
One need not read very hard at all before posting reactionary nonsense as a reflex.
Thank you for helping me see that more clearly.
Unless there is a great more scandals, the science will make the same mistakes and keep going on the same track.
Already there is a great shifting of precipitation and cooling occurring that current science cannot explain.
Yet they keep pushing global warming as the cause until ALL credibility with scientist as experts are lost. This then WILL effect government grants.
Dear Dr Curry
This post highlights to me, an important problem in the climate change debate. This very issue has been covered in the blogosphere and yet, no one knows!
Starting from the Climategate inquiries failure to go into Phil Jones’ (CRU) refusal to provide the complete metadata required for his seminal publication Jones et al 1990, and Douglas Keenan’s efforts to look into this matter, both Bishop Hill and I confronted multiple rounds of obfuscation about journal policies and the larger issue of data availability in climate science.
The initial chain of inquiry, involving emails exchanges with Nature magazine’s editors and Douglas Keenan, eventually led to this post: Data availability in climate science: The case of Jones et al 1990 and Nature.. It was linked to from Bishop Hill as well who added details of other dimensions in his post: More on Nature’s data policy
In January this year, came the admission from Jones to Nature magazine, that he will not be able to provide the supporting data and does not intend to do so, a little over a year after Climategate. This occasioned the second detailed post: Data availability and consequences and climate science, where the Duke University cancer researcher Anil Potti’s issue was examined at great length, with its immediate relevance to the climate debate on data availability.
This was followed by an editorial at Nature magazine who took up the exact same issues considered in the previously linked post – of data availability in climate science and genomics research. The saga of Anil Potti was revisited once again with this post: The code of Nature: making authors part with their programs, in the broader context of reproducibility of scientific research. This post was picked up by Steve McIntyre on his Climateaudit blog as well, and, posted on WUWT. The blog Duke.Fact.Checker has been instrumental in providing an enormous wealth of details on the Anil Potti case and bringing to light the numerous potential and actual conflicts of interest that led to Duke University handling the issue the way it did (the very reason for why it did not ‘admit and correct mistakes’). The end result is damning. If you google “Duke.Fact.Checker”, and spend about half-an-hour reading the posts tagged ‘Potti’, you’ll see how ugly the whole mess is.
Even in the latest editorial at Nature magazine, the journal notes that Anil Potti, curiously enough, has managed to repair his online image because Internet searches throw up misleading positive results on the first page, when pages and pages of text and miles of newsprint documenting the sorry Duke episode exist online. It seems, Nature’s editor is seized with promoting open data access and research data availability as his latest initiative with the Royal Society shows.
In short, the ‘skeptical blogs’ have covered this extraordinary event more than adequately and the context of the Duke story in climate science issues is acknowledged even in leading journals, but there is absolutely no response, no reaction, and no feedback from the ‘mainstream climate community’. People like Darrell Ince, not to take away from his efforts, have really only followed along the same pathway, even hitting upon the very same landmarks noted in the above posts. The Potti episode has been an enormous confidence-shaker in bioinformatics research – a field that carries many statistical analytic, data complexity and high dimensional similarities with climate science. It has been an enormous confidence-shaker in the integrity of oncology research in recent years.
It is the taking of the lazy route in research to just follow temperature data and then generate a model with no concern for any planetary changes or precipitation pattern changes. Simple to generate a mathematical formula for that may be tweaked than looking at all the areas that have influence on temperatures(for only the last 150 years out of 4.5 billion years). This does not include cloud cover and storms NEVER cross the equator, or planetary mechanics of rotation and motion, or the shape of this planet.
Shub, thanks much for this background info and your efforts on this issue.
Thanks for the kind words, Dr C. Just went off on a bit of a rant there. :)
The fact that, over a year after climategate, the same people who were exposed in climategate are doing exactly what they were described as doing in the e-mails should deeply disturb any climate scientist who is concerned about ethics.
When is a group of climate scientists going to stand up and be counted on this in clear, no uncertain terms?
The responses of organization like the AGU in hiring transparent hacks to ‘improve communications’ is not working, and is not going to work.
The entire strategy of focusing on ‘improving communication’ is a fraudulent effort.
There is only one acceptable response:
To call out those who are fibbing, toss out their work and all work based on it, and to start over.
The rest is wasting time and tax payer money.
I completely agree.
Talk about the roll of uncertainty in climate science, just doesn’t cut it. All science is uncertain if people make up the results!
Major components of this ‘science’ consist of deliberate fraud – either at the time the work was done, or shortly afterwards when those responsible discovered that they could dismiss even the most cogent criticism with waffle – and get away with it. Think for a moment about the clear language that the Oxford physics professor, Jonathon Jones used in relation to Mann’s Nature graph. If only all his colleagues had joined him.
Nobody will take any notice until a number of big names give interviews in which they call foul in the most unambiguous way.
Thank you for the kind words.
My bet is our hostess will be involved with some of the inevitable calling out of the hypesters.
A decade or so ago I worked on my Mphil project at a teaching hospital associated with the university of Manchester, and just happened to be biletted in the room next to the resident Phd statistician. Nearly every day, medical staff involved in research would troop in and I often overheard what was going on, as the door was usually left open.
The medical staff would consult the statistician right from the start, before the experimental design was finalised, and keep in contact as their research progressed.
Thinking about this in retrospect, I can see how useful it was in ensuring the quality of the research being carried out at the hospital. The thing is, though, the medics knew they weren’t trained scientists (still less statisticians), and realised it was in their interests to avail themselves of whatever help they could get in studies that almost always require statistical analysis.
It seems to me that what is required is a change in attitude of trained scientists whereby they become acutely aware that they are often neither competent statisticians nor software engineers. It should be accepted as a matter of course that experts need to be consulted and involved, and that there is no loss of kudos in doing so.
Academia needs to be able to fund consultative services and accord them a certain amount of clout within the purview of their particular fields. In the end, it would improve the conduct of research, and the confidence of the public in its outcomes.
“Rules” is rules!
Funny thing about “rules” they’re very often all about protecting the rule makers, not the public, or the profession, or…. etc., etc., etc.
Funny thing about BIG organizations, too, their first reaction is often to defend their work (and employees) against all comers by asserting the integrity and validity of the matter in question, rather than keeping their PR mouth shut and learning the truth. Very human, but not good for business or their reputation. What a waste!
Where oh where did “integrity” go? Everything today is about rules and laws and whatnot. Honor is dead! It’s just one, big, CYA world!
Good post, Judith. However, until the climate science funding agencies demand full disclosure, this will go nowhere. They have the leverage, but lack the interest. Universities will be of no help on this issue. To guard their reputation they will lie, cheat, and conceal in the most clever ways. I am not holding my breath!
I have an increasing feeling that western science has become complacent and corrupt. The BBC uncovered a major scandal regarding the fact that cancer researchers were cutting corners to save time and money by not checking their cell lines for contamination. The problem is that some contaminants grow faster than the desired cells, and gradually replace them. Research done on the result is totally meaningless.
The date of that program was 2007, and the scandal had already been brewing for some time, yet a quick GOOGLE turns up further instances of the same problem 4 years later! As the program explained, once a cell line was found to be contaminated, numbers of published papers could become invalidated – which made researchers even less keen on getting the checks done!
There were people spending money painstakingly gathered by fundraising events!
As Shub Niggurath | points out above, these awful scandals never seem to bite – people stay in post, and the general public barely remembers that there was a problem.
Yes, it is very corrupted and it did not start with the climate science. This is very difficult to see through and accept for many people. It was very difficult for me to accept when I realized it a decade ago. I was a believer and lover of science and I accepted everything the established science claimed. Now I think it’s all cargo cult science.
There is no real science in the establishment. Science must be “contrarian” to be fruitful. All else is buearocracy.
Just to be clear, I am not claiming there are no honest and uncorrupted scientists in the establishment, but that the system is bad. All the forcings and feedbacks are positive (dogma, supression, corruption, buearocracy…).
We have runaway science corruption.
My brother and my brother-in-law are both 7-year survivors of cancers that were once highly lethal. They were not save by laetrile, or any other contrarian science crap.
I think even more people could be saved if science was less buearocratic and more open-minded.
I don’t think Edim said anything contrary to that. It is good news that your relatives were saved, but clearly more people could potentially have been saved if research money had not been wasted on experiments performed on the wrong kind of cells.
Science isn’t totally a cargo cult, but the real problem is that it is all but impossible to know which bits to believe and which are garbage. The truth – as in climate science – is going to be hidden really deeply.
And my father, a niece and I think acouple other relatives DIED of cancer. Proves nothing about our current level of medical capability or the honesty/dishonesty in medical science.
Let’s treat the analogy as valid, leaving aside the concerns with that. It may provide a useful perspective. Here we have a study about chemotherapy which was apparently flawed, and needed greater transparency and maybe help from a statistician. The scientists involved reached incorrect conclusions, and took a long time to recognize that fact.
Suppose someone were to come forward and, on the basis of this, claim that cancer did not exist, but was a delusion dreamed up by “progressive” physicians to make money and expand their influence.
I think most of us would say that was absurd — to go from criticizing a particular study to discounting a massive amount of direct observations from the physical world. At best, that would be an extraordinary claim requiring extraordinary evidence.
Suppose after a massive factual beat-down, this lobby accepted that cancer existed, but stated there was no way to know if it was dangerous or even if it were, that all the therapies were risky (true!) and expensive (true!) Therefore all cancer research should stop, and treatment should not even be considered. This fall-back position, again, would probably not be well received.
Suppose some well-meaning(?) folk tried to split the difference between the scientists and the screamers, and asserted the existence of the screamers meant that the cancer scientists had lost the trust of the public, and should win it back by opening their research to the screamers and refraining from any involvement in clinical medicine — which is evidence of a strong anti-cancer bias among supposedly dispassionate scientists.
I think we would need to hear a very good argument to think that such a trust-building exercise was either warranted or likely to be useful.
And exactly how does all that relate to the need for transparency?
Fake but accurate?
This is how it relates: increased transparency may or may not further the process of scientific discovery and speed the identification of errors. Feel free to make that case. Howsoever that may prove, the case for greater transparency doesn’t advance the larger case being made for paranoid disavowal of scientific facts as they threaten a particular right-wing worldview.
And you are free to make the case that greater transparency is not desirable.
Let us know how that works out for you.
I don’t really think that I do. If you want to change the standards by which science operates, standards that have been in place for a long time, and have produced a lot of success in terms of greater understanding of the physical world, then you need to make a positive case for that.
If I don’t, I don’t need to do anything, because I don’t need to change anything.
I’m friendly to the idea, so I should be easy to persuade. I await your case.
You might want to try looking up the requirements the EPA has to follow to set standards.
I really don’t have to persuade you. The precedent on rule making and openess is quite clear.
It’s a very simple difference between “science as usual” and “policy making”. Real easy to get in over your head in the latter.
You’re comparing apple to oranges; how science is done, as compared to how the EPA designs regulations. Do you understand what a “precedent” is? In no case are two totally dissimilar things able to provide “precedents” for each other.
I get it: like parents who oppose the teaching of evolution, you feel that the science will eventually lead to social changes, and therefore we should regulate the science to make it nonthreatening to your concept of what society should be.
It doesn’t work that way, I’m afraid. Better to apply the rules of science to science, and the rules of policy-making to policy-making. If you want to borrow an idea of one process and apply to the other, feel free to make the argument, but it’s not an argument from “precedent.”
Let’s cut to the chase. Is it your position that publically funded “science” does not need to be open and transparent?
I want to know a simple yes or no answer to that.
No apples. No oranges.
Just an honest answer for you to stand behind.
We’ll let the courts and Congress sort out what level of rigor the EPA needs to follow.
Robert doesn’t want you to mess with his science. He likes it the way it is – perfect, nothing more to learn, everyone just learn the facts from the IPCC. He’s quite conservative.
BTW, he only chews Trident gum because of the consensus has settled it – 4 out of 5 dentists agreed.
And so the charade is officially over again for Robert.
Science is politics.
Because I recognize the politics that motive most “skeptics,” this is somehow supposed to discredit me? How so?
I’d be happy to share with you the statistics on the association between conservative/libertarian politics and climate “skepticism.” It’s similar to the association between those fine Americans and “skepticism” about the president’s birthplace.
Facts are facts. ;)
Will you also share the stats available on the association between leftism/progressivism/liberalism/statism and Global Warming Belief? You can start with yourself.
Oh, sorry, Andrew, once you start raving about “statists,” you lose the game. Good luck with your fringe ideology, and I hope you find a better use for your dogma than trying to make science fit within it.
“Statists.” I laughed out loud. :)
Just because you declare I “lost” doesn’t mean I have.
Oh and thanks for not sharing the info you said you were happy to share. If you decide to actually share it in the future, I will look at it.
Are you still talking, Andrew? You know, the statists could be tracking you with their satellites RIGHT NOW. You might want to put on another layer of tinfoil, just in case.
I offered to share the figure with you and you started calling names and demanding an account of my political beliefs. That’s not a “yes, I would like to see the studies” response. But I guess you expect other people to be serious and helpful while you act childish and irrational.
My impression is that most skeptics are motivated by a love of the scientific process, and disgust at seeing it abused.
Politics generally doesn’t mix with science. I am generally somewhat on the left of politics – certainly pro Obama, but that doesn’t blind me to the fact that the Greens (and Obama) have made a huge mistake over carbon. I wish they had stayed with issues that really matter – like the destruction of rain forests.
Big corporations will find a way to make money whatever way we generate energy, but the poorest will really suffer if we insist that energy must be produced in the most expensive and unreliable way possible.
So to be skeptical of the idea that we are facing a planetary climate crisis is part of a right wing world view?
Please do tell.
FWIW, we on the skeptical side have had a lot of experience with much better trolling than you seem capable of.
But good luck and all of that,
Before everyone decides you are not worth the trouble of reading, please clarify this:
If increased transparency may or may not further the process of scientific discovery and speed the identification of error, do you also question if an increase in opacity may or may not further the process of scientific discovery and the idenitification of error?
Ummmm. Robert, was that supposed to be an argument or are you really a skeptic trying to sabotage AGWers under a false flag?
Your analogy doesn’t really work. Cancer killed before anyone knew anything about it. Science did not start from the question, “is cancer a problem?”, it started from the need to solve a serious disease.
In the case of climate change, the so called evidence, seems to be little more than observations of background fluctuations, helped along by efforts to underestimate the natural variability of the climate (by hockey sticks or other means).
The analogy fails for two big reasons. First, as David Bailey points out, the supposed threat of climate change is itself based on questionable publications, primarily by political advocates. It is not, like cancer, something that is known to exist. It is not even based on scientific publications, but rather on assessments of these publications by advocacy groups like the IPCC and USGCRP.
But second, the “screamers” as you call them are not screaming about a single bad study, although the hockey stick gets a deserved amount of attention. The skeptics are pointing at a large body of scientific evidence against dangerous AGW. Moreover, they we been doing this for over 20 years. The only thing recent is the Climategate scandal and the failure of the AGW political movement.
It’s also worth pointing out the issue of the hockey stick isn’t limited to a “single bad study.” There are multiple papers claiming to “get” a hockey stick which are faulty. In fact, (I think) the hockey stick is still part of the “consensus.”
The damning thing about the hockey stick wasn’t that it was shoddy work. The damning thing is how much the shoddy work got promoted/defended.
“The analogy fails . . .”
Take it up with the OP. I’m just tracing out the further implications of the comparison.
I’m just tracing out the further implications of the comparison.
And you failed to do that.
That’s your assertion, Jim, but I’m afraid you’re wrong.
“Statistics are like a bikini; What is revealed is interesting; What is
concealed is crucial.”
Slight correction. Add “topless” before “bikini”.
Watts and McIntyre covered this last year, making pretty much identical points.
Like them, you make far more of Ince’s ‘study’ than it objectively deserves, from any perspective; but he does emphasize the importance of Open Source software (as do I) so there are professional and personal implications for your commercial company, if you like this study as much as you say. And … a potential point of agreement. :-)
To be clear, Ince is using information from 1990 to 1994, about mostly clinical (medical) disciplines, and making an error in analogical reasoning to infer something about climate science. It’s hit and miss, this business of arguing from analogy; and it’s a clear miss. No actual analyses or references to climate modelling processes are made, never mind current analysis. Apparently you think that because he writes outside the climate science establishment that it’s good stuff. In some respects, you would sometimes do better to quote Peter Pan.
From him, it’s fine – he’s not a climate scientist, he has not read any climate science, and he is a progressive/free education advocate who was speaking with such goals in mind.
But from you – it is surprising that you could be so unfamiliar with current analyses and completely unable to do your own pragmatic evaluations of the relevance of information. You are content to compare apples to oranges, and pretend you are doing something more. Anyone familiar with science, basic logic, and critical analysis — no matter one’s perspective on climate change action and policymaking — should find this of zero interest.
“This is hugely worrying when you realise that just one error — just one — will usually invalidate a computer program”
Good grief. All your denizens with statistical and math backgrounds do not recognize this remarkabley inaccurate statement about defects in code, and responses to defects? Here, something IS wrong with him. Ince must have been reacting to something, instead of using his noggin. Something’s very wrong with this statement, actually, as anyone (especialy those in the relevant disciplines) should easily see.
“So, if you are publishing research articles that use computer programs, if you want to claim that you are engaging in science, the programs are in your possession and you will not release them then I would not regard you as a scientist; I would also regard any papers based on the software as null and void”
This ‘nullifies’ entire disciplines, including the disciplines of quite a few of your denizens, and you … so it’s a very dramatic statement. But I am not unsympathetic to the inference regarding your private company’s proprietary software and activity. You like Ince, so hopefully you get this point — in which case, we have this small area of agreement. :-)
Martha, not sure what you are talking about. I will comment on one thing: my private company’s proprietary software and activity. For starters, we don’t publish 90% of the scientific advances made by the company, these are for the benefits of our clients. Anything we publish is based on publicly available data or data sets that we make available. The services we provide to our clients have nothing to do with the IPCC or climate policy.
What Ince article are you looking at? The one that is the subject of this post is related to a 2006 publication on chemotherapy, not stuff from 1990-1994.
Sometimes your posts are entertaining and even interesting; this one is just incomprehensible and way off the mark.
Martha, are you saying that climate science research and climate scientist operate on exclusive rules that are unrelated to other forms of research (and in this case medical )? That is the point of the article. Certainly you can’t be so pharisaical to think you can’t learn from other types of research.
& give one example where that statement about computer programs would nullify an entire discipline?
In reply to Brandon Shollenberger :
Wikipedia discusses the controvery but still presents the Hocky Stick as valid.
It is valid, as far as having numerous studies confirm its basic findings.
Certainly on the point that makes “skeptics” hate it so much — is the present period the hottest in recent history? — the further succession of even hotter times has strengthened the evidence there.
A lot of enthusiastic “skeptics” have lost a few teeth to Mann’s Hockey Stick, yet, they hope to find refuge in the “Big Lie” of pretending to have refuted it.
Hey, maybe we could have another investigation of Mann. Thirty-fourth time’s a charm, right?
Having multiple studies get the same results as you got doesn’t make your results correct. This is especially true when those other studies have many issues of their own. It is even more true when every study which has gotten the same results does the exact same erroneous thing.
Your attitude is kind of ridiculous. Mann made a new hockey stick paper a decade after his original, and it was just as flawed. During discussions of it, even Gavin Schmidt admitted the criticisms raised by skeptics were right. At the point even RealClimate admits skeptics are right about something, you know there’s a serious problem.
There are no studies that show conclusively that the present temps are higher than the MWP. Ironically the “hide the decline” fiasco showed that the proxy temps are not accurate enough to make this determination. All we have are uncertainties (and cover ups of uncertainties). As with most climate science, the evidence is vague at best. It may have been warmer in the 1930’s.
No studies show conclusively that there was a MWP globally. The key word is “conclusively.” It’s hard to be conclusive about temperatures a thousand years ago.
As Brandon reminds us, study after study has confirmed Mann’s Hockey Stick. But Brandon feels there are all flawed. Which opinion, when it is published along with evidence in a peer-reviewed journal, I will care about.
Until then, it’s just sour grapes on a comment thread. Oh, careful there: I see a new shiner. Watch out for that handle; it’s vicious.
study after study has confirmed Mann’s Hockey Stick. But Brandon feels there are all flawed.
AFAIK all the hockey sticks have been effectively discredited for good reason.
Can you cite any that did NOT use either the Mann’s statistical techniques or the same dendro (or upside down Tiljander) data that Mann used?
You DO realize, I trust, that the use of dendro data was invalidated by the several “hide the decline” incidents?
Or do you have “other” proxies that we have yet to be told about?
I believe that the following statements are true and acceptable to knowledgeable people on both sides of the controversy:
– Mann’s original analysis was not based on a well defined and standard statistical method. It was using methods of principal component analysis (PCA), but not in the standard way, which are well understood.
– Standard PCA taking only one component determined by the agreement with the instrumental data leads to the hockey stick form even, when the proxy data does not have any signal of that type as demonstrated by McI & McK.
– The modifications to the standard PCA method in the Mann’s original paper made its results to have some significance, but using a non-standard method means that the reliability of the results cannot be estimated even to the extent then they can for the standard method.
– Later analysis with improved methods has always confirmed the hockey stick shape for hundreds of years to the most recent history, but the temperature of the MWP remains uncertain to the extent that it may be either close to the present temperatures or significantly lower.
– It is known (and stated explicitly also by the scientists, who have produced reconstructions) that most of the methods tend to give less variability for the past temperatures than there has been in reality, but there are different views on the strength of this bias.
– The proxy data contains so much noise and also systematic errors that finding an unbiased signal and estimating its accuracy is extremely difficult. The scientists have tried to invent new methods to maximize their power in finding the signal, but this has led also to worsened understanding of uncertainties and even systematic errors produced by the methods of analysis.
– Professional statisticians emphasize the value of well analyzed standard methods and dislike the use of imaginative new methods with badly understood properties and risk of bias. The view of other scientists is often the opposite: finding signal is most important even, if that makes the estimation of the reliability incomplete. Both views have merit. The methods have often valid power even, when they are not fully understood, but the results of such methods must always be taken with a grain of salt.
There. Fixed it for you.
I have a few issues with these statements. They aren’t really significant, but for clarity:
Standard PCA taking only one component determined by the agreement with the instrumental data leads to the hockey stick form even, when the proxy data does not have any signal of that type as demonstrated by McI & McK.
I’m not clear on what you mean by the part I made bold. If you mean when the overall data (combined in whatever sensible way you’d like) doesn’t have a hockey stick shape, then I agree. On the other hand, some portion of the proxy data must have that shape to create a hockey stick. This may just be unclear wording.
The modifications to the standard PCA method in the Mann’s original paper made its results to have some significance, but using a non-standard method means that the reliability of the results cannot be estimated even to the extent then they can for the standard method.
It’s hard to tell what is meant by “some significance.” Ultimately, Mann’s work showed little more than a small subset of his data (bristlecones) contained a hockey stick. Even worse, Mann willfully hid the statistical significance results he calculated which showed his paper’s conclusions were unjustified. These are damning criticisms, but maybe there could still be “some significance” in his conclusions.
Later analysis with improved methods has always confirmed the hockey stick shape for hundreds of years to the most recent history
I don’t like the idea of calling it a “hockey stick shape” when all we’re really talking about is the blade, and a hockey stick requires a flat shaft. However, that’s a mostly irrelevant issue. My real question is this. Have the methods actually improved?
“You DO realize, I trust, that the use of dendro data was invalidated by the several “hide the decline” incidents? ”
I decline to participate in your hallucination.*
As I’ve said, when you’ve backed up your assertions with a strong scientific case that has survived peer review, drop me a line.
*h/t Scott Adams
Apparently you do not understand my point about uncertainty. There are many paleo studies and most get results that differ significantly form the hockey stick. Most show dramatic oscillations, which is probably correct, and contradicts the hockey stick. Moreover, these proxies all fail to agree with the 20th century instrumental records, because the proxies decline. If we cannot tell what the MWP temp was we certainly cannot confirm the hockey stick.
So if the temp oscillates and we do not know what the temp was, how is the hockey stick confirmed? It is not.
Note too that the above is a scientific argument.
Robert, do you care about the opinion both of Mann’s hockey sticks are faulty? That opinion has been published along with evidence in peer-reviewed journals.
If we both agree both of those are without merit, and the criticisms of them stemmed from skeptics on the blogosphere, we have good groundwork to build from.
“Conclusively” is a mighty big word.
There are also no studies, which show conclusively that there was not a MWP globally.
However, there are many studies from all over the world, using different paleo-climate methodologies, which all point to a slightly warmer MWP than today.
In addition there are historical records from all over the civilized world at the time pointing to a slightly warmer climate.
And there is also physical evidence, such as signs of ancient vegetation and civilization under receding alpine glaciers, medieval farms buried in Greenland permafrost, etc.
IOW the evidence for a global MWP slightly warmer than today is overwhelming (even if not “conclusive” in your mind).
Hey, maybe we could have another investigation of Mann.
He admits it himself:
There is no reason to give them any data, in my opinion, and I think we do so at our own peril!
MM is so unethetical – got our pocket money and non-delivering. Shameless peudoscientist should be stopped from public funds immediately.
Robert, which skeptic lost to Mann? most have been correct in their criticism.
William, Wikipedia is a wonderful site but for controversial issues it takes the side of the majority of the contributors or the moderators. Besides the ones claiming it’s valid in the wikipedia article is Mann et al.
Indeed. It’s sad how poorly it has handled controversy in global warming issues.
I think they do a pretty good job of coverage the issue — but it’s a political and ideological controversy, not a scientific one, and “skeptics” are never going to be happy when a source is straight about that, whether the “controversy” is “Do vaccines cause autism?” or “Is the Earth 6,000 years old” or “Is AGW real and serious?”
Your comparison is foolish. There is ample scientific evidence that AGW is very probably false. Are you not aware of the skeptical side of the science.? It sounds like it, but how is this possible? Note that the issue is not who is right; you are making a preposterous claim about the science.
“There is ample scientific evidence that AGW is very probably false.”
OK, I see we have an advanced case here. Let’s start from the beginning: do you know what “science” is? And how it is different from really, really wishing something was true?
Let’s start from the beginning: do you know what “science” is?
Yes – David does, as do I – and many, if not most of those who comment here.
But apparently you wouldn’t know “science” if it bit you on the ankle. You’ve so far shown no knowledge of what it is, or where it came from or how it operates. Your only weapon, apparently, is nonsense like Do vaccines cause autism?” or “Is the Earth 6,000 years old” or “Is AGW real and serious?”. And that’s kinda like taking a rubber knife to a gunfight.
As much as I’m sure you cherish your 2nd amendment rights, your guns will not help you win an argument. ;)
I’m glad you concede that doubt whether AGW is real and serious is “nonsense.” Or did you mean to admit that? Durn it. Better go clean ’em guns again.
As much as I’m sure you cherish your 2nd amendment rights, your guns will not help you win an argument.
Depends on the argument. And they have. :-)
Lucky you – you’ve obviously never needed them. But then, life is uncertain and you “could” get unlucky tomorrow. But having been there, I wouldn’t wish that on you.
My Ph.D. Is in the logic of science and
my research focus is the logic of complex issues. Climate science issues have been my case study for 18 tears. How about you?
Yes, David, you have quite the resume: the Cato Institute, the Heartland Institute, the laughably misnamed “Greening Earth Society.” But none of them, I think, paid you for your scientific chops. Your degree is in philosophy, I believe?
To make a ludicrous assertion like “There is ample scientific evidence that AGW is very probably false” suggest that you have a problem either with the concept of “science” or “evidence” or “false.” I’d still like to hear you answer to the question, as I think that is most likely where you have gone astray, your prestigious humanities degree notwithstanding.
My working definition of science is the mathematical explanation of nature based on observation. Yours?
I do cognitive science not humanities, including the mathematical modeling of science. Do you have anything intelligent to say or are you just wasting our time? How about participating in one of the technical threads here, to show that you actually know something.
Do you have anything intelligent to say or are you just wasting our time?
He’s just wasting time, David. His, yours, and (he thinks) mine. But he was wrong about that, too. :-)
Oh my, a new troll.
The barrel is getting scraped.
“This ‘nullifies’ entire disciplines, including the disciplines of quite a few of your denizens, and you … so it’s a very dramatic statement.”
I found it dramatic as well, but fortunately I remembered no one cares what Steven Mosher thinks is science. ;)
We ought to draw a clear and bright line between improving the process of science, as for example incorporating new standards for sharing and reviewing data in the computer age, and in making absolutist value judgments and being outraged when an institution resists change. Anyone with a good idea can and should advocate for the former; the latter requires a bit more preexisting credibility.
I found it dramatic as well, but fortunately I remembered no one cares what Steven Mosher thinks is science
Nor does anyone care what you think is science.
Some of what you’ve written here is nonsense and contrary to what science has been for the last several hundred years. You really should learn more about what it is before arguing about it.
making absolutist value judgments
There are people on both sides of the dance floor who do that – apparently, including you.
“Nor does anyone care what you think is science.”
Then it’s a good thing I didn’t try to tell anyone what qualifies as science and what doesn’t. Dodged a bullet there! I think the key to my success was is NOT being an arrogant douche.
“Some of what you’ve written here is nonsense . . .”
What a spirited yet vague criticism. I am indeterminately chastised.
I think the key to my success was is NOT being an arrogant douche.
Nope – you failed that test.
Jim, if there were such a test, I’m sure you’d be a highly sought-after consultant.
Since there isn’t, I’m afraid you’re just the guy who lost the argument and doesn’t realize it yet.
What argument – you haven’t said anything worth arguing about.
“I found it dramatic as well, but fortunately I remembered no one cares what Steven Mosher thinks is science.”
This is not about what I think is science. As, i’ve noted from the beginning of my efforts in 2007 it’s about the work of Jon Claerbout. Claerbout and those who followed him noted a few disturbing things. Chief among them was that researchers could not reproduce their OWN results, much less the results of others. Particularly in the sciences that had large computational components. The insight of course is that a paper is not actually the science itself. A paper is a collection of words and numbers and figures. It advertises ( we hope faithfully) the ACTUAL scientific behavior that the researcher undertook. Papers essentially say, “if you perform this behavior, you will witness these results” That’s what a METHOD section describes. Now, how of course does one verify a method section? how does one verify that the results published actually came from the method described?
However you answer those questions is fine by me. I am not trying to convince you of my ideal for science. My argument is simply this. I am not rationally compelled to believe claims made by those who refuse to share code and data. I am not buying what they are selling. Since 2007 no one has been able to give an argument about why I should believe someone who refuses to share code and data. You are welcomed to try.
The best evidence about what data was used is not a description of that data or a pointer to that data. The best evidence is the data ITSELF. the best evidence of what method was actually employed is not a verbal description of that code/algorithm, but rather the code itself. I see no rational basis for insisting that I settle for second best evidence when the best evidence is readily available. This is not to say That I must dis believe someone who fails to provide data and code. I have these choices:
1. Suspension of judgment
generally, being a methodological skeptic, I will suspend judgement. Convince me that I should not suspend judgement? Understand that your argument on this point will be weaker than the argument the paper is asking me to buy. wrap your head around that.
Hang in there, Steven.
Basically everyone knows something is basically wrong with the entire global warming story. While the world economy is collapsing, world governments are still focusing on this non-problem.
Many people have concluded that our governments are completely out of control. But those that created this debacle are not going to accept responsibility now.
Some have even suggested that “Global Warming Fraud Creates Third World Food Crisis:”
It must be the requirement of publishing!
One single paper or the work of one single research team should never be taken as correct, but only as partial evidence, unless all steps are totally clear and fully verified, which is not possible in practice for a very large part of research. Being open and providing maximally information on the methods and inputs of the work helps, but even that is usually not enough.
The real confirmation of important results comes, when they are applied or extended in ways, which would fail, if the original work would be incorrect. If such successive work will not be done, then the result could not be so important. Unfortunately it may take years or decades, before a practically (not absolutely) final verdict can be given.
Then we are in the present problem that choices must be done based on incomplete knowledge. Then we must compare the costs and risks under great uncertainty, and then views and attitudes of people will have – rightly – a weight comparable to, what the incomplete science tells. This doesn’t mean that we should not try to make maximal use of the existing knowledge.
On the inclusion of statisticians as urged in the Wegman Report:
Wegman, Edward J., David W. Scott, and Yasmin H. Said. 2006. Ad Hoc Committee Report On The “Hockey Stick” Global Climate Reconstruction. July 11. http://republicans.energycommerce.house.gov/108/home/07142006_Wegman_Report.pdf
McIntyre, Steve. 2007. The Wegman and North (NAS) Reports for Newbies. Scientific Blog. Climate Audit. November 6. http://climateaudit.org/2007/11/06/the-wegman-and-north-reports-for-newbies/
DIFFERENCE BETWEEN GISTEMP AND HADCRUT3
Pat Frank has found the reason for the reduced early 20th century warming of about 0.04 deg C per decade of Gistemp compared to hadcrut3
To Steven Mosher,
Where he writes:
I have these choices:
1. Suspension of judgment
generally, being a methodological skeptic, I will suspend judgement.
Steve again uses the above philosophy to criticize me claiming that I do not share data and codes etc. See also other his posts above.
Indeed, the three Mosher’s choices are severely incomplete. There exists a fourth choice. This is to spend time and resources to study the issue and to try to actually reproduce the results.
In my specific case, Mosher should first read my paper where he will find that my data are readily available (I used the ACRIM data from http://acrim.com and theCRU data from http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3sh.txt) and then study the wavelet algorithms which are written in this book with a lot of examples Percival and Walder “Wavelet Methods for Time Series Analysis”
Unfortunately, Steve does not want to even take into consideration the fourth choice, and prefers to continuously attack me claiming that I do not want to share codes and data.
The fourth above choice is the right way to proceed in these cases. But it requires time and efforts which people do not want or may not want to spend.
I have written Dr. Curry an email explaining the details of the story about this topic and formally asked her to spend time or at least involve one of her students in the verification of the paper and I could help. I was replied: “I think you should make your data public as well as your code. Neither I nor my students have any particular interest in your data or code, but there are other people that have requested this information from you.”
Curry’s answer is the same answer and logic used by Steve: “I do not have time or will to even read your paper and try to understand what you did and which data or methodologies you were using.” Etc. However, to improperly criticize Duke University that was easy.
But it is ok.
After all people cannot be forced to spend time and resources in something they are not really interested in. But they should also not harass people in something they are not interested in.
But now let us come to the further logical step.
If somebody does not have the capacity or the will to choose the fourth choice (that is spending time and/or resources to address the issue), does it make sense to choose Steven’s preferred choice and state that “Suspension of judgment” is the best that should be done?
Now, the issue has been extensively debated in philosophy since ancient times. Fortunately for Steven, the conclusion of the ancient wise men was that the best choice among the other three is “Belief,” and not “Suspension of judgment”.
The reason of the above conclusion is easy. More than 99% of our entire knowledge is based on “belief” and “trust” including the scientific knowledge scientists have. If we need to suspend our judgment every time something is not formally and experimentally proven to us directly, we would be trapped in a sad and terrible ignorance and darkness that would bring us to become crazy.
Let us explain this with a simple example.
I surely that Steven has parents, a mom and a dad, as everybody has. Does Steven have required from them a formal DNA test for giving them his love and respect as a son? Or does he have “suspended his judgment” because he could not “believe” that those two persons are his parents unless a scientific prove is properly given to him?
How sad would Steven be living by truly applying his “methodological skepticism”!
Think the surreal situation.
Steven who accuses his parents on web-blogs because he wants from them the scientific prove that they are his parents. His parents who protest his pointless skepticism. Steven insisting his position because he is a “methodological skeptic”! His parents that reply: “this is where you can find our blood, take it and pay for the DNA tests you wish.” Steven continues to insist that he will NEVER consider them his parents claiming also that he does not want to spend time nor money for the DNA test or even looking where the blood is: It is a duty of his parents: They should provide the proper verification confirmed to others and these to him, and pay for everything. The things continue in this way. Steven gets crazy and obsessed, etc.
Everybody can think about many other examples involving or not Steven and his methodological skeptical philosophy of life.
Now, about my papers they passed the peer review process and has not been disproved by anybody interested in properly analyzing it. My response to Benestad and Schmidt’s criticism about my past papers (in 2005 and 2006) is here
Curry knows many other details about that specific situation, as I wrote her. However, Curry herself said she is not interested in spending time on properly studying the issue.
Data are available on the web and codes are also in the web and written in textbook that everybody can buy on Amazon. Other technical details are explained in my paper. So, I do not know what I can do more.
Let us just hope that Steven does not get crazy with this story!
My response is in Webster’s Dictionary. It is available from Amazon. It should be no problem for you to select my words and assemble them in the correct order.
do your homework.
I am very precise about where the data and codes are. And how to use the codes etc.
Moreover, the results of my paper have been confirmed by other independent peer review studies as well also included in the IPCC 2007. Just read the references of my papers and the comments about them.
Just one of the recent study is this one:
Eichler, A., Olivier, S., Henderson, K., Laube, A., Beer, J., Papina, T., G¨aggeler, H.W., Schwikowski, M., 2009. Temperature response in the Altai region lags solar forcing. Geophys. Res. Lett. 36, L01808 10.1029/2008GL035930.
As the end of the paper the authors write:
Our results are in agreement with studies based on NH temperature reconstructions [Scafetta and West, 2007] revealing that only up to approximately 50% of the observed global warming in the last 100 years can be explained by the Sun.
Excellent. I appreciate you response. I mistook your statements to mean that you did not describe specific algorithms and how they were applied, and why. I assumed you meant that anybody who reviews your work should start at the beginning like you did and work through the details of your efforts on their own.
I see now that you actually meant that people can read you papers and verify the results using the same exact data and computer code. I’m glad you did not simply give the names of organizations with data sources and text books for folks to winnow out what they might need from them. I hate that old school teacher thing about “It’s in the book, look it up!”
Thank you Gary for having understood my logic.
It is very disturbing when people criticize you, and when you replies them: “study carefully the issue and you will see the things are different than what you think,” the same critics reply that they have no intention, no time and no capacity for studying carefully the issue.
Their time and efforts are not supposed to be used for “learning”, but only to give emphasis to their biases and prejudices, and for defaming others knowing that they can get rid of it because other people in their own environment behave in the same way.
I have red 2 or 3 of your papers, with some I agree, some I do not, but that is not reason for my comment.
Climate (or temperature) change is a result of complex interaction between solar energy, atmosphere, land and the oceans. If these are considered globally then task is overwhelming. Going ‘local’ tends to reveal some of the ‘secrets’ of solar-terrestrial link as clearly shown here:
(1850-present appears to be very promising)
I am putting it all together and will be available on line soon.
in my opinion you finding are interesting.
I just believe that they need to be well presented and properly argued. You need to convince not just me, but people who may not have a good experience with data pattern recognition.
CET/AMO case may be a fluke, but if PDO and ENSO are also subject to same type of local forcing
than concentrating on some kind of a single global factor is plainly counterproductive.
Not being professional scientist, but an engineer with some experience of waveform processing, convincing anyone is not my aim, but to point at a possible misguided belief that number of hypothesis floating around may be the answer.
Once the article is completed it will be available on line, than anyone who thinks that ‘finding are interesting’ can, if inclined so, pursue it further and show irrelevance or possibly confirm viability of the method used and the results so obtained.
I whish you success with your research.
Part of the problem is the history Steven McIntyre has had in trying to get data and code from the Hockey Team. If he pulls data from online, the Hockey Team claims he used the wrong data set. If he tries to reverse engineer their code based on the description in a paper, they claim he made a mistake. Time and again McIntyre has proven it was the Hockey Team who made the mistakes. They were simply trying to hide their mistakes by putting up roadblocks and hindrances to prevent McIntyre from reproducing their results.
In your case, I do not think you are trying to stone wall researchers. However, this response from you can be seen that way. Take, for example, the URLs you provided. The hadcru3 dataset is fine. For ACRIM, I have not looked, but isn’t it possible ACRIM keeps more than one dataset? People want to know they are looking at exactly the data you examined. Why not provide a URL to the exact dataset?
Regarding the code, you write “study the wavelet algorithms” in a book. I’m sorry, but disagreements could arise over which of these algorithms would be the correct choice in a given situation. If someone tried to replicate your work and could not, you could say they chose the wrong algorithm. That just wastes everyone’s time. On the other hand, if you identify which algorithms you used and why, it would advance learning and people would be able to follow your thinking. As it stands, you are actually standing in the way of your own ideas progressing.
Steve McIntyre leads by example. Whenever he does a calculation, he posts his code online in R. Anyone can follow his steps and will get the same results he got. I encourage you to set the same high standard for yourself. Don’t ever allow anyone to criticize you for lack of openness.
I tend to agree with you, BUT, I can download Steve’s code and data and run the program. I then still know NOTHING about why it gives the output it does or why the underlying physical effects work that way. Nicola, unfortunately, is working from the education aspect. He already did the work for them. He can’t give them understanding. He can only point them to what is needed to learn.
Of course that attitude does not move acceptance of his work in a hostile community which isn’t interested in learning and doesn’t want the results.
I understand what you’re saying. People still have to think through the code to understand the different operations and what the code is really doing. But at least McIntyre provides it for all to examine. If you don’t understand some of the terms in the code, you can look them up.
Ideally, a researcher would provide the code and references to explain why he chose to do things in a certain way. Understanding why a researcher chose one algorithm over another could be important. I think Scafetta wants to advance learning and I like that he is citing a book where the algorithms are explained, but it seems to me he is only going part way. I would still like to see him release the entire code. I cannot think of a reasonable reason not to.
Let me sum up this way. McIntyre is releasing the code and sometimes without references. In this instance, Scafetta has released references but not the code. I would like to see both (when necessary), but if only one – I would rather have the code.
Ron, please read Scafetta’s post explaining what he DID put into his paper. He apparently provided enough for any person knowledgeable to duplicate his work and some have. Maybe Schmidt Et. Al. aren’t as educated as they think. Maybe they should read the papers of those replicating the work. Claiming that he has NOT provided necessary information to replicate the work is not correct either.
Bottom line is whether he provided enough information in his paper to replicate his work. Apparently he did. The fact that Gavin and friends couldn’t should only reflect on their lack of knowledge in that area and not on the author.
Ron, thanks for this comment. I agree that Scafetta is not trying to stonewall researchers, it mainly seems that he doesn’t want to be criticized by people that he views as lazy researchers unwilling to dig in and reproduce his work. You make the case effectively for what I think needs to be done.
Is Scafetta supplying all code and data, or is he not?
If the latter he, is indeed stonewalling. It’s that simple.
did you read my paper? or not?
Which data and which code is used and how they are used it is written there.
To understand how I used the algorithm and why I used it in the way I did, you need to study the book. Did you study the book I indicated above?
If the answer to the above questions is “NO”, try to understand that your behavior is not appropriate in education. No teacher would tolerate such a lazy behavior from a student.
Moreover, the results of my study have been confirmed by several other studies already. Just look in Google scholar about who reference my studies. Again read my papers where I add also references to confirm that my results are reasonable.
For example one of the results of the my old 2005 paper was that the 11-year solar cycle leaves a signature of about 0.1 oC on the global surface temperature record. This is what the IPCC 2007 report
page 674 says:
“A number of independent analyses have identified
tropospheric changes that appear to be associated with the solar cycle (van Loon and Shea, 2000; Gleisner and Thejll, 2003; Haigh, 2003; White et al., 2003; Coughlin and Tung, 2004; Labitzke, 2004; Crooks and Gray, 2005), suggesting an overall warmer and moister troposphere during solar maximum. The peak-to-trough amplitude of the response to the solar cycle globally is estimated to be approximately 0.1 oC near the surface.”
So, the result is the same as found by many authors using different techniques. This is what confirms the robustness of my calculations.
Try to understand that this story that I am not clear about the data and the codes is nothing but a defamation attempt by certain guys who wrote a paper filled with naive mathematical errors against me as I have proved publicly. See here
and other web-blogs did the same. A reproduction of my response with the original figures is still here
So read the above web-page.
You know how much I respect your work. In a bygone era, your response would be appropriate. But the climate science debate has become poisoned by incompetence, lack of openness, and in some cases (in my opinion) outright fraud. There are a great many educated people watching the climate science debates now and there is a distinct lack of trust. I am talking about engineers, software developers, mathematicians, statisticians and others. They may not be able to follow the arguments between you and Gavin Schmidt. Or, they may think they are following the argument but arrive at the wrong conclusion. But if you were to make your code available exactly as you used it, then they could follow the arguments. To say “other peer-reviewed papers agree with me” is not enough. Those experts could be wrong. I refuse to think you are stonewalling because you are not confident of your results. But I do think your response is standing in the way of your ideas gaining acceptance. Why not release the code and prove Gavin and the rest wrong?
I think you should give more, explain more. I think you are trying but by saying you need to study the book is not so helpful. Surely you didn’t use the entire book (for example – the copyright, preface, every exercise in section 10.10, etc.). Many publications cite many books(sometime hundreds) but to claim you need to read each one to understand the publication or validate it would be difficult.
I am have to give you a mark of ‘incomplete’ on this.
Also, your analogy of parents is difficult too, typically one trusts their parents unconditionally. In this case, a doctor or accountant would be more appropriate, although some might argue that at least they well established professional affiliations that encourage ethics.
Teddy and Ron,
Unfortunately I need to again and again repeat that you need to read my paper first where the precise data and algorithms are used.
As overwhelmingly clear in my paper I am using the Maximum Overlap Discrete Wavelet Transform (MODWT). Which is extensively discussed in the book in chapter 5. Of course to understand chapter 5 you need to read chapters 1,2,3 and 4 first.
About the issue of Ron, “Why not release the code and prove Gavin and the rest wrong?” there is no need to release the code to prove Gavin wrong. Any person expert in time series decomposition analysis would understand Gavin’s errors.
Their errors are evident. Benestad was using a precompiled package on “R” program called “modwt”, that is the right function.
The manual for R function is here, see page 16
The function that Benestad used was
modwt(X, filter=”la8″, n.levels, boundary=”periodic”, fast=TRUE)
Benestad’s biggest error was to the chosen flag. He used
Instead, he should have used the flag
boundary = “reflection”
as everybody expert in the technique would have done. The error by Benestad and Schmidt is much bigger than a mountain!
By doing that macroscopic error Benestad and Schmidt concluded that when the Sun goes up, its effect on the temperature is a cooling and claimed based on their result my papers are wrong!
Which only proves how incompetent those people are in mathematics and physics.
A reproduction of my response with the original figures is still here. Read it well, the things are evident.
Then there was an issue about the dates that had to be about 1980 and 2002 which are evident in all my figures of my 2005 paper plus a simple re-sampling of the temporal unit (from the monthly to a shorter scale to optimize the wavelet filter that was also understood by some person commenting on the bogs) of the temperature data that was clearly explained in my paper.
That is all, really! So simple!
You write: “Any person expert in time series decomposition analysis would understand Gavin’s errors.”
I cannot help but think this is an oversimplification. Gavin works on time series as part of his job. I don’t believe Gavin has admitted his errors or you could provide a link where he withdrew his objections.
The point I am trying to make is that your goal should not be to convert the experts, but to convert the non-experts. When all the facts are made available, the non-experts can grasp the debate. Then it doesn’t matter what the Gavin’s of the world say or do.
As I mentioned before, there are a great many highly educated people watching this debate. Many of them simply will not put in the time or effort if they think you are withholding information. In my opinion, it would take someone with the patience and skill of Steve McIntyre to delve into the debate to figure out who is right or wrong. Steve McIntyre is a rarity.
It is also my opinion that you do not think you are withholding anything. But again, it is the perceptions of other people that matter. My best advice to you is to release everything in the form they want so you take this issue off the table. It is a distraction that is slowing the progress of your ideas.
Nicola shows graphs explaining this error in detail in his link: http://climaterealists.com/index.php/forum/?id=3813
Even though the data is formally at acrim etc, I think it would help your communication to post a copy of that data as you actually used it at your website.
THE SCIENCE PROCESS
You seem to have missed my point. All I’m saying is this:
– If you have NOT supplied all your code and data as requested, then you ARE stonewalling ( a la the Climategate crooks ).
– But equally if you HAVE supplied all your code and data as requested, then you are NOT stonewalling.
So – which is it ?
( So as I hope you now see, I am not discussing your science per se, but rather your science process – the obvious background being that this is what Jones et al went to such trouble to sabotage ).
you and Mosh and a number of other people don’t seem to be understanding what Nicola has written. Let me try to help this understanding.
My mother is 83 and does not have a High School diploma and can play solitaire on a computer. Just how much time is Nicola supposed to take to insure that she not only can find the code and data, download it, run it, AND UNDERSTAND IT WELL ENOUGH TO KNOW WHETHER IT DOES WHAT HE SAYS!!!
I believe Nicola is trying to make the point that it is pointless to give details that would allow someone to run his code who cannot understand why it was done the way he does it or if he is wrong. Who cannot even understand the generic types of analysis he is doing. If you cannot understand it what is the point of having it. You can only verify that it runs and comes up with the same results. That is pretty pointless although a beginning in replication.
You giving Nicola 2 choices does NOT cover a reasonable middle ground where he provides enough information where somehow who is capable of actually EVALUATING the work can replicate it.
I also notice that most of those complaining about a Holy Perfection haven’t even attempted to replicate the work. That shows that they do NOT have the knowledge to evaluate it so would be pointless for them to try.
The important thing is that his work HAS been replicated whether you, Mosh, and others are happy with the way he provided this improtant information or not.
A case of whine anyone??
It really is very simple. Scafetta has NOT produced all code and data as requested. Therefore he IS stonewalling.
No amount of wriggling by you and him can change or evade that. Your (and his) only reponse has been to try and justify his stonewalling, thus implicitly admitting it.
It very much fits the pattern of Jones’s “Why should I show you data when I know you’ll try and find something wrong with it”. IOW, prima facie evidence of fraud and/or incompetence.
What you fail to realize is that the burden of proof is on you. I have read your papers and claims. The first checks that I want to make are the basic QA checks.
1. That you actually used the data you cited. I have reason to doubt this. So, I request of you the same thing I requested of CRU. a copy of the data AS YOU USED IT. we all know that data sources change over time. We all know that copying data sources is not an error free action. So, I request a copy of the data that you actually used. Not a pointer to the source you claim to have gotten it from, but the copy you used. This will allow me to make the most fundamental QA check.
2. That the figures you produced were actually produced by the code you ran. Again, my experience and the experience of others who write in the field of reproducable research indicates that many researchers cannot reproduce the very results they published. Simply, when the code they have is re run, the figures produced do not match those published.
Sadly today journals do not perform such basic QA checks. Those of us who care about such things are left with 3 choices.
I hold no respect for peer review. Some people who I do not know have read your papers and failed to ask to code and data. Why would I care what such sloppy thinkers think.
Speaking of sloppy thinker and sloppy analogies
“His parents that reply: “this is where you can find our blood, take it and pay for the DNA tests you wish.” ”
WRT my parents. If my parents told me where I could find their blood then I would surely ask them for fresh samples. Simply, if I doubted my parentage WHY would I trust the people I doubt to point me to the correct location?
I want their blood from their bodies. I want your data from your hands. chain of custody. I want your code that you claim produced the results.
Your behavior here and CA indicates to me that you are not to be trusted. Why should I or anybody else spend their time investigating the work of man who we would fail as a student, and fire as an employee.
Frankly, I’ve heard all your arguments before. I heard them from Mann, I heard them from Jones. You are in some fine company there.
try to do your homework first.
You need to use the data that I have indicated and the codes that I have indicated. And use the instructions I have added in the paper with the mind of somebody would like to understand things. And do all the rest as I have indicated.
If you fail to get the same results then, we can discuss. This is how the things run in science.
Your request is not appropriate.
For example, if I were an experimentalist that does a certain experiment in my lab and I write a paper claiming that by doing the experiment in a determined way it is possible to obtain a certain result, other research groups that want to check my claim would not harass me that they want to use my own lab, my own equipments and my own samples, and they want also that I repeat the experiment under they own eyes.
What people do is to create their own lab and, following the instructions written on my paper and using the known science implicit in the issue (in fact, not everything must be explicit in the paper, things evident to the experts in the field can be omitted), they try to replicate my result.
The issue about Mann and Jones are not comparable with mine because they have not revealed the data nor the details on the mathematical methods they used. So, nobody could verify anything, because data and methodologies were hidden.
In my case I have indicated the data and the codes that need to be used very precisely. Then, it is in the duty of the critics to properly use the data and properly learn about the techniques if they want to disprove my results.
This is how scientific verification works. Dear Steven!
Your case kind of illustrates the need to define reasonable transparency. With your “as used” data and a description of the methods you used, it should be no problem to replicate your work (well, no problem is relative , doable is better). More steps required to rebuild your data bases then write the code to replicate your results, just needlessly complicates review. That may be your point, “unless it is your specialty don’t try to review my work”.
I can see not releasing your code. It may be valuable, it is your intellectual property, it belongs to you and your institution to do with as you wish. Your data is not protected unless it is a “unique” database, so you may wish to keep that secret for whatever reason. But if you want your results taken seriously, there should be some reasonable minimum standard of transparency that allows protection of your intellectual property and a reasonable expectation of replication of your results, if you are publishing into this climate mess, anyway.
Steven, be careful what you are saying. You claim that Scafetta is equivalent to Mann and Jones. Neither put enough information in the public view to replicate their work. Scafetta has. you are going overboard with your Witch Hunt. Becoming fundamentalist will not help your cause.
The pattern here is the same as with McIntyre-Mann : an attempt to see and audit work (a cheap exercise), is stonewalled with the response that the would-be auditor should instead try and replicate it (an expensive exercise).
Could it be that the stonewallers bask in tax dollars as their day jobs, whereas the would-be auditors have only their spare time?
you and Mosh are being obtuse. You accusing Nicola of what Mann did is going way to far. You are slandering him0. Let me try to explain as you obviously are too lost in your holy war to drive out the evils of hiding data and code to see reality.
1) mann did not make his data available or even tell which data set(s) were used , Nicola told everyone what data set(s) were used
2) mann used a new algorithm that was unknown, did not provide code, and did not provide enough of an explanation so that someone could replicate it in a reasonable amount of time even if they had the right data, Nicola used standard algorithm that is known and easily duplicated, specified that algorithm and a reference where it is explained how to use.
3) mann offered absolutely no assistance to those trying to replicate his work and he and the team misled and heaped abuse on those who did so, Nicola offered to help anyone who actually TRIED to replicate his work.
4) no one was able to replicate mann’s work in a reasonable amount of time, several people have replicated Nicola’s work.
But hey, you have your religious war to fight so please continue to insult a man who is doing it right compared to many in the climate community.
Sorry Punksta, you are apparently delusional. The replications have already been done. You are weeks late and it didn’t cost you a dime that you didn’t waste yourself by coming on this and probably other sites to slander an honorable man who is simply trying to teach those able to learn.
Your dogged avoidance Scafetta’s sabotaging of the science processs (by refusing to provide his original code and data), duly noted.
Yes, I agree Mann is more corrupt than Scafetta.
Fundamental sabotaging of the science process is only “nit-picking” in the eyes of those with pre-committed conclusions and an agenda that trumps that of science.
And regarding the issue of stonewalling any attempt at auditing, which is cheap, by telling the would would-be auditor to instead do replication, which is expensive, I note that you remain silent.
Thank you for recognizing my efforts in trying to head off useless time wasting nit pickers!!
What science has failed to comprehend is the multiple tasking in science to get accuracy.
Ever do a chart on angles of deflection on a rotating orb?
I this for understanding how individual energy interacts in power generation on a turbine blade(extremely poor). Many factors were included into this to understand exactly how energy was being harnessed and it’s interaction with neighboring molecules.
For planetary mechanics, there is multiple energy at different speeds from a rotating sun, hitting a rotating planet that tilts with different materials that deflect or slightly absorb and release energy with different density of gases that it has to pass through that also can absorb some energy or deflect energy.
An extremely complex and highly interaction type of process. Then also many other factors as well have to be included into this frame such as cloud cover density, wind movement, pressure differences, etc.
So rather than having a argument fest on right or wrong, we need to continue with science and that WILL show what is correct and what is incorrect.
“So rather than having a argument fest on right or wrong, we need to continue with science and that WILL show what is correct and what is incorrect.”
Alarmist are robbing our pocket money. Do we have the luxury time while waiting for a conclusion of right or wrong in your time scale which let them robbing us? No.
In Mathematica one can write a paper with live (executable) equations in it, including the graphics. There are not problems with different compilers or compiler settings or scripts that are hidden. I love it. A paper could be published that when downloaded could be checked by any reader. Longer code could be in suppl info but still checkanble easily.
This may be an interesting technological improvement for the future.
However, the issue will remain the same: critics should still spend time opening the file, reading the paper, understanding the math used in the paper, which may require further study, pushing the button to make the program run, having the patience of looking at the results with fairness, etc.
If the critics do not want to spend any time for learning the things, no technological improvement will be able to compensate their unwillingness.
I suggest you read Steven Mosher more closely and more thoughtfully.
On including statisticians: there are difficulties. 1) some statisticians are only interested in pure math type stuff (high theoretical content) because that is what they are rewarded for professionally. Many projects fail that test. 2) The ones who do consulting find lots of work from people with money, so you may be competing for their time. 3) Statisticians can be dense on certain topics and not understand the science, just like anyone else.
I have been successful in many cases getting statisticians interested in my work and joining me on a project without funding, but not always.
The attitude of journals that a paper finding a flaw in a previously published paper is not real science is simply astonishing and must change. I have also encountered this.
Craig, well stated. Many important climate problems that could benefit from collaboration with a statistician simply aren’t sexy enough to interest the statisticians. One solution is to substantially ramp up the education in statistics and data analysis for graduate students in climate research.
A new graduate level course… Climate Statistical Analysis?
An excellent suggestion, I believe!
But why grad-level?
improving the mathematical standards of students in climate research might be a solution. But that should not be limited to statistics alone.
There exists the science of complexity that is finalized to directly describing how complex systems evolve and that uses nonlinear phenomenological models to capture the essence of a complex dynamics. This science is the key for understanding complex systems which cannot be efficiently modeled with analytical models starting from the fundamental equations of physics.
Physicists know this things already. Unfortunately, in the geophysical department students are told that only using analytical models such as general circulations models is the “scientific” way to study climate without really evaluate the possible merits of those old style engineering models vs. their actual errors that are currently larger than the signals they want to interpret.
Science of complexity is a revolution in the philosophical way people address complex problems (start from up to down, and not from down to up), but it is the only thing that can really work in geophysics too, where phenomena are too complex to be handled just with pure analytical models that simply ignore what is unknown and compensate such ignorance by playing with the huge errors of some parameters and end up by obtaining just noise plus an upward trend.
Actually science of complexity and complex statistical analysis of time series is my field of expertise. May I be of help?
‘Science of complexity is a revolution in the philosophical way people address complex problems (start from up to down, and not from down to up), but it is the only thing that can really work in geophysics too…’
I like the way you refer to it a philosophical revolution. It is still too little understood what the limitations of the maths are when it comes to complex systems. This is still limited to an increase in autocorrelation (slowing down) and noisy bifurcation (the catastrophe of Rene Thom or the dragon-kings of Sornette).
This Ghil et al paper – http://www.nonlin-processes-geophys.net/18/295/2011/npg-18-295-2011.html – uses Boolean delay equations. From Wikipedia – as ‘a novel type of semi-discrete dynamical systems, Boolean delay equations (BDEs) are models with Boolean-valued variables that evolve in continuous time. Since at the present time, most phenomena are too complex to be modeled by partial differential equations (as continuous infinite-dimensional systems), BDEs are intended as a (heuristic) first step on the challenging road to further understanding and modeling them. For instance, one can mention complex problems in fluid dynamics, climate dynamics, solid-earth geophysics, and many problems elsewhere in natural sciences where much of the discourse is still conceptual.’
I think this is a very humbling development in climate science – and a post from you would be most welcome.
If climate research were a purely scientific field, I would agree with you 100%.
But it isn’t, as we both know.
Unfortunately, it is to a large extent a political field, in which there are many true scientists (like you), who do their work diligently and objectively, but that several of the key players (Hansen, Mann, Trenberth, Jones, and several others) already know the desired answer and make sure that’s what the “science” will show. This has been referred to as “agenda driven science”.
In the case of those newly trained climate scientists, one can only hope that they do not fall into the same trap, but the risk is great, since that is the way the field appears to be skewed.
The saying goes that statistics don’t lie, but liars use statistics.
I’d personally prefer that climate scientists do their work (calling in a statistical expert in those few cases where this might be advisable), that the data are presented openly and transparently and the statistical validity gets checked, when this seems worthwhile and advisable, by independent auditors, such as Steve McIntyre, that do not have a “dog in the race”.
[Maybe I have become overly skeptical as a result of all the latest happenings, starting with all the questionable stuff in AR4, but I really think that climate scientists should stick to doing their work honestly, objectively and diligently and let some one else audit the statistical validity where this is deemed worthwhile and advisable, as was the case for the “hockey stick”.]
This is a reply to John Carpenter’s remarks
“No… I understand the 2nd law. Your comment was phrased in a way that looked as though you did not understand what thermodynamic law was being talked about, further you did not specify net heat flow… similar to the way Postma did not specify which thermodynamic law he was talking about… it appears to me you are fishing for an argument with “deniers”… some call it trolling.
Can you elaborate on this comment? I’m not sure what you mean here. How does the greenhouse effect influence whether there is more or less CO2 or for that matter deuterium?
That is a new one for me, I have never seen the 2nd law of thermodynamics used to disprove evolution, how does that work?
First comment, as to the net heat flow, I didn’t mention it because it is irrelevant. Using the second law of thermodynamics in an attempt to disprove the greenhouse effect has been tried and failed too many times for me to bother trying anymore to refute that argument. Perhaps it is the trolls who keep trying to say there is no greenhouse effect.
Second comment, I would much rather leave this your own research to answer the questions of why there is an excess of deuterium in Venus’s atmosphere as well as how all the CO2 got there in the first place.
Current thinking is that at one point the atmosphere of Venus was similar to the earth’s with a fair amount of water. There is almost no water left on Venus and the excess deuterium is evidence that the water on Venus evaporated and dissociation occurred from UV radiation high in the atmosphere. The lighter hydrogen isotopes were then more likely to escape the gravity of Venus resulting in the excess deuterium.
Do you want me to explain current theories of stellar evolution and the main sequence for the rest of the story or can you look it up yourself?
The creationists also use a flawed interpretation of the second law of thermodynamics in order to prove evolution false and it goes something like this.
The second law can also be stated as there is no allowable process that decrease the entropy of a sysem without the addition of work. Or that entropy always increases, unless work or energy is added.
Entropy is commonly described as disorder, and evolution violates the second law because it decreases disorder as more complex organisms evolve. That’s how the argument goes.
If I can alter the understanding of one denier then I can put my lamp away.
“I would much rather leave this your own research to answer the questions of why there is an excess of deuterium in Venus’s atmosphere as well as how all the CO2 got there in the first place.”
Thanks, It appears you learned what you know from Wikipedia… it’s all good though.
“Do you want me to explain current theories of stellar evolution and the main sequence for the rest of the story or can you look it up yourself?”
No thanks, not what I was asking for.
As for the 2nd law and evolution, which I was curious about… I honestly never heard that one before… so thanks for that one, gave me a good chuckle.
Thanks for the replies and yes, put your lamp away, all the ‘deniers’ around here are getting blinded.
Jeez, maybe I had read of Hansen before wikipedia came into being, can you give a guy the benefit of doubt, before accusing him of learning everything from wikipedia?
Stellar evolution and the main sequence are important to explain how Venus came to the situation it is now in.
The sun brightened and boiled away the oceans which disolved the carbonaceous rocks and liberated some of the carbon in the form of CO2 gas.
Were you not asking where all the CO2 came from and where the excess dueterium came from.
“Were you not asking where all the CO2 came from and where the excess dueterium came from.”
Actually no… you were
My reply was:
“If I had to hazard a guess, you already have an answer and are chumming the water.”
was I wrong? :)
You then replied:
“If there wasn’t a greenhouse effect, there wouldn’t be all that CO2 in the Venusian atmosphere…”
Infering that the greenhouse effect on Venus was responsible for “all that CO2”.
I asked for you to elaborate on that idea… but you only replied about deuterium, until you just now replied with:
“The sun brightened and boiled away the oceans which disolved the carbonaceous rocks and liberated some of the carbon in the form of CO2 gas.”
So whatever… the greenhouse effect apparently was not responsible for CO2 in the Venusian atmosphere… it was the sun getting brighter. Perhaps there’s a little confusion on my part in interpreting what you are saying, but now I totally understand.
No, you are still no getting anywhere.
Here is where the greenhouse effect comes into explaining the composition of the Venusian atmosphere.
First the sun got brighter, causing more water vapor in the Venusian atmosphere, which caused the planet to heat even more due to the water vapor feedback. And the enhanced hydrological cycle weathered away more of the carbonaceous rocks, which liberated more CO2 into the atmosphere, which caused further heating.
Venus has high pressure due to the heat, not high heat due to the high pressure.
Isn’t this what you asked me?
“How does the greenhouse effect influence whether there is more or less CO2 or for that matter deuterium?”
“The sun brightened”? If you mean a lower temperature body such as the Venus or the Earth radiated energy towards the Sun and brightened the Sun, then it is totally non-sense.
That’s not what I’m saying at all.
Check out any source discussing stellar evolution and the main sequence or google the faint early sun paradox.
Instead of ad hominem attacks on “deniers”, re the 2nd law, may I encourage you to study and address Granville Sewell’s development of the second law equations in the Appendix to his 2005 textbook and 2001 paper.
Can “ANYTHING” Happen in an Open System? 2005
From Can ANYTHING Happen in an Open System? 2001
Sewell explores the tautology:
Particularly note his development of the boundary conditions:
What mathematical, physics, or logical issues do you have with Sewell’s development?
Sewell’s development underlies Scafetta’s findings on the influence of the sun and plants on earth’s climate. Also those examining the impact of Forbush events on clouds and earth’s climate.
(If you are brave enough, you could then begin exploring the impact of Sewell’s theorems on the origin of life.)
IPCC News Note: On Friday the IPCC concluded its 33rd session, which was devoted in large part to reform issues. According to IISD press coverage “The decision on IPCC Process and Procedures addresses: the sources of data and literature; the handling the full range of views; the quality of the review; the confidentiality of draft reports; the procedure for handling potential errors; and the evaluation of evidence and treatment of uncertainty.” But I have not yet seen the actual IPCC decision document(s). Perhaps a new thread on this, with links? IPCC also passed a “Conflict of Interest Policy.”
More info, including links to the IISD summariess of the meeting, is at
David, thanks for posting the IISD link …
The impression I have (from far too many hours of poking around the IPCC site) is that decision documents (along with official reports of a Session) may not be posted until a few months prior to the subsequent Session. Well, perhaps they are … but apart from any Opening Statements, I haven’t succeeded in finding any!
So, I found the IISD’s ENB Summary to be quite informative – and their analysis insightful. The Summary provided me with answers to some (fairly simple!) May 14 questions I had posed to the 2 media contacts listed on the IPCC’s May 13 Press Release.
Neither seemed to be able to answer; although one of the two did indicate that he’d asked for “experts” to reply back (presumably to him because I haven’t subsequently heard from anyone) within 48 hours.
So much for the IPCC’s new-found dedication to “transparency” and “rapid response” vis a vis “communications”.
Update: Sorry, I must have been mistaken. I received an E-mail a few hours ago from the “Deputy Secretary Inter-Governmental Panel on Climate Change (IPCC).”
The Deputy Secretary did not answer my questions, but he did indicate that:
The topic is:
Here’s a good example of a mistake.
In 1988, James E. Hansen made a (now famous) prediction of global warming, defining three scenarios of CO2 growth rate:
What happened actually?
The compounded annual growth rate (CAGR or exponential rate) was higher from 1988 to 2011 (around 0.48% per year) than from 1970 to 1990 (0.43% per year), so the closest scenario is scenario A (actual CO2 growth rates were actually a bit higher than those in scenario A).
This projection was made based on the GISS climate models, using the 2xCO2 climate sensitivity assumption of 3°C on average.
It called for an average increase of 0.34°C per decade. By 2011 the temperature should have warmed 0.78°C over the 1988 value.
The CO2 increase occurred as predicted.
But the warming did not.
The actual temperature increase over 1988 was 0.36°C, or less than half of Hansen’s prediction.
So has Hansen admitted his mistake?
Has he corrected his estimate for 2xCO2 climate sensitivity accordingly (to 1.4°C)?
Instead he gets his NASA buddy, Gavin Schmidt, to write a nice double-talk rationalization on RealClimate. Schmidt fogs up the issue nicely, by referring to Hansen’s forecast with reduced CO2 growth rates (scenario B) as the most appropriate comparison.
Oh well. What did you expect?
You are wrong about the climate sensitivity Hansen used for his forecasts.
The model he used had a climate sensitivity of 4 for doubled CO2.
But that was in the Real Climate post you cited, what’s the matter, didn’t you read it?
You did not actually help your position on that one.
Does it matter in a post about admitting mistakes, that Max makes a comment that is directly contradicted by the post he cites?
That’s a rhetorical question if there ever was one.
And anyway, the predicted temperature is for equilibrium, which in case you havn’t noticed, we haven’t reached yet. So his entire analysis is faulty.
No, Bob. Hansen’s temperature prediction was NOT “for equilibrium”.
It was for annual mean global temperature change.
Did you actually read the paper, Bob? If so, how could you make such a silly error?
Pay attention next time before you make snide remarks about others.
I was wrong about the temperature being for equilibrium,
But is the current global mean temperature within two standard deviations of any of the models results or within two standard deviations of the control run?
you apparently don’t understand the argument. It is over the climate sensitivity. Hansen claimed it was much higher claiming that we would have higher temps due to the extra GHG’s due to the high climate sensitivity. It matters not at all that some outlier models are close to matching the satellite record due to NOT having Hansen’s high sensitivity!! The models advertised claim over 2c/C. If we have less there isn’t a problem. Heck, if there is only 2c/C there isn’t a problem!!
When will you start settling for 3SD??
Sorry, bobdroege, I missed that in Hansen’s original paper (guess I got sidetracked where the paper refers to a “3°±1.5°C range”).
That does not change the fact that his forecast was off by more than 2X.
But he has not “corrected” his “mistake” yet.
It’s not a mistake if both the projection and actual events are within the experimental error or natural variability even though they differ a fair amount.
It was a fair first shot at something no one had ever attempted.
But wasn’t he dead on correct about the Northwest Passage?
I mean the “fabled northwest passage”
Gimme a break, Bob. Tell it to Roald Amundsen (or the many others who crossed this over the 20th century). But this has nothing to do with the topic at hand.
The subject here is “admitting and correcting mistakes”.
Hansen made a mistake in his 1988 prediction because he used false assumptions to get there.
He was RIGHT on the assumed CO2 increase (in fact the real increase was a bit higher than his “worst case” scenario A).
He was WRONG on his assumption of what the temperature impact of the increased GHG would be – by a factor of more than 2 to 1.
As you pointed out, he started off with a grossly exaggerated assumption for the 2xCO2 climate sensitivity.
Of course, what he wanted to do was frighten people with an absurdly exaggerated prediction (and it probably worked for some).
Has he now “admitted” his mistake and “corrected” the false assumptions that led to it?
Of course not. He is still using fear mongering to sell his story and he cannot admit that he made a big, bad boo-boo back in 1988.
Readers should be aware that Hansen and Hansen et al. (as well as Schmidt on the Real Climate post Max cites above) have explained why actual emissions followed Scenario B more closely than Scenario A. Max’s assertions that “…the closest scenario is scenario A” and “…his forecast was off by more than 2X” are false.
Sorry, Pat, Schmidt’s “explanation” is a canard.
Hansen’s scenario A forecasts slightly slower CO2 growth than what actually occurred, as I pointed out in detail.
If you don’t believe it, go back to the Mauna Loa record and check it out for yourself.
[But don’t rely on Gavin Schmidt to give you a correct answer.]
Readers should be aware that Pat Cassen has got this one wrong, probably because he relied on someone else’s word, rather than going back to the actual data; based on these data, the closest scenario was scenario A, and Hansen’s forecast was off by more than 2X.
But, hey, Hansen’s only human and humans make mistakes…
Max – I agree with you that the actual data must be consulted, and I agree with Schmidt and Hansen that it’s the forcing that counts. So use all the data to calculate the forcing of all components, and you will find (as Hansen did) that Scenario B produces the forcing history closest to reality.
FYI, Hansen scenario A assumes CO2 emissions will increase at same rates as prior to 1988, to wit:
I have already shown based on Mauna Loa data that the atmospheric CO2 concentration increased at a slightly higher CAGR after 1988 than that, which was “typical of the 1970s and 1980s”, but I have now also checked the emission rates as published by the USEIA “World Carbon Dioxide Emissions from the Consumption and Flaring of Fossil Fuels”
These also are greater than those Hansen assumed for his scenario A:
prior to 1988 = 1.5% increase per year CAGR = Hansen scenario A assumption
after 1988 = 1.85% increase per year CAGR
No matter how you slice it, Pat, the actual CO2 increase was slightly higher than Hansen’s scenario A assumption and not at all comparable to his much slower scenario B.
Those are the facts, regardless of how Schmidt tries to distort them now.
Pat, I simply posted this as a good example of the topic at hand not occurring, i.e. “admitting and correcting mistakes”.
You are beating around the bush.
Sure thetemperature change forecast for Hansen’s scenario B (reduced CO2 emissions) turned out closer to the actual than that for scenario A (emissions at same level as before).
But the emissions followed scenario A more closely (in fact, were even higher than scenario A, as I demonstrated based on the published data).
This is most likely because Hansen’s assumed climate sensitivity was off by a factor of more than 2x.
That was his “mistake”
Hansen believed GHG emissions would drop, which is why he considered Scenario B the most plausible. GHGs dropped.
Hansen tells in this one sentence that he emissions typical of 1970s and 1980s will continue, but doesn’t tell, what he means quantitatively by that: What is typical for the period? It’s not really to make reconstructions based on this information.
Instead he describes the concentration trends sufficiently to allow for rather precise reconstruction, and the result is that scenario B is much closer than scenario A, which is totally out of mark. Presently even scenario B is above the observed concentration.
The paper is internally inconsistent or Hansen has used a very strange relationship between emissions and atmospheric concentration. For judging the other parts of the paper than this relationship, it’s clear that scenario B should be used rather than A.
Actually, Pekka, it is quite easy to determine what the “emissions typical of the 1970s and 1980s” were (because these are recorded). If you check the published record, you will see that they showed a CAGR of 1.5% per year (as Hansen stated correctly in his report).
After 1988 the rate increased slightly. Between 1988 and 2007 the CAGR was 1.75% per year. IOW, the actual CO2 growth was higher than that projected in Hansen’s scenario A.
Sorry, Pekka, but you are wrong again. As far as the increase in CO2 concentration is concerned, scenario A is closest to the actual development (in fact, the actual was slightly higher than scenario A, as pointed out above for the emission rate).
Agree that Hansen’s paper is “internally consistent”. But most of all, it was a grossly exaggerated prediction of warming, which did nor materialize despite the CO2 levels rising even more rapidly than his assumption for scenario A. IOW it was a “mistake”, which should be “admitted” and then “corrected”
Wrong, Pekka. You can’t compare the temperature trend for a scenario with decreasing CO2 growth rates with an actual temperature with exponentially increasing emissions (and concentrations). You have to compare “apples with apples”.
Hope this clears it up for you.
PS You are in the hole on this one, Pekka. My advice: stop digging.
You see a hole. I have not seen evidence that anybody else sees it. Is there a whole, or is ti only in your imagination?
when I came to this blog it appeared that you were a relatively honest person who stuck by the science. This exchange over Hansen’s purposely inflated scenarios used to scare the poppulation takes you out of that small group. I must now inspect everything you post where I am unknowledgeable for its warmer bias.
Pat – Thanks for the Hansen et al reference. Bringing up the Hansen 1988 model projections is guaranteed to perpetuate unending arguments in this blog. At this point, little can be said that doesn’t repeat comments made many times earlier. I do tend to agree with your assessment – when I read the references themselves, I reach a conclusion quite different from the one Max Manacker reaches above and below based on the selected parts he cites. I’m of course tempted to explain why, but that would do what I’m trying to discourage – perpetuate the argument with no prospect of an agreement.
My recommendation is for readers to visit the references you cite and any others they wish to form their own judgments.
Fred – Agreed. I will not pursue the matter.
You recommend that readers check Schmidt and Hansen to see whether or not Hansen’s 1988 forecast was right or wrong.
I have a much better suggestion:
They should check the actual published data, which I cited (Mauna Loa for atmospheric CO2 concentrations and the USEIA data for global CO2 emissions).
When you check these data, Fred, you will see that the actual CO2 emissions and atmospheric levels actually experienced after 1988 were slightly higher than those assumed for Hansen’s scenario A.
Yet the warming actually observed was less than half as much.
In a nutshell, Fred, this is the “mistake” which Hansen has neither “admitted” nor ““corrected” as yet.
Just the facts, Fred.
This is OK for readers who believe that Hansen and Schmidt are going to give them a fair and accurate analysis after the fact on whether Hansen’s 1988 forecast was good or bad.
For readers who may be a bit less naïve and a bit more rationally skeptical (in the scientific sense) my recommendation would be to:
– read Hansen’s original 1988 paper, checking out in detail his assumptions for scenarios A, B, and C
– check the Mauna Loa record to see which Hansen scenario came closest to the actually observed change in CO2 level
– check the USEIA data on global CO2 emissions to see which Hansen scenario came closest to the actually observed change in rate of CO2 emission
Once they have done this, and confirmed based on the actual data, that scenario A is closest to what actually happened as far as CO2 is concerned:.
– check to see how Hansen’s warming projection to 2011 compares with the actually observed warming
Then decide for themselves.
Sailed it not crossed it, in one season.
Never happened until after Hansen predicted it would happen.
Hansen, that is.
Never happened until after Hansen predicted it would happen.
Not unless Hansen predicted it before 1969.
If you are truly interested in the history of NW Passage crossings, let me know (although this is a bit OT here).
There have been several prior to 1980.
NW Passage crossings prior to 1980:
In 1906 Norwegian explorer, Roald Amundsen successfully spent three years traversing the Northwest Passage in an ice-fortified ship. In 1924-1926 Amundsen made his second crossing in the three-masted schooner ‘Maud’.
The next crossing was in 1942 when the Canadian ship ‘St Roch’ and its captain Henry Larsen made the complete transit and the first from West to East. Larsen then turned round and went back through the other way.
In 1957, three United States Coast Guard Cutters, ‘Storis’, ‘Bramble’ and ‘SPAR’ became the first ships to cross the Northwest Passage along a deep draft route. They covered the 4,500 miles of semi-charted water in 64 days.
The first ship capable of carrying significant cargo to traverse the Passage was the ‘SS Manhattan’, a specially reinforced supertanker, in 1969.
In 1977 the Belgian sailor Willy de Roos and his steel ketch ‘Willywaw’ became the 3rd yacht to go through, largely single handed.
In 2007 the Northwest Passage was open during the summer months for the first time in recorded history, but it remains to be seen how stable this opening is. In 2009, two German ships, Beluga Fraternity and Beluga Foresight, completed the first commercial journey across the Northern Sea Route (or Northeast Passage) linking Busan to Rotterdam with several stopovers.
The consideration of arctic routes for commercial navigation purposes remains a very speculative endeavor, mainly for three reasons:
· First, it is highly uncertain to what extent the receding perennial ice cover is a confirmed trend or simply part of a long term climatic cycle.
· Second, there is very limited economic activity around the Arctic Circle, implying that shipping services crossing the Arctic have almost no opportunity to drop and pick-up cargo as they pass through. Thus, unlike other long distance commercial shipping routes there is limited revenue generation potential for shipping lines along the Arctic route, which forbids the emergence of transshipment hubs. This value proposition could improve if resources (oil and mining) around the Arctic are extracted in greater quantities.
· The Arctic remains a frontier in terms of charting and building a navigation system, implying uncertainties and unreliability for navigation. This implies that substantial efforts have to be made to insure that navigation can take in place in a safe manner.
In view of all of the above maritime shipping companies are not yet considering seriously the commercial potential of the Arctic.
Interesting, but OT here.
“In 2007 the Northwest Passage was open during the summer months for the first time in recorded history”
From your post, this is exactly what Hansen predicted.
From your post, this is exactly what Hansen predicted.
That’s what’s called “cherrypicking”. You ignored all the evidence that it was untrue and used it to confirm your own bias.
The line you jumped on came from the link that max provided, which was apparently written by someone with no knowledge of the history. Happens all the time. But that does not excuse your “cherrypicking”.
Bob, the 2007 opening was based on ESA satellite imaging estimations for which the records only went back to about 1979 so the claim is not clear. I know when they say “recorded history” you tend to think hundreds of years but there is a great exaggeration on the part of many newspapers for climate change. They know what sells papers.
I was watching a recent documentary of a ship crossing the passage and they were stuck ice most of the time. It’s still a very difficult crossing. For example, in 2009 of the 10 vessels that attempted the crossing only 1 made the crossing.
thank you for demonstrating why this blog post was written. The warmers simply will not admit any error.
There is no end to Noble Causes as Climate Change is but one. The problem arises when someone hands climate scientists a microphone; that’s when things go terribly wrong: exaggeration, self importance, and when challenged, denial, even fabrication. The issue degenerates into who’s had the microphone first, I said that I am right, you prove me wrong!
I agree with Pekka, partially, the validity of science, or more accurately, science’s utility comes when that bit of science is used later on to perform other work. If that building block is not valid, subsequent science based upon that idea is also not valid. My perspective differs from Pekka in the sense of urgency of having to act now without the vetting of the likelihood of projections by the present GCM. Unlike JC, trying to develop more elaborat methods of assessing uncertainty, which require opinions by “experts”, I prefer waiting and continually investigating. No one has demonstrated that we have hit an iceberg. In fact, all the ships who stopped engines that night in 1912 to await the light of day, were not only the survivors of the night, but the rescuers and heroes. Collectively, the science community best ignore the blaring from the microphones, do one’s thing, and await the light of day to see which direction is perilous and which direction is more prudent. I my mind, the logic of Tomas Milanovic and the non linear temporal spacial relationship of climate, makes a full speed ahead decision making attitude imprudent.
But isn’t the problem exactly that we’re going “full speed ahead” (or nearly so) with CO2 emissions?
CO2 emissions must be assumed to have more than a theoretical/wet lab bench impact. It also means that one can deduce a signal of CO2 in the climate now. As you are aware, such detection is unavailable. Therefore the impact of CO2 leading to a runaway climate is just one of many plausible hypotheses with strengths and weakness. In my mine, the weaknesses far outway the plausibility.
Yes, and it is doing wonderful things for biomass on the planet!! I would point out that if the lower solar output translates to lower temps the extra CO2 will be important for keeping outputs from dropping drastically!!
The problem is that CO2 is being treated as a problem, when there’s no problem, dude!
Concerning urgency my view is not quite so simple:
Some, possibly rather weak, evidence on the risk of severe consequences is sufficient for looking at issue more.
Rather weak evidence is sufficient also as factor that could be taken decisive, when the alternatives are otherwise on equal footing.
The stronger actions are considered, the more evidence is required, and the requirement concerns all relevant issues:
– the risks of climate change and their likelihoods,
– the effectiveness of the proposed actions and uncertainties of the effectiveness,
– the costs and other risks of the proposed actions.
At this point some quantitative understanding of all these factors is required. A full benefit-cost analysis (BFA) is in most cases not achievable, but something on semiquantitative level is needed. One obstacle for a full BFA is that much of the data that can be collected is difficult to valuate. Its valuation involves ethical problems both on the level affecting directly presently living people and on the level of intergenerational equity.
In my view the problem is that the various issues of the previous paragraph are finally decisive. Neither side of the controversy should be allowed to think that the decision follows automatically, when some proof of AGW has been obtained or shown to be lacking. We are living in a world where more complex judgment is required all the time for various issues. Often its done based on past experience following normal practices of political decision-making. That works well enough, when issues are familiar enough for the decision-makers, but the risks and mitigation measures of the climate change don’t satisfy that requirement. Therefore supporting analysis is needed, but it must be comprehensive, not limited to climate science proper.
Thank you for your thoughtful reply. Like you I also believe that a formal cost/benefit analysis is warrented yet not possible with the current state of knowledge. What I find so vexing, that I do not trust some members of the climate science community and particularly the “science” they report. By this I mean data that is adjusted or found to be weak, start point dates used for drama sake, failure to acknowledge thought provoking paradigms such as Tomas Milanovic, etc. Weakness is not acknowledged let alone uncertainty. Certainly there have been no appologies, retractions, etc. So I have found Climate Science to be weak and it’s practioners to be less than believable, particularly when they claim the stakes are so high. I am willing to live with a great deal of uncertainty. I am relatively intolerant of those messing with my head.
No, Bob. Hansen’s temperature prediction was NOT “for equilibrium”.
It was for annual mean global temperature change.
Did you actually read the paper, Bob? If so, how could you make such a silly error?
Pay attention next time before you make snide remarks about others.
Speaking of owning up;
It’s only the third inning, many more to follow as they hit pension age.
ON ADMITTING AND CORRECTING MISTAKES
The increase in carbon emission has been exponential.
It has been much higher than Hansen’s annual growth, for scenario A, of 1.5%.
The actual growth rate from 1900 to 2007 had been 2.67% as shown in the following graph.
As a result, the prediction of Hansen 1988 to be compared with observation is scenario A.
He was completely, utterly wrong in his prediction as shown in the following result:
When is Hansen going to acknowledge the discrepancy between his prediction and observation?
Thanks, Girma, for graphs showing how bad Hansen’s 1988 forecasting mistake was.
I have been trying to explain this to bobdroege and Fred Moolten, but they are having a hard time understanding it.
Hopefully your graphs will help them (a picture is often worth 1,000 words).
Actually, the observed temperature matches the case for:
Scenario C drastically reduces trace gas growth between 1990 and 2000…
Which leads to the conclusion that carbon has no effect in the global mean temperature.
The observed temperature might go lower than scanario C in the near future. Much lower.
We know that there are multiple limitations to models. Significant and seemingly little understood are the chaotic aspects – irreducible imprecision in the words of James McWilliams. This is an imprecision that necessitates a qualitative judgement about solution behaviour. That is – there are large differences in possible solutions as a result of chaotic bifurcation between plausibly formulated models and a solution is selected from the range of possible solutions based on a priori assumptions.
‘Sensitive dependence and structural instability are humbling twin properties for chaotic dynamical systems, indicating limits about which kinds of questions are theoretically answerable. They echo other famous limitations on scientist’s expectations, namely the undecidability of some propositions within axiomatic mathematical systems (Gödel’s theorem) and the uncomputability of some algorithms due to excessive size of the calculation.’ McWilliams (2007)
Whether the models could produce a decadal and longer temperature decline as a result of a ‘plausible’ model formulation is not known – because we are not informed about any systematic exploration of the solution space. If they do not – they would seem to fail an a priori formulation test – because temperatures in the real world certainly can decline.
It seems very premature to me to be worrying about fine detail when the broad brush of model behaviour is so uncertain.
I was looking through my e-library yesterday and came across this.
El Niño Southern Oscillation (ENSO) is the dominant mode of climate variability in the Pacific, having socio-economic impacts on surrounding regions. ENSO exhibits significant modulation on decadal to inter-decadal time scales which is related to changes in its characteristics (onset, amplitude, frequency, propagation, and predictability). Some of these characteristics tend to be overlooked in ENSO studies, such as its asymmetry (the number and amplitude of warm and cold events are not equal) and the deviation of its statistics from those of the Gaussian distribution.’
ENSO’s non-stationary and non-Gaussian character: the role of
climate shifts (2009) – J. Boucharel, B. Dewitte, B. Garel, and Y. du Penhoat
The paper is openly available at ‘Nonlinear Processes in Geophysics’. ENSO is indeed the major source of variability in global climate and hydrology on interannual to possibly millenial timescales. ENSO is well known to be associated with secular changes in cloud radiative forcing. Low level marine stratiform cloud in the eastern and central Pacific (from both surface and satellite observations) is negatively correlated with sea surface temperature – producing very large changes in cloud radiative forcing seen in both ERBE (edition 3) and ISCCP-FD data. Regardless of the accuracy of the satellite data – it seems again premature to neglect changes in cloud radiative forcing on at least decadal scales. This can only be dismissed by dismissing ERBE, ISCCP, HIRS and CERES data – all the relevant satellite data – as James Hansen does in his recent draft. I think it says more about James Hansen than the data.
ENSO is most certainly chaotic in the terms of chaos theory. That is – a system evolving over space and time with control variables and multiple feedbacks (dynamically complex and sensitive to changes in initial conditions) and exhibiting abrupt and non-linear change.
Dynamical complexity is indeed a fundamental property of climate resulting in the ‘internal climate variability’ which has become undeniable but is usually unspecified. There are control variables that theoretically include anthropogenic greenhouse gases – and abrupt and non-linear climate shifts that occur fairly frequently.
We are in a cool multi-decadal phase of the Pacific Interdecadal Oscillation (IPO). A resent study uses the PDO as an IPO proxy. ‘A negative tendency of the predicted PDO phase in the coming decade will enhance the rising trend in surface air-temperature (SAT) over east Asia and over the KOE region, and suppress it along the west coasts of North and South America and over the equatorial Pacific. This suppression will contribute to a slowing down of the global-mean SAT rise.’ Takashi Mochizuki et al 2011.
So no temperature increase for another decade at least? I suggest that if we want a broad solution to the carbon impasses that we need to go past the unproductive dichotomies of the climate wars to reach a broad political agreement. This does involve giving up carbon taxes and cap and trade. No great loss as these are mostly moot at any rate – and will certainly remain so as the global temperature continues to stubbornly refuse to rise.
There are risks from greenhouse gas emissions in a chaotic climate system. But much of what has been said about global warming is fundamentally in error and has been used in support of fringe cultural values. This has been quite unfortunate as it undermines practical and pragmatic policy formulation for the rational gaol of limiting the great atmospheric experiment.
I suppose everyone knows that Mann’s statistical techniques produce hockey stick shapes from red noise according to McIntyre and McKittrick.
But do you know that the signal strength of the blade in these cases is on the order of 0.02 C, while the actual blade as described by Mann et al is on the order of 0.6 C.
Does it matter?
You ask (concerning the “noise signal” in Mann’s comprehensively discredited and falsified “hockey stick”):
No. It doesn’t.
Mann’s HS is a “mistake” which (except for a few die-hards) has been “admitted”, and has also been “corrected” by several studies from all over the world, using different paleo-climate technologies, all showing a MWP that was slightly warmer than today.
It is dead and buried as a piece of junk science. Let it RIP.
No, I was talking about the noise signal in McIntyre and McKittricks almost totally discredited paper.
The hockey sticks they find using Mann’s method are at the noise level, about 0.02 C, where the hockey stick of Mann has a level of 0.6C.
How about Loehles study, which showed that the Medieval Warm period was slightly warmer than the last three decades of the 20th century, but not statistically significantly warmer. But the first decade of the 21st century was warmer still, so you can say that the medieval warm period was not warmer than today, at least not statistically significantly warmer.
I’m not going to enter a long discussion with you re Mann’s HS. That’s a dead issue.
As far as Craig Loehle’s study is concerned, it did conclude that the MWP was marginally warmer than the current warm period, thereby raising serious questions concerning the controversial IPCC AR4 SPM claim (which was apparently still based on the old HS and some “spaghetti copies”):
To compare the MWP with today in the Loehle study you have to download the annual temperatures.
Then you have to put them on the same baseline basis as the modern temperature record.
I did this (last year), comparing the warmest 50-year period in Loehle’s study with the most recent 50-year period using HadCRUT and got the following comparison:
0.471°C year 857 to 906
0.311°C year 1960 to 2009
0.16°C MWP warmer than most recent period
As you can see, the warmest 50-year period in the MWP was only marginally warmer than the current warmest 50-year period, but it was warmer, as the Loehle study concluded.
Craig Loehle is a frequent visitor to this site, so you can ask him in person, whether I did the analysis correctly.
PS And there are around 30 studies from all over the world, using different paleoclimate methodologies, which have also concluded that the MWP was slightly warmer than the current warm period, so I think the conclusion is fairly robust.
That is a slightly different interpretation of the data, than what I was referring to as far as Loehle.
Loehle had to publish a correction to his paper in E&E, adding the not statistically significant part to his statement that the three warmest decades of the medieval warm period were warmer than the last three decades of the 20th century.
Of course if you add the cool 60s into the average, you get a different answer.
But you are right, that whether or not the Medieval Warm Period was warmer than today or not is irrelevant.
My point was that if you blow up a signal, you can always find any shape you want, and that is the basic criticism of Mann et al. That his statistical techniques produce hockey sticks from red noise, which is true, but the blades are only 0.02 C in height, where as the real blade is 0.6 C in height.
Bottom line is that it will continue to warm unless we start doing something about it, and even then it will be some time before the temperature signal stops increasing.
I would rather have an economy with no growth than no economy at all.
Why do you feel that a warmer would would be bad?
What is it that makes you believe that we NEED to do something about it, or that what you propose would have a beneficial effect worth the cost???
I’m won’t make statements or arguments – I’ll just ask questions. I’ll be interested in your answers.
Bottom line is that it will continue to warm unless we start doing something about it
Since you apparently believe that, what’s the scientific/factual basis for your belief? IOW – what evidence is there that a/ it will continue to warm? and b/ that it will stop if we do “something”? And – specifically, what is it you would have us do?
even then it will be some time before the temperature signal stops increasing.
Who told you that? And what evidence do they present to support that contention? Why do you believe them?
I would rather have an economy with no growth than no economy at all.
Many questions, but I’ll confine myself to these –
a/ What would produce your no economy at all scenario?
b/ What would produce your economy with no growth scenario?
C/ Do you understand that the end result of both is exactly the same, just on different time scales?
Specifically, the scientific basis is the infared absorption properties of CO2, the radiative properties of the earth and sun, and the basic laws of thermodynamics.
In other words, CO2 absorbs the thermal radiation from the earth causing the earth to warm to maintain the heat in versus heat out balance.
The increase in CO2 takes time for the whole effect which includes various feedbacks.
We still need the energy and it will take time to replace all the coal burning power plants with alternatives, so we will continue to add CO2 to the atmosphere for some time.
We could also drive more efficient automobiles for a while until electrics are more feasible.
For the middle questions, it’s pretty much in the IPCC reports.
For the last questions, these projections I would make would be at least 500 years out.
What would the CO2 levels be if we burned every drop of oil, every scrap of coal, all the shale oil, and all the permafrost melted and released all the methane stored there and all the methane clathrates frozen on the sea floor became CO2 in the atmosphere.
That could easily melt all the ice caps, which could turn the polar regions into warm tropical areas. Which would put more water vapor into the atmosphere causing even more warmth.
Warmer and wetter is eventually uninhabitable for us mammals, as certain combinations of heat and humidity prevent us from shedding heat and remaining alive.
So we will all cluster at the poles till the very end.
It’s reasonable to try to limit the temperature increase to 2C.
So put another way- you have no evidence that a warmer world is worse overall for humanity, and you have nothing of consequence to support your belief that actions should be taken to lower CO2 emissions even if they are damaging economically.
We could also drive more efficient automobiles for a while until electrics are more feasible
OK – so everything down to this point is just standard talking points and “standard” science – some of which is questionable and/or uncertain, but I won’t argue that right now. The alternative energy thing raises the question of what you consider it to be solar,wind, geo, nuke? But we’ll probably do that later anyway, so…
I do agree about the more efficient cars – but I’ll also add that many, if not most people fail to understand that those cars will be far more expensive, both in initial cost and in maintenance costs. And so, not nearly as accessible. And that the carbon footprint of the manufacturing process will skyrocket.
What would the CO2 levels be if we burned every drop of oil etc.
That’s been calculated here several times – and IIRC, it would be about 1000 ppmv. Given the temp increase over the last 50 years or so, it would seem that the sensitivity is not nearly what the IPCC claims and that temp doesn’t track CO2 as advertised. Therefore, the massive temp increases that you seem to expect are not likely. In fact, I would suggest that given the latest round of sensitivity estimates, the temp, even at 1000 ppmv might fail to reach your 2 degC. If we were to increase temp by only 2 DegC, the methane thing is a non-starter. And frankly, I suspect it is anyway. No guarantee on that, of course, but then there’s no guarantee on anything in life – including the expectation that anything we CAN do will have any effect at all. Remember that ALL the warming from 1850 to 1970 was accomplished without the CO2 that you seem to think is the culprit for the last 60 years of warming.
For the rest of your comment – someone’s been feeding you horror stories – but they’ve failed to give you the REAL horror story. Which is that the planet’s climate has repeatedly crashed in the past – and that it has NEVER crashed such that the temps increased but rather always in the other direction – into an ice age. Which few of todays population would likely survive. Given the choice, I’ll take warm rather than cold – even though I’m one of those who likely have the necessary knowledge, skills and gear to survive (for a while, anyway).
One more thing – your warmer, wetter, cluster around the poles scenario is not one I find credible for several reasons. One of which is that it ignores natural feedback mechanisms which, apparently your “story source” also fails to understand. And another of which is that it’s a long way from northern Canada – or Scandinavia – or Russia to the nearest Pole, which is surrounded by a LOT of water. And neither you nor I are likely to be able to get to the other Pole if things do crash into hot.
I’ve said this several times before – I will NOT live my life with fear – not mine – not yours.
“Specifically, the scientific basis is the infared absorption properties of CO2”
CO2 absorption is negligible over the spectrum of the long wave radiation.
“In other words, CO2 absorbs the thermal radiation from the earth causing the earth to warm to maintain the heat in versus heat out balance.”
This is a misconception. CO2 only absorb a minute bandwidth of the Earth’s longwave radiation. Be realistic, a trace gas having a narrow bandwidth of the whole longwave spectrum has no effect or effect tend to be absolute zero considering the mass of the atmosphere.
“The increase in CO2 takes time for the whole effect which includes various feedbacks.”
The various feedbacks were manufactured by the alarmist to scare people to steal their pocket money in order to stay on the gravy train. These alsrmists are unetheicals.
You know Sam, for a trace gas there is about 1.5 grams of CO2 above every square inch of the earth’s surface.
Unethical alarmists on the gravy train?
Stealing pocket money?
You know that’s an ad hominem argument don’t you?
Sorry Bob, FACTS are NEVER ad homs!!!
You can look at 30-year periods, 50-year periods or any other length to compare the MWP with the current warming period (CWP).
IPCC has chosen a 50-year period when it stated:
So I downloaded the complete record of Loehle’s study and HadCRUT.
There was a period of overlap between 1850 and 1980.
So I took the average of each of the two records over this period.
The difference between these two long-term averages is 0.207°C (so this is the difference in the “baseline” value of the two records).
HadCRUT shows an average for “the last half century”, 1960-2009 (also the warmest since 1850) of 0.104°C. Adjusting this to the same baseline as Loehle gives an average for “the last half century” of 0.311°C.
Then I picked the warmest 50-year period in the Loehle record. This was the period 857 to 906, with an average temperature of 0.471°C.
This period was 0.16°C warmer than “the last half century”, thereby falsifying the IPCC statement.
I agree in principle with your statement
even if IPCC apparently did not believe it was irrelevant (nor did Michael Mann).
I cannot agree with your statement, however:
This statement is purely conjectural. As a matter of fact, it has stopped warming since the end of 2000 (again according to HadCRUT), even though we have not done anything about it, so the temperature signal has already stopped increasing (at least for now).
What will happen in the future is anyone’s guess. I would tend to agree with Girma, who has concluded that it is likely that we will have another multi-decadal (roughly 30-year) period of slight cooling, before the next warming cycle starts again.
But I am not going to make any forecasts for the future.
It’s like the US baseball star and philosopher, Yogi Berra, once said:
(A message James E. Hansen should have heeded in 1988 when he made his now infamous warming prediction that did not materialize.)
Now Hansen has to console himself with another Berra quote:
Now you are talking about 50 year periods, moving the goalposts, I was referring to the 30 year periods mentioned by Loehle in his paper.
Let’s leave the MWP and discuss how McInyre discredited Mann’s hockey stick by showing Mann’s statistical techniques produce hockey stick shapes with blades of 0.02 C, which is 30 times smaller than Mann’s hockey stick blade.
Have you heard the Wegman report was retracted?
I heard it was the paper, and not the report. The report was filed with a congressional committee, the paper published in a scientific journal.
It my understanding correct?
Again, I am wrong, or mistaken.
Yes, it was the paper that was retracted.
You are beginning to become tedious here with your silly remark”
It is IPCC that picked 50-year periods, not Loehle nor myself. Read what I wrote rather than simply making ridiculous accusations of “moving goal posts”.
I simply checked the IPCC claim that
against the Loehle data set and found out that the IPCC claim was false.
FYI a “half century” equals a 50-year period.
If one looks at 30-year periods (as Loehle did in his study) the net difference between the warmest period in the MWP and the CWP is around 0.3C (MWP warmer than today).
Give up on this one, Bob. You are only making yourself look silly.
You have asked to change the subject to Wegman and Mann.
The latest Wegman episode does not interest me much, since it has little to do with the accuracy of his committee’s report under oath to US House committee regarding the McIntyre and McKitrick deconstruction of Mann’s hockeystick.
As you will see below, the validity of the conclusions of the Wegman committee was subsequsntly confirmed to the congressional committee under oath by a NAS panel, so that’s good enough for me.
Wegman’s committee reported:
In addition to supporting the McIntyre and McKitrick study, which pointed out errors in the statistical approach used by Mann et al., the Wegman report concluded:
At a subsequent hearing before the House Committee, Gerald North, the chairman of the NAS panel and panel member Peter Bloomfield were asked whether or not they agreed with Wegman’s criticisms, they replied as follows:
For a comprehensive summary (with lots of scientific reference, all of which you can check out) read Andrew Montford’s, “The Hockey Stick Illusion – Climategate and the Corruption of Science”
It is described as follows:
This story (along with Mann’s discredited hockey stick) is dead and buried. Let’s leave it like that.
PS In addition to the Loehle study, which we have already discussed, there have been twenty-odd studies from all over the world, using different paleoclimate methodologies, which all conclude that the MWP was slightly warmer than today. If you would like, I can post the references. But I really think we have beaten this dog to death. Bottom line is that the IPCC made a claim of unusual 20th century warmth that has not held up. Yet, so far they have neither “admitted” nor “corrected” their “mistake” (to get back on topic here).
Read the correction to his paper that Loehle published in E&E, then get back to me.
I read it before already, but thanks for the tip.
Does it matter? No. McIntyre never claimed the hockey sticks generated from red noise are equitable in size to Mann’s. He used the red noise experiment to show Mann’s process mined for hockey sticks, giving them undue weight. Moreover, this process gives spurious RE statistics to the mined hockey sticks. This is important because Mann’s hockey stick passed RE verification but failed R2 verification (as did the hockey sticks created with red noise).
The size of these hockeysticks is completely irrelevant to the issue, so no, it doesn’t matter. I’d advise you exercise more caution when reading things from people like DeepClimate. They are not trustworthy sources.
Also, the Wegman Report has not been retracted. A paper which covered some of the same topics as it was retracted, but obviously that’s not the same thing. You really ought to be more careful with your reading.
Oh, sorry. I see you already realized your mistake about the Wegman Report.
You are quite wrong about the size of the hockey sticks in M&Ms analysis.
They are saying they found the hockey sticks in the shaft part of the reconstruction, but they are at the level of the noise. Not important, I don’t think so.
I couldn’t possibly be wrong about the size off the hockey sticks in M&M’s work as I didn’t say anything about what their size was. It’s strange for you to tell me I am wrong about things I never said, especially when you don’t respond to anything I said. Quite frankly, your response here makes no sense.
Incidentally, M&M were not saying they found hockey sticks in the shaft part of the reconstruction.
My life for an edit/preview feature. Obviously that should be “of,” not “off.”
You just need to go read M&M again.
bobdroege, this is a ridiculous response. You cannot simply say, “Look it up” and expect it to fly in any real sort of discussion. If you have a reason to think I’m wrong, say it. Otherwise, all you’re doing is trolling.
I’ll say it again, you are wrong and here is why.
The “hockey stick” shaped temperature reconstruction of Mann et al. (1998, 1999) has been widely applied. However it has not been previously noted in print that, prior to their principal components (PCs) analysis on tree ring networks, they carried out an unusual data transformation which strongly affects the resulting PCs. Their method, when tested on persistent red noise, nearly always produces a hockey stick shaped first principal component (PC1) and overstates the first eigenvalue.
Nothing in this response contradicts anything I said. Indeed, it doesn’t even support what you said. There were two parts to your comment. First, you said I was wrong about the size of the hockey sticks generated from red noise. I responded by pointing out I hadn’t said anything about their size, so your claim made no sense. Your current response obviously does not address this issue.
The second part of your response claimed MM “found the hockey sticks in the shaft part of the reconstruction.” I pointed out MM never did anything of the sort. A hockey stick is made up of a shaft and blade. You cannot find a hockey stick in just the shaft. That would be like claiming you found a car inside the wheel part of the car. Your current response does nothing to address this issue either.
I think there is some sort of confusion on your part. You aren’t responding to what I say in my comments, so it’s hard to tell what you are thinking. To solve this issue, I have an idea. In your next response, in addition to stating your position, try restating mine. If what you come up with isn’t the same as what my position actually is, we’ll know the source of our problem.
Yes, and many of us KNOW that the shaft is created by basically random sizes of rings and data averaged together. Now, in the blade there was a selection process to pick the alledged treemometers. Where was the selection process to pick the treemometers for the shaft?? Oh yeah, there wasn’t one leading to the flat AVERAGED shaft. Sorry Bob, you are just convincing everyone that you are simply regurgitating talking points. Basically a sock puppet.
No, the blade is the historical thermometer reading, do keep up.
The blade was not reconstructed from tree rings.
No, the blade is the historical thermometer reading, do keep up.
The blade was not reconstructed from tree rings.
From what I’m seeing, you’re not even at a beginner level wrt warming. Do the words “Hide the Decline” mean anything to you?
If the handle is the reconstruction and the blade is the historical thermometer record, then you desperately need to watch this –
And learn something real about science.
OK Bob, you just PROVED you have no idea what you are talking about. Please go back to your mentor and have him/her explain what Mann actaully admits to doing and what Steven Mc. and others discovered. Only ONE reconstruction in Mann’s spaghetti graph was replaced by actual thermometer readings.
Please keep up.
Sorry, that should be PARTIALLY replaced by “thermometer” readings.
The global mean temperature pattern shown below is continuing:
Here is a paper that confirms the above prediction.
Thanks Chief for the link.
Pacific decadal oscillation hindcasts relevant to near-term climate prediction
Takashi Mochizuki et al, 2009
Decadal-scale climate variations over the Pacific Ocean and its surroundings are strongly related to the so-called Pacific decadal oscillation (PDO) which is coherent with wintertime climate over North America and Asian monsoon, and have important impacts on marine ecosystems and fisheries. In a near-term climate prediction covering the period up to 2030, we require knowledge of the future state of internal variations in the climate system such as the PDO as well as the global warming signal. We perform sets of ensemble hindcast and forecast experiments using a coupled atmosphere-ocean climate model to examine the predictability of internal variations on decadal timescales, in addition to the response to external forcing due to changes in concentrations of greenhouse gases and aerosols, volcanic activity, and solar cycle variations. Our results highlight that an initialization of the upper-ocean state using historical observations is effective for successful hindcasts of the PDO and has a great impact on future predictions. Ensemble hindcasts for the 20th century demonstrate a predictive skill in the upper-ocean temperature over almost a decade, particularly around the Kuroshio-Oyashio extension (KOE) and subtropical oceanic frontal regions where the PDO signals are observed strongest. A negative tendency of the predicted PDO phase in the coming decade will enhance the rising trend in surface air-temperature (SAT) over east Asia and over the KOE region, and suppress it along the west coasts of North and South America and over the equatorial Pacific. This suppression will contribute to a slowing down of the global-mean SAT rise.
You have to remember that the effects are Pacific wide – and not simply the PDO. They use the PDO as a proxy for the broader changes involving – upwelling and feedbacks in the eastern and central Pacific – and this must be only approximate.
Global Climate Changes as Forecast by Goddard Institute for Space Studies
Here are the scenario definitions
Here is the comparison of forecasts with observation.
I let you judge the forecast yourself.
ON ADMITTING AND CORRECTING MISTAKES
What is the OBSERVED exponential carbon emission growth rate that Hansen forecasted to be 1.5% in Hansen et al., 1988?
The carbon emission curve is shown in the following graph.
From the above data, the approximate annual global carbon emission in G-ton from 1900 to 2007 = 0.53*e^(0.0267*(year-1900))
As a result, the annual exponential growth rate is 2.67%, much higher than the 1.5% assumed by Hansen et al, 1988.
If we substitute the Hansen’s growth rate of 1.5%, the carbon emission for 2007 = 0.53*e^(0.015*(2007-1900)) = 0.53*e^(1.605)=0.53*4.978=2.63 G-ton, which is obviously wrong.
If we substitute the actual approximate growth rate of 2.67%, the carbon emission for 2007 = 0.53*e^(0.0267*(2007-1900)) = 0.53*e^(2.857)=0.53*17.409=9.23 G-ton, which is much closer to the actual carbon emission of 8.4 G-ton for 2007.
The OBSERVED exponential carbon emission growth rate is about 2.67%, which was forecasted to be 1.5% in Hansen et al., 1988. As a result, among the three scenarios, scenario A is closer to the reality.
Here is the comparison of the three forecasted scenarios with observation.
When is this mistake going to be admitted and corrected?
The analogy is catching on:
“”I’m not a meteorologist. All I know is 90 percent of the scientists say climate change is occurring.[b] If 90 percent of the oncological community said something was causing cancer we’d listen to them,[/b]”
– Jon Huntsman, in Time.
Regrettably, Huntsman does not credit either Dr. Curry or myself. Still, it’s exciting to be present at the birth of a climate meme!
On admitting and correcting mistakes
I thought this was going to be about Anthony Watts and Edward Wegman. But I understand you want these to blow over quietly. Sssssh, don’t wake up the kids.
Clearly Judith Curry is horribly biased. On May 14th, she failed to write about a story which broke on May 15th. The only possible reason Curry didn’t use her psychic powers to write about the story is because she wants to protect Wegman and Watts.
There is no other possible explanation.
Don’t be silly.
I can’t help it. I’ve been silly since the day I was born.
Oh, great! This means there will be a piece soon on the self-refuting work of Watts and Wegman? Should be interesting, because in these 379 Wegman is only named twice, and Watts zero times. Almost as if nothing has happened/is happening.
Considering the thread we’re on, it’s funny you made a criticism of Judith Curry that was completely baseless, and when it was pointed out, you didn’t admit your mistake. It’s going to be hard to have any sort of high ground with behavior like that.
Neven, we are waiting for you to post links or details that show how Wegman’s actual work showing problems with the Hockeystick was flawed.
I realize that you want to be able to ignore this based on inappropriate use of others work in his paper, but, it really is two separate issues.
If a policeman uses an illegal wiretap to record a murderer confessing to a crime, the case can be dismissed and the murderer walk. He still murdered someone.
Kuhnkat, I suppose there is no need to show how Wegman’s work on describing paleoclimatology and SNA was shoddy, incompetent and in considerable part plagiarized, so here’s a nice link for you to bite into that shows that when it came to the actual statistics Wegman also did a cowardly copypaste of McIntyre’s criticism: Replication and due diligence, Wegman style.
The Wegman Report was presented to Congress as being independent and impartial. It wasn’t. Perhaps Mann is the murderer you so obsessively want him to be, but the PR ploy that is the Wegman Report is not the most honest way to go about it (putting it mildly). But keep defending it. This story is far from over, as new things keep turning up. For instance, the editor approved the now retracted paper for publication within 5 days of reception. The editor was a buddy of Wegman.
Please, defend that too, because we really want that irony meter to explode.
do you really believe that independent and impartial means that the report MUST DISAGREE WITH THE SIDE YOU DON’T LIKE?!?!?!?!
The original report was just that, a report to congress on the facts. If one side has the correct facts why would you try and disagree with them??
Please show us where he was WRONG on the substantive issue of appropriateness or usefulness of the Hockeystick for its purported purpose.
That link contains is filled with so many falsities, I’d be hard pressed to point them all out. DeepClimate has an annoying habit of taking the word of people he agrees with as gospel even though their comments have been refuted. I’d be fine with the possibility of him thinking they’re right despite the refutations (disagreements are normal), but he doesn’t even acknowledge the disputes. Given the extreme bias shown by behavior like that, I can’t find any motivation to look into the technical issues he might raise.
I think the most amazing part of that post is DeepClimate’s accusation of cherry picking. In actuality, the text clearly explains why the series that were picked were picked. The only reason there is any appearance of cherry picking is apparently DeepClimate can’t read simple sentences.
If anyone actually wants to discuss the details of that article, I’d be happy to, but in my experience, people who believe DeepClimate aren’t interested in having real discussions.
Is any skeptic claiming that Wegman is why AGW calamitism is wrong?
Are you claiming that Wegman’s alleged plagiarism makes his claims and conclusions false?
How you like my Wonderware computer graphic of the tunnel I made?
Could you please check the carbon emission exponential growth rate was about 2.67%, not 1.5% as assumed in Hansen et al 1998?
Here is the carbon emission data:
Here is how I got the CO2 growth rates.
First I checked Hansen’s basis for scenario A:
I took data from USEIA on global CO2 emissions from fossil fuels. These look to be the same as the chart you posted, but expressed in GtCO2/year rather than GtC/year:
1970 – 15.8
1988 – 21.2
2007 – 30.0 (last year of data)
The compounded annual growth rate (CAGR) from 1970 through 1988 (19 years) was:
(21.2 / 15.8)^(1/19) = 1.0154 = 1.54% per year
This checks with Hansen’s scenario A figure of 1.5% per year.
From 1988 through 2007 (20 years) the CAGR was:
(30.0 / 21.2)^(1/20) = 1.0175 = 1.75% per year
So the actual growth of emissions after 1988 was higher than that assumed by Hansen for scenario A.
Then I checked the atmospheric CO2 content as recorded in ppmv at Mauna Loa:
The annual increase bounces all over the place year by year, so I plotted it in Excel and got a linear equation: y = 0.026x + 0.745.
Since Hansen stated that the GH forcing increases exponentially rather than linearly, I again figured out the CAGR of the annual CO2 concentration itself.
1970 – 324.4 ppmv
1988 – 350.7 ppmv
2010 – 389.1 ppmv
The compounded annual growth rate (CAGR) from 1970 through 1988 (19 years) was:
(350.7 / 324.4)^(1/19) = 1.0041 = 0.41% per year
From 1988 through 2010 (23 years) the CAGR was:
(389.1 / 350.7)^(1/23) = 1.0045 = 0.45% per year
So the exponential increase in atmospheric CO2 content was also slightly higher after 1988 than prior to 1988.
IOW scenario A represents the closest change in CO2 emissions and forcing from CO2 compared to the actual development (in fact, the actual development was slightly higher)
That is why the comparisons with scenario B (reduced CO2 growth) or scenario C (drastic reduction) are not “apples to apples” comparisons, but simply silly “smoke and mirrors” attempts to try to make Hansen’s 1988 forecast look a bit less preposterous than it actually was.
Thanks very much Max.
In summary, here is the comparison of Hansen et al, 1988 forecast (Scenario A) with observation
Max and I leave the conclusions to you.
It seems we can convert from carbon to carbon dioxide by multiplying by a constant factor. As a result, my estimate of the carbon exponential growth rate of 2.67% should also apply for CO2.
To convert from carbon to carbon dioxide, multiply by 44/12 (=3.67).
Thank you again.
The mistake I made was to start from 1900 to estimate the exponential carbon emission growth rate.
I have made the estimation for the period from 1970 to 2007, and my result matches yours as shown in the following graph.
For this period, the exponential carbon dioxide emission growth rate is 1.64%. As a result, Scenario A in Hansen et. al, 1988, is closer to the reality.
Here is how I checked your values using the formula:
Carbon dioxide emission (GtCO2) = 3.67*4.3*e^(0.0164*(Year-1970))
For 1970, carbon dioxide emission (GtCO2)= 3.67*4.3=15.8
For 1988, carbon dioxide emission (GtCO2)= 15.8*e^(0.0164*(1988-1970))=15.8*e^(0.295)=15.8*1.343=21.2
Max, when is Hansen going to acknowledge these discrepancies?
Max, note that when I average your two estimates of 1.54 and 1.75 I get, (1.54+1.75)/2=1.64, which the exponential carbon emission growth rate for the whole period that I estimated:
Thanks for that.
Looks like no matter how you calculate it (and how many posters here try to “fog up” the issue) Hansen’s 1988 scenario A is closest to what actually occurred as far as CO2 (and forcing from CO2) is concerned.
The problem with Hansen’s 1988 projection is not the CO2 growth, which was actually even a bit higher than his scenario A, but the resulting temperature increase, which he overestimated by more than 2X.
This is a clear indication that the 2xCO2 climate sensitivity he loaded into his model was much too high.
But instead of “admitting” and “correcting” his “mistake”, he gets Gavin Schmidt to write a defensive “smoke and mirrors” obfuscation why it was correct, after all (by comparing “apples and oranges”).
If this were an isolated exception in “mainstream climate science” one could overlook or excuse it. But it appears more and more that this is not a single incident. The dogma is defended even when it has been challenged and falsified.
Under “lessons learned” this thread states:
Of course, this goes for scientists as well as for the journals.
Too bad James E. Hansen and Gavin Schmidt don’t appear to see it that way.
Do you agree with the following comparison of Hansen et al, 1988 with observation?
I used a digitizer (http://bit.ly/iTxrs9) to reproduce the three scenarios shown in Figure 3 of Hansen et al, 1988, (http://bit.ly/lZD4tq).
The observed values are from woodForTrees.org (http://bit.ly/jSsqEn)
It would be great if someone confirms/contradicts my comparison graph.
Thanks for link to digitizer.
Your analysis looks correct to me.
I plotted Hansen Fig. 3 (annual mean rather than 5-year running mean):
I started the series in 1980 (although Hansen’s forecast was made in 1988). I also plotted GISS and HadCUT actual anomaly trends for comparison.
You can see that Hansen’s forecast (scenario A) is off by 2:1. It is a bit worse than that if you start the series in 1988, when Hansen made his forecast.
Since scenario A is the closest to actual reality as far as CO2 emissions and concentrations are concerned, it is the correct scenario to compare with the actual observation, regardless of what Schmidt tries to tell us.
This is the same result you got using the 5-year running mean figures.
Hansen has decided not to admit and correct his mistake, but rather to stonewall and claim he did a good forecasting job.
The problem is, people are not fools and it is hard to hide this big a mistake.
“The problem is, people are not fools and it is hard to hide this big a mistake.”
There is ample evidence to show that on the topic of climate change that people are foolishly ignoring real data (or the lack of it).
I should have written:
“Not all people are fools…”
Your result (http://bit.ly/lLPjXX) shows Hansen’s Scenario C forecast matches the observed global mean temperature.
Scenario C drastically reduces trace gas growth between 1990 and 2000 such that the greenhouse climate forcing ceases to increase after 2000.
However, we did not have any “drastically trace gas growth”. As a result, trace gases don’t affect global mean temperature.
FYI, I mixed the labels for the GISS and HadCRUT linear equations on the last graph. Here is a link to the corrected one.
I am convinced that we are not the deniers!
Of course we are Deniers. We Deny Junk Science as a basis for remaking the world!!!
“I am convinced that we are not the deniers!”
So you deny it?
My research has been in the investigation of sudden cardiac death due to abnormal rhythms pf the heart (“arrhythmias”). I have used faily complex measurement and signal processing based techniques to investigate patients and to attempt to predict their risk of dying suddenly.
My conclusions are:
1) Medics (I am one!) are statistically illiterate and have difficulty in understanding the most elementary concepts.
2) The further one moves from straightforward data towards abstraction, the link with reality becomes increasingly tenuous. The number of workers in a field who can actually understand a new method drops rapidly to zero with the complexity of the method.
3) In clinical research, paper count is all; content may, or may not, be relevent.
4) Any computational method is wrong until proved otherwise.
In a perfect world this could be remedied by, as suggested in this post, statistical input and open access to data and code.
My view is that two independent groups within a team should analyse the data and scrutinise the experimental methods (calibation, reproducibility, etc)as part of the protocol. This approach should be as effective, but less adversarial, as an “audit”. Regretably, if you want to see a scientific rat dressed as Mickey Mouse, my field is a rich seam of enquiry.
It is unlikely to happen as many of the “leaders” in the field have not acheived their position through outstanding research but by politics and would not be receptive to this idea,
Quis custodiet ipsos custodes?
These criticisms of the process of science certainly have merit, but they remind me of Churchill’s famous comment on democracy — the worst system, except for all the other that have been tried.
Does anyone doubt that science has vastly improved our understanding of the physical world over the last hundred years? And the last fifty years? And the last twenty years? Science works, and so does peer review. Could it work better? Like any human institution, of course it could. Modern computers and the internet makes new forms of collaboration and review possible, and these ought to be exploited.
Nevertheless, I do not have much time for people who a) lack any real interest in making science better, and are just groping around for a talking point to use against scientific work that challenges their worldview, b) assert they know exactly what changes ought to be made, right now, and have no interest in the pros and cons of various changes or interest in what actual working scientists think. And I’m not even going to talk about c), the people who think science isn’t done the way they prefer in order to supress their brilliant takedown of the status quo and score grant money.
Since the pros of openness are well represented on this thread, let me list some of the drawbacks, which should be addressed in any change to the process of “admitted and correcting mistakes”:
* Data collection is expensive, and as a result, some of it is proprietary. How do we share data when some of it is pay-per-view?
* Science is highly competitive, so in a perfect world in which everything is always shared instantly and and without restraint, how do scientists and labs avoid have their work appropriated by others before they publish it?
* When provided with the raw data, some critics have complained that the unfiltered information is not useful, and have demanded subsets of the data, or that it be processed in a given way to make the information more accessible. Basically, they do not want to do the work to independently reproduce the result. If we want to satisfy them, where does the manpower and money come from to answer dozens or hundreds of requests of like kind?
RC Saumarez mentioned the medical field, which is a good example of the balance that needs to be struck. Lack of openness = bad. Lack of privacy for research subjects, sucking up clinicians’ time with excessive documentation = also bad. Openness is a value, but not the only value.
bobdroege writes “One other point though, the sensitivity for doubling of the CO2 concentration calculation is only good for the one doubling.”
How many more times must I write that the word “calculation” in this context is not just wrong, but completely misleading. There are no exact formulae that allow anyone to “calculate” climate sensitivity. The only way that this has been done is to use non-validated models to ESTIMATE climate sensitivity. This sort of travesty in the name of science is really inexcusable.
I’ll answer this in two parts:
“How many more times must I write that the word “calculation” in this context is not just wrong, but completely misleading. There are no exact formulae that allow anyone to “calculate” climate sensitivity. ”
This is correct, but it really is just a semantical arguement, just because there isn’t an exact formula doesn’t mean that the climate models do not calculate an “approxiamate climate sensitivity”
“The only way that this has been done is to use non-validated models to ESTIMATE climate sensitivity. This sort of travesty in the name of science is really inexcusable.”
Climate sensitivity has also been “estimated from the historical record” due to there being times in the past when natural large releases of CO2 have happened to warm the climate.
I was just trying to point out that counting the 18 doublings needed to get from the CO2 concentration on Earth to the CO2 concentration on Venus gives you the wrong answer at least in part due to the pressure broadening of the CO2 absorption bands.
Calculate or estimate if you insist, it had no effect on the argument I was making.
You write “Climate sensitivity has also been “estimated from the historical record” due to there being times in the past when natural large releases of CO2 have happened to warm the climate.”
Do you have a reference for this? My reading of the historical record shows that temperature first rises, and then CO2 rises some 800 years later. TIA
Here is one,
Just a clue, you will need to do your own research as to whether CO2 changes have ever preceeded temperature changes.
bobdroege. I went to your reference and found
“Ontong Java Plateau eruption as a trigger for the early Aptian oceanic anoxic event”
What this has to do with climate sensitivity, I have no idea.
I dont have time to chase down this sort of irrelevant nonsense. It is clearly an effort of your part to just pretend you know what you are talking about, when in fact you dont.
I should have added that you seem to claim that values for climate sensitivity can be derived from historical records. Fine. Give me the reference that shows what these numbers are, and how they were derived.
You cannot, because the reference simply does not exist. The only estimated values for climate sensitivity that exist are those that were estimated from non-validated models.
That reference was for CO2 increases preceeding climate change, not for a calculation/estimation of climate sensitivity.
as is this
This one may be of some use, as it does show how climate sensitivity is calculated. It has an impressive list of references and maybe you can find what you are looking for there. Of 101 references, it looks to me that only 8 or so are from Hansen and the IPCC, 3 are definately from the skeptic side.
References to Callandar and Arrhenius are there, which definately predate unvalidated climate models.
If Svensmark’s theories become more generally accepted (which looks likely) there will be most of Climate Science needing to accept they were wrong about the proportion and impact of mans influence. We will see how that goes.
Svensmark’s hypothesis is supported by Shaviv’s analysis of the historical records.
(full paper linked at bottom of page)
ON ADMITTING AND CORRECTING MISTAKES
What is the observed exponential carbon emission growth rate that was forecasted to be 1.5% in Hansen et al., 1988?
The observed carbon emission curve is shown in the following graph.
From the above data, the approximate annual global carbon dioxide emission in G-ton from 1970 to 2007 = 3.67*4.3*e^(0.0164*(year-1970))
As a result, the annual exponential growth rate is 1.64%, a bit higher than the 1.5% assumed by Hansen et al, 1988.
The observed exponential carbon emission growth rate is about 1.64%, which was forecasted to be 1.5% in Hansen et al., 1988. As a result, among the three scenarios, scenario A is closer to the reality.
Here is the comparison of the three forecasted scenarios with observation (GREEN).
When is this mistake going to be admitted and corrected?
Don’t get bamboozled by bobdroege’s paleo-climate stuff. These data are notoriously inaccurate and their interpretations highly doubtful. As a result, one can prove almost anything one seeks to prove with them.
For this reason they are not a robust basis for estimating CO2 climate sensitivity.
CO2 measurements prior to 1959 (Mauna Loa) are based on ice-core data, which has also been questioned by Ernst Beck based on a record of actual determinations over the late 19th and early 20th century.
But we have hard data since 1959 (CO2 increased from 315 to 390 ppmv), plus a surface temperature record (with all its warts and blemishes). This record shows a linear warming of 0.65C from 1959.
But here is where an estimation of CO2 CS gets dicey.
Let’s assume that the Svensmark hypothesis is invalid and that the GHE (primarily from CO2) has been a major driver of our climate.
Several independent solar studies tell us that around 0.35C of the observed 20th century warming can be attributed to the unusually high level of solar activity. Around three-fourths of this supposedly occurred in the first half of the century, so let’s say that the solar influence from 1959 was only 0.09C.
The late 20th century saw seven major ENSO events, including the very strong 1997/98 El Niño that resulted in the record warm year 1998. NASA has estimated the temperature impact of these events.
Based on these data, we can estimate an ENSO impact on the overall temperature trend from 1959 of around 1.6C.
IPCC AR4 (Fig. SPM.2) assumes that all other anthropogenic factors beside CO2 (other GHGs, aerosols, etc.) have cancelled one another out, so we can ignore these.
Also ignoring any other natural factors, this means that CO2 forcing caused warming of 0.65 – 0.9 – 1.6 = 0.4C from 1959.
Using the logarithmic relationship and ignoring any postulations of “hidden energy” (Hansen et al.), this gives us an observed 2xCO2 climate sensitivity of 1.3C.
C1 = 315 ppmv
C2 = 390 ppmv
C2/C1 = 1.238
ln(C2/C1) = 0.2136
ln2 = 0.6931
dT = 0.4C
dT(2xCO2) = 0.4 * 0.6931 / 0.2136 = 1.3C
This could be lower if:
– The Svensmark hypothesis is proven correct and clouds have exerted a natural forcing not considered above, as has also been suggested independently by the Spencer and Pallé observations.
– The land surface record is distorted upward by the UHI effect, as has been suggested by several studies.
– Inaccuracies in the sea surface temperature record prior to around 1990 have introduced a spurious warming bias to the surface temperature record.
– Recent ERBE and CERES satellite observations (Spencer, Lindzen) showing a low climate sensitivity are confirmed and validated.
So, no matter how one looks at it, the estimates of 2xCO2 climate sensitivity with all feedbacks is really not much more than an educated guess at this time, even using relatively reliable modern data.
Under the “this could be lower” section at the end of my post I forgot to add
– The current trend of no warming despite CO2 increase to record levels continues for another few years, thereby demonstrating that CO2 has only a secondary influence on our climate.
Jim can’t get his story straight:
“Give me the reference that shows what these numbers are, and how they were derived.
You cannot, because the reference simply does not exist. The only estimated values for climate sensitivity that exist are those that were estimated from non-validated models.”
“Don’t get bamboozled by bobdroege’s paleo-climate stuff. These data are notoriously inaccurate and their interpretations highly doubtful.”
The estimates don’t exist, and anyway they are notoriously inaccurate. Spot the tinnie-tiny flaw in the logic there. Classic stuff, Jim!
Robert writes “The estimates don’t exist, and anyway they are notoriously inaccurate. Spot the tinnie-tiny flaw in the logic there. Classic stuff, Jim!”
My statement was that the estimates dont exist. manaker said they were inaccurate. I dont agree with manaker. My position is completely consistent.
I have yet to see any reference which produces estimates of climate sensitivity from “historical records”. What I am lookimng for is a reference and the key words and numbers, WITHIN THAT REFERENCE, showing the values of the climate sentsitivity. It is all very well producing a reference that is huge and claiming that somehwere in it there are numbers relating to climate sensitivity. Give me the words and numbers from the reference. I cannot prove a negative by trying to read a huge reference and then claim I cannot find the numbers. Where are the numeric values of climate sensitivity in the references from the historic records?
But now Jim does give a contradiction
“I have looked at all the science, and I am convinced that CAGW is a hoax. ”
“I cannot prove a negative by trying to read a huge reference and then claim I cannot find the numbers. ”
I thought you looked all the science.
bobdroege writes “I thought you looked all the science.”
This goes to show an old adage that on the blogosphere it is possible for someone to completely misintrepret what one writes. Clearly I do not have the time or the expertise to look, literally, at all the science. What I wrote was figure of speech.
But you have still not provided a reference, and the words and numbers from that reference that show that estimates of climate sensitivity that have been produced from historical records. Where is the reference, together with an extract from that reference, with the actual numbers?
On “all the science” see the detailed review of the science “Climate Change Revisited: The Report of the Non-governmental International Panel on Climate Change.” That reviewed science from IPCC’s AR4 to 2009, and that ignored by the IPCC.
When this full range of science is examined, the conclusions are substantially different from that of the IPCC. Uncertainties are higher. The Sun has a bigger impact.
Spencer’s latest modeling of galactic cosmic rays suggests
Consequently, IPCC’s argument from ignorance on the magnitude of anthropogenic causation is not as strong as stated. (Surprise!) No need to panic!
Nice try there Robert, but I’ll say it before Jim and Max jump on you.
Max said one quote and Jim said the other.
as an aside, I notice you raising Beck’s excellent work. I’m sure you know most warmers reject it out of hand as those old guy couldn’t possibly know enough to prevent contamination by measuring in reasonable areas and the most common method had an error of WOW 3%!!!!!! (lessee 3% of 400ppm is (get out the calculator and math reference for formulas) uhh 12ppm??? Boy that really changes things. That measurement might only be 388ppm!! Naaah, he probably breathed on it. The warmers assure us that it was never over 300ppm since the last ice age!!
On the other hand they want us to accept the albedo data for Venus from the early 1800’s because the scientists then did excellent work and understood how to do the measurements with little error.
There are lots of estimates for climate sensitivity floating around. It’s just that the one favoured by IPCC is the one based on Hansens computer model which fills the gap in Trenberth’s energy budget.
For an alternative assessment, see for example:
BLouis79. Thank you, but this is not the issue. The issue is whether there are estimates of climate sensitivity from the historical record.
Jim – Surely you have read Knutti and Hegerl, in particular the section ‘Constraints from the instrumental period’. The numbers are summarized at the top of Figure 3; methods and other specifics are given in the dozen or so references to that section.
And surely you are dismissive of those results; but there they are anyway.
I thought these were the historical record:
The different time scales included in my analysis are:
The 11-year solar cycle (averaged over the past 300 years).
Warming over the 20th century
Warming since the last glacial maximum (i.e, 20,000 years ago)
Cooling from the Eocene to the present epoch
Cooling from the mid-Cretaceous
Comparison between the Phanerozoic temperature variations (over the past 550 Million years) to different the CO2 reconstructions
Comparison between the Phanerozoic temperature variations and the cosmic ray flux reaching the Earth (as reconstructed using Iron-Meteorites and astronomical data, e.g., read this).
Thank you. You are probably right. I am afraid my physics simply is not up to understanding how the author expalined what he did. The reference leaves me cold.