by Judith Curry
In Nature, Ian Boyd calls for an auditing process to help policy-makers to navigate research bias.
Echoes of McSteve, coming from a very big chamber.
Ian Boyd is the science advisor to DEFRA, the UK government department for environment, food and rural affairs. His commentary entitled A standard for policy-relevant science is published in the latest issue of Nature. Read the whole thing, its available online. Here are a few excerpts:
To counsel politicians, I must recognize systematic bias in research. Bias is cryptic enough in individual studies, let alone in whole bodies of literature that contain important inaccuracies.
These problems are amplified in complex issues such as the environmental effects of GM organisms or chemical pollutants, including pesticides and endocrine disrupters. The problem is amplified further when statistical inference is used.
Systematic bias across whole fields of science is even more cryptic and therefore more problematic. It could stem from the combined effects of how science is commissioned, conducted, reported and used, and also from how scientists themselves are incentivized to conduct certain research. Such bias results from actively searching for a particular outcome, rather than performing balanced hypothesis testing. For example, in 2006, researchers in the United Kingdom and in the Netherlands found that the number of insect pollinators might have declined. A consequent call for proposals contained the underlying assumption that there was a decline, rather than conveying a need to establish whether current information about declines was robust.
JC comment: This issue regarding call for proposals is rampant in climate change research. Even if there is not bias in reviewing a proposal that challenges the AGW assumption, there are relatively few opportunties for such proposals since so many calls for proposals contain this underlying assumption.
Another problem is the tendency to treat different studies as statistically independent, even when they have emerged from connected commissioning processes and could therefore amount to multiple testing of the same hypothesis, meaning that every extra study must overcome an increasingly rigorous statistical hurdle to demonstrate efficacy. In combination, these kinds of bias can make individual or groups of studies that report certain effects seem more important than they really are.
JC message to the IPCC: pls pay attention to this when assessing confidence in your collection of climate models.
A common reaction to such controversy is to commission subject reviews or meta-analyses that assess the weight of evidence for certain effects across many individual studies. But reviews also contain pitfalls. First, they risk amplifying rather than eliminating systematic bias — which could be more common in some subjects than others.
JC comment: Here is how the IPCC assesses confidence (text pulled from the leaked final draft of AR5 summary for policy makers): Confidence in the validity of a finding is based on the type, amount, quality, and consistency of evidence (e.g., mechanistic understanding, theory, data, models, expert judgment) and the degree of agreement.
We need an international audited standard that grades studies, or perhaps journals. It would evaluate how research was commissioned, designed, conducted and reported. This audit procedure would assess many of the fundamental components of scientific studies, such as appropriate statistical power; precision and accuracy of measurements; and validation data for assays and models. It would also consider conflicts of interest, actual or implied, and more challenging issues about the extent to which the conclusions follow from the data. Any research paper or journal that does not present all the information needed for audit would automatically attract a low grade.
JC comment: Absolutely eloquent description of the bias problem. I am not so sure about his proposed solution. Something for us to discuss: how to implement wholesale auditing for climate research.
JC request of the IPCC: Please replace Rachendra Pachauri with Ian Boyd.