Site icon Climate Etc.

On confusing expertise and objectivity

by Judith Curry

Having great intelligence or specialized knowledge isn’t assurance against a person remaining unbiased in their public opinions.

I’ve been collecting material on this topic for several months, Tamsin’s essay has motivated me to actually write a post.   I’ve written previous posts on the topic of politics of expertise, mostly in the context of anecdotal examples.  The issue of objectivity in context of expertise used in public policy debates raises  profound ethical and epistemic issues and responsibilities. Here are some perspectives on expertise and objectivity.

Intellectual Conservative

Intellectual Conservative blog has an interesting post Philosophical Aspects of the Climate Change Controversy.  Excerpts

Few of us have the specialized knowledge necessary to make absolute pronouncements on [global warming/climate change concerns], yet all of us have a right, or even an obligation, to philosophically cross-examine the arguments presented for rational consistency.

We should realize that evidence never exists in a vacuum. All evidence requires interpretation, and all too often the interpretation of evidence is influenced by pre-existing ideology, not ruthless objectivity.

A second observation is what I call “the fallacy of appealing to expertise.” Let’s develop this point. It goes something like this: A consensus of credentialed scientists nearly all believe a certain thing, therefore it is true. This reasoning assumes that someone must be objective in the same proportion that they are an expert, or said another way, an expert can never be biased or affected by groupthink.

Suppose you go in for a dental examination with a new dentist, and while examining your mouth, your dentist says “have you considered taking out a loan?” Now are you dealing with a oral hygiene expert speaking objectively, or a businessperson speaking out of self-interest? You have to use your own judgment to discern the difference. In that case you have no difficulty seeing how bias can work contrary to knowledge. The appeal to expertise is not as strong an argument as it would appear to be, because specialized knowledge is not necessarily tantamount to pure objectivity.

Or take an example from our legal system. In a court case both the defense and prosecution may provide testimony from expert witnesses. But the opinions of equally qualified people are often in diametric opposition. What accounts for this? As a juror you must discern who is best at offering the more plausible explanation, though you are not an specialized expert on the topic in question.

So what am I saying? Are all these experts liars? Of course not. I am saying that I doubt every expert comes to their own conclusions independently from scratch, and that reputations and careers are sometimes of primary consideration when such persons publicly take a position.

In general, people confuse two concepts: expertise and objectivity. Having great intelligence or specialized knowledge isn’t assurance against a person remaining unbiased in their public opinions. Persons of all stripes are generally loyal to their source of income. We shouldn’t assume that every expert begins their search tabula rasa, that is to say, without an agenda or wholly independent of prevailing consensus.That is why appeals to credentials or expertise are never as conclusive as they ought to be.

Objectivity vs ‘Objectivity’

Nate Silver’s recent transition from the NYTimes to ESPN sparked an interesting piece at fair.org, here is the relevant text:

This is what I like to describe as the difference between objectivity and “objectivity.” Objectivity is the belief that there is a real world out there that’s more or less knowable; the “objectivity” that journalists practice holds that it’s impossible to know what’s real, so all you can do is report the claims made by various (powerful) people. The chief benefit of “objectivity” is that it means you will never have to tell any powerful person that they’re wrong about anything.

If someone comes along and tells you that, no, there are ways to figure out what’s actually happening with the world, and simply repeating without question what interested parties claim to be happening is not a very helpful approach, that’s going to be, as Sullivan put it, “disruptive.” That’s what I think she’s really getting at when she says, “I don’t think Nate Silver ever really fit into the Times culture and I think he was aware of that.”

Hotwhopper

A reaction to Tamsin’s post from hotwhopper that makes an important point:

Although it may be true (or not) that [Tamsin] doesn’t have the knowledge or experience, it doesn’t follow that other scientists don’t have it.

People who develop policy don’t have answers, they have questions first and foremost.  They weave answers from others into solutions.  Their expertise is rarely at the technical level.  It’s in policy formulation itself.  Policy developers and advisers turn to the technical experts for advice.  Those technical experts will work in science, economics, finance, human services and other arenas.  There are no sharp lines dividing technical experts from each other or dividing the technical experts from the policy developers and advisers.  Some scientists will end up in policy development roles.  They won’t suddenly jump from working in a laboratory to working in the west wing or a Minister’s office or on the executive floor.  They will be drawn into the role gradually.  For example, they may be tapped on the shoulder to sit on a committee or two.  They may be invited to take a short term assignment in a research advisory role or a management role.

In the same way, these people who will help shape the future will not suddenly find their ideas fully formed as they venture into these roles.  It doesn’t happen like that – or if it does it’s rare and I’d say it’s not a good thing when it does, being more likely some ideological driver rather than new skills learnt or a gradual appreciation of the subtleties of policy development and the myriad implications of broad-ranging policy alternatives.

Common cognitive biases

You are probably familiar with these, but I flagged an interesting blog post at io9 entitled The most common cognitive biases that prevent you from being rational. The article is well worth reading.    Excerpts:

Before we start, it’s important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).

Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them.

The post describes:

One that I hadn’t seen described before is Ingroup bias, a neurological explanation for tribalism:

Somewhat similar to the confirmation bias is the ingroup bias, a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called “love molecule.” This neurotransmitter, while helping us to forge tighter bonds with people in our ingroup, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know.

JC message to Joshua:  I can already anticipate your comments on this.  There is a  clearly defined ‘tribe’ associated with the IPCC and its supporters. The other side is extremely diverse, including skeptical academics who don’t have a high opinion of each other, data libertarians from the open knowledge movement, people interested in accountability of publicly relevant science, and yes those who are politically motivated by allegiance to fossil fuels.  Hence I argue that the tribalism issue is asymmetrical on the two sides of the debate.

JC comments:

Scientists play an important role in many public debates.  While advocacy by scientists is appropriate when pretty much everyone agrees on the problem and the solution (e.g. tornado warnings),  it is much less appropriate for wicked problems where there is substantial disagreement about the problem and/or solution and there is heavy reliance on expert judgment rather than on more objective and well-quantified analyses.  Trust in the experts then becomes a paramount issue, and political advocacy is a sure path towards public distrust of the experts.

The climate science-policy interface is a particularly difficult one to navigate, owing to the complexity of the science and its uncertainties,  the complex socioeconomic impacts of climate variability and change, and the high cost and potential unintended consequences associated with proposed policies.  I became interested in this issue circa 1999, and I am actively trying to learn more about the policy process and issues at the science-policy interface.  I have found the dialogue at Climate Etc. to be particularly helpful in this regard in developing understanding about bridging the science-policy gap.

Stealth advocacy is one thing; I argue here that inadvertent advocacy is something different that is heavily influenced by cognitive biases. We are all subject to cognitive biases, even scientists in the field of their science.  As scientists, we need to be aware of these biases and actively fight against them.  Instead, in the climate debate  I see far too much effort in the guise of ‘communication’  that attempts to pander to the cognitive biases of the public.

If I can be forgiven for generalizing a bit, it seems that the cognitive biases are asymmetrical among vocal proponents on the two sides of the debate:

Both sides seem subject to observational selection bias and confirmation bias.  Each of us as individual scientists needs to continually challenge our own objectivity and and be on the lookout for cognitive biases.

The ‘bias’ in the climate debate is often ascribed to political motivation, but there are a host of other cognitive biases that come into play.  As pointed out by the Conservative Intellectual, reputations and careers are sometimes of primary consideration when such persons publicly take a position.

Finally, I would like to pick up on the issue raised by hotwhopper, about scientists being gradually drawn into the policy role, and taking time to fully form their ideas.  This introduces the ‘age’ issue into the science-policy process, something that was alluded to in the Tamsin’s twitosphere discussion.  Being effective at the science-policy interface requires experience and perspective that only comes with experience which implies seniority.

Leadership roles in the IPCC (as lead or coordinating lead authors) requires experience and perspective at the science-policy interface.  Assigning lead authors before the ink is dry on their PhD thesis (e.g. Mann) or as a coordinating lead author within 5 years of a Ph.D. (e.g. Santer)  seems  ill-advised to me.

And finally, climate scientists need to become more effective at the climate science-policy interface, and I do not mean effective at advocacy.  On the Tamsin thread, I quoted David Westcott as saying “advocacy” from many climate scientists has just plain sucked.  It seems extremely naive to me to have expected a ‘speaking consensus to power’ approach to have been effective at implementing a costly international energy policy.  There is no easy recipe to follow, since climate change is arguably the mother of all wicked messes.

But recognition of these challenges, and working to understand and eliminate our own biases, would be be a good starting point.

Exit mobile version