Site icon Climate Etc.

We are all confident idiots

by Judith Curry

Stumbling through all our cognitive clutter just to recognize a true “I don’t know” may not constitute failure as much as it does an enviable success, a crucial signpost that shows us we are traveling in the right direction toward the truth. – David Dunning

In pondering how we rationalize the ‘hiatus’ in context of theories and predictions of anthropogenic global warming, I have been looking to the fields of philosophy of science and psychology for insights.

The linkages between philosophy of science and psychology in context of epistemology is articulated in this statement by Quineepistemology itself “falls into place as a chapter of psychology and hence of natural science”: the point is not that epistemology should simply be abandoned in favor of psychology, but instead that there is ultimately no way to draw a meaningful distinction between the two.

Below are some articles I’ve recently come across that provide some insights.

David Dunning

David Dunning  has penned an article for the Pacific Standard entitled We are all confident idiots. Subtitle: The trouble with ignorance is that it feels so much like expertise. This is a fascinating article, some excerpts:

For more than 20 years, I have researched people’s understanding of their own expertise—formally known as the study of metacognition, the processes by which human beings evaluate and regulate their knowledge, reasoning, and learning—and the results have been consistently sobering, occasionally comical, and never dull.

The American author and aphorist William Feather once wrote that being educated means “being able to differentiate between what you know and what you don’t.”  To a great degree, we fail to recognize the frequency and scope of our ignorance.

In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. 

What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.

Because it’s so easy to judge the idiocy of others, it may be sorely tempting to think this doesn’t apply to you. But the problem of unrecognized ignorance is one that visits us all. And over the years, I’ve become convinced of one key, overarching fact about the ignorant mind. One should not think of it as uninformed. Rather, one should think of it as misinformed.

 As the humorist Josh Billings once put it, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

Because of the way we are built, and because of the way we learn from our environment, we are all engines of misbelief. And the better we understand how our wonderful yet kludge-ridden, Rube Goldberg engine works, the better we—as individuals and as a society—can harness it to navigate toward a more objective understanding of the truth.

Some of our most stubborn misbeliefs arise not from primitive childlike intuitions or careless category errors, but from the very values and philosophies that define who we are as individuals. Each of us possesses certain foundational beliefs—narratives about the self, ideas about the social order—that essentially cannot be violated: To contradict them would call into question our very self-worth. And any information that we glean from the world is amended, distorted, diminished, or forgotten in order to make sure that these sacrosanct beliefs remain whole and unharmed.

The way we traditionally conceive of ignorance—as an absence of knowledge—leads us to think of education as its natural antidote. But education can produce illusory confidence.

It is perhaps not so surprising to hear that facts, logic, and knowledge can be bent to accord with a person’s subjective worldview; after all, we accuse our political opponents of this kind of “motivated reasoning” all the time. But the extent of this bending can be remarkable.

But, of course, guarding people from their own ignorance by sheltering them from the risks of life is seldom an option. Actually getting people to part with their misbeliefs is a far trickier, far more important task. Luckily, a science is emerging, led by such scholars as Stephan Lewandowsky at the University of Bristol and Ullrich Ecker of the University of Western Australia, that could help.

But here is the real challenge: How can we learn to recognize our own ignorance and misbeliefs?  Behavioral scientists often recommend that small groups appoint someone to serve as a devil’s advocate—a person whose job is to question and criticize the group’s logic. While this approach can prolong group discussions, irritate the group, and be uncomfortable, the decisions that groups ultimately reach are usually more accurate and more solidly grounded than they otherwise would be. For individuals, the trick is to be your own devil’s advocate: to think through how your favored conclusions might be misguided; to ask yourself how you might be wrong, or how things might turn out differently from what you expect. 

Another quote sometimes attributed to Franklin has it that “the doorstep to the temple of wisdom is a knowledge of our own ignorance.”

The built-in features of our brains, and the life experiences we accumulate, do in fact fill our heads with immense knowledge; what they do not confer is insight into the dimensions of our ignorance. As such, wisdom may not involve facts and formulas so much as the ability to recognize when a limit has been reached. Stumbling through all our cognitive clutter just to recognize a true “I don’t know” may not constitute failure as much as it does an enviable success, a crucial signpost that shows us we are traveling in the right direction toward the truth.

Lewandowsky

Dunning refers to Stephan Lewandowsky in a favorarable light.  Lewandowsky conducts psychological research on the subject of bias. In the context of the climate debate, Lewandowsky’s psychological research is highly controversial, see discussions by Steve McIntyre and Joe Duarte.

At WUWT, Andy West has just published a lengthy three part series:  Wrapped in Lew papers: the psychology of climate psychologization [Part I, Part II, Part III].  The main point is that Lew is so busy dissecting the ‘bias’ of climate change skeptics that he misses his own rather glaring biases.

Lewandowsky’s latest essay is Are you a poor logician?  Logically you might never know.  Lew applies the Dunning-Kruger ideas to dismiss AGW skepticism.  Ben Pile counters with a post Lewandowsky’s logic.  I just spotted Paul Mathews Lewandowsky’s loopy logic, which provides a good overview and extensive links.

I decided not to pull excerpts from these posts, but my summary point is this.  Psychologization can be a dangerous tool in ideological warfare.

It pays to be overconfident

An article in New York Magazine:  It pays to be overconfident, even if you have no idea what you’re doing.  Excerpts:

We deceive ourselves about our superiority so that we may better deceive our potential competitors, collaborators, benefactors, and mates. To be a good salesman, you have to buy your own pitch.

It turns out, we tend to (over)use confidence as a useful proxy for competence — if you speak firmly, it sounds like you know what you’re talking about. People who showed more confidence, regardless of their actual ability, were judged to be more capable and accorded more regard by their peers.

As for the effect of confidence on perceived ability even after actual ability has been reported, the authors note the lasting power of first impressions have been long known to disproportionately affect our judgments of others. All of this suggests that even when we’re unmasked as less skilled than our self-assured manner would suggest, there are ancillary social benefits to overconfidence.

Maybe this is how pundits (and Times columnists) maintain their audience, and why political candidates feel free to make undeliverable campaign pledges: There may simply be insufficient downside to their overpromising. 

Oh my, that certainly puts the IPCC’s confidence levels in a new light.

Assertions of scientific confidence

On the other hand, Pacific Standard has a post Assertions of scientific certainty are greeted with skepticism.  Subtitle: New German research suggests the public is wary of statements suggesting a scientific debate has been closed.  Excerpts:

On many fronts, scientists continue to be frustrated by the public’s unwillingness to accept their conclusions. On issues ranging from Ebola to climate change, their impulse is often to re-state their case in ever-more-vigorous terms, forcefully noting that there is no serious doubt about their assertions.

Newly published research from Germany suggests that sort of language may, in fact, be counterproductive.

“This means that readers were not persuaded by powerful formulations which described scientific evidence as very certain, but seemed to be skeptical when information was presented as too simple.”

JC reflections

I am interested in the overlap between epistemology and psychology; I’ve only dabbled in the relevant psychological literature (as pointed to from blog posts), so I have no idea what interesting papers out there that I might be missing.

What I would like to see is some studies related to the psychology and social psychology of scientific belief by scientists.  If you know of relevant papers, I would appreciate a pointer.

I am very concerned about the brand of psychological research conducted by Stephan Lewandowsky, that seems to be more of a tool in ideological warfare than anything else.

I like Dunning’s suggestions of devil’s advocates, which is similar to suggestions made by Steve Koonins and John Christy regarding a red team to critique the consensus statements.

Dunning-Kruger is a popular rationale for dismissing skeptics (particularly in the blogosphere); however I remain very concerned about the general phenomena in the scientific  community.  I will pick this issue up in a future post on scientific underdetermination.

The most disturbing point here is that overconfidence seems to ‘pay’ in terms of influence of an individual in political debates about science.  There doesn’t seem to be much downside for the individuals/groups to eventually being proven wrong.   So scientific overconfidence seems to be a victimless crime, with the only ‘victim’ being science itself.

How does the New Yorker article (overconfidence pays) square with the Pacific Standard article (certainty greeted with skepticism)?  Well within the group of the ‘converted’, overconfidence pays.  However, in the broader population (e.g. uncoverted),  certainty is greeted with skepticism.  In the later group, I’ve found that humility and discussing uncertainty works to build trust.

But of course I am absolutely not confident of any this.

 

 

Exit mobile version