by Andy West
On the origin of public skepticism and its entanglement with science.
Lewandowsky has a habit of raising fundamental truths1 and asking pertinent questions, yet then for the climate change domain turning psychology (and according to analyses his data and ethics too) on its head in order to ensure an agreement with his über-orthodox viewpoint on risk2, rather than embrace outcomes that the fundamentals and questions actually point to. This involves accepting questionable conclusions as valid for other domains too, if these reinforce his climate change position. So for instance from Science and the Public: Debate, Denial, and Skepticism by Lewandowsky et al (L2016), here is a useful question:
‘What characterizes the public response to scientific discoveries that are “inconvenient”, or threatening to one’s lifestyle, livelihood, or deeply-held beliefs? Is it debate, denial, or skepticism?’
In answering this Lewandowsky et al3 adopt a particular notion, which in the climate change domain minimizes the scope for any serious questioning of their particular (calamitous) orthodox view. This notion is that in regard to testing for a dividing line between ‘denial’ and skepticism, ‘existing research permits its identification with relative ease’. They refer to three papers4 immediately after this quoted text as their first line of support (hereafter ‘3P’), which cover various disputed domains yet unfortunately provide neither an objective test nor one based upon any theory of causation regarding these reactive behaviors5. So inevitably, they will sometimes fail.
However the very need for such a distinguishing test acknowledges a crucial truth: the reactive behaviors skepticism and so called ‘denial’, are fundamentally related. And while the authors frame science as only ‘discoveries’ (so implying factual) rather than the wider set ‘theories plus discoveries’ (so not all factual), their line of inquiry is nevertheless appropriate. As we shall see cultural resistance is indeed important where the related reactions of skepticism and ‘denial’ are aroused in the public by science issues29a, such resistance being tied to lifestyle, livelihood and especially deeply-held beliefs.
L2016 doesn’t pursue these valid clues about the nature of public skepticism6 (and incidentally features significant hypocrisy7), while the claimed ‘dividing line’ is certainly not easy to identify reliably in the generic case. Yet notwithstanding the above difficulties, and also that the role of cultural support as well as resistance needs consideration (hence: what characterizes the public response to scientific theories or discoveries that are “convenient”, or encouraging to one’s lifestyle, livelihood, or deeply-held beliefs?), Lewandowsky et al’s question and concept of testing raises deep and pertinent issues regarding the nature of public skepticism. These are briefly explored below.
The two faces of skepticism
Barry Marshall, the doctor who discovered the role of Helicobacter pylori in causing gastritis, which precursors ulcers and stomach cancer, said in an interview with Discover Magazine: “I presented that work at the annual meeting of the Royal Australasian College of Physicians in Perth. That was my first experience of people being totally skeptical. To gastroenterologists, the concept of a germ causing ulcers was like saying that the Earth is flat.” [emphasis mine].
To the detriment of patients it took some years for his theory to be accepted8. Yet the essential issue is that highly plausible historic evidence, didn’t prevent skepticism of bacterial causation from growing to widespread dominance before Marshall.
So why does skepticism appear to have two faces; often healthy and indeed crucial to scientific inquiry and progress, yet at other times an unhealthy impediment? To get a better understanding we should go back to the beginning and ask, along with the implication of cultural resistance from L2016 as a clue, how old is skepticism and where does it come from? What actually is skepticism?
How old is skepticism?
The quote from L2016 in section 1 refers to the ‘public response’. Initially I will strictly adhere to this, i.e. in answering my questions above this is in regard to folks who overall are not deeply informed about the disputed issues. So for now we’ll leave behind gastroenterologists and climate scientists along with other experts. This should furnish us with the most straightforward case.
When members of the public respond in a skeptical manner to a promoted narrative, be this religious or political or scientific in origin, their response is not typically shaped by knowledge of philosophical skeptics such as Socrates or Pyrrho or Descartes or Hume. While philosophical skepticism as founded by the ancient Greeks has, in addition to its role within the theory of knowledge, contributed to a wider and more practical skepticism over the millennia, this is not the main font of reactive public behavior. Indeed there was skeptical behavior long before even the pre-cursors of formal philosophical skepticism, which arose with the 5th century BC Sophists9.
It is strongly argued that there is skepticism in the works of poets like Homer10 (if he was even real he lived centuries before the Sophists), and likely most of his content was inherited from oral traditions stretching back much further. There is certainly skepticism in some of the Harper’s songs originally from the Middle Kingdom of Ancient Egypt, which goes back to ~2000BC. Mainly with respect to serious doubt about life after death and in direct contradiction to the dominant religion11. Even more emphatic on this issue is the Immortality of Writers (end of the 19th dynasty, ~1190BC). These examples don’t prove that considerable skepticism resided in the public consciousness, yet none are exclusive to intellectual elites11a. And the personal art and graffiti of ordinary folks provide a more direct seam of skepticism. I particularly like the private drawing of an Egyptian tomb-painter, who humorously12 depicts his god-pharaoh as a dog driving a chariot pulled by a rabbit13.
Detecting skepticism this far back depends upon the evidence from written records14. However, Homeric works are not the only indicator that prior oral societies also foster skepticism15. Given too that where records exist skepticism seems to stalk every religion (indeed every social consensus15a), and also that religious behavioral practices of one form or another have been around for a very long time, a plausible hypothesis is that this pattern always held and so skepticism is very old indeed. Old enough to be part of evolutionary processes16, maybe as old as our species.
Of course there was no formal science to be skeptical about in such early times. Yet public reactions to promoted narratives about how the world works or how we should best act to prosper within it, which per our scope above springs from people possessing only shallow domain knowledge at best, aren’t different in nature if such narratives happen to be religious or scientific or philosophical in origin. Not being priests or theologians or scientists or philosophers, the public is largely uninformed. Hence necessarily, their reaction must arise from factors largely independent of detailed domain knowledge.
Where does skepticism come from?
The oxford online dictionary defines scepticism / skepticism as: ‘A sceptical attitude; doubt as to the truth of something.’ This straightforward definition hides an apparently fiendish complication regarding public skepticism. Lacking domain knowledge, how do the uninitiated know whether their doubts are well or ill founded? What framework are they even using to judge?
A long evolved framework, says evolutionary theory. Deception and the ability to detect deception have evolved in many species including our own17, an arms race17a producing complex strategies and balances, especially in humans17b. Indeed the same pressures selecting for group co-operation, a key characteristic of humans, appear to have selected for deception too17c.
The simpler end of this arms race is driven by individual deception, the complex end by group deception. In each is a balance point of withheld judgment18, i.e. where the detection is sufficient to uncover issues, yet not sufficient to prove deception. Thus a suspicion of deception arising from skills long honed by the evolutionary arms race, doubt as to the truth of something, is skepticism. An underlying manifestation from evolutionary processes fits well with the evidential long timescales noted in section 3.
So when by virtue of evolved abilities (maybe at times aided by conscious application), someone detects or at least suspects deceit in an individual, it’s not too hard to see how this suspicion can be independent of any knowledge related to the concept the individual is conveying. Michael Shermer lists a subset of clues19 such as nervousness, excess control20, apparent rehearsal, consistency of a story over time, plus the increased likelihood of such clues manifesting when the subject is under cognitive load. Added to which can be weightings derived from the level of intimacy and trust. Hence to assess doubts about the veracity of the concept, domain knowledge is not required.
Yet what if the concept in question isn’t presented to us by an individual, but by an entire social group? What then does the arms race look like?
Group deception and skepticism
Ultimately a product of group selection21 and gene-culture co-evolution21a, the social groups caught up in the deception versus detection arms race are cultural, i.e. they feature an emergent socially enforced consensus, a cultural consensus21b. Despite underlying social truths, a cultural consensus is in itself a collective deception22a,b that provides major advantages including: a coalition system for combating individual dominance, an underwriting of altruism within group, and mechanisms to achieve common action in the face of the unknown. The fairy-tales put out by religions, such dominant cultural narratives throughout history, are examples of powerful collective deceptions.
Just as in the case for individual deception, there are various domain independent clues available for instinct to exploit in detecting the typical deceptions of a whole group (advocating a cultural consensus). All told these can invoke sufficient doubt, at least, for folks to withhold judgment regarding the truth of a promoted concept. Typically most adherents of a culture are honestly motivated, so the clues in section 4 aren’t useful. Relevant signs include that the group’s overall expression is: too coherent and coordinated (policing naturally occurs within emergent cultural consensuses), too certain (that which challenges a cultural consensus, including uncertainty, is belittled or bypassed), too forceful (e.g. suppressing other views), too emotive (positive passions as well as say fear and worry, these are how the cultural narrative gets iteratively selected in the first place), too arrogant (e.g. demeaning and / or demonizing dissenters), too universal (applicability of the concept across society, via tenuous connections, is exaggerated), too existential (potential threats exaggerated; becomes the bogeyman), and too associated with convenient belief23. Similarly to the individual case, familiarity and trust in the messaging sources will also weight the assessment.
While not all examples are so blunt and many would be considered part of normal social messaging, an obvious case typically revealing itself via most of these clues is state propaganda especially as put out by an extremist regime. Expressed via several papers24 (2005 to 2012 and not focused on climate change) Lewandowsky and co-authors confirm that a ‘stable personality trait’ of skepticism boosts our resistance to this type of misinformation, i.e. deceptions resulting from cultural / worldview biased transmission.
So public skepticism arises from an evolved instinct that detects collective deception and doesn’t need detailed domain knowledge in order to operate. To distinguish this from ‘philosophical skepticism’ (PSk) or ‘scientific skepticism’ (SSk), we can call this ‘innate skepticism’ (ISk).
The limits of Innate Skepticism
ISk works independently of detailed domain knowledge, but not independently of pre-existing social values and aspirations. So for example if the expression of a rising cultural consensus aligns well to an individual’s pre-existing values: universality will seem quite ‘natural’, emotions will resonate and hence thwart the objectivity in sensing that these may be inappropriate, coherence and certainty will confirm the comforting value framework upon which social identity is based, convenient belief will be overlooked, the demonizing of others may seem justified considering the stakes, or at least a blind eye will be turned, plus the existential narrative will confirm (once emotions are engaged) those high stakes are valid, especially if the information sources are largely trusted (which may simply mean they are sympathetically biased). Yet if the individual’s existing values are unaligned to the rising cultural consensus, clues to collective deceit will seem much more obvious, meaning skepticism is likely, and will be particularly sharp in those with opposed values.
Some of the pre-existing values will themselves derive from collective deceptions, so across a population ISk would be modulated depending on the rival or ally status of the newer culture25. Section 7 includes the context for individuals regarding this concept.
So ISk can be undermined, as the powerful sway of many cultures makes clear. Yet never wholly within a population26. In modern times and also evidenced by the historic references above, significant skepticism always seems to persist in association with cultural beliefs (footnote 15 is also relevant here). And given the typically low profile of historic skepticism relative to consensus messaging (i.e. veiled or relegated to unofficial channels or graffiti etc.) this may have been stronger than its surviving footprint suggests. Note: the balance between ISk and belief is dynamic rather than static27.
Innate Skepticism as Cultural Resistance
Notwithstanding specific co-evolutionary linkages, the selection pressure upon biology is not for cultures per se but for social thinking28; the way we think, hence also our identity, is bound up with the values of our social groups. In complex modern societies we can both buy into and repudiate various social groups at various levels, simultaneously. So per section 6 each of these relationships attenuates or sharpens ISk, which protects our precious identity and aspirations. This is consistent with the measured effect Dan Kahan calls identity protective cognition29. We’re all genetically and culturally unique so our modulation profiles are likewise individual, yet major statistical assemblies are readily identifiable in the supportive or skeptical responses that reflect our lifestyle, livelihood, and deeply-held beliefs29a.
So ISk can be thought of as cultural resistance to that which threatens our identity; a useful model (per section 1 invoked by L2016) although it tends to make us think more about those with opposed rather than merely unaligned values. (Equivalently, cultural beliefs are conceptually an emotive part of identity that restrains the ISk against them30). This defensive role31 of ISk provides some guard against local decadence as well as alien culture31a.
The link to identity and values means an individual’s ISk in one domain (say GMO), says nothing about their ISk in another (say climate change) that invokes different value challenges (but see 12/8a below).
Innate Skepticism and truth
Discovering truth isn’t needed to fulfill the above role. Indeed on some cultural consensus issues, e.g. human origins, historically the truth simply wasn’t available. Withheld judgment can be sufficient. Or bounded skepticism, e.g. not a disbelief in God but a disbelief in his current agents upon Earth (which is equivalently, a modified belief). And in a cultural competition occurring within domain overlap, one cultural ‘truth’ can form the focal point for skeptical resistance to another. Or sustained strong skepticism of an established culture (maybe due to decadence) can create conditions in which a breakaway culture eventually arises32. Yet in a literal sense all competing cultures are just as untrue, they’re all collective deceptions.
The entanglement of Science
Science with social impact, or perceived impact, is tangled in the group deception / detection arms race because (at least):
1) Correct science may be associated with or promoted or transmitted by, specific culture.
2) Correct science may challenge values and contradict knowledge established by a culture.
3) Scientific theories often have genuine and significant uncertainty, opening a window to cultural judgments and bias.
4) Via a raft of bias mechanisms, culture can divert or hi-jack science in a particular domain33.
5) Science as an enterprise has picked up cultural characteristics33a, 22a.
6) Strong ISk about a promoted theory may motivate a pursuit of truth via science (SSk)43.
As noted in section 3 ISk is the only framework via which the uninitiated public can interpret and judge the many competing claims and alleged uncertainties, the often obscured loyalties of information and funding sources, plus all the other domain complexities associated with a scientific theory or discovery that becomes socially contentious.
So regarding 1) correct science may be rejected because the cultural package it comes in is rejected by innate skepticism correctly detecting clues of collective deception33b.
Regarding 2), science faces a long-evolved system not looking for truth but for specific clues. Some of these clues will incorrectly be detected by individuals with unaligned values. For instance the absolute certainty granted by straightforward replication, ironically looks just like the unwarranted certainty that cultural consensuses enforce. Scientific zeal may trigger the detection of emotive content as would occur in a cultural narrative (and if the zeal for a particular theory is way OTT, this detection isn’t really wrong). A kind of convenient belief is exhibited by those who don’t understand the theory, yet simply believe ‘because it is science’. The authority science projects, maybe too arrogantly by some scientists, can trigger within some of the public a detection of the demeaning function that cultures employ.
3) means that not only will ISk and cultural beliefs get much more freedom to operate in members of the public regarding the concept at issue, but also in scientists embedded within that public. Even scientists are not Vulcans or able to rise above all long-evolved behaviors. So this may sometimes lead to 4), the worst case of which is a cultural consensus hi-jacking science, hence posing as a scientific consensus33. Such a consensus will correctly trigger the detection mechanisms of ISk, yet not in individuals having closely allied value systems. And ISk may also correctly detect 5).
Hence in resisting that which is bannered as science, ISk will sometimes be apt, and sometimes inapt.
Innate Skepticism and ‘denialism’
So ISk in members of the public will be undermined, maybe overridden by belief and support, regarding promoted scientific theories or discoveries that are “convenient” or encouraging to one’s lifestyle, livelihood, or deeply-held beliefs. And returning to L2016’s pertinent question from section 1, ISk will indeed be strongly aroused by scientific theories or discoveries that are “inconvenient” or threatening to one’s lifestyle, livelihood, or deeply-held beliefs. No one group will be anti-science generally, but anti the science that challenges their values, and pro the science that aligns with their values34. So where does so-called ‘denialism’ fit in? Well most likely, it doesn’t; certainly if we assume the context of the ‘denialism’ meme’s vague and falsely negative implications, i.e. of a pathological condition or systemic individual lying or both34a.
We need to confront a much more challenging reality. ISk isn’t a different reaction depending on whether history eventually proves it right or wrong. Public reaction to a scientific theory that turns out to be apt, sometimes forming a groundswell that helps overturn dogma and aid progress, and public reaction that turns out to be inapt, sometimes forming a groundswell that resists progress, both spring from the same cause; despite in the latter case resistance may continue after clear-cut scientific replication is available. These are the two faces of innate skepticism.
Distinguishing Apt and Inapt ISk
Because of cultural alliances and other effects35, there is typically cultural behavior on both sides of an entrenched and polarized socially contentious issue (likewise rhetoric and poor behavior governed mainly by individual factors). So when suspecting that science faces inapt ISk (‘denial’ is only this, once the section 10 negativities are discounted), observing such behavior does not confirm which side is which.
The 3P tests4 for a dividing line between skepticism (aka apt ISk) and ‘denial’ (really, inapt ISk), which evaluate rhetoric and source authorities, amount to little more than justifications for the authors’ own biases35a. Providing no theoretical basis for either skepticism or ‘denialism’, these papers also fail to mark the critical difference between sides and groups. Further, as the ‘fundamental relationship’ from section 1 is essentially equivalence (the same mechanisms drive apt and inapt ISk), we can’t reliably distinguish them by this means.
Fortunately, when there is social data, social analysis can tell us which issues are essentially cultural, i.e. promoted by a group with a cultural consensus, against which ISk is most certainly apt. Yet such a group is usually only part of a side. Where two groups occupy a side, only one owns the issue, so perceiving the allied group as the cultural source of the issue is inappropriate, albeit easily done if the allied group has a higher profile than the owning group. Likewise where a group occupies the same side as an evidential position, this side cannot be viewed as monolithic; the cultural behavior stemming from the former does not invalidate the latter, plus the ISk from the side is still apt. In both cases the allied groups are along for the ride, reaping some benefit from their relationships40.
E.g. ‘Who is who’ identifies the three main groups in each of the US climate and creationism domains (which feature opposite asymmetrical alliances40a).
NOTE: Social analysis cannot say what science is true; only what narratives are cultural collective deceptions, which are never true22a,b,42 para4. An evidential position is no guarantor of truth41, yet it is not a collective deception. Apt ISk is no guarantor of good behavior! (Incidentally having acquired sufficient terminology now, footnote 42 translates the cultural view into Kip Hansen’s MSC view).
This theory of Innate Skepticism (ISk) as relevant to a non-expert public, says:
1) ISk works independently of detailed knowledge of a contested domain.
2) ISk attempts to detect collective deception, not pursue truth.
3) ISk leads to doubt, withheld judgment, and sometimes modified beliefs.
4) Acquired beliefs suppress ISk, yet never wholly within a population.
5) Apt and inapt ISk are fundamentally the same reactive behavior.
6) ISk not pitched against a cultural consensus36 (collective deception), is inapt.
7) Apt ISk points to falsity (collective deception), not what alternatives are true.
8) Individual ISk cannot be assumed to cross domain boundaries, though…
8a) If domains have strong cultural alliance, ISk will more likely be aligned.
9) The capability for ISk arises from evolutionary processes.
10) ISk is entangled with the enterprise of science.
These characteristics have fundamental implications for all socially contentious science issues; just one example is the likely failure of ‘climate change education’37.
The ISk of experts
And expanding our scope from the public back to experts, what about their innate skepticism? I used the Helicobacter reference above as it seems the hardest kind of case for out-of-domain tests, perhaps never determinable39, yet also because it raises critical questions.
The 1980s strong consensus view said gastric disorders had a physiological basis and weren’t due to infection, a dogma dominating since the 1950s despite significant historic evidence. Given this almost exclusive focus for decades, gastroenterologists didn’t have expertise regarding bacterial possibilities. So was their assessment framework largely reduced to that used by the public? Did they judge Marshall’s and prior theories via ISk, attuned to their peer group? If so this ISk proved inapt. These are fundamental questions, because if most scientists who lack expertise typically fall to ISk:
1) Any sufficiently left-field concept challenging an established view will always face (if socially contentious, strong) ISk, even from scientists and despite plausible evidence.
2) In fields incorporating many disciplines, most scientists will lack expertise in most of the disciplines. So the process of building the big picture is very vulnerable to warring beliefs and ISk. Culture / identity could outbid truth; an observational match with CAGW.
Conflation of ISk and SSk.
ISk is very different to SSk. Unless the former is characterized and distinguished38, considering the deep entanglements of science with powerful cultural mechanics (of which ISk is a part), we’ll never be certain in contentious debates which is which, which is dominant where, and if SSk is operating objectively as it should or is mixed with lesser or greater ISk.
Where ISk (or similar identity related accounts like Kahan’s) isn’t even acknowledged, debate about the nature of skepticism will be confused. For instance Michael Shermer comments: ‘science and skepticism are synonymous’. Yet this ignores innate skepticism and its entanglement with science, making conflation of ISk and SSk inevitable. Indeed Shermer’s recommendation to investigate the sources of claims to help establish truth / untruth, is a slip into ISk. Sources and their authority are about identity, not argument; they’re irrelevant for objective in-domain SSk. They are relevant to out-of-domain social analysis (also a scientific inquiry yet topic independent), though only to the extent of establishing ownership or not by a cultural consensus, a procedure neither Shermer or the 3P tests address.
Further conflation is apparent in the same article (discussed at Climate Etc.) and even phrased using terms of (religious) culture: ‘It is to find the essential balance between orthodoxy and heresy, between a total commitment to the status quo and the blind pursuit of new ideas, between being open-minded enough to accept radical new ideas and so open-minded that your brains fall out. Skepticism is about finding that balance.’
In the sense of social conformance to particular views, science should have no need of orthodoxy and no commitment to a status quo (a social device). Hence there’s no need for heresy either, and no balance between these culturally defined poles; just alternate propositions that like the challenged ideas should stand or fall purely on their merits. So ultimately upon evidence. And exercising withheld judgment, a principle common to all skepticisms, means our brains are unlikely to ‘fall out’ should distinguishing evidence not yet be obtainable; note also: no alternate theory is necessary for validly withheld judgment. Nor would a blind (to domain conventions) pursuit of new ideas typically represent a critical science-related danger to society44. A much greater danger is that institutional SSk doesn’t keep objectivity and drifts into ISk modes, at which point a whole scientific mainstream, often pushing widely applied policy, may owe a lot more to a collective deception than to science.
The Ancient Greeks attempted to formalize instinctive skepticism and make it independent of values; a strong linkage to cultural values still challenges scientific skepticism where science has social impact.
Social psychology asks pertinent questions about skepticism in the public. Yet it seems, especially for those practitioners covering the domain of climate change (who in the great majority of cases assume the orthodoxy of certain calamity as a hard-baked prior), the field not only avoids the answers it doesn’t like, but typically avoids investigating the actual root causes of public skepticism. Maybe because that activity could lead away from a comfort zone of ‘approved’ consensuses, with ‘denialists’ firmly in the naughty box. This post may not provide the best and certainly not the fullest answers, but at the least it highlights a productive area needing much more attention.
Moderation note: As with all guest posts, please keep your comments civil and relevant.