by Judith Curry
“No matter what the hard risk sciences may tell us the facts are about a risk, the social sciences tell us that our interpretation of those facts is ultimately subjective.”
Dave Ropeik’s essay
The entire essay is well worth reading, and there are some good tables and figs also. Some excerpts :
While this system has done a good job getting us this far along evolution’s winding road, it also gets us into trouble because sometimes, no matter how right our perceptions feel, we get risk wrong. We worry about some things more than the evidence warrants (vaccines, nuclear radiation, genetically modified food), and less about some threats than the evidence warns (climate change, obesity, using our mobiles when we drive). That produces what I have labeled The Perception Gap, the gap between our fears and the facts, which is a huge risk in and of itself.
The Perception Gap produces dangerous personal choices that hurt us and those around us (declining vaccination rates are fueling the resurgence of nearly eradicated diseases). It causes the profound health harms of chronic stress (for those who worry more than necessary). And it produces social policies that protect us more from what we’re afraid of than from what in fact threatens us the most (we spend more to protect ourselves from terrorism than heart disease)…which in effect raises our overall risk.
Here’s a mad dash through the literature on risk perception:
• Neuroscience by Joseph LeDoux et.al. has discovered neural pathways that insure that we respond initially to risky stimuli subconsciously/instinctively, before cognition kicks in. And in the ongoing risk response that follows, the wiring and chemistry of the brain also insure that instinct and affect (feelings) play a significant role, sometimes the primary role, in how we perceive and respond to danger. Simplistically, the brain is designed to subconsciously feel first and consciously think second, and to feel more and think less.
• The research of Daniel Kahneman et.al. has discovered a mental toolbox (as Gird Gigerenzer puts it) of heuristics and biases we use to quickly make sense of partial information and turn a few facts into the full picture of our judgment. These mental shortcuts occur subconsciously, outside (and often before) conscious reasoning. This research further confirms that we are far more Homo Naturalis than Homo Rationalis.
• The Psychometric Paradigm research of Paul Slovic et.al. has revealed a suite of psychological characteristics that make risks feel “more” frightening, or less, the facts notwithstanding. Recent research on the theory of Cultural Cognition by Dan Kahan et.al has found that our views on risks are shaped to agree with those we most strongly identify with, based on our group’s underlying feelings about how society should operate. We fall into four general groups about the sort of social organization we prefer, defined along two continua, represented as a grid. We all fall somewhere along these two continua, depending on the issue.
Individualists prefer a society that maximizes the individual’s control over his or her life. Communitarians prefer a society in which the collective group is mire actively engaged in making the rules and solving society’s problems (Individualists deny environmental problems like climate change because such problems require a ‘we’re all in this together’ communal response. Communitarians see climate change as a huge threat in part because it requires a social response). Along the other continuum, Hierarchists prefer a society with rigid structure and class and a stable predictable status quo, while Egalitarians prefer a society that is more flexible, that allows more social and economic mobility, and is less constrained by ‘the way it’s always been’. (Hierarchists deny climate change because they fear the response means shaking up the free market-fossil fuel status quo. Shaking up the status quo is music to the ears of Egalitarians, who are therefore more likely to believe in climate change.)
That risk is inescapably subjective is disconcerting for those who place their faith in the ultimate power of Pure Cartesian “I think, therefore I am” Reason. But the robust evidence summarized above makes clear that;
1. Risk perception is inescapably subjective
2. No matter how well educated or informed we may be, we will sometimes get risk wrong, producing a host of profound harms.
3. In the interest of public and environmental health, we need a more holistic, and more realistic, approach to what risk means. Societal risk management has to recognize the risk of risk misperception, the risk that arises when our fears don’t match the evidence, the risks of The Perception Gap.
The challenge is to rationally let go of our irrational belief in the mythical God of Perfect Reason, and use what we know about the psychology of risk perception to more rationally manage the risks that arise when our subjective risk perception system gets things dangerously wrong.
Dave Ropeik’s comment at collide-a-scape
DR has commented on Jonathan Gilligan’s post:
May I suggest that the most important point he makes is the lesson, for climate change, of how things turned for Yucca Mountain, and WHY. The DOE and Congress, in their infinite lack of wisdom and intellectually naive belief in the facts and science and ‘reason’, effectively told Nevada “Tag! You’re IT!”, and ignored the findings of Slovic et.al. that people fear imposed risks far more than risks they choose to take themselves. This key oversight, embodied in the ‘Screw Nevada Bill’, predictably doomed the Yucca project to decades of delay and opposition. Indeed, this is the precise factor cited by Phil Sharp of Resources for the Future and The President’s Blue Ribbon Commission (BRC) on America’s Nuclear Future when he says in their preliminary report that the reason Yucca failed was that it was jammed down people’s throats. Compare that to the way the Finn’s cited a high level nuclear waste repository…offering potential host communities $$$ to study what would be involved, BUT GIVING THEM VETO POWER IF AFTER THEIR RESEARCH THEY STILL WANTED TO SAY NO. Of 6 possible host sites, 4 said no and 2 fought to host it! It got done in under 10 years (it’s nearing completion.)The Swedes are copying this, and the Spaniards are trying. The BRC is wisely studying all those models, and visited Finland to learn how they did it.
This is a valuable lesson for climate change. As Prof. Gilligan points out, risk is not just about the facts, but how those facts FEEL. If we understand WHY people feel the way they do about climate change, (not HOW they feel, but WHY), we can respect the powerful psychological underpinnings of where people are coming from as we look for ways to encourage actions to mitigate and adapt. THAT’s where we will find progress, not in arguing the facts alone and trying to convince people to change their minds about the evidence per se.
(by the way not to be too self-promoting, but “the research on the complex psychology of preferences, emotions, and uncertainty by Pal Slovic, Dan Kahan, et. al” is precisely what I have brought together and summarized in my book “How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts”. Ch. 5 has initial proposed solutions to some big problems, using these insights, and is available free online athttp://www.dropeik.com
Nullius in Verba counters with this:
“I am not sure I entirely agree with David Ropeik on his “getting past the intellectual argument” stance.”
Me neither. It’s all very well to switch from arguments to motives, but the same logic applies in reverse. Should sceptics instead of trying to argue climate science, instead try to find out why AGW believers believe as they do? And then find ways we can respect the powerful psychological underpinnings of where people are coming from as we look for ways to discourage hasty and counter-productive action. If the method works at all, it should work for us too, shouldn’t it?
JC comments: I have been reading some of the risk perception literature, but this essay really synthesizes and clarifies things. Much of the disagreement that is commonly assumed to be political is really more deeply rooted in psychology and cultural identities. Awareness of these issues is critical:
“. . . use what we know about the psychology of risk perception to more rationally manage the risks that arise when our subjective risk perception system gets things dangerously wrong.”