Site icon Climate Etc.

5 logical fallacies that make you more wrong than you think

by Judith Curry

The Internet has introduced a golden age of ill-informed arguments.  But with all those different perspectives on important issues flying around, you’d think we’d be getting smarter and more informed. Unfortunately, the very wiring of our brains ensures that all these lively debates only make us dumber and more narrow-minded. – Kathy Benjamin, CRACKED

CRACKED magazine has a funny but insightful article entitled Logical fallacies that make you wrong more than you think.   Here are excerpts describing the 5 fallacies, with some commentary from me re some examples from the climate debate:

#5.  We’re not programmed to seek ‘truth’, we’re programmed to ‘win’

Think about the last time you ran into a coworker or family member spouting some easily disproven conspiracy theory. When they were shown proof that their conspiracy theory was wrong, did they back down? Did they get this look of realization on their face and say, “Wow … if this is untrue, then maybe the other ‘facts’ upon which I’ve based my fringe beliefs also aren’t true. Thank you, kind stranger, for helping me rethink my entire political philosophy!”

That has literally never happened in the history of human conversation. Whether it’s a politician whose point has been refuted or a conspiracy theorist who has been definitively proven insane, they will immediately shift to the next talking point or conspiracy theory that backs up their side, not even skipping a beat. They keep fighting to defend their position even after it is factually shown to be untrue. But what’s really weird is that process — of sticking to your guns even after you’ve been proven definitively wrong — is apparently the entire reason humans invented arguing.

It’s called the argumentative theory of reasoning, and it says that humans didn’t learn to ask questions and offer answers in order to find universal truths. We did it as a way to gain authority over others. That’s right — they think that reason itself evolved to help us bully people into getting what we want.

And as evidence, the researchers point out that after thousands of years of humans sitting around campfires and arguing about issues, these glaring flaws in our logic still exist. Why hasn’t evolution weeded them out? The answer, they say, is that these cognitive flaws are adaptations to a system that’s working perfectly fine, thank you. Our evolutionary compulsion is to triumph, even if it means being totally, illogically, proudly wrong.

Back when evolution was still sculpting your ancestor’s brains, admitting you were wrong to the person you were debating got you bred out of existence. These days, being able to admit you’re wrong is the greatest skill you can develop if you want to stay married.

JC comment:  The warrior like tactics being used by proponents in both sides of the climate debate aren’t getting us anywhere; if either side actually ‘wins’ in terms of policies, they might end up being totally, illogically, proudly wrong.  Staying married is a good analogy for what we should be doing:  seeking solutions and areas of agreement to support overall well being.

#4.  Our brains don’t understand probability

 Do you know a guy who keeps a loaded shotgun under his bed? You know, in case a gang of European terrorists storm into his house and try to kidnap his family?

If you throw a bunch of statistics at him about how unlikely that is (for example, that he lives in a low-crime suburb in Wisconsin where there’s only been one murder in the last 40 years, that he’s statistically more likely to accidentally do something stupid than ward off a criminal and that more people were struck by lightning last year than successfully shot bad guys in the middle of committing crimes), it won’t change his mind. Instead, he’ll rebut you by citing a news story or an anecdote about a guy who successfully fended off a Die Hard bad guy thanks to his trusty 12-gauge. For him, that single, vivid example completely overrides all talk of statistics or probability.

It’s called neglect of probability. Our brains are great for doing a lot of things. Calculating probability is not one of them.

As experts point out, when there is strong emotion tied to the unlikely event, our ability to continue to see it as unlikely goes out the window. Thus, any statement of “It’s very unlikely your child will be eaten by a bear, these bear traps in the yard are unnecessary and keep injuring the neighborhood kids” will always be answered with, “Say that when it’s your child being eaten!”

JC comment:  this one really strikes a nerve.  It explains how extreme events (especially as they influence you or your country) or the lack thereof explain the waxing and waning of support for climate change policies in an individual country.  It also explains the role grandchildren play in the debate.

#3:  We think everyone’s out to get us

If you’re smart and savvy, you know not to trust anyone. This is why we can excuse ourselves for using shady or flat-out dishonest tactics to win an argument. We’re sure the other guy is doing much, much worse.

The world is so full of hidden agendas and stupid ideologies that we have to do whatever we can to keep up. And “whatever we can” is often code for lying.

Think about all the people you’ve disagreed with this month. How many of them do you think were being intentionally dishonest? Experts say you’re almost definitely overshooting the truth. It’s called the trust gap, and scientist see it crop up every time one human is asked to estimate how trustworthy another one is.

We start assuming people have ulterior motives and hidden agendas as early as age 7 and from that point on, we never have to lose another argument for the rest of our lives. After all, if we assume the person we’re arguing with is lying, the only thing they can prove to us is that they’re a really good liar.

The more arguments you get into with those lying extremists from the other side of the aisle, the more you learn about how they lie, the faster your brain turns off after they start talking.

JC comment:  this explains the Peter Gleick episode as well as anything I’ve seen.  And the refusal of consensus climate scientists to enter into debates with skeptics.

#2.  We’re hard-wired to have a double standard

[T]he fundamental attribution error . . . is a universal thought process that says when other people screw up, it’s because they’re stupid or evil. But when we screw up, it’s totally circumstantial.

The process feels so obvious when explained — we simply lack information about the context in which the other person screwed up, and so we fill it in with our own.

The reality is, of course, that you were on completely different roads. The assumption that everyone’s circumstances are identical is so plainly wrong as to be borderline insane, but everyone does it.

JC comments:  Skeptics, pay attention to this one.  Accusing scientists of fraud and malfeasance with every mistake that is identified is not useful.

#1:  Facts don’t change our minds

Let’s go back to the . . . theory that people figured out how to build arguments as a form of verbal bullying rather than a method of spreading correct information. That means that there are actually two reasons somebody might be arguing with you: because they actually want to get you to think the right thing, and because they’re trying to establish dominance over you to lower your status in the tribe (or office or forum) and elevate their own. That means there’s a pretty severe cost to being on the wrong side of an issue completely separate from the issue itself.

That is why confirmation bias exists. We read a news article that supports what we believe, and we add it to the “I’m right about this” column. News articles that contradict what we believe are dismissed. We make up a reason — maybe the source is part of the conspiracy from the other side or whatever it takes to make sure the “I’m wrong about this” column remains empty.

Researchers have done experiments where they hooked up people’s brains to scanners and then made them read a story pointing out something stupid their favorite candidate said. The logical parts of the brain stayed quiet, while the emotional parts of the brain lit up. Their brains were weighing the story, not based on what it logically meant for their position, but on the emotional/social consequences of that position being wrong.

Backing down  . . . means letting down your team. Every inch of your psychology will fight it.

JC comment:  Ahhhh, the tribe, and the team!  The emotional/social costs of being ‘wrong’ in the climate debate have reached mammoth proportions. This is probably the scariest fallacy in terms of science, and we see ample evidence of this in the CRU emails.

Exit mobile version