Facts don’t appear to change our minds


By Jeff Linville - [email protected]



I learned a new phrase this week.

I knew the condition well enough already, but I didn’t have a catchy title for it.

Confirmation bias.

Are you familiar with this phrase? Well, even if you aren’t, you’ll certainly recognize the symptoms.

You are talking about an issue with your friend, relative or neighbor. They give an opinion that is just ridiculous. Surely no reasonable person could think that way.

You try to give some facts that disprove what they say, but it doesn’t do any good. You try to give some instances where their logic would fail, and they refuse to see your point.

Sounds a lot like the 2016 election season, doesn’t it?

We certainly saw plenty of this on both sides of the presidential race. No matter how much evidence someone could bring up about Hillary Clinton’s shady past, Democrats were unshaken. No matter how many times someone brought up that Trump has botched many business deals, Republicans were convinced that Trump was a successful businessman and would make a great leader.

Turns out, it’s not just politics where we have this confirmation bias. It shows up in sports arguments, the best way to educate children, nutrition and diets, and much more.

Humans have a tendency to look at a small amount of data, make a decision based off inadequate knowledge, and then steadfastly refuse to change that opinion.

As a sports fan, I certainly know this to be true. Some dude is convinced that one player is better than another. But when I want to bring up stats that show the second guy is performing better, the dude wants to blow off my research, saying you can’t tell greatness from numbers (however, if he does have facts that support his side, you can be sure that dude will certainly bring them up).

An article this week in The New Yorker looks at a couple of books by researchers on this phenomenon.

“The Enigma of Reason,” by cognitive scientists Hugo Mercier and Dan Sperber, features an experiment the two men and some other researchers put together concerning the death penalty.

A group of college students were given two phony studies to examine. One fake report gave data that showed that capital punishment deters crime. The other report gave equally phony stats that capital punishment does not deter crime.

The ones who favored the death penalty thought the first report sounded highly credible, but the second one was either suspicious or just unconvincing. Those who were against the death penalty felt just the opposite.

Despite the fact that both reports were fake, students came out of the experiment feeling that their respective position had been strengthened.

Scientists tend to agree on the idea that natural selection shaped who we are today. From the caveman days up to about the start of the 20th century, those who were weak or sickly died off while the stronger remained.

Cognitive scientists feel the same way about how the brain works. We evolved in ways that helped us survive our harsh conditions.

And yet, confirmation bias seems to be something that would not be selected, note Mercier and Sperber. If a caveman didn’t see any wild animals around his cave for a few days and drew the conclusion that there was no danger present, he would soon be dinner to some big predator for ignoring warning signs.

The authors suggest that this trait developed not as a way to survive Mother Nature, but rather as a way to cope with working in tribes. If a tribe leader decided it would be best for everyone else that Grog go hunting, the others might agree, but Grog would protest for his own self-interest.

And let’s face it, when it comes to poking holes in a belief, we are great at spotting someone else’s weaknesses, but blind to seeing our own.

College professors Steven Sloman and Philip Fernbach penned a book called “The Knowledge Illusion.”

People in general think they know more about a particular topic than they do.

When you start drilling down into their knowledge, however, people’s ignorance can show through.

Sloman and Fernbach mention a study at Yale where graduate students were asked to rate their understanding of everyday objects such as zippers and toilets. Students tended to rate themselves at pretty knowledgeable of these things.

So then the study asked them to write step-by-step details of how the objects function, and the students suddenly realized they didn’t know as much as they thought.

Some might argue, “Well, I don’t need to know how the toilet works in order to flush it.”

That’s true, but the same point holds true when it comes to things like improving schools and testing a child’s level of learning.

Sloman and Fernbach gave an example of a study in 2014 after Russia annexed the Ukrainian territory of Crimea. The respondents were so ignorant of geography that the median guess on where Crimea was located was off-target by 1,800 miles — like saying Charlotte was on the other side of Santa Fe, New Mexico.

The farther off-base people were about where Crimea is (it’s at the Black Sea), the more likely those people were to favor military intervention.

I’ve asked several detractors about “Obamacare” and almost none of them know that the Affordable Care Act isn’t what Barack Obama proposed (the Senate’s proposal more closely matched GOP Gov. Mitt Romney’s insurance law in Massachusetts).

And yet, in a democratic society, an ignorant man’s vote counts as much as an educated one.

http://mtairynews.com/wp-content/uploads/2017/03/web1_Jeff-new-mug-1.jpg

By Jeff Linville

[email protected]

Jeff is the associate editor and can be reached at 415-4692.

Jeff is the associate editor and can be reached at 415-4692.

comments powered by Disqus