Opinions are individual and subjective. But facts are facts. Trouble is, they’re usually communicated or interpreted by someone. That’s where opinion comes back in….
And, says the Boston Globe, the facts backfire.
Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
…most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study.
“Cognitive dissonance” is more than mumbo-jumbo. This is where we are, culturally. Academically. People are increasingly afraid to be wrong, claims First Things’ R.R. Reno.
For a long time as a young teacher, I believed the danger of prostituting their minds by believing falsehoods was the preeminent, or even singular, intellectual danger my students faced. So I challenged them and tried to teach them always to be self-critical, questioning, skeptical. What are your assumptions? How can you defend your position? Where’s your evidence? Why do you believe that?
I thought I was helping my students by training them to think critically. And no doubt I was. However, reading John Henry Newman has helped me see another danger, perhaps a graver one: to be so afraid of being wrong that we fail to believe as true that which is true. He worried about the modern tendency to make a god of critical reason, as if avoiding error, rather than finding truth, were the great goal of life.
That tendency has only grown in modern times, as the Boston Globe piece suggests.
Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts…
Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.
Or take some other highly consequential action.
Reno says we have to want to know ‘the truth’ and “risk error as we leap forward to grasp” it.
Sometimes, the dangers of failing to affirm the truth are far greater than the dangers of wrongly affirming falsehood.
If we see this danger—the danger of truths lost, insights missed, convictions never formed—then the complexion of intellectual inquiry changes, and the burdens of proof shift.
As Chesterton said, it’s not what you look at, but what you see.