Strange Science: Why we prefer our opinions to facts 


Our society has become so contentious that we often can’t even agree on what is a fact and what is an opinion. Many of us have lost the ability to be objective about things that matter, i.e. politics and religion. Many more never had it in the first place. It’s become difficult for some folks to get through the day without yelling “fake news!” at least once, and we don’t even agree on what that vague term means; is it the mainstream media, or is Fox News the real “fake news”? 

However, the problem of deciding between true and false isn’t just a matter of drawing an intellectually or morally correct conclusion. First, we have to deal with our own unconscious biases, according to a scientific study of “involuntary opinion confirmation”—our tendency to automatically rate as “true” any information with which we already agree.  

When I first read this information, I said to myself, “I’m a professional journalist. I don’t ‘involuntarily confirm’ any opinions. I analyze them objectively. With neurons firing at near light-speed, the information jumps across synapses to brand new conclusions that are based only on facts.” 

Yeah, sure. I wish. 

The study I just referred to (That’s My Truth: Evidence for Involuntary Opinion Confirmation) indicates that all of us might have trouble distinguishing between facts and opinions. The scientists wrote, “Past research has investigated deliberate mental acts that allow people to remain entrenched in their convictions. The purpose of the current investigation was to examine whether opinion-confirmation processes can occur involuntarily. We conducted experiments wherein participants made speeded judgments of the grammatical accuracy of statements pertaining to various matters of opinion, and subsequently rated their agreement with those statements. The results show that participants more readily verify the grammaticality of a statement when it corresponds to their opinion. These findings may help explain why opinions are sometimes change resistant, in showing that acceptance (rejection) of confirmatory (contradictory) opinions can occur involuntarily.” The researchers claim that when we are exposed to opinions with which we already agree, our brains automatically label them as “facts”. No conscious thought required. Knee-jerk reactions welcome. 

If involuntary opinion confirmation turns out to be a fact itself, it will fit right in with two other well-known phenomena—”confirmation bias” and “the backfire effect”. The former is our hard-wired tendency to prefer information that confirms our biases; this is why conservatives generally read writings that are slanted to the right, and liberals stick with those that lean to the left. The latter is our propensity for resisting facts that contradict our opinions; the more facts you throw out to prove your point to your opponent, the deeper your opponent digs in to resist. Put all these biases together and you have a pretty good explanation of why it’s so hard to change people’s minds about things they care about. 





About the author

Dave Segal

Dave Segal

Dave Segal, a Detroit native, has been a journalist since 1977. He has worked as a reporter, commentator, and news director at radio stations in Detroit, Denver, and Montrose.

Dave has been writing and editing for the Monitor since its first print issue in 2003. He is editor and senior writer for the digital magazine. On the side, Dave has also done freelance writing, media relations, and a variety of volunteer work.