Hit them in the feels: confirmation bias and the emotional component of reason

Author

Brian Macrae Davishttp://broconvo.com/
Brian Macrae Davis primarily works in construction and is a yoga teacher. His interests lie in the topics of critical thinking and the art of conversation. That interest led him to create a podcast with his brother, Colin, called Brother VS Brother in which they discuss topics ranging from politics to social issues, usually from opposing viewpoints.

More from this author

- Advertisement -spot_img

Imagine that you are a willing participant in a study. In this study, you are shown the photo of a man. You are given his name and credentials in the field of meteorology, and the credentials are impressive. You are asked to rate this man’s expertise from low to high on the subject of climate change. Without knowing his specific credentials, how do you imagine you might answer the question?

In the next stage of the study you are then given his scientific opinion on climate change, and find that he is dismissive of the idea that humans are causing the rise in global temperatures. You are then asked to rate his expertise again. Would your rating change?

Now imagine that you are placed in the other group that is being tested. You are shown the same photo and given the same list of credentials, and of course you give the same rating for the man’s expertise. The next step is different, you are given his scientific opinion but this time the opinion supports the idea that humans have caused the rise in global temperatures. This time around, would your rating of his expertise change?

When this study was conducted by Dan Kahan, who was exploring the Political Motivated Reasoning Paradigm (PMRP), whether people’s opinion of this man’s expertise changed or not depended on their political ideology. Liberals were more likely to lower their rating of his scientific credibility when he dismissed global warming, and conservatives lowered their rating when he expressed support for the idea. This study shows that more important than someone’s expertise is whether their perspectives align with our own.

The study illustrates that we tend to look for the data that supports our beliefs, and dismiss the rest. This is known as Confirmation Bias, which is well known among those who value a skeptical mindset. It is very easy to find Confirmation Bias in the arguments of others, but very difficult to recognise it in our own views.

Return to the study and your opinion of the expert. If your opinion of his expertise did not change once you learned that his opinion contradicted your own, you resisted the completely natural impulse to dismiss information that contradicts your belief. On the other hand, if your opinion of his expertise changed, you are a victim of your own Confirmation Bias. At this moment, you might want to consider whether you are proud of yourself for your impartiality, or whether you justified your shift in opinion with a comment that starts with something like, “Yeah, but he was wrong because the science shows…”

Debates of changing temperatures fall into a category of topics that we cannot explain by personal experience. We can’t look out the window and point at climate change. We might be able to point to an effect of climate change, but that is a faulty premise for an argument. All your opponent needs to do is point to a regional weather pattern that counters your experience and your argument flounders. Convincing a person from Iowa that the drought in India is caused by climate change as they watch their town disappear under the Mississippi River is a tough message to sell because it does not fit with their experience.

That one winter which was incredibly harsh in your neighbourhood also saw record high temperatures somewhere else. That one summer that had the highest number of hurricanes on record was followed by 4 years of normal hurricane seasons. When we discuss climate, we must remove our personal experience and look at the data across the entire globe, and spanning many decades. Our personal experience does not prove that humans cause a rise in global temperature, the data does.

The separation by distance or time can diminish the sense of urgency that an issue weighs on us. We may understand the problem intellectually, but we don’t feel pressured to act. The further we are removed from an issue, the harder it is to respond to the issue. To express it more simply, emotional response is proportional to proximity.

Most people are moved by feelings, not facts. Motivation is an emotional driver. It is difficult to feel an emotional response to an abstract issue like climate change. Other issues that fall in this category are famine or war on the other side of the world (separated by distance), and the loss of fossil fuels in the future (separated by time), and recycling (we assume the soda bottle that we put in the bin is being repurposed, while in reality it often ends up in landfills in some other country). This is why those trying to earn your donations will show images or tell stories of a victim. They attempt to personalise the suffering and bring it closer to you, so you will have an emotional response.

A white woman (Sally Struthers) cuddling a black child

If you are old enough and lived in the US, you may remember the Save The Children commercials with Sally Struthers who informed you that for $1.38 per day, you could have a profound impact on a particular impoverished child. This commercial, and others like it, attempted to humanise the problem by giving it a face. Not only is the problem given a face, but you are given the opportunity to help a particular, specific child with an inconsequentially small amount of money (when it is broken down into daily amounts). Other programs allow you to send a goat to a village, or purchase a decorated gourd made by an indigenous artisan.

This emotional disconnect to certain problems are, in a way, the opposite issue of Confirmation Bias – which itself is nothing more than an emotional reaction that should be suppressed. Confirmation Bias is an emotional response to an intellectual challenge. When Confirmation Bias influences our mindset, evidence loses relevance. When Confirmation Bias is in play, the most important thing is how the information fits our existing worldview, not how accurate it is. We irrationally defend perspectives that align with our worldview and dismiss perspectives that contradict.

Remember the original study of the scientist whose expertise diminished when he contradicted participant’s views? The facts did not change, the emotions changed.

When someone presents information to you, how quickly do you assimilate or dismiss it? As soon as you determine whether it aligns with your existing beliefs or after you have been presented with the facts? Are you willing to dispassionately hear information that contradicts something that you take as fact or is your mind made up? How deep does your Confirmation Bias go?

The Skeptic is made possible thanks to support from our readers. If you enjoyed this article, please consider taking out a voluntary monthly subscription on Patreon.

- Advertisement -spot_img

Latest articles

- Advertisement -spot_img

More like this