Why do people ignore evidence, and what actually changes minds?

Author

Zion Lightshttps://www.zionlights.co.uk/
Zion Lights is an award-winning Science Communicator and environmental advocate who makes complex science clear and compelling. She explores energy, climate, and technology through the lens of human challenges, showing how curiosity and evidence can shape a better future. Her work combines technical clarity with a sense of wonder, helping people see the big questions and the choices that define our civilisation.
spot_img

More from this author

spot_img

In 2020, amidst the torrent of pandemic misinformation, a tweet claimed that a Covid-19 vaccine would implant tracking microchips in people. Within hours it had been shared thousands of times, spreading fear faster than any scientific explanation could counter it. Health authorities rushed to post facts, but the viral falsehood had already taken hold. The episode illustrated a frightening truth: while misinformation isn’t new, in the age of social media it can spread more rapidly than we are able to deal with it.

As Jonathan Swift once wrote,

“Falsehood flies, and the Truth comes limping after it; so that when Men come to be undeceiv’d, it is too late; the Jest is over, and the Tale has had its Effect.”

This quote is also a fitting illustration of the problem it describes, as it is frequently misquoted and misattributed to other writers. But, arguably, a misattributed quote is of little consequence compared with misinformation that leads people to fear and reject things that are more essential to human health, like effective medical interventions. So what can we do about that?

For decades, early science communication relied on a simple strategy: present the evidence and expect people to change their minds based on numbers and graphs alone. In science communication studies, this is known as the Deficit Model, and it has been thoroughly debunked. Experience and research have, in recent years, shown that this approach rarely works. That’s because people cling to beliefs because they are entangled with identity, ideology, and community, not because they don’t understand fact-based arguments.

To persuade people effectively, scientists have to go beyond attempting to persuade people using data alone. They have to combine evidence with empathy and storytelling, to craft messages that speak to both the mind and the heart.

If it sounds challenging, it’s useful to understand why it works. The first thing to understand is that cognitive biases make false beliefs stubborn. Humans favour information that confirms what they already think – a tendency psychologists call the confirmation bias. The more emotionally charged a false claim, and the more it fits with someone’s existing belief system, the harder it is to dislodge. In this instance, attempting to correct a belief with facts alone may even backfire, potentially reinforcing them in the other person’s mind, which is known as the Backfire Effect (though the latest evidence suggests those fears may be overstated). This phenomenon helps to explain why vaccine misinformation persists despite decades of public health campaigns sharing vaccine data.

Misinformation sticks not only because of psychology but also because it travels as a story. Facts are inert, while stories are memorable and personal. Often, the outlier in a dataset tells the most compelling story; for example, a dramatic anecdote of a vaccine side effect, or a single extreme weather event carry more weight than tables of data that represent a different truth. We are a storytelling species: the human brain evolved to respond to narrative, to remember lessons and patterns embedded in social context. When scientists ignore this, evidence struggles to compete with storytellers who may not have the best intentions.

However, understanding why people believe false claims is only half the battle. Communicators also need strategies to make corrections stick, and this is where research in cognitive science and psychology offers insights.

First is the principle of identity-protective cognition: people reject information that threatens their social, political, or cultural identity. A correction framed as a direct challenge to someone’s worldview will almost certainly fail to convince them. This means that when countering a viewpoint, it’s essential to take into account and speak to the person’s belief system, meeting them where they are, rather than demanding that they step away from the groups and values that shape their sense of self.

Then there is the role of empathy, which is a surprisingly powerful – and often underused – tool. Science communicators who acknowledge concerns rather than dismiss them create space for dialogue. Studies show that people are more receptive to corrections framed around shared values than to confrontational messaging. For example, a climate skeptic may resist charts of carbon emissions, which represent abstract statistics, but is more likely to engage with a story about how extreme heatwaves are disrupting a local community. The human experience grounds the narrative into emotional storytelling, and allows a reframing of the data. Similarly, public health campaigns that frame vaccination as protecting loved ones or the broader community tap into emotional responses, reducing resistance.

Another important element is communicator trust. This refers to the confidence an audience has in the person or source delivering a message. It’s the belief that the communicator is honest, knowledgeable, and has good intentions. When people trust a communicator, they are more likely to accept, understand, and act on the information that person shares – even if it is scientifically false. Communicator trust is often built through credibility, authenticity, and consistency over time, and it is usually linked with the most effective storytellers.

A man with dark, shiny hair wearing a white labcoat and stethoscope around his neck stands in front of a red wall. His right hand is doing a thumbs up and he's holding an orange clipboard in his left. He's wearing a white surgical face mask.
Why wouldn’t you trust your doctor? By Fotos, via Unsplash

It can be daunting to start from scratch with building trust in the digital world, but the good news is that some groups are already naturally more trusted than others, and they can leverage this to debunk misinformation more successfully. For example, doctors are generally highly trusted because of their expertise and perceived good intentions.

You may ask, if trust in doctors is so high, why do people reject vaccinations in the first place? And that is a very good question. Almost always in those instances, the doctor-patient trust has been eroded through a negative experience of some kind. Again, there’s positive news: studies show that even once lost, trust in doctors can be regained through transparency, empathy, and demonstrating competence. Being honest about uncertainties, listening carefully to patients, showing care for their well-being, and consistently providing reliable medical advice all help rebuild trust over time. This is an underused element of science communication that can yield significant results – studies have found that vaccine-hesitant parents often change their minds after being invited to have a one-to-one conversation with a qualified doctor, through having their fears heard and addressed respectfully.

Even in the most resistant populations, messenger choice matters. Research shows that peers, community leaders, or individuals with shared identity traits can influence beliefs far more effectively than distant experts. Similarly, relatable voices telling stories of climate impacts can persuade audiences that data alone cannot reach. It may seem counterintuitive that people are more likely to trust a single individual than broad scientific consensus, but it reflects how the human brain is wired – we respond more strongly to personal stories and relatable messengers than to abstract data, however robust it may be.

The implications of these insights are significant. To be effective, science communicators need to learn to craft messages that engage empathy through storytelling, in recognition of the fact that persuasion is as much an art as a science. Changing minds is not easy, but it is possible. Evidence shows that respectful, narrative-driven communication can reduce the influence of misinformation and encourage people to rethink deeply held beliefs. The goal is not to shame or lecture people, but to connect with them and guide them toward understanding without triggering defensive resistance. In a world saturated with misinformation, the ability to communicate science effectively is as critical as the science itself.

Ultimately, combating misinformation requires humility and persistence, but it is also necessary to build trust, strengthen understanding, and create a foundation for informed decision-making in society. If we want truth to keep pace with fake news, we need to learn to meet people where they are, communicate with empathy, and commit to the long, patient work of rebuilding confidence in reliable information. Only then can evidence claim a central position in public discourse, guiding decisions instead of being drowned out by convincing but misleading claims.

The Skeptic is made possible thanks to support from our readers. If you enjoyed this article, please consider taking out a voluntary monthly subscription on Patreon.

spot_img
- Advertisement -spot_img

Latest articles

More like this