Facts won’t change minds about animal medicine, so should we bother trying?

Author

Robyn Lowehttps://www.facebook.com/veterinaryvoicesuk
Robyn J Lowe BSc Hons, Dip AVN (Small Animal), Dip HE CVN is a small animal Registered Veterinary Nurse (RVN) who regularly writes articles for academic journals and publications for animal owners. Robyn has a passion for evidence-based medicine, volunteers for Canine Arthritis Management, runs the Veterinary Voices Public Page, and campaigns on mental health and animal welfare issues.
- Advertisement -spot_img

More from this author

spot_img

In veterinary medicine (as in human medicine) we are often faced with people who believe that, despite our oath to advocate for our patients, there are veterinary professionals who are actively trying to harm animals.

As history shows with the likes of Harold Shipman, there are certainly sad and disastrous events that occur in human and veterinary medicine that we can learn from to ensure they never happen again. They are, thankfully, rare.

Yet sometimes, people believe – magnified and exacerbated by social media – that there is widespread harm being done because a mass of professionals is ‘in’ on a conspiracy, trying to make money, trying to harm pets, and more. This is not true. But, as with most things, the loud and vocal minority can make a lasting impact – especially when they set out to attack the integrity of a compassionate profession.

One example is anti-vaccination rhetoric. A 2020 study into human vaccination concluded that, over a 20-year period, vaccines were remarkably safe. Similarly, a 2004 canine study found no temporal association between vaccination and ill-health in dogs. Although there is always risk with any medication, and medication reactions obviously do occur, the preponderance of the evidence supports minimal risk of harm from vaccines. Even when we look more closely at some specific brands of vaccination that are particularly vilified, the evidence still suggests incredibly uncommon and rare adverse events.

You need only to look at the re-emergence of measles in the UK and USA; countries that had previously, due to vaccination, almost managed to eliminate the disease. New data suggests that it could soon become endemic again if vaccination levels remain as they are.

This re-emergence was fundamentally caused by a now-disproven paper that linked autism to a vaccine, and subsequent erosion of trust in science by misinformation. The General Medical Council found the paper’s author, former doctor Andrew Wakefield, guilty of serious professional misconduct – but, despite the publication’s withdrawal and being disproved by numerous studies, there are still people who continue to cite Wakefield’s research as evidence of the harms of vaccines.

Medical skepticism has been subject to social-psychological research, which found strong correlations between vaccine skepticism and adherence to complementary and alternative medicine and conspiracy ideation. A 2019 study by Cuevas et al. suggests that mistrust toward healthcare may unfavourably affect patient-clinician interactions and patients’ outcomes. To tackle it, we can’t just present facts, because facts do not change minds; we need to implement a more systemic long-term strategy to address the root causes of medical mistrust.

One 2020 paper (Scherer et al.) looked at three theoretical perspectives on why certain people are susceptible to online misinformation: lack of knowledge or literacy to discriminate between true and false information; having strong pre-existing beliefs or ideological motivations; and neglecting to sufficiently reflect about the truth or accuracy of news content encountered on social media.

Addressing pre-existing beliefs, research from Toomey et al. found that factors such as worldview, religion, and political beliefs have strong associations with rejection of science related to controversial issues. Not only that, but our bias towards those factors also means that we see data compatible with the beliefs we currently hold as more ‘valid’ than data that could refute those beliefs. This was further supported by a 2015 paper that showed acceptance of scientific evidence depends on the availability of such alternative points of view as religious faith and political ideology; if scientific results conflict with a readily available alternative view, individuals are less likely to defer to the research.

Furthermore, we are highly sensitive to the beliefs and actions of those in our immediate circle or community – is this the hunter-gatherer ancestral background rearing its head? One paper by Douglas et al., (2017) reported social motives (the desire to belong and to maintain a positive image of the self and the in-group) as a driver of conspiracy ideology.

A Black woman in a long-sleeved white top sitting at a white desk browses Facebook on her laptop
Browsing Facebook, a very effective way to build mistrust in medicine – human or animal. Image by cottonbro studio, Pexels

So, while behavioural change can occur at the individual level, broader impacts require a focus on social networks and systems. This is a great example of how social media platforms are exacerbating these beliefs, as people enter an echo-chamber and feel part of that community or movement. Trying to engage in reasonable discussion with individuals who choose to engage aggressively is likely futile; data that runs contrary to their view is likely to drive them further away.

Additionally, the physical strain of your time and resources and mental drain on your emotions means direct confrontation is not likely to change minds. Does this mean we shouldn’t ever ‘call out’ or counteract medical misinformation? Perhaps not.

There is often a silent majority, quietly reading, sitting on the fence. These people, not yet aligned to any particular view, will potentially read your professional, calm, polite and robust replies, and you may make a difference to them. Evidence does show that those people exposed to correct information are less susceptible to conspiracy or misinformation than those who have not been exposed to the correct information first. This is called inoculation theory – offering a logical basis for developing a psychological “vaccine” against misinformation.

A paper by Douglas et al. (2017) discusses factors such as epistemic motives (the human desire to find causal explanations for events, building up a stable, accurate, and internally consistent understanding of the world) and existential motives (the need for people to feel safe and secure in their environment and to exert control over the environment) as other reasons people will latch on to alternative or conspiracy views – because it gives them an answer and control over a situation they feel out of control of.

This is noteworthy in veterinary medicine – sadly, animals get ill and die, and sometimes these events are unexpected. We do not always do a post mortem to confirm diagnosis, maybe due to financial cost, or owners’ wishes. Owners, in their understandable deep emotional state of grief or mental duress (anyone who has lost an animal companion can attest to this), can try to find answers as to what happened – and can settle on blaming the vaccine/medication the pet had a week, month or even year prior, especially when anti-vax info is easily accessible when you start to search for it. Again, vaccine reactions do occur – this is the importance of pharmacosurveillance – but overall they have been proven incredibly safe.

Toomey et al.’s 2023 study concluded that most attitudes and behaviours regarding research decision-making are not based on the rational evaluation of evidence, but determined instead by a host of contextual, social, and cultural factors and values. Therefore, providing additional information – even in accessible formats – is not likely to lead to significant changes.

So, facts will not always change minds, and a social media argument is unlikely to be beneficial, helpful or fruitful to you or the other person, and will probably cause both of you considerable anger, anxiety and frustration. Ultimately, you both believe you’re right, and on reflection want the same thing – a healthy, happy pet (you’re just going about it via different routes, one of which may not be evidence-based). But there is some indication that effective techniques exist.

With that in mind, how do we best get people to evaluate, remember and engage in evidence-based information?

‘Message elaboration’ is a term that broadly refers to the amount of effort that an audience of a message has to use to process and evaluate a message, remember it, and subsequently accept or reject it. One paper from 2022 looked into the presentation of a message and its contents, evaluating how successfully it was received. The results indicated that including statistical evidence in messaging reduced elaboration, improving audiences’ understanding, with fewer misperceptions and increased perceived message believability. Facebook messages presented in this way also were associated with higher audience intentions to share, like and comment, showing greater engagement and favourability. The research also found that messages including text and an image had better message elaboration than image-only messages. This shows that, if we want to engage effectively with people on social media, we should consider how we present our messaging, as well as what we’re saying.

Benecke and DeYoung’s 2019 study looks at the broader picture, and identifies the need for a long-term educational strategy. In their work, they explain that medical professionals must take a different approach to education, including outreach to vulnerable communities and individuals; they also note that social media platforms have an active role to play in monitoring and banning false information.

In terms of bridging the gap between medical professionals and the public, and opening up conversation that’s more likely to help than hinder, we need more spaces for group dialogue, where we can listen to multiple perspectives and “embodied knowledge”, which can help us alter our message to make it more likely to be well-received, and to think more carefully about whom we seek to target with our message. In essence, we should tackle medical misinformation, but we must be strategic in how we do so.

Direct contact with clients and veterinary teams can also end in a positive or negative interaction. Perceived dismissive attitudes, judgement or defensive behaviour by a veterinary team will likely build bigger walls and shut down opportunity for conversation and compromise.

Some of these issues can be mitigated by ‘physician-focused’ changes. For example, a study found the effects of mistrust can be countered using patient-centered communication skills. These include soliciting the patient’s concerns and priorities, and being responsive to the healthcare needs and belief system the patient identifies with. One study found that medical professionals might be able to buffer patients’ levels of medical mistrust using patient-centred communication.

Medical misinformation is rife in human and veterinary medicine. It is frustrating, and can cause poor patient outcomes. But we, as veterinary professionals, have to accept some accountability in how our behaviour shapes communication and our perceived image. There is no simple answer as to how we tackle it, but ensuring we engage in the right way can ensure we help, rather than hinder, communication with those with alternative views.

Further Reading

The Skeptic is made possible thanks to support from our readers. If you enjoyed this article, please consider taking out a voluntary monthly subscription on Patreon.

- Advertisement -spot_img

Latest articles

- Advertisement -spot_img

More like this