Conspiracy theorists, as I have written in a previous article, employ a series of rhetorical tactics to prove their theories. This is, for them, a necessity, since they cannot rely on the evidence for their point as it is insufficient. They cannot rely on logic because their arguments will typically fall apart when exposed to it. What they are left with is a list of tactics that are called “informal fallacies.”
They are informal in that they do not violate any of the laws of formal logic, or, as I called it in graduate school, ‘letter math’. In a formal setting we have a simple syllogism of the AAA type. All S are P, all P are Q, therefore All S are Q. Thus, that argument could be written as, “All penguins are birds, all birds can fly, therefore all penguins can fly.” The argument follows the rules of logic and is formally correct but commits an error in evidence. This renders the argument valid but unsound. Informal fallacies are those which focus on some aspect of the argument itself which is in error. Conspiracy theories that cherry pick their data, for instance, are committing an informal fallacy. Most informal errors are contained within the conspiracy theory itself, while others can only function within a debate against someone who is antagonistic toward the theory.
One of these debating tactics is for the conspiracy theorist to ‘move the goalpost’. The conspiracy theorist posits an epistemic standard that they claim will disabuse them of their position. They say, ‘I will no longer believe my theory if you can provide X evidence.”
You, the intelligent (and good looking) skeptic reply, “You mean this evidence? Here you are.”
At this point, in an ideal world, this conversation should be over. The conspiracy theorist should shrug and reply, “Oh, that evidence does exist. I guess I was mistaken, perhaps we should share a tea and have a friendly chuckle about this.”
Instead the conspiracy theorist shifts the goalposts. Instead of reconsidering their belief, they ask for X+1. If you provide X+1, it will not matter because they will then ask for X+2, then X+3…and so on. The temptation for us skeptics is to provide the evidence, and if we do, we are falling for their trap, which will be perceived as a ‘victory’ for their side. I’ve written that conspiracy theories are emotionally compelling, not rationally compelling; the goalpost shifting is evidence of this.
There are a few aspects of this tactic that we must keep in mind. The first is that the conspiracy theorist is not acting in good faith; they do not want the evidence. We can come to this conclusion via inductive reasoning, as they would have given up their theory once that evidence was presented. The second, is that they have no idea whether or not the evidence they initially asked for exists. A safe thing to assume is that we know their theory much more than they know ours. This is not to say that they are stupid or unintelligent—rather, I am saying that their emotional attachment to their position will often cloud their minds from self-criticism. When they ask for X, they will genuinely not conceive that X exists. Finally, and most importantly, the reality of conflicting evidence is not something that really matters to their belief. The issue is that their motivations for believing the conspiracy theory extend beyond the factual evidence itself. Presenting the evidence they have asked for, is typically going to be an impotent exercise.
The feeling might be that if we keep providing the evidence the conspiracy theory will eventually collapse underneath the weight of the counter-factual evidence. This is what they are hoping for. Our skeptical worldview is supposed to compel us to abandon positions that we cannot support through evidence or logic. Ideally, if the positions were reversed, they would “win” the debate because we are supposed to acquiesce when counter evidence is presented. They are hoping that by pushing the goalposts so far back that eventually there is going to be some gap in our knowledge, something that we will have to admit is not there. Eventually, I’m not going to be able to offer proof that Adam Weishaupt was just some eccentric theology professor. There are so many examples of this tactic in use that is hard to pick just one.
The best example I can think of, comes from the seeming quaint old days when the biggest issues facing American skepticism was stopping Evangelical Christians from forcing Creationism (or Intelligent Design) into public school science classes. The method of shifting the goalposts is delightfully portrayed in one of the best episodes of the animated Futurama (not the sad one with the dog) in a debate between Professor Farnsworth and Dr. Banjo. Every time Banjo proclaims that there is a missing link, Farnsworth supplies it… until the point where Farnsworth has run out of examples. Banjo claims victory, yet his victory is only in rhetoric not in essence.
The most common occurrence of this tactic belongs to the current anti-vaccination crowd—if I were writing this twelve years ago, I could have said exactly the same thing. The conversation begins with their claim that they just “don’t know about this vaccine.”
We bite, “What do you not know?”
“It came out a bit fast, didn’t it?” is their reply.
Which, to be fair, does look strange. While previous vaccinations have taken years to develop, in a single year four viable vaccines were developed, tested, and released to the public. It is a speed record for sure, but it ignores a couple of different factors that can be presented to the vaccine “hesitant.” The first being that the mRNA vaccine method was adapted from existing technology, it wasn’t constructed out of whole cloth. You can also offer up the explanation that the most time-consuming aspect of medical research is generally not the science but finding the money. When a government cuts out the red tape and is throwing money at the problem it cuts out a significant portion of the time. No longer do scientists and researchers have to write out grant applications and funding requests (especially as they aren’t known for their writing ability to begin with), waiting to hear a response, getting rejected, and then having to start the entire process over again. Part of this process is having to justify why the research is important. In our present pandemic that point was obvious. Without this drag on the process the actual research and pre-clinical experimentation stages went quite a bit faster.
Suddenly though, the speed of development is no longer the issue. Now they want proof that it is safe. You point to the phase three, double-blind, placebo-controlled studies that were conducted prior to the release of the vaccine. You also point out that the vaccination programs have been rolled out for nearly a year at this point, and if they were dangerous—we would have seen huge numbers of serious side-effects, including deaths, by this point.
Your interlocutor, along with an increasing frustration in their voice, is no longer concerned with that kind of safety, but instead wants some guarantee that it is not going to affect their fertility. They will further want information about some fetal stem cells, something they heard about named “luciferase,” and on and on. What you have to realise is that none of the information you provide is doing anything. They shift from topic to topic not because your information is educating them, but because their emotionally—held position is forcing them to pivot toward a new direction that you will hopefully not be prepared for.
What can be done? Well, fortunately we have an easy response to this tactic, provided we succeed a the most important part—identifying it. Once identified, the best method is to set the changing goalpost in concrete. Stop the conversation and say something like, “You keep changing your request, is there any information that would satisfy you?”
To be clear, it is vital to do this politely, as rising their ire will only cause them to dig in on their position. Secondly, the only thing that pointing this out can accomplish is to make the other person aware of what they are doing. Nothing is one hundred percent, and some questions are valid, while others bely the hope that there is no answer. Eventually, even the hardest of sciences will admit that there is some border to knowledge that cannot be answered. That is not the fault of human inquiry, it is in fact what drives us. After all, it is the search for the unknown is what has led us to that which we know.