Why are some people just so obstinate? – Bayes Theorem & The Backfire Effect

What do Flat-earthers, Anti-vaxxers, 9/11-conspiracists, Jesus-mythicists and Obama citizenship deniers have in common?

Well, subconsciously at least, they all display a contempt for Bayes’ theorem, and arguably they strongly manifest the ‘Backfire effect.’ Welcome to another instalment of Cognitive Bias Wednesday, today looking at the ‘backfire effect,’ with a prelude on Bayes’ theorem.

statsWhy is it that even when presented with all the data and information, some people refuse to modify their beliefs? That even with all the weight of evidence that the world is round, or that vaccines don’t cause autism, that they refuse to budge in their beliefs, and even seem to become more set in their ways. While some of this behaviour is certainly conscious, at least some is also the product of a cognitive bias operating behind the scenes: the ‘backfire effect.’ This is also why most of the arguments that occur on the internet are relatively fruitless. However, before we get to the cognitive bias, it is worth having a brief journey through a neat cognitive feature, the theory of Bayesian inference.

bayes-theorem-equationThomas Bayes was a 18th century Presbyterian minister, philosopher, and statistician; and while he published works in both theology and mathematics, he is best known for his contribution to statistics, posthumously. His work on what would eventually become known as Bayesian probability theorem was only published after his death, and the impact of it would be completely unknown to him. While Bayesian modelling and statistics are applicable in a wide spectrum of fields and problems, from memory theory and list length effects that my previous lab worked on (MaLL), 1 through to Richard Carrier’s application to historiography, 2 and many more (just don’t get me started on the null hypothesis significance testing debate). 3 The key factor for this investigation is in the Bayesian logic applied to the belief-revision loop. In lay terms Bayesian statistics can be used to predict how much someone will believe in a proposition when presented with a certain evidence set.

lutececointossTake for example the good old coin toss test, suppose you have a coin with one side heads and the other tails. Logic and the laws of probability would indicate that it should be a 50/50 chance of being heads. But what happens if you flip a coin 5 times and get 5 heads in a row, well statistically speaking it is still a 50-50 chance, even though the probability of getting a long consecutive run trends with 2(n-1). What about if you get 92 heads in a row, 4 or 122 heads, 5 do the odds change then? Probability and statistics give us a clear no, it is still 50-50 no matter how big n is. However, if you ask gamblers at a casino, or for that matter most people on the street, you will get a startling response. Many respondents will say that as it is a 50-50 probability the chance of the next coin toss being tails increases to even out the overall trend. Why? Well it is a bit of a faulty belief-revision loop, and this trend is able to be predicted by Bayes’ Theorem. Using Bayesian inference and applying it to epistemology we can predict the modification of the belief loop and see that degrees of belief in the outcome of a coin toss will rise and fall depending on the results, even though the statistics remains the same. Furthermore, these modifications are overwhelmingly conservative in most people, and this should give us pause for thought when we find evidence that challenges our beliefs.

But what does this have to do with the backfire effect? I hear you ask. Well the backfire effect is essentially where the Bayesian inference model of the belief-revision loop fails, and fails badly. Normally when people are presented with information that challenges their beliefs and presuppositions they engage in the Bayesian belief revision loop as above, and slowly change (even if slower than you would think). However, when testing how people respond to correction of misinformation Nyhan and Reifler found that, in some cases, rather than modifying their beliefs to accommodate or fit with the information that they have received, they instead clung to their beliefs more strongly than before. 6  Essentially in their tests the presence of correcting material for political misconceptions served to strengthen the misconception, rather than modify it. They dubbed this the ‘Backfire Effect.’

Now this isn’t displayed by everyone in the populace, although I would argue that it works within our subconscious all the time. Some early research shows that the backfire effect commonly raises its head when the matters are of emotive salience. So even though some of the more amusing incidences of the backfire effect that are commonly highlighted involve people sharing satirical news stories from The Onion or Backburner as if they were real news articles, others are less benign. Indeed, for almost every amusing incidence of people not checking their Bayesian revision loops and falling prey to the backfire effect, there are just as many where people are strongly reinforced in their faulty beliefs on items that matter. One of the notable items recently has been the issue of vaccination, where I have seen several acquaintances strongly hold to the proven faulty and fraudulent research that ‘linked’ vaccines with autism. Here the overwhelming body of evidence finds no link between the two, and yet they strenuously hold to the link.

arguing-internetSo what can be done about it? Well this blog post is one useful step. Being aware of the backfire effect should help us evaluate our own belief systems when we are challenged with contradictory evidence. After all we are just as susceptible to the backfire effect as any other human being. So we should be evaluating ourselves and our own arguments and beliefs, and seeing where our Bayesian inference leads us, with the humility that comes from the knowledge of our own cognitive biases, and the fact that we might be wrong. However, it should also help us to sympathise with those who we think are displaying the backfire effect, and hopefully help us to contextualise and relate in such a way that defuses some of the barriers that trigger the backfire effect.

Please weigh in on the comments as to what you thought about my explanation of Bayesian inference and the backfire effect. Also let me know what other cognitive biases you would like to see covered.


About Chris


  1. Dennis, Simon, Michael D. Lee, and Angela Kinnell. “Bayesian Analysis of Recognition Memory: The Case of the List-Length Effect.” Journal of Memory & Language 59, no. 3 (2008): 361–76.
  2. Carrier, Richard. Proving History: Bayes’s Theorem and the Quest for the Historical Jesus. First Edition edition. Amherst, N.Y: Prometheus Books, 2012.
  3. Lee, Michael D., and Eric-Jan Wagenmakers. “Bayesian Statistical Inference in Psychology: Comment on Trafimow (2003).” Psychological Review 112, no. 3 (July 2005): 662–68; discussion 669–74. doi:10.1037/0033-295X.112.3.662.
  4. Rosencrantz and Guildenstern Are Dead: Act 1
  5. Lutece Twins, Bioshock: Infinite
  6. Nyhan, Brendan, and Jason Reifler. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32, no. 2 (March 30, 2010): 303–30. doi:10.1007/s11109-010-9112-2.
  • Casp

    Hey Chris,

    Appreciate you trying to address these things. I would have thought that I’ve been living with a working understanding of the “Backfire Effect” but hadn’t really named it or made it a thing. What I always think is what is the deeper presupposition that the person is holding to that leads them to hold to a view that’s contrary to common or scientific belief. Usually there’s a very emotional belief about identity or an experience underlying that which needs to be addressed first but I’m assuming that’s exact thing you’re going to expand toward. Or are you going to go in the direction of attempting to explain why people will end up researching to prove their point rather than researching to learn which is true? (Just as true for the Googler or the Uni Researcher!)

    Just a couple of tips on making your writing more engaging, The tagline plus paragraph one is almost the same as paragraph two for overall content so watch out for that. I’m glad that the rest of the article wasn’t like that because that leads me to want to check out if it was going to be overly repetitive. Also I know you’re trying to explain something that is filled with technical jargon but I still think there’s ways to use simpler language that makes it easier to flow through and not get slowed down on unnecessarily large words. I know it’s a natural symptom of working in an area that values things on word count and the need to look smart. Simple & Direct is a good book on it. And watch out for the clip-art/stock photography looking headers they make it look like one of my client’s old blogs that says to me that it’s worth ignoring, what got me to click through is that I know you and appreciate your insights. (Kottke, Gruber & Medium are the way they are for a reason)

    • Heya Casper. You are right, I will eventually look at some of those aspects. However, one of the problems in the area is that with the ‘Backfire effect’ only being recently documented the research is patchy, and doesn’t necessarily control for emotions. In fact some follow up studies have found the same effect even on topics that the participants self-rated as ‘disinterested.’ So in some ways we don’t entirely know what triggers the backfire effect yet.

      Thanks for the other reflections too. I’m still experimenting with how the WordPress delivery system works best, especially with the Facebook OpenGraph protocol and the various RSS readers which often do odd things like preview certain sections, but only if they are short enough. Hence the double introduction etc. The header images are linked together in each M/W/F series to tie the content together, primarily as a suggestion from another friend. 🙂
      Tips on writing style appreciated, will be writing about that next week on Friday.

  • Interesting that you would use Bayesian statistics for this discussion, as many Bayesian enthusiasts fit well into the category of inflexible thinkers. Bayes’ theorem itself is just a restatement of the definition of conditional probability, so shouldn’t be taken too philosophically.
    As for vaccinations, hopefully it makes more sense if one considers that it took a very long time, and a protracted legal battle, before smoking was found to be bad for people. Medical science is not as clear cut as conditional probability.

  • It’s strange that you would include Jesus mythicists. That is a position that has growing academic support. It would be more accurate to include Jesus historicists as examples of backfire effect.

    It’s become increasingly obvious how weak is the Jesus historicist position. That isn’t to say that Jesus mythicists are therefore necessarily right in all ways, but it is to acknowledge that their position is becoming more compelling. To dismiss their position without seriously and genuinely engaging it would be a clear example of backfire effect.

    It’s interesting that your cite Richard Carrier in your notes. He is a Jesus mythicist.

  • Pingback: Why I am becoming convinced that bifurcated argument on Social Media is detrimental - PorterblePeople()