What do Flat-earthers, Anti-vaxxers, 9/11-conspiracists, Jesus-mythicists and Obama citizenship deniers have in common?

Well, subconsciously at least, they all display a contempt for Bayes’ theorem, and arguably they strongly manifest the ‘Backfire effect.’ Welcome to another instalment of Cognitive Bias Wednesday, today looking at the ‘backfire effect,’ with a prelude on Bayes’ theorem.

Why is it that even when presented with all the data and information, some people refuse to modify their beliefs? That even with all the weight of evidence that the world is round, or that vaccines don’t cause autism, that they refuse to budge in their beliefs, and even seem to become more set in their ways. While some of this behaviour is certainly conscious, at least some is also the product of a cognitive bias operating behind the scenes: the ‘backfire effect.’ This is also why most of the arguments that occur on the internet are relatively fruitless. However, before we get to the cognitive bias, it is worth having a brief journey through a neat cognitive feature, the theory of Bayesian inference.

Thomas Bayes was a 18^{th} century Presbyterian minister, philosopher, and statistician; and while he published works in both theology and mathematics, he is best known for his contribution to statistics, posthumously. His work on what would eventually become known as Bayesian probability theorem was only published after his death, and the impact of it would be completely unknown to him. While Bayesian modelling and statistics are applicable in a wide spectrum of fields and problems, from memory theory and list length effects that my previous lab worked on (MaLL), ^{1} through to Richard Carrier’s application to historiography, ^{2} and many more (just don’t get me started on the null hypothesis significance testing debate). ^{3} The key factor for this investigation is in the Bayesian logic applied to the belief-revision loop. In lay terms Bayesian statistics can be used to predict how much someone will believe in a proposition when presented with a certain evidence set.

Take for example the good old coin toss test, suppose you have a coin with one side heads and the other tails. Logic and the laws of probability would indicate that it should be a 50/50 chance of being heads. But what happens if you flip a coin 5 times and get 5 heads in a row, well statistically speaking it is still a 50-50 chance, even though the probability of getting a long consecutive run trends with 2^{(n-1)}. What about if you get 92 heads in a row, ^{4} or 122 heads, ^{5} do the odds change then? Probability and statistics give us a clear no, it is still 50-50 no matter how big *n *is. However, if you ask gamblers at a casino, or for that matter most people on the street, you will get a startling response. Many respondents will say that as it is a 50-50 probability the chance of the next coin toss being tails *increases* to even out the overall trend. Why? Well it is a bit of a faulty belief-revision loop, and this trend is able to be predicted by Bayes’ Theorem. Using Bayesian inference and applying it to epistemology we can predict the modification of the belief loop and see that degrees of belief in the outcome of a coin toss will rise and fall depending on the results, even though the statistics remains the same. Furthermore, these modifications are overwhelmingly conservative in most people, and this should give us pause for thought when we find evidence that challenges our beliefs.

But what does this have to do with the backfire effect? I hear you ask. Well the backfire effect is essentially where the Bayesian inference model of the belief-revision loop fails, and fails badly. Normally when people are presented with information that challenges their beliefs and presuppositions they engage in the Bayesian belief revision loop as above, and slowly change (even if slower than you would think). However, when testing how people respond to correction of misinformation Nyhan and Reifler found that, in some cases, rather than modifying their beliefs to accommodate or fit with the information that they have received, they instead clung to their beliefs more strongly than before. ^{6} Essentially in their tests the presence of correcting material for political misconceptions served to strengthen the misconception, rather than modify it. They dubbed this the ‘Backfire Effect.’

Now this isn’t displayed by everyone in the populace, although I would argue that it works within our subconscious all the time. Some early research shows that the backfire effect commonly raises its head when the matters are of emotive salience. So even though some of the more amusing incidences of the backfire effect that are commonly highlighted involve people sharing satirical news stories from *The Onion* or *Backburner* as if they were real news articles, others are less benign. Indeed, for almost every amusing incidence of people not checking their Bayesian revision loops and falling prey to the backfire effect, there are just as many where people are strongly reinforced in their faulty beliefs on items that matter. One of the notable items recently has been the issue of vaccination, where I have seen several acquaintances strongly hold to the proven faulty and fraudulent research that ‘linked’ vaccines with autism. Here the overwhelming body of evidence finds no link between the two, and yet they strenuously hold to the link.

So what can be done about it? Well this blog post is one useful step. Being aware of the backfire effect should help us evaluate our own belief systems when we are challenged with contradictory evidence. After all we are just as susceptible to the backfire effect as any other human being. So we should be evaluating ourselves and our own arguments and beliefs, and seeing where our Bayesian inference leads us, with the humility that comes from the knowledge of our own cognitive biases, and the fact that *we *might be wrong. However, it should also help us to sympathise with those who we think are displaying the backfire effect, and hopefully help us to contextualise and relate in such a way that defuses some of the barriers that trigger the backfire effect.

Please weigh in on the comments as to what you thought about my explanation of Bayesian inference and the backfire effect. Also let me know what other cognitive biases you would like to see covered.

### About Chris

Notes:

- Dennis, Simon, Michael D. Lee, and Angela Kinnell. “Bayesian Analysis of Recognition Memory: The Case of the List-Length Effect.” Journal of Memory & Language 59, no. 3 (2008): 361–76. ↩
- Carrier, Richard. Proving History: Bayes’s Theorem and the Quest for the Historical Jesus. First Edition edition. Amherst, N.Y: Prometheus Books, 2012. ↩
- Lee, Michael D., and Eric-Jan Wagenmakers. “Bayesian Statistical Inference in Psychology: Comment on Trafimow (2003).” Psychological Review 112, no. 3 (July 2005): 662–68; discussion 669–74. doi:10.1037/0033-295X.112.3.662. ↩
- Rosencrantz and Guildenstern Are Dead: Act 1 ↩
- Lutece Twins, Bioshock: Infinite ↩
- Nyhan, Brendan, and Jason Reifler. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32, no. 2 (March 30, 2010): 303–30. doi:10.1007/s11109-010-9112-2. ↩

Pingback: Why I am becoming convinced that bifurcated argument on Social Media is detrimental - PorterblePeople()