In today’s heated political stage, where everyone has a soapbox thanks to outlets like Facebook, Twitter, Instagram and all the personal blogs, I’ve tried my best not to share my political views publicly. And I’ve miserably failed. I use my own Facebook page and profile to talk about science, books and photography, but then I can’t resist browsing other people’s posts. Most of my friends are not as shy as me about making their political views heard and that’s when I fall into the trap: I comment. And then someone replies. And I comment back. And on and on it goes until one of us drops out of the conversation because clearly we’re not getting anywhere.
Science has taught me to be humble and rational. And yet I’m human, and every time I make a mistake in my line of work I feel something inside my brain stir and protest: “How’s that possible? Surely they sent me the wrong data, or they didn’t give me the correct information, or the world collapsed and my computer exploded, but there’s no way I could’ve made that stupid mistake.”
Apparently, I’m not unique. We all go through this kind of mental distress whenever we encounter an inconsistency between reality and our expectations, and between other people’s opinions or choices and our own. It’s called “cognitive dissonance.” According to Wikipedia, social psychologist Leon Festinger described four ways our brain deals with this:
In an example case where a person has adopted the attitude that they will no longer eat high fat food, but eats a high-fat doughnut, the four methods of reduction are:
- 1. Change behavior or cognition (“I will not eat any more of this doughnut”)
- 2. Justify behavior or cognition by changing the conflicting cognition (“I’m allowed to cheat every once in a while”)
- 3. Justify behavior or cognition by adding new cognitions (“I’ll spend 30 extra minutes at the gym to work this off”)
- 4. Ignore or deny any information that conflicts with existing beliefs (“This doughnut is not high in fat”)
What determines what choice we make?
In my case, I end up going back to my computer program. I typically find the bug (which I unknowingly introduced as I was coding), correct it, and rerun the analyses. Admitting my mistake costs me emotional distress, in addition to that nagging doubt at the back of my head — will my boss still like me even though I made a stupid mistake? — but in the long run it would cost me a lot more not to correct the error and hand the wrong analyses to our collaborators.
So why can’t we do the same when we are heatedly debating politics or religion? Why do some of us even resort to insults rather than admitting that our own logic is faulty?
One possible reason is that there are no consequences to being disrespectful or even offensive when debating on line. After all, even when we use our real name, we are still hiding behind a shield of impersonality when typing our thoughts on an electronic device. On the other hand, if I hand out the wrong results and my collaborators publish them, there will be huge consequences for me. And frankly, trial and error is part of the scientific process: we all make mistakes, we correct them, and we repeat the process over and over again until we have clean and sensible results. Only then we publish a paper.
But in a political or religious debate the consequences can be far more costly if we suddenly admit that we may have been wrong all along. Changing our mind affects our self-esteem and may lead to self-blame, possibly disrupting the relationships around us. That’s why our brain has a tendency to choose the easier path, which often coincides with reinvigorating present beliefs rather than shifting to new ones. As Nyhan and Reifler notice in a 2010 paper , there’s a difference between being uninformed and being misinformed, as the latter is much harder to correct. In the paper, the authors claim that “humans are goal-directed information processors who tend to evaluate information with a directional bias toward reinforcing their pre-existing views,” and conclude: “Indeed, in several cases, we find that corrections actually strengthened misperceptions among the most strongly committed subjects.”
This behavior of reinforcing one’s beliefs the more the contrasting evidence is presented, is called the “confirmation bias”. Patterson et al.  define this bias as the tendency to favor certain explanations that conform to our own beliefs and/or emotional response, and classify it as “cognitive” or “emotional” depending on whether it reflects the former or the latter. It’s a very familiar bias, as we’ve all seen it everywhere around us, whether it was to defend our favorite presidential candidate or to debate climate change. A little harder is to pin it down when we are engaging in this behavior ourselves — but rest assured, we all do it at some point, although each one of us to different extents.
“Because of this mechanism,” explains Robin S. Cohen, a Los Angeles based psychoanalyst, “not only are we biased to favor perceptions that are in line with our beliefs, but we are also very likely to organize our world in order to only experience things that conform to our own ideas. This makes it less likely to be confronted with alternative opinions. Our own beliefs are so thoroughly reinforced through this process that new perceptions gain very little traction.”
Interestingly, as Leonid Perlovsky describes in a 2013 review , experiments have shown that music helps abate the stressful consequences of cognitive dissonance. So, maybe I could try playing a little music in the background next time I’m trying to convince a Trump supporter to find a better presidential candidate. What do you think? Mozart or Metallica?
 Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions Political Behavior, 32 (2), 303-330 DOI: 10.1007/s11109-010-9112-2
 Patterson, R., Operskalski, J., & Barbey, A. (2015). Motivated explanation Frontiers in Human Neuroscience, 9 DOI: 10.3389/fnhum.2015.00559
 Perlovsky, L. (2013). A challenge to human evolution—cognitive dissonance Frontiers in Psychology, 4 DOI: 10.3389/fpsyg.2013.00179