The Backfire Effect. Even With The Facts On Your Side, You Lose
Have you ever wondered how people can hang onto a belief long after the facts have proved them wrong?
We’ll it turns out that for many people, with strongly held opinions, the presentation of contradictory facts actually increases their certainty, rather than demolishing it.
This is a human failing. A cognitive issue that affects human thought. We are all prone to it. But it is usually avoided because most people just aren't that certain to start with, so are open to a change of mind.
But if you wonder why people hang onto climate change denial now that 99.8% of scientists have concluded that man made climate change is a real thing. Or how the anti-vaccination crowd can hold onto their beliefs in the face of a now rising whooping cough epidemic and resurgent mumps, with far greater cost and mortality than their misplaced fears, this is the reason. When you believe something strongly enough, the facts do not affect you.
So much of American politics is explained by this. And it is a global issue.
On a personal note I suggest this simple exercise. If you are dead certain you are right about something, and yet there is also controversy, try assuming that you might not know the whole story. From that position, you leave open the possibility the new information will lead you closer to the truth.
On the other hand you could simply say that what you “know” is what you know and the truth doesn't matter. I recommend the more open approach, because the truth actually matters
Here's some more on this phenomenon: How Politics Makes Us Stupid