Some things, you just can't make up. Here's a very earnest story in Stuff about why anyone with a different opinion to you is wrong.
It comes down to rabbit holes.
But I will let the author tell the story himself. Here it is.
CHARLIE MITCHELL • NATIONAL CORRESPONDENT STUFF, 16 October 2021
Why do we believe things that aren’t true? And why do we struggle to change our minds, even when shown evidence that we’re wrong.
One answer is a hardwired part of human psychology, which makes us particularly vulnerable to online misinformation and disinformation. It’s called “confirmation bias”.
A basic summary: Confirmation bias means we look for information that supports our beliefs and dismiss information that doesn’t. Over time, this hardens our point of view and makes us less willing to accept or understand other arguments.
Let’s say you believe people driving green cars are worse than blue car drivers. You might see a green car driver cut you off, and interpret that as proof you were correct. But what if a blue car driver cuts you off? You might dismiss it as just a bad driver.
Research tells us no one is immune to confirmation bias. It has no link to intelligence or education: It affects scientists, journalists, plumbers, teachers. If you think it doesn’t apply to you, you are likely falling prey to the blind spot bias.
Studies suggest confirmation bias is tied to our brain chemistry. Finding information we think confirms our views creates a dopamine hit, much like listening to music or eating a nice dinner.
This phenomenon likely had an evolutionary function but has become problematic in the internet era.
Social media algorithms push us towards information we want to see, which inevitably reinforces our biases and makes us feel good. This is the “rabbit hole” — an endless loop that tells us everything we believe is correct, warping our view of the world.
This is evident in groups opposed to the SARS-CoV-2 vaccines. For example, if you believe Covid-19 vaccines cause widespread deaths, you might speculate a vaccinated person who died from natural causes actually died from the vaccine — doing so reinforces your belief, which is pleasurable, even if there’s no evidence to support that view. It could run the other way, too. If you accept the vaccines are safe, you might dismiss cases of side effects potentially linked to vaccination, because that too, reinforces your belief.
There are ways for us to break our confirmation biases, but it’s not easy. One method is to continually ask ourselves if we’re trying to build a one-sided case: Are we looking at a wide range of sources, and making sure those sources are reliable, and not just telling us what we already think?
Ultimately, it comes down to making sure we don’t end up in a rabbit hole, and being alert that our brains are prone to biases we’re not always conscious of.
So that clears it up nicely. We now know everyone is at risk of going down a rabbit hole. Stuff journalists may be an exception to the rule, but for the sake of argument, let's assume they are prone to rabbit holes just like every other journalist; and scientists, plumbers, teachers and presumably every other occupation including those without occupations.
Isn't that why Stuff should publish two sides of each rabbit hole? That way, a reader can choose which rabbit hole they go down.
It's why they should ask themselves "are we trying to build a one-sided rabbit hole" well before they type the first word to a story.
And it's why the media should avoid the $55 million snare at the entrance to the rabbit hole. It's sign-posted and not hard to see - "The Public Interest Journalism Fund - please enter".
The other little gem is this. We now know that if we feel we are heading down a rabbit hole, we could instead get the same pleasure from listening to music or eating a nice dinner.
It's so good having options. Eeny, meeny, miny, moe: Barry Manilow, sausages and chips, or rabbit hole?
Frank Newman, is a political commentator, investment analyst, and a former local body councillor.