Articles, Blog

UQx DENIAL101x 6.3.3.1 Flu Shots

November 7, 2019


Winston Churchill summed up one of the problems
with misinformation: “A lie gets halfway around the world before
the truth has a chance to get its pants on.” Churchill said this before the Internet and
the age of Twitter. If he saw how quickly misconceptions can go viral these days, it
would make his head spin. While myths can spread very quickly, they unfortunately don’t
disappear so easily. On the contrary, many myths are sticky. They can be memorable, persistent
and notoriously difficult to dislodge. So this lecture explains how to debunk myth.
We’ll look at how to structure a debunking in order to effectively reduce the influence
of misconceptions. But we’ll come to that later. The golden rule of debunking is “Fight Sticky
Myths with Stickier Facts”. To debunk a myth, you need to replace it with an alternative
fact that is more plausible and more compelling than the myth. The stickier the fact, the
more effective the debunking. You also shouldn’t put too much emphasis
on the myth. Otherwise, there’s a risk that after time, the myth is all people will remember
from your debunking. So you might ask, “why mention the myth at all?” The scientific research into Inoculation Theory
says we do need to specify the myth. Inoculation theory comes from a line of psychological
research that applies the metaphor of inoculation to knowledge. The research finds that just explaining the
science isn’t enough. It doesn’t necessarily equip people to make sense of myths that distort
the science. We do need to specify the myth. So it’s
a balancing act. Don’t put too much emphasis on the myth, but don’t ignore it altogether. What inoculation theory tells us is that in
order to build resistance to misconceptions, we need to expose people to a weak form of
the myth. Just like a flu shot. What do I mean by a weak, or ineffective, form of the
myth? Well, a few things. Before repeating a myth, you need to warn
people that you’re about to mention the myth. This can be something as simple as saying
“a common myth is…”. Or if your debunking is visual, then use visual cues to make it
obvious that it’s a myth. This puts people on guard so they’re less likely to be influenced
by the myth. The second way of presenting a weak form of
the myth is to explain why the myth is wrong. Typically, you do this by explaining the fallacy
that the myth uses to distort the science. A useful framework for denial fallacies comes
from a paper by Pascal Diethelm & Martin McKee. They found that movements that deny a scientific
consensus share five characteristics of science denial: Fake Experts, Logical Fallacies, Impossible
Expectations, Cherry Picking and Conspiracy Theories. The way I remember the five traits is with
another acronym: FLICC. Fake Experts are used to try to foster the
fake impression of an ongoing scientific debate. Logical fallacies distort the science by drawing
incorrect or inappropriate conclusions from the data. Impossible expectations demand standards of
evidence that are impossible to achieve. One version of this argument is that if we don’t
know everything, then we know nothing. But this ignores the parts of climate science
where we have a high level of understanding. Cherry picking involves using small, select
pieces of data, while ignoring any inconvenient data. You know someone is cherry picking when
the conclusion they get from a small piece of data conflicts with the conclusion arising
from the full body of evidence. Conspiracy theories abound among groups who
disagree with an overwhelming consensus across a global scientific community. How else do
you explain nearly every scientist in the world disagreeing with you. There are subcategories of fallacies too.
For example, under logical fallacies, you find red herrings which distract people with
irrelevant information. Other logical fallacies include misrepresenting or over-simplifying
the science. Making faulty leaps of logic is called jumping to conclusions. Presenting
only two choices when other options are available is a false dichotomy. If you successfully explain the fallacy of
a myth, you neutralise the myth – in fact, you can even make it backfire. In my psychology
research, I’ve been experimentally testing the impact of misinformation and how to neutralise
it. In one experiment, I showed participants an online petition signed by 31,000 scientists
or science graduates who don’t think human activity is disrupting the climate. I also
asked them to estimate how many climate scientists agree that humans are causing global warming
– a measure of perceived consensus. This graph show the change in perceived consensus
as a result of reading the misinformation. The horizontal axis represents political ideology
– on the left are people who are politically more liberal, and on the right are people
who are more politically conservative. The red line shows the change in perceived consensus
– anything below the dotted line means a decrease in perceived consensus. After being
told that 31,000 scientists didn’t believe that human activity was disrupting climate,
perceived consensus didn’t change for people at the liberal end of the political spectrum.
But perceived consensus fell by 20% among people at the right or conservative end. The
misinformation had the biggest effect among conservatives The blue line represents another group who
first read an explanation of the technique of fake experts, then they read the misinformation
that used fake experts. When people had been informed about the fake expert strategy prior
to reading the misinformation, then the misinformation was completely neutralised. Intriguingly,
it even caused a slight increase in conservatives’ perceived consensus after reading the misinformation.
If people already understand the fallacy of a myth when they encounter that myth, then
that myth can backfire. So let’s bring all this together. An effective
debunking requires the following elements: First, a sticky fact. Your debunking should
emphasise a sticky alternative fact. Second, you do need to specify the myth, but
make sure you provide a warning before specifying it. Lastly, explain how the myth distorts the
science. What is the technique or fallacy that it uses? The science tells us that the
Fact, Myth, Fallacy format is an effective way of debunking myths. As Winston Churchill reminds us, myths can
spread quickly. This is why we need to explain how the myth distorts the science. This can
neutralise misinformation – or even make it backfire.

No Comments

Leave a Reply