Fake news’ power to influence shrinks with a contextual warning, study finds

Research conducted by social psychologists at Cambridge University in the UK, and Yale and George Mason in the US, offers a potential strategy for mitigating the spread of misinformation online — involving the use of pro-active warnings designed to contextualize and pre-expose web users to related but fake information in order to debunk factual distortion in advance.

The researchers found that combining facts about climate change with a small dose of misinformation — in the form of a warning about potential distortion — helped study participants resist the influence of the false information.

This could be either a general warning, i.e. about the risk of skewed data on the topic, or a warning that debunked a specific climate science myth in more detail — although the researchers found that the more detailed warning was about twice as effective as the general warning at shifting opinion towards acceptance of climate science consensus despite exposure to fake news. Warnings were presented first.

Conversely, when false information when presented alongside truth — i.e. without the added context of the warnings — they found it weakened people’s belief in the veracity of climate science, with climate myths effectively negating climate facts.

False information presented on its own was also shown to have a potent effect at spreading doubt about the veracity of the overwhelming scientific consensus on human-caused climate change.

Inoculating against misinformation

The researchers characterize their strategy to mitigate the potency of misinformation to influence opinion as akin to a medical inoculation, whereby a small amount of a virus can be used to bolster immune resistance to an illness ahead of time, before a person encounters the virus in the wild.

“In medicine we can try to build resistance to infections by injecting people with a weakened strain of the virus or small dose of the virus that will trigger antibodies in people’s immune system to confer resistance. The attitudinal theory of inoculation is exactly the same — in the sense that what we try to do is pre-expose people to a bit of the misinformation… then debunk that information with specific facts,” explains lead author of the study, Dr Sander van der Linden, in an interview with TechCrunch.

“This process helps arm people with what I call a cognitive repertoire to resist basically future encounters with misinformation. That’s the basic idea… We see that we can incrementally protect people’s beliefs about science, through inoculation.”

The public opinion dynamics research, which was conducted more than a year ago — and published today in the journal of Global Challenges — is especially timely, given ongoing debate around the role social media platforms are playing in spreading fake news and other misinformation.

Facebook, for example, has increasingly found itself in the firing line over the veracity of content shared across its network — not least because the company initially refused to accept any editorial responsibility for the views being distributed via its technology (last month CEO Mark Zuckerberg finally conceded the social media giant has a “greater responsibility than just building technology that information flows through”).

It is currently testing measures to mitigate the power of misinformation to influence its users (in the US, and in Germany) — including by labelling suspect links with warnings about the veracity of the content. Facebook relies on user reports and outside fact-checking organizations to flag/identify disputed/dubious content.

The research, while not affiliated with Facebook in any way, suggests the company’s approach to fighting the potency of fake news on its platform could be effective in helping boost users’ ability to more critically weigh content they are being exposed to via Facebook’s algorithmically driven News Feed.

Although the study is focused on opinions around a single topic (climate science consensus), van der Linden believes the same strategy of prefixing contentious content with contextual warnings to bolster resistance to fake news could be far more widely applicable, as a way to mitigate the power of misinformation to gather momentum online.

“We did this over a year ago — in fact when fake news wasn’t as big of a story as it is now, but climate change has always been a contentious issue — but yes, I would say this is very generally applicable, with Donald Trump or with other fake news that has been perpetuated in the media. Trying to preventatively inoculate people to be more sceptical and be more resistant to fake news,” says van der Linden.

“I think it might be helpful,” he adds of Facebook’s current trial strategy of badging problem content with warnings. “Simply because… one of the things that I’m interested in is what I call the psychology of consensus — which is this idea of when lots of people agree on something implicitly, like when it’s been shared a million times, or this video has a million hits people are inclined to just click on it out of curiosity because there’s some social consensus that this must be important.

I would have anticipated that the misinformation would have some influence — but not that it would cancel out the facts completely.

“And people aren’t really necessarily thinking about it — and so, again, putting that warning label on there might switch people from this system one/mode one thinking to being a bit more selective and conscious and deliberate. And of course there’s no guarantee — but I think it could be helpful for that reason.”

Might the effect of labeling contentious topics with warnings not wear off over time — i.e. if people become so desensitized to seeing warnings they start to discount them? While he concedes that could become “a factor”, emphasizing that the long term solution is clearly better education to foster more critical thinking, he argues there’s an interim need to help the public debunk myths being dressed up as facts.

“Of course education is probably the ultimate vaccine in the long term, if you educate children and then adults to be sceptical consumers… But in the interim I think it would be important to try to develop tools that could help with that,” he says.

In terms of study specifics, the researchers conducted two surveys: the first, polling a representative sample (1,000 people) of the US population, aiming to identify the most compelling falsehood associated with global warming — which they identify as a fake claim on the Oregon Global Warming Petition Project website that there’s no scientific consensus on human-caused climate change — and which they subsequently used as the misinformation statement for their second study: an online randomized survey (polling 2,167 people), conducted via Amazon Mechanical Turk, to assess different combinations of statements of truth and statements of misinformation in order to assess how public opinion on climate science consensus varied as a result.

Interestingly they found their warnings to be equally effective across US political lines — despite also finding that Republican voters were more likely to believe climate misinformation vs facts than Democratic counterparts.

What surprised the researchers most about the results of the study? Firstly how powerful misinformation can be, says van der Linden, but also — on the positive side — that ‘inoculating’ people against misinformation can be effective even if a person has an entrenched prior viewpoint.

“I didn’t anticipate that the misinformation would be so overwhelming for people,” he says. “I would have anticipated that the misinformation would have some influence — but not that it would cancel out the facts completely. I think that was quite surprising and also quite concerning, in some way, that people are paying so much attention to this idea of balance. And it’s tricky for people because they don’t know what the sources are, and how credible each side of the debate is — so it is difficult.”

“Similarly we weren’t sure if inoculating people, depending on their prior position, is going to be effective because some people might already have certain prior beliefs — and we were surprised by the fact that on average the inoculation worked well, regardless of what your affiliation or your prior beliefs were. This is not to say there aren’t individuals in the study for whom it didn’t work — and similarly vaccines work for most people but they can’t guarantee they work for everyone — so I would say that’s very much the same here too. But we were surprised that it worked across the board — and that is quite promising.”

Asked about the global challenge posted by the propagation of misinformation, van der Linden is also relatively upbeat — pointing out that misinformation is nothing new, even though the popularity of social media platforms has led to an amping up of the fake news volume in recent years.

“We’re perhaps now at the peak of fake news… Perhaps it’s viral right now but I think it will subside in years to come — at least that’s my hope — but that’s not to downplay the importance of doing something about it… It is affecting people’s opinions and the types of decisions they make — both for themselves, as well as for other people.

“If we think about Donald Trump, about Brexit, about situations where people — regardless of what the right or wrong position here is — it is influencing the way in which people make decisions and it could potentially undermine the democratic process, and I think that at least that’s important to take into account.”

He agrees powerful social media platforms such as Facebook should be pro-actively trying to mitigate the spread of misinformation across their networks.

“Given the influence and power that Zuckerberg’s platform has I do think Facebook has a responsibility to try to moderate what’s going on on their website,” he adds.