Skip to main content
Open this photo in gallery:

The rapid rise of artificial intelligence chatbots, with their bad habit of confidently spewing fake news, will only make spotting misinformation – and more pernicious, deliberate disinformation – more difficult, psychologist Sander van der Linden says.iStockPhoto / Getty Images

Sander van der Linden is a world-leading expert on misinformation and he still fell for it: a video of NASA’s rover Perseverance landing on Mars that arrived in his Twitter feed in February, 2021, courtesy of a retweet by author Stephen King. Van der Linden, a psychologist at the University of Cambridge, watched the video, enthralled by the “sound of Martian winds,” and enthusiastically retweeted it to 13,000 followers.

“The video was quite sophisticated,” he says, a little sheepishly, on a recent video call. A brief fact check, however, would have revealed it as fake. The date was wrong, for starters.

Bill Nye on climate change, TikTok, and fighting misinformation: ‘It takes a steady drumbeat’

That’s one of the findings of Van der Linden’s research, detailed in a book, Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity, published in Canada this month: everyone’s susceptible to catching – and spreading – misinformation.

So what if we could protect ourselves in advance? Van der Linden and his team at Cambridge are experimenting with a promising idea called psychological inoculation – the theory that a prebunking “vaccine” can prime our brains to recognize and reject fake news. Immune to its persuasive ploys, we would also stop spreading it.

Open this photo in gallery:


“When people don’t think they’re vulnerable, we stop actively paying attention,” Van der Linden says. “We just assume, most of that time, that stuff is more or less true.”

Misinformation can influence whether people get a life-saving vaccine. It can grow into conspiracy theories that erode faith in public institutions and trust in expert advice. The more often you hear a statement, the more believable it sounds, and the more people believe something is true, the more likely they are to share it.

Van der Linden cites a 2018 study by MIT researchers who analyzed Twitter data, and estimated the truth took six times longer than a lie to reach 1,500 people. Misinformation that inspired negative emotions, such as disgust and fear, travelled fastest of all.

The rapid rise of artificial intelligence chatbots, with their bad habit of confidently spewing fake news, will only make spotting misinformation – and more pernicious, deliberate disinformation – more difficult. But not everyone needs to be fooled for misinformation to be dangerous, Van der Linden points out; elections are often decided by small margins.

Van der Linden’s idea, then, is to dose our brains with a shot of critical thinking, so misinformation is less likely to find a gullible host in the first place. His research expands on a 1961 paper by a Columbia University psychologist named William McGuire, who proposed that American soldiers could be protected from brainwashing if they were taught strategies to deflect it.

Psychological inoculation is based on a biological vaccine: expose people to a weakened dose of misinformation by forewarning them about it, then give them the correct information so they have the cognitive tools to fend it off.

While his work suggests this form of prebunking helps adults become better at identifying misinformation, he expects it will work even better with teenagers, who are still forming their political identities.

Along with a series of YouTube videos, Van der Linden and his colleagues developed free online games: Bad News and, for younger players, Bad News Junior. The games use a script-based, choose-your-own-adventure design to guide people through examples of the persuasive tricks often employed in Twitter-based misinformation, such as fake experts, emotional language and scapegoating.

In one experiment, when they tested people before and after they played the game, it was found to boost their relative protection against misinformation by about 25 per cent. Not perfect, but even incremental changes can make a significant difference at a population level, Van der Linden argues, especially if people “spread” their knowledge to those around them.

The best approach, he suggests, would be to combine a prebunking vaccine to protect against infection with fact-checking to limit spread.

A study published last year in Nature found that fact checks reduced misconceptions about COVID-19 in the United States, Great Britain and Canada, even among the strongest believers.

But as the authors reported, their experiments found “discouragingly little evidence that fact checks have enduring effects on beliefs.” People appeared to accept the correction in the moment but returned to believing misinformation once exposed to it again. The same was true for prebunking: The protection wore off over time, necessitating regular boosters.

To deliver them, Van der Linden suggest that social-media sites regularly circulate reminders about the strategies used to spread misinformation. Ahead of events such as elections, policy makers could highlight techniques to help citizens identify when facts are being twisted or embellished.

But misinformation also spreads fastest when trust in society is low, says Mathieu Lavigne, a senior researcher at the Centre for Media, Technology and Democracy at McGill University. So countries, including Canada, will also have to address why people are losing faith in institutions. “When we can’t agree on basic facts,” he says, “it is much harder to find solutions to the problems we face.”

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe