Skip to main content
opinion

Adrian Monck is the managing director of the World Economic Forum.

Canada is a country that punches above its weight internationally, is a leading NATO member and has a strong and vocal Ukrainian diaspora. For these reasons, and others, Canada is a target of Russian disinformation.

And while Russia may be struggling to prosecute its war of aggression militarily in Ukraine, its disinformation campaigns are executed with more competence and better results.

A study by Canadian researchers found that Russian disinformation “attempts to amplify mistrust of democratic institutions, be it the media, international institutions, or the Liberal government.” The government, too, has spoken out on the issue, recently publishing a webpage dedicated to countering Kremlin narratives.

Yet the challenge remains: Canada – like other countries targeted by Russian disinformation – is an open, democratic society. This openness has been key to Canada’s success. It has forced governments to justify their mandates to voters and has allowed loyal opposition parties and the public to hold those in power accountable.

But the openness of a system like Canada’s is also a staggering vulnerability.

As citizens of open societies, we may well mistrust elites and institutions. We may well feel that our interests are not aligned with or well represented by our governments. Those feelings are both reasonable and legitimate.

Yet such sentiments, which are often steeped in feelings of uncertainty and the loss of a sense of control, are powerful drivers of conspiratorial beliefs and make societies ripe for disinformation. Malevolent state actors can exploit them to undermine political resolve and weaken or prevent military and diplomatic action. This was evident in part during the recent truckers’ protest, which, regardless of its motivations, was covered heavily by Russian state media, according to multiple analyses.

However, framing the debate as rational and concerned citizens against the gullible and easily manipulated hardly helps.

So how do you tell someone that their cherished beliefs are the result of manipulation? Well, you don’t. But what you can do is protect the public sphere from the toxic sludge of Russian-backed disinformation that infects the digital information space and leaks into political debate. This is especially crucial when dealing with political actors who are eager to capture a small but significant percentage of disaffected voters and are incentivized to tailor their rhetoric accordingly, even if it means accentuating conspiracy messaging. So, what can be done?

The good news is, there are solutions – and ones that protect the polity while allowing for and even facilitating healthy disagreements between responsible politicians and citizens.

In some areas, this protection has taken the form of more traditional approaches, like increasing media literacy. Russia’s neighbours, like Finland and Estonia, have built a defence against disinformation into their curriculums from kindergarten, with some success.

Indeed, the Canadian government has made inroads in boosting media literacy. These efforts include the annual Media Literacy Week, a government-supported program to raise awareness of disinformation and encourage critical thinking online. And according to a recent study, increased digital literacy improves people’s ability to assess the accuracy of information. However, it doesn’t stop people sharing inaccurate information. The reason is that for the most part they share information socially to reinforce their sense of community, or their beliefs, not to signal their credentials as arbiters of accuracy.

Moreover, social media platforms are relatively open and the penalties for sharing inaccurate information are near non-existent. This openness means that manufactured outraged pushed by Russian bots and legitimate political critiques are often hard to distinguish. Because of this, government and social media platforms must work together to build strong, but transparent, digital barriers that degrade disinformation pushed by malicious foreign powers.

Lastly, consensus around modern approaches must be fostered. The battlefield of democracies is ideas and policies. Parties, their supporters and their leaders can disagree fundamentally on these. But when it comes to protecting the space in which those debates can be had, there should be agreement around a general set of rules. Such rules could range from backing broad efforts to diminish foreign disinformation to political parties pledging to shun social media bots.

After all, even duels had rules.