Facebook Canada says it is taking measures to ensure election integrity ahead of the Oct. 24 vote in British Columbia, but would not share how many posts containing misinformation, if any, have been removed or fact-checked.
Kevin Chan, Facebook Canada’s head of public policy, said the social media company has two methods for dealing with misinformation on its platforms.
Content promoting voter suppression will be removed under the company’s community standards and advertising policies, and it also works with independent fact-checkers to review “fake news” posts, he said.
Agence France-Presse and Radio-Canada are in charge of fact-checking for the B.C. election, but Facebook does not monitor their work or know if any posts have been flagged relating to the campaign. He did not say if any voter-suppression posts had been removed.
Marisha Goldhamer, senior editor for AFP fact-checking in Canada, said most claims it has fact checked from B.C. relate to provincial health officer Dr. Bonnie Henry and COVID-19.
“We have not yet rated a misinformation claim directly related to the B.C. election on Facebook, but we are watching closely,” she said.
As part of the program, Canadians on Facebook will be informed if a story they shared has been rated as false, and pages that repeatedly share false news will be seen less across users' news feeds.
“We are a platform and we do very much want to remain neutral. What we do is we have the partnership with the fact-checkers, but they are completely independent in terms of what they fact-check, how much they fact-check and obviously what their conclusions are,” Chan said in a teleconference call with journalists.
The company does not moderate content in private messages, which are encrypted. However, it has limited the number of recipients that can receive a forwarded message to five, Chan said.
Facebook Canada has also offered political parties and candidates access to an emergency hotline to report concerns like suspected hacks, training in social media security and other kinds of support, Chan said.
Its advertising transparency policy means it now requires a “rigorous” identification authentication process and labelling system for topics related to policy or possible campaign issues, he said.
Advertisements that don’t meet those standards are blocked, Chan added. He did not say if any ads related to the B.C. election had been blocked.
The company has not identified any attempts at foreign interference in the B.C. election, Chan said.
Philip Mai, co-director of Ryerson University’s Social Media Lab, described the fact-checking strategy as a “minuscule” step in the right direction.
However, he said it can be misleading because it doesn’t necessarily mean the original post is removed from Facebook, only that it will now have a label indicating it has been fact-checked with a link to the fact-checking article. If a user doesn’t read closely, the label can make the post appear more official, he said.
Misinformation related to elections becomes a concern when it is amplified or planted for nefarious reasons, he said, adding two people can have a relatively harmless public conversation on social media that includes some errors.
A local or provincial election would only likely see interference if it had national or international implications, he said.
“Then you will see actors from outside the province who will try to tip the scale by putting their finger on it,” Mai said.
Mai also said there is a good reason for Facebook not to disclose the volume and types of posts it has taken down or moderated. That can give bad actors more information about what they can and cannot get away with, he said.
As a platform that wants to encourage expression, Chan said differentiating posts that are opinions versus intentional misstatements can be difficult.
For that reason, the company tends to focus on reviewing suspicious user behaviour, which can help it identify fake accounts for example, and leave the content reviews to third parties, he said.