Skip to main content
opinion

Navneet Alang is a Toronto-based freelance technology-culture columnist.

The world is still reeling from the election of Donald Trump. People on all sides of the political spectrum have been casting about for explanations, desperate to make sense of it all. Among the wave of reasons offered for the surprising outcome is one that also happens to represent a more general threat to democracy: fake news on Facebook.

Made-up stories online have been a common phenomenon for a couple of decades now. They've been found in e-mail chain letters, hoax sites and faux blogs. It's why Snopes.com became so popular: Its entire purpose is to debunk untrue things that are on the Internet. Is it any wonder that Oxford Dictionaries just declared 'post-truth' their word of the year?

While false information has thus far been mostly a minor annoyance, it may have been a factor in Mr. Trump's unexpected victory. During the campaign, Facebook was full of fake news: stories that the Pope endorsed Mr. Trump, that President Barack Obama was born in Kenya, that Hillary Clinton suggested that Mr. Trump should run for president. All had thousands of shares, and all were false.

Fake news stories are only part of the problem of social media's effect on democracy: Also common are image memes shared on Facebook and other platforms – that is, pictures with text superimposed over them that carry arguments or simple ideas that are extremely popular and often untrue.

Whether these actually affected the outcome of the election is difficult to know with any certainty. Officially, Facebook's answer is that they didn't. Company chief executive officer Mark Zuckerberg has gone so far as to call the notion "a pretty crazy idea."

A number of Facebook employees have gone rogue and created a task force to address the problem, saying anonymously that "fake news ran wild on our platform during the entire campaign season."

Facebook's official stance is meant to try to maintain an air of neutrality, but when a false story can get a million shares, there's clearly something wrong. There is also the more obvious issue: Facebook has not only recently claimed that it can influence purchasing decisions, but has more broadly asserted that it can play a part in the spread of ideas around the world. So how might it then insist that it has an effect in one way but not another?

Not only is the assertion that fake news didn't affect the election naive, it also ignores a broader trend. The threat to democracy is not just fake news, or even Facebook itself – it's that nobody can any longer agree on what is actually true.

All of this fits into the wider issue of filter bubbles – the cocoons of information created by Facebook's algorithms that that show users things based on their past browsing history. Where one ends up is in a vicious cycle, turning to Facebook for new information, but always being shown things with the same slant.

How Facebook might go about solving this problem is going to be extremely complicated. There aren't really algorithms to detect what is and is not true, and the sheer scale of the issue makes human intervention equally as impossible; people post millions of links every day.

There are complicating factors, such as the fact that some websites show both true and false information, making banning entire sites impossible, or that it is extremely easy for people looking to cash in on clicks to keep creating new websites. Moreover, there is the very tricky issue of Facebook becoming an arbiter of truth, something that in an already deeply polarized political climate will come under great scrutiny.

So where does Facebook fit into the shifting notion of what can be called truth? Consider the results of this very election. In the storm of reaction, everyone is jockeying to give the right explanation: that it was a revolt of the white working class, or an embrace of racism and misogyny, or the poverty of Ms. Clinton as a candidate, or the rejection of political correctness.

There are a hundred of competing reasons, and all must be considered. But when everyone is ensconced in their own world view, and so many do not trust media – and with fake or slanted news to support their view – how does anyone agree on what is true, let alone then debate it? The authority and consensus-forming that once helped people coalesce around truth is greatly diminished.

Both Google and Facebook have now instituted some measures to deny ad revenue to sites that share fake news, which may blunt the appeal of running such stories. But the bigger issue remains: that we are all locked into a mindset, encouraged by the ubiquity and blinkered nature of our social media feeds, with no clear voice to separate the false from the real. When everything is equally true, then nothing is.

Interact with The Globe