Skip to main content
Access every election story that matters
Enjoy unlimited digital access
$1.99
per week for 24 weeks
Access every election story that matters
Enjoy unlimited digital access
$1.99
per week
for 24 weeks
// //

Taylor Owen is the director of The Centre for Media, Technology and Democracy at McGill University and the host of the Big Tech podcast.

The recent hate crime in London, Ont., has sparked renewed calls for the government to do more about Islamophobia and other forms of hate. As part of this response, a wide range of civil society organizations, including prominent anti-hate leaders, are calling for the government to introduce the online harms legislation it has long promised. This week they took an initial step by introducing amendments to the criminal code to better address online hate.

Canada is not alone. Governments around the world are stepping – albeit slowly and cumbersomely – into the perilous space of governing online speech.

Story continues below advertisement

What is Bill C-10 and why are the Liberals planning to regulate the internet?

Bill C-10 has ‘zero’ chance of becoming law by summer, senator says

For understandable historical reasons, Germany banned online hate (including Nazi) speech and forces platforms to take it down within 24 hours or face fines of up to €50-million. In response to the livestreamed Christchurch mass shooting, Australia banned the sharing of violent material. Addressing a concern about content that targets and exploits children, Britain has created new rules aimed at minors and mandates the removal of material depicting child sexual abuse or promoting suicide. Even in the U.S., the right to free speech online is not absolute. While platforms are broadly protected from being responsible for the behaviour of their users, they are liable for publishing child pornography and terrorist content.

At its core, the problem these laws all seek to address is relatively straightforward: There are a lot of awful things on the internet. And while some of this is a result of there being lots of awful people, the problem is magnified by the very way social-media platforms are designed. The problem of online hate is a difference in kind, not just degree.

By deciding who is seen and heard, by shaping what we read and watch, and by controlling the means of collective action, social-media platforms don’t just facilitate free speech, they shape the character of it. And as such, they bear some responsibility for the ensuing harms.

What’s more, because these companies now look and behave like traditional monopolists, they have little incentive to self-regulate. So citizens are left with little choice other than to accept the harms embedded in the system. Put simply, the free market has failed.

Fortunately, this is not a new problem. When a market leads to social or economic harm and the private sector is unwilling or unable to self-regulate, then that is precisely when we have traditionally looked to governments to govern. We do so with the financial sector, the pharmaceutical and petrochemical industries, and for food and aviation safety. In each we have developed intricate and at times highly invasive means to minimize the downside risks of these industries while maximizing their benefits.

Platforms are no different. And platforms can be regulated in a wide number of ways that both address the root causes (the business model and abuses of data) and the symptoms (the harmful speech itself). Enter the Canadian government.

Bill C-10 began as an effort to update our regime of cultural protectionism. Whether or not one agrees with these policies, what is undeniable is that our current broadcast and CanCon regulations were built for a different era and need to be either updated or scrapped. The government chose the former. Reasonable people can disagree on this.

Story continues below advertisement

What drew C-10 into the online speech debate were last minute changes lifting the exemption on user generated content (to ensure YouTube music would be included) and empowering the CRTC to regulate discoverability (to prioritize Canadian content in our feeds). What’s worse, the government has shamefully tried to shut down debate and force through legislation on a bill that at least ostensibly touches on free speech.

The unfortunate irony, however, is that by clumsily and unnecessarily expanding C-10 into the domain of free speech, the government has shown itself ill prepared to defend – and has put in jeopardy – the passage of separate legislation that explicitly deals with speech, its planned online harms bill.

Learning from similar efforts in Europe, the government is thought to be developing plans to force platforms to remove already illegal speech and are considering a regulator to enforce this new take-down policy as well as to potentially implement a range of additional accountability measures such as mandatory transparency reports and algorithmic audits. Other measures, such as dispute resolution mechanisms to allow citizens adjudication of take down decisions, are being discussed.

All of this is important, but is easier said than done. Regulating speech is far more difficult than updating competition policy or data privacy reform (neither of which this government has delivered on to date).

The difficulty is in part because each country weighs the balance between the right to speech and the right to be protected from speech differently. The world will not uniformly adopt Silicon Valley’s, or America’s, definition of free speech. This means that as much as platforms might like to have one set of rules for the whole planet, there will be different approaches taken in different countries. It’s going to be messy.

It is also not always clear what is an act of online speech. Is it the comment typed by the user, or the algorithm that amplifies that speech to an audience, or the recommendation that a user join a group where that speech is posted? All of these are arguably speech acts but demand very different governance tools to regulate.

Story continues below advertisement

And once you have decided what counts as online speech, you need to determine who should be liable for it. Should platforms be viewed as neutral hosts for the speech of their users (like a utility), or should they be liable for the content that they distribute and profit from on their sites (like a publisher)? The answer is a bit of both.

Perhaps most worryingly, speech regulation is further complicated by the reality that illiberal leaning regimes around the world, including Poland, Hungary, the Philippines, Turkey, Brazil and India, are increasingly using very similarly worded laws to crack down on the media, civil society and the free expression of their citizens.

For example, the Indian government has recently imposed a new set of sweeping regulations, what it calls “IT Rules.” While these rules sound familiar, as they require, among other things, that platforms remove content deemed defamatory, obscene and harmful to children, they are widely seen as a means of cracking down on the speech and political activities of Indian civil society.

Which begs the question of whether democratic governments should abandon their efforts to govern online harms because their initiatives may be instrumentalized by illiberal regimes?

I would argue that the opposite is true.

Those of us lucky enough to live in democratic societies have a responsibility to show how online speech can be governed in a manner that both minimizes harm and prioritizes and reinforces core democratic principles.

Story continues below advertisement

It is undeniable that social-media platforms increasingly govern like states, some more responsibly than others. But they invariably do so in a manner driven by private rather than public interests, without the norms and institutions of democratic accountability we have rightly placed on our governments. And this has come at a significant cost.

Few people are calling for a completely deregulated platform ecosystem, where terrorists and child pornographers have free rein. So the debate is not really whether online content on social media should be regulated, but rather what the extent of that regulation should be.

While the question of how we govern online speech will rightly spark important and heated debates, doing it in a manner that prioritizes free expression and accountability will make the internet more democratic, not less.

Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.

Your Globe

Build your personal news feed

  1. Follow topics and authors relevant to your reading interests.
  2. Check your Following feed daily, and never miss an article. Access your Following feed from your account menu at the top right corner of every page.

Follow topics related to this article:

View more suggestions in Following Read more about following topics and authors
Report an error Editorial code of conduct
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

If you do not see your comment posted immediately, it is being reviewed by the moderation team and may appear shortly, generally within an hour.

We aim to have all comments reviewed in a timely manner.

Comments that violate our community guidelines will not be posted.

UPDATED: Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies