Former Supreme Court chief justice Beverley McLachlin and Taylor Owen, director of Centre for Media, Technology & Democracy, are co-chairs of the Canadian Commission on Democratic Expression. Peter MacLeod is the chair of the Canadian Citizens’ Assembly on Democratic Expression.
Last week’s Senate testimony by Facebook whistle-blower Frances Haugen, as well as the Wall Street Journal reporting on thousands of pages of internal research and communication that she leaked, have the potential to change the debate about regulating social media, and should inform how Canada does so.
There have, of course, been many controversies over Facebook and other social media platforms. There have been previous whistle-blowers, many leaks and regular government hearings around the world. But this time is different, for three reasons.
First, while civil society leaders, researchers, investigative journalists and policy-makers have long identified and documented the harms of social media, Ms. Haugen has provided us with detailed documentation of internal research. While Facebook has denied these harms, and been very reluctant to share data that would allow for their independent study, we now know that they knew. These problems include harming the well-being of our children, the proliferation of hate speech and undermining the integrity of our democratic norms.
Second, Ms. Haugen has also shed light on the causes of these harms. While Facebook and other social platforms would like to suggest they are simply mirrors of society, reflecting back our own prejudices, divisions and social problems, they also play an important role in shaping them. Their algorithms shape the behaviour of their users, playing an important role in who and what is seen and heard, and these algorithms are calibrated for engagement. It turns out that too often content that engages us is also content that causes harm. And they know that.
Third, Ms. Haugen’s leak demonstrates the limits and failures of relying on self-regulation to mitigate these harms. The documents she leaked and her testimony show that when faced with choices between minimizing the harm identified by their own research or maximizing profit though growth and engagement, Facebook often chose the latter. This should not be surprising. Facebook is one of the most profitable companies in history and it got here by being, since its founding, singularly focused on growth and by wiring its incentive structures for it.
In short, Ms. Haugen has finally focused our conversation on the right problem: corporate decisions, product design and incentive structures that too often prioritize profit and growth over public safety and democratic responsibility. Instead of starting with the outcome of this structural problem – harmful speech – Ms. Haugen rightly calls for governments to focus on ensuring greater accountability and transparency over the companies that shape it. Fortunately, we have many of the regulatory tools we need to solve this problem.
The Canadian government has spent the past year developing legislation to address online hate speech. It is understandable why this was targeted first; hate speech is experienced viscerally by both politicians and the public – especially minority and marginalized communities. There are certainly some sensible things that could be done to ensure that already illegal speech is sufficiently enforced online, as we recommended last year. And while proposed legislation has helpfully provided the governance architecture to regulate social media (a new regulator and a council of experts to advise it), the government has been criticized for empowering these new authorities to act as public censors, overly focusing on the symptom of the problem (harmful speech) at the expense of the cause (the scale and incentives of the platforms themselves).
We believe the government now has an opportunity to empower this new regulator to focus on precisely what Ms. Haugen calls for: accountability and transparency. This is why the Commission and Citizens’ Assembly on Democratic Expression, an initiative led by Canada’s Public Policy Forum, are focusing our work this year on policies that provide greater transparency into the behaviour and societal effects of these companies and greater accountability over their actions.
This is not a novel idea. It is precisely the oversight we demand of other sectors. We don’t simply trust Pfizer to develop a safe COVID-19 vaccine, we demand to see the evidence that it is safe. For social media, the policy tool kit could include better data sharing, transparency in the online ad market, robust auditing of algorithms, mandated harm and risk assessments, and exploring new forms of liability for the companies themselves.
This approach would minimize the hazards of restricting speech and instead focus on applying the same daylight provisions we demand of other industries. Greater transparency and accountability will make our public sphere healthier and strengthen our democracy. This is where governments should start.
Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.