Skip to main content
opinion

Kevin Chan is a policy director at Facebook and a 2020/2021 Technology and Democracy Fellow at Harvard. Rachel Curran is a policy manager at Facebook and a former director of policy in the office of Prime Minister Stephen Harper. Joelle Pineau is the Director of Facebook Artificial Intelligence Research Labs and an Associate Professor and William Dawson Scholar at the School of Computer Science at McGill University.

Every day, tens of millions of Canadians use Facebook because they find value in the connection and community that it provides. At the same time, concerns have been raised around the world about the impact of technology companies in our daily lives, and in recent days, of our company in particular.

This encapsulates the central challenge that companies like Facebook face: how to maximize the good and minimize harms. As three Canadians working directly on public policy and research at Facebook, we take very seriously the opportunity and responsibility to contribute to this effort, and to always strive to do better.

Importantly, we hear the calls for more regulation, and we agree. Matters of hate speech, online safety and freedom of expression are some of the most challenging issues of our time, and we have been vocal in calling for a new set of public rules for all technology companies to follow. As Canadian lawmakers seek to construct new frameworks for platform governance, we stand ready to collaborate with them.

Canada has the tools to regulate the Internet and has done so for many years on matters such as privacy, hate speech, consumer safety and election integrity. Just look at the significant platform regulation that was achieved with the Elections Modernization Act. As new regulations are developed in Canada, we need a constructive dialogue about how to achieve good social and public policy outcomes for all Canadians.

Important groundwork for this new set of regulations has already been laid. At Facebook, we agree that there should be greater transparency and accountability among social media platforms and have been building industry-leading solutions for many years. Take our Ad Library, for example. We are the only platform to provide this level of ad transparency in Canada.

We publish quarterly Community Standards Enforcement Reports that show what Facebook is doing to address multiple areas of prohibited content, per our Community Standards, including continuing to reduce the prevalence of hate speech. These reports are the most comprehensive of our industry and we have opened them up for independent auditing. We also recently released our Content Distribution Guidelines about the kind of content we demote on Facebook, such as fact-checked misinformation, borderline content, and suspicious virality.

When it comes to accountability, we have created an independent Oversight Board that renders final and binding content decisions on Facebook. This board provides external, transparent decisions on some of the most challenging cases we face.

Addressing harmful speech is important, but also complicated and requires careful balancing with freedom of expression considerations. A recent report by the Public Policy Forum noted that it could not arrive at a consensus on how to best address this significant question of fundamental human rights, and recent federal consultations have prompted significant concerns from around the world.

No one sector or institution, whether public or private, can fully address these challenges on their own. Sustained dialogue between government, society, and industry on how to achieve the right balance across all these issues is needed. Collaboration like this during the pandemic has allowed us to identify and remove harmful COVID-19 misinformation, something we could not do without the co-operation of governments and scientific bodies.

As for the oft-reported leaked data on the well-being of youth on Instagram, this is an issue we’ve never taken lightly. According to this research, most teen girls say that Instagram either makes them feel better or it doesn’t make very much difference one way or the other. But if our products are making things worse for even one person, it’s important for us to know that – and to understand why. While there will always be more work to do, it’s false to suggest that we ignore these issues. Our research efforts, which we subsequently released in full, show the exact opposite.

It is only by better understanding the risks that we are able to develop products, policies and partnerships to address areas of concern. The claim that Facebook is incentivized to maximize profit at all costs ignores the reality of this work, as well as the fact that both users and advertisers do not want to see hate on our platform. There is no incentive, whether moral or economic, for Facebook to build products that make people angry or depressed.

And despite what has been said about Facebook’s algorithms shaping people’s behaviours, we in fact provide industry-leading tools for Canadians to control for themselves what they see and interact with on Facebook.

Consider our 2021 Canadian Election Integrity Initiative. Our research helped us understand that Canadians wanted less politics in their feeds and we made product changes to address those concerns. We partnered with civic organizations like Equal Voice and Apathy is Boring to address candidate safety and civic engagement. We are cautiously optimistic that our collective efforts contributed to the integrity of the 2021 federal election and remain vigilant.

Making progress on these challenging questions requires space for nuanced discussion among a plurality of voices and perspectives, and collaboration across government, business, and civil society. This is the basis of how we operate in Canada. We look forward to continuing our ongoing work with stakeholders in a shared goal of good public policy outcomes for all of society.

Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.

Report an error

Editorial code of conduct