Skip to main content

This is the weekly Amplify newsletter. If you’re reading this on the web or someone forwarded this e-mail newsletter to you, you can sign up for Amplify and all Globe newsletters here.

I remember what the air felt like the night I heard comedian Marcia Belsky talk about being banned from Facebook on an episode of the podcast Lady Lovin’. It was a biting December evening in Montreal, and I had popped in my earphones on the bus ride home from a professor’s office hours. As I listened to Belsky (at the time a stranger to me) describe what it feels like to be publicly harassed on a major social media platform, then banned for responding to the harassment, I was overcome with rage. When the bus reached my street, I stepped off and stomped through the fresh snow, up the stairs to my apartment, where I immediately sat down and opened up a new e-mail. I drew up a research pitch with a summary of Belsky’s story and links upon links to news articles on her case, addressed it to a professor and hit “send.”

I’m Audrey Carleton, a content editor in The Globe and Mail’s summer staff program. I recently graduated from McGill University where I conducted an independent study on Facebook’s content moderation policies and their application across demographic and identity groups.

The research process was daunting. The first step was to identify the policies surrounding hate speech and user expression that guide Facebook’s content moderation practices. At the time of my research, the social media giant had not published many of these policies (it only made many of these internal policies publicly available in April, 2018), and only had a vague set of Community Standards. I had to rely on leaked internal training documents, boldly made public by investigative journalists like The Guardian’s Nick Hopkins and ProPublica’s Julia Angwin.

I quickly learned that Facebook treats reported hate speech differently depending on the targeted group. “Protected” categories include religious affiliation, sexual orientation and race; “unprotected” ones include social class, political ideology and age. Content moderators (most of whom are outsourced, from developing countries like the Philippines) are taught to delete any statement posted on Facebook that targets only protected categories, but not to remove those that target a protected category and an unprotected category in the same turn of phrase. This makes it so that in practice, a statement that targets white men will be quickly removed, because it attacks two groups within protected categories: gender and race. Meanwhile, a statement that targets, for example, black children, will not be removed, because it attacks children, a group within the unprotected “age” category, while simultaneously attacking a racial group (which is otherwise considered protected). Confused? This New York Times quiz lists a few examples.

Searching primarily through other social media sites, I unearthed countless testimonies from women, many of them women of colour, who had their words erased from the site and their accounts briefly shutdown with little explanation. I also discovered the Facebook Jailed project, a website and Twitter account run by comedian Kayla Avery devoted to sharing the stories of Facebook users who have been banned from the site.

The common thread uniting many of the cases was that the user had been reported and removed for publishing words that challenge forms of privilege and structures of oppression. There’s Belsky, who was famously removed from Facebook after commenting “men are scum” on her friend’s album of screenshots of rape and death threats she’d received on Twitter.

Then there’s Black Lives Matter activist DiDi Delgado, whose account was taken down after she posted a status saying, “All white people are racist. Start from this reference point, or you’ve already failed.” Or Molly McIsaac, a YouTuber who was banned for calling out a man threatening to attack and sexually assault her.

While I personally have never been banned from Facebook, I developed a profound sense of empathy with those who have. I rely on social media heavily for connecting with other professionals in my industry, cultivating a reputable public image and promoting my work. For comedians like Belsky, Facebook is a platform for promoting shows and boosting ticket sales. For activists like Delgado, it’s a space for educating and mobilizing others. Many users rely on the site to make a living, so being forced off of it can have tangible (read: financial) consequences, thus perpetuating existing gaps in access to resources and socioeconomic power across demographic groups.

Facebook’s content moderation practices are built upon forms of supremacy that govern real life – and worse, the site upholds, even perpetuates these very structures by silencing already marginalized groups whose speech threatens those in power. I came to realize over the course of my analysis that for any social media site to truly be safe for all of its users, it has to actively work against systems of oppression, moderating content in such a way that acknowledges supremacies.

And to users who find themselves at the receiving end of an unjust ban, my advice would be to remember that you are far from alone. If you have the energy, share your story on other platforms, continue to hold this chronically opaque site to account, and exercise your right to have a voice on the web.

What else we’re reading

I’ve become obsessed with The Pudding, a data visualization site full of projects analyzing pop culture phenomena. I stumbled upon the site when I found Russell Goldenberg and Matt Daniels’s interactive multimedia breakdown of why Ali Wong’s comedy set in Baby Cobra is so successful. This took me to other projects that are revolutionizing data visualization and dismantling power structures within mainstream entertainment at the same time, like Ash Ngu’s analysis of the airtime gender gap on This American Life and Hanah Anderson and Matt Daniels’s exploration of differences in film dialogue by gender across 8,000 American screenplays.

Inspired by something in this newsletter? If so, we hope you’ll amplify it by passing it on. And if there’s a woman you think our readers should know about, tell us about her. Send us an e-mail at amplify@globeandmail.com.

Interact with The Globe