Skip to main content

The Cleaners will be at the Hot Docs festival next week.

Courtesy of Hot Docs

Child pornography. Beheadings. Immolation videos. Every day, thousands of pieces of repugnant content are removed from Facebook by a secret army of censors. (After the van attack in Toronto on Monday, Facebook deactivated the account of the alleged assailant, who had posted a call for femicide.) But it also overreaches, deleting important examples of political dissent.

The disturbing new documentary The Cleaners, which will be at the Hot Docs festival next week, pulls back the curtain on Facebook’s global content moderation, revealing the extraordinary psychological cost on cubicle workers in the Philippines paid a pittance to sift through the world’s digital sewage. On Tuesday, we spoke with Hans Block, one of the film’s co-directors.

You’ve got impressive timing: This week, Facebook succumbed to years of pressure and publicly released its internal guidelines on permissible content. How long was this film in the works?

Story continues below advertisement

There was a case in 2013 of a child sexual-abuse video that went online in the United States and was shared about 60,000 times on Facebook and “liked” thousands of times. We started asking, “Why is that happening, and is there something that’s filtering the platform, curating the content we upload?”

You presumed the moderation was being done by artificial intelligence.

Yes, and the media scientist Sarah T. Roberts, who’s in our movie, told us there are thousands of people in the developing world sitting in front of a screen eight to 10 hours a day, reviewing the content.

At first, it seems like economic and cultural imperialism, paying people in Manila a couple of dollars an hour to do this numbing, awful work. But then it appears that there’s a kind of reverse cultural imperialism: They’re applying a Filipino standard across the world. You profile one moderator, a very religious woman who considers penises “sinful.”

The Philippines is a very Catholic country. And their way of thinking, their mindsets, has some huge effects on how they decide about content.

Some of the moderators also don’t seem comfortable with satire or strong political speech. So they remove satirical content that is critical of Donald Trump or Turkish President Recep Erdogan.

One of the reasons Facebook chose the Philippines for outsourcing – because they could go to Pakistan or India, which are also low-wage countries – is that the Philippines have a history of colonization. For 300 years, Spain occupied the country and the Americans [occupied it] for 100 years. Companies use this to promote outsourcing there. They say Filipinos share the same values as the Western world. Our experience was quite different. It’s sort of a postcolonial continuation of history and all the decisions have an impact on our digital public sphere.

Story continues below advertisement

Moderators are each asked to review about 25,000 pieces of content every day.

Courtesy of Hot Docs

Moderators are each asked to review about 25,000 pieces of content every day, which cover a huge range of issues.

It’s so complicated to know all the cultural background of the content. Satire is just one example. It becomes much more difficult in regions where a crisis or a war is going on: How to distinguish between a terrorist group or a freedom-fighter group which is trying to help to bring freedom to the country?

Moderators often seem to miss content they should be taking down, such as hate speech that incites violence.

You can make mistakes on one side when you delete too much, but you can also make mistakes when you ignore too much. It can lead to a genocide in Myanmar, where thousands of Rohingya refugees have to flee the country because hate speech is amplified every day on these social-media platforms. This hate speech has real consequences.

Someone in the film notes that we like to believe that technology is neutral, free of judgment, but it’s clearly not – the moderators are only one of the more obviously capricious elements determining what we see.

When they started building those technologies, those tools, they were just interested in – as [Facebook’s CEO] Mark Zuckerberg says – connecting people. Building a platform where people can spread the word, publish statements or content. They don’t accept they have editorial decisions. But they have. Not just by content moderation, but the news feed, which is also curated. You see things Facebook thinks are interesting to you, and it hides other things. This is also an editorial decision.

Story continues below advertisement

While making the film, you were exposed to some of the awful content that the moderators deal with. What kind of an effect did that have on you?

It is tough, but the main difference is that I can talk about my feelings, I can take a break and talk to psychologists. These content moderators are not allowed to speak a word to their nearest friend or their families, because they signed non-disclosure agreements. This is the worst you can do if you’re traumatized. This can lead to the worst consequences, which is also part of our movie: The suicide rate is extremely high.

The Cleaners will screen as part of the Big Ideas series on Monday, Apr. 30 at 6:30 p.m., followed by a Q&A with directors Hans Block and Moritz Riesewieck, and the UCLA assistant professor of information studies, Sarah T. Roberts. It will screen again May 2 and 4. For more information: hotdocs.ca

This interview has been condensed and edited.

Report an error Editorial code of conduct
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.
Comments are closed

We have closed comments on this story for legal reasons or for abuse. For more information on our commenting policies and how our community-based moderation works, please read our Community Guidelines and our Terms and Conditions.

Cannabis pro newsletter
To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies