Skip to main content

‘A convenient fiction:’ Author Tarleton Gillespie on Alex Jones and social media’s responsibility to moderate content

Apple, Facebook, YouTube and Spotify have removed from their services large portions of content posted by far-right conspiracy theorist Alex Jones and his Infowars site while Twitter has not done so, fostering debate over the policing role social-media platforms should play in dealing with hate speech.

ILANA PANICH-LINSMAN/The New York Times

As social-media platforms such as Twitter and Facebook have become central hubs of conversation this century, their role in amplifying harmful rhetoric has itself become a source of debate. Swarms of bots, “fake news,” hate speech and conspiracy theories have led even Twitter chief executive Jack Dorsey to acknowledge that the “health, openness, and civility of public conversation” is at a low.

Enter Alex Jones, the once-niche, far-right-wing digital broadcaster who through his brand, Infowars, has perpetuated, among other conspiracy theories, that the 2012 Sandy Hook Elementary School shooting was staged with “crisis actors.”

Earlier this week, social media and content platforms including Apple, Facebook, YouTube and Spotify removed some content and accounts affiliated with Mr. Jones, for reasons including hate speech and the glorification of violence. Twitter, however, did not, which Mr. Dorsey said in a tweet was because he hadn’t broken the platform’s safety rules, which say that someone must “cross the line" into threatening violence to make a violation. (A spokesperson for Twitter Canada declined to add further comment on Mr. Jones, citing Mr. Dorsey’s tweets as the company’s statement.)

Story continues below advertisement

The discrepancy between Twitter and other platforms' responses to Mr. Jones only further fosters the debate over the policing role platforms should take for problems such as hate speech. Tarleton Gillespie, principal researcher at Microsoft Research New England in Boston, studies social media and has published a new book exploring these questions, titled Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media, via Yale University Press. He spoke with The Globe and Mail by e-mail this week about Twitter and other platforms’ decision-making.

Let’s start with Alex Jones. He’s known for pushing out fake news to the broadest degree, including awful conspiracy theories about the Sandy Hook shooting. How have social-media platforms enabled this kind of voice to propagate?

Social-media platforms have spent years convincing users, and themselves, that they’re just hosts, that provide an open space for users to speak. But this is a convenient fiction. Now that we’re getting a clearer understanding of how platforms work, and of how people take advantage of them, it’s increasingly clear that they invite and amplify certain kinds of really troubling speech.

Because platforms seem to presume that everyone is participating on genuine and fair terms, they often overlook those who tactically use the system to their advantage, while appearing genuine. Alex Jones perfectly presses on a fault line that runs through social media today: He is willing to say things that are false and cruel, but dresses them sometimes as legitimate political speech, mere theatre at others, and his readers like and forward it like the latest viral cat video. He produces the commodity they want and pretends to be the contribution they swear to protect.

Twitter took a different step than Facebook, YouTube or Spotify, which removed some of Mr. Jones’s accounts and content. Do you think there’s anything fundamentally different about Twitter’s approach to dealing with users who allegedly post hate speech or encourage violence?

For years, platforms have been making these moderation decisions on our behalf. Whether they make good decisions or bad ones, the fact that they do it for us may be the core problem, because they’re decisions that belong to the public: Where’s the line?

Public outrage is the closest we have right now to collectively considering these hard cases. It’s not necessarily a bad thing, at least in principle, that Twitter came to a different conclusion. I don’t agree with their decision, myself, but it’s probably a positive thing that platforms take different approaches to content moderation based on their values. Twitter’s decision makes a kind of sense with how their platform works and with their philosophy in the past. We’ll see how strong the public backlash is.

Story continues below advertisement

Twitter CEO Jack Dorsey tweeted that “critical journalists document, validate and refute such information directly so people can form their own opinions.” Given what you’ve found in your research on moderation, is it fair for a platform such as this to pass off this responsibility?

The reality is, the major platforms lean on other experts and institutions in moderation all the time. Twitter has a “Trust and Safety Council”; Facebook brings in cultural and linguistic experts to help them moderate posts. And they ask us to flag objectionable content.

But this kind of support is meant to help them make a sensible intervention, not to justify shifting the responsibility of discerning harm onto someone else. It's true that journalists should be reporting on the problematic aspects of Infowars. But whether they do, or how well they do it, shouldn't shape a platform's policy about whether Alex Jones gets to enjoy the benefits of that platform.

Mr. Jones’s discrediting of true stories can be seen as an attempt, as well, to discredit the mainstream media – the very people Twitter expects to verify the controversial statements made by people such as Mr. Jones. Do you think this process gives misinformation and its sources an inherent advantage on a platform such as Twitter?

If the news media today were widely seen as unassailable truth-tellers, able to perfectly and fairly expose lies and call out those who try to defraud the public debate, maybe platforms could rest a bit easier about what speech they should and should not circulate. But, right or wrong, we currently do not have that; the news media are fighting to stay afloat in an environment of distrust and confusion, sown in part by people like Alex Jones. We don't need platforms stepping back from a responsibility for the health of public speech right now – we need them stepping forward.

Mr. Dorsey also tweeted on Wednesday about moderating the discourse on its website: “Relying on algorithms alone will not work. … We need to figure out how to help with economic incentives too. We’re behind on that, but thinking deeply about it.” In this context, what do you think could work?

Story continues below advertisement

I think public debate and political pressure is beginning to work, to a degree. If critics are calling the platforms to task publicly for content that they didn't already see as worth removing, and the counterarguments for keeping Alex Jones are being heard, that's some version of a public dialogue.

The reality is that content moderation is constantly happening, it impacts low-profile users as much as high profile ones, and it is regularly being tested by new hard cases that are difficult to anticipate. So while public debate is good, something needs to gather these individual cases into a deeper and more coherent recognition of what speech like Alex Jones’s represents – conspiratorial bluster dressed as legitimate speech but built to delegitimize other speakers using any means necessary. And someone, whether it’s the platforms, public or policy makers on behalf the public, needs to put forward a coherent approach to that kind of speech.

This interview has been edited and condensed.

Report an error Editorial code of conduct
Comments

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • All comments will be reviewed by one or more moderators before being posted to the site. This should only take a few moments.
  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed. Commenters who repeatedly violate community guidelines may be suspended, causing them to temporarily lose their ability to engage with comments.

Read our community guidelines here

Discussion loading ...

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.
Cannabis pro newsletter