Taylor Owen is the director of the Centre for Media, Technology and Democracy at McGill and the host of the Big Tech podcast.
Last week, social-media companies held hands, plugged their noses and jumped together, suspending the President of the United States from their platforms and delivering, in effect, the de-platforming of Donald Trump, which many thought was long overdue.
The kindest interpretation of this move is that the platforms, responding to the insurrection in Washington, acted swiftly and decisively to protect the country from further unrest, marking an evolution from their youthful and naive pretenses of neutrality to a more mature stature as responsible gatekeepers. More cynically, this moment could be seen as a deferential move for an incoming president whose party now has a lock on the White House and Congress, done in the dying days of an administration that these very same companies had welcomed for four years.
The ban may well be all of these things. But it is also a distraction from the core challenges we continue to face in our online ecosystem.
Every social-media platform has different content moderation policies and systems, and while all have the veneer of technocratic due process, in reality they all operate in a largely ad-hoc manner, applying policies differently to each of their billions of global users. Last week’s events demonstrated the failure of this approach. Just take Facebook, which arguably takes content moderation more seriously than other platforms. It’s far from obvious how Mr. Trump’s disqualifying posts were worse than previous ones, and why Facebook should ban him while other divisive world leaders remain, let alone the untold other egregious violations by political figures and citizens alike. And as legal scholar Evelyn Douek pointed out, Facebook’s new oversight board, intended to be an arm’s-length adjudicator of its policies, can’t even review the suspension because they are only mandated to review posts that are taken down, not accounts – an abdication of responsibility, on a technicality.
A focus on the banning of Mr. Trump’s accounts also distracts from the power displayed by recent de-platforming efforts. Parler, which promotes itself as a free-speech social-media haven for its more than 12 million users, has become home to some of the more toxic elements of U.S. politics, including the organizers of the Stop the Steal conspiracy theory and the violent insurrection at the Capitol. In response, Google and Apple have banned new Parler downloads from their app stores; Amazon’s AWS has announced it would kick Parler off its servers entirely. This de-platforming of a platform, however justified, demonstrates the consequences of market concentration: companies that own the infrastructure can make unilateral decisions about the companies that use it, and in so doing radically shape the nature of political and economic activity.
Ultimately, playing whack-a-mole with bad content, including many of Mr. Trump’s tweets, confuses the symptoms for the structural elements of the problem. Conspiracy movements such as QAnon, for instance, did not arise organically; they grew as a function of the platforms’ own design and business models.
Platforms make money by using vast amounts of data about users to determine how to target content designed to influence our behaviour (to sell ads), and how to hold our attention (so we see more ads). To manage the mind-boggling scale required – Facebook, for one, handles more than 100 billion transactions a day – platforms use either vast teams of content moderators, or highly imperfect AI, which increasingly dictate the character of our public sphere. This scale is simply impossible to manage responsibly, and the result is systemic failure. One internal Facebook study, for instance, found that 64 per cent of all new users joining extremist groups were a result of its recommendation tools, while the New York Times reported that, after the U.S. election, “Facebook’s algorithm drove 100 new people to join the first ‘Stop The Steal’ group every 10 seconds.”
These platform companies have become so large that we can no longer rely on the free market to correct for potential harms. And if the market can’t address that power imbalance, then the only alternative is democratic governance. In the interests of political expediency and a hope that the market could regulate itself, governments have left the governance of the digital domain to a handful of private companies, whose goals may or may not be aligned with the public good. This approach has failed. And so democratic governments must now lead difficult conversations: about what speech should be allowed online, about implementing accountability and transparency mechanisms for the data-driven economy, and about the necessity of competition in what has become an oligarchic industry.
Banning Mr. Trump’s tweets won’t begin to address the harms so clearly on display this week. Only these debates can.
Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.