As elected officials place internet giants such as Google and Facebook under an increasingly intense microscope, the pressure mounts on those companies to play more proactive roles in policing content on their networks. In recent weeks, the demands have come from seemingly every direction: privacy commissioners seeking rules on the removal of search results, politicians calling for increased efforts to address fake news on internet platforms and internet users wondering why the companies are slow to take down allegedly defamatory or harmful postings.
Internet companies can undoubtedly do more, but laying the responsibility primarily at their feet poses its own risks as governments and regulators effectively cede responsibility for content moderation and policing to private, for-profit companies. In doing so, there is a real chance that the internet giants will become even more powerful, limiting future competition and entrenching an uncomfortable reliance on private organizations for activities that are traditionally conducted by courts and regulators.
Contrary to some claims, there has never been a fully hands-off approach to internet regulation. All internet companies – like any other company – respond to court orders to take down content or to disclose the identity of their subscribers or users. The major companies such as Google, Facebook, Microsoft and Twitter also regularly release detailed transparency reports that provide insights into lawful requests and take-down efforts.
Some companies have proactively attempted to block or mute certain content. Facebook CEO Mark Zuckerberg emphasized his company’s success in combatting terrorist materials last week in his U.S. congressional appearance, noting that the technology is sufficiently effective to ensure that the vast majority of posts are never viewed by anyone.
Similarly, YouTube, the world’s largest video site, automatically flags copyright-infringing content identified by rights holders, which is then muted, taken down or used to generate revenues for the rights holder through advertising. These efforts at content moderation require significant resources with hundreds of millions of dollars invested in employees and technologies that can use automation to help facilitate content identification.
Before politicians or regulators mandate additional requirements, we should recognize the risks associated with outsourcing responsibility for content moderation to internet companies.
First, mandating broader content moderation and take-downs virtually ensures that the big players will only get bigger given the technology, research and personnel costs that will be out of the reach of smaller companies. At a time when some of the internet companies already seem too big, content moderation of billions of posts or videos would reaffirm their power, rendering it virtually impossible for upstart players to compete.
In fact, we are already perilously close to entrenching the large internet players. At a conference on large-scale content moderation held earlier this year in California, there was a wide gap between companies such as Google and Facebook (which deploy thousands of people to the task) and smaller companies such as Medium, Reddit and Dropbox, which have hundreds of millions of users, but have only a handful of people focused on content moderation issues.
Second, there remains considerable uncertainty with what politicians actually want. Last week, members of Congress alternately took turns criticizing Facebook for not doing enough to take down content or for doing too much. For example, Representative David McKinley wanted to know why Facebook was slow to remove posts promoting opioids, while Representative Joe Barton raised concerns about Facebook taking down conservative content.
Similar issues arise in other countries. For example, Facebook faces potential liability in the millions of dollars for failing to remove hate content in Germany, but earlier this month a German court ordered the company to restore comments the company deemed offensive.
Third, supporters of shifting more responsibility to internet companies argue that our court systems or other administrative mechanisms were never designed to adjudicate content-related issues on the internet’s massive scale. Many internet companies were never designed for it either, but we should at least recognize the cost associated with turning public adjudication over to private entities.
Leaving it to search engines, rather than the courts, to determine what is harmful and should therefore be removed from search indexes ultimately empowers Google and weakens our system of due process. Similarly, requiring hosting providers to identify instances of copyright infringement removes much of the nuance in copyright analysis, creating real risks to freedom of expression.
Advocates of increased regulation are quick to point out that the internet is not a no-law land. Yet if the determination of the legality of online content is left largely to private internet companies, we may be consigning courts and regulators to a diminished role while strengthening the Googles and Facebooks, as concern grows over excessive power in the hands of a few internet giants .
Michael Geist holds the Canada Research Chair in Internet and E-commerce Law at the University of Ottawa, Faculty of Law. He can be reached at mgeist@uottawa.ca or online at www.michaelgeist.ca.