Navneet Alang is a technology and culture journalist based in Toronto
There is something depressingly familiar about it now: A new Internet technology emerges, full of promise, and it is soon used for something terrible. It happened with YouTube. It happened with Twitter. And, most recently, it has happened with Facebook Live, which was used in a horrific incident in which a Thai man broadcast the murder of his toddler before killing himself.
The relatively new feature was supposed to let users simply stream live video from their phones wherever they are. When it was introduced in April of 2016, Facebook CEO Mark Zuckerberg said it represented "a big shift in how we communicate" and would "create new opportunities for people to come together."
But in addition to the most recent tragedy, Facebook Live has also been used to show the beating of a disabled man, a gang rape and the murder of a man in Cleveland in a video that was left up for more than two hours.
That Facebook neither predicted this possibility nor had adequate protections in place is a symptom of a sickness in Silicon Valley, whereby companies build features and worry about consequences later. It is part of a broader Facebook issue, too, in that it simply refuses to take responsibility for what it has become: A hybrid of media and tech that is now an enormous gatekeeper of culture.
Unfortunately, these sorts of events that turn murder into media predate Facebook Live. In 2015, Alison Parker and Adam Ward were gunned down during a live TV interview. The video of their deaths was put on Twitter and Facebook for millions to see.
It was a tragedy, but also the extreme example of a broader trend. From abuse on Twitter to child porn and crime online, it has become clear that any time you simply open up a technology with no boundaries at all, both amazing and awful things can happen. It is up to technology companies to foster the former while clamping down on the latter.
Abuse of Facebook's technology should have thus been easy to foresee. But that little forethought was given to potential misuse is a product of the fact that digital media companies thrive on attention – literally. Because their financial model relies on capturing eyeballs, everything they do is attuned to keeping people glued to their screens.
The obvious downside: In prioritizing eyeballs, the cultural effects of what people are paying attention to falls to the wayside.
Technology is not simply some neutral thing. It neither just reflects who we are, nor entirely changes us. Instead, it creates a new set of circumstances that we have to newly deal with. In the case of Facebook Live, those seeking attention will be drawn to it, and given the vast number of people using Facebook – now creeping close to two-billion users, a full quarter of the world's population – there will, of course, be extreme events.
The obvious question, then, is what is to be done? Reaction on social media varied from shutting the service down to limiting live broadcasts to verified users who already have a public presence. But each is less than ideal. Consider the shooting of Philando Castile by Minnesota police, which was recorded by Mr. Castile's girlfriend, Diamond Reynolds, using Facebook Live. Without video evidence, it is perhaps less likely the officer involved would have been charged with manslaughter or that the public would show an increasing awareness of police brutality. Technology is ambivalent, and when we ban it or constrain its use to the few, we lose its power to help as well as harm.
Instead, the answer is for Facebook to take responsibility for what it is: A massive cultural presence that is part media, part tech, and hugely influential. Time and time again, whether regarding privacy, advertising, the media industry or the content displayed on its network, Facebook has had a policy of acting in its own interest first and thinking about consequences later.
As with the issues of fake news and trending topics from a few months ago, the likely solution for Facebook Live rests on more human intervention – to have a bigger, much more responsive team ready to take down violent content when it occurs. In short, the solution is for Facebook to behave like the massive media behemoth that it is.
Failing that, there is another option. When corporations repeatedly fail to regulate themselves and show no concern for the public interest, usually only one other entity can rein them in: The government.