Skip to main content

The Globe and Mail

The danger of using data to prevent crimes before they occur

Opinion

The danger of using data to prevent crimes before they occur

As we reckon with the latest incident in a long line of mass shootings, a deeper question than effective gun control emerges: When we have so much data about everything and everyone, why wait for killings to happen?

The recent science-fiction television series Minority Report, based on the short story by Philip K. Dick, creates a world in which widespread “future crime” arrests are the norm, based on a prediction system run by mutants floating in a tank.

Tim Wu is a professor at Columbia Law School whose books include The Attention Merchants: The Epic Scramble to Get Inside Our Heads and The Master Switch: The Rise and Fall of Information Empires.

Following the horrific school shooting in Parkland, Fla., many began to ask: Might something have been done earlier? The FBI, it turns out, did receive a tip that Nikolas Cruz, the suspect, might just want to shoot up a school. Mr. Cruz himself had made disturbing social-media posts and comments. If that kind of data and more was findable, shouldn't someone have taken action?

There is one thing already obvious to the majority of North Americans – that disturbed individuals such as Mr. Cruz should not be allowed to buy military assault rifles and ammunition. (If only this were obvious to the U.S. Congress.) But embedded here is another, deeper question that goes beyond gun control: Why wait for killings to happen? When we have so much data about everything and everyone, if we can predict Tuesday's traffic and next week's hurricane, surely we should be able to prevent far more violence. Shouldn't we be using all technological means possible to stop crime before it occurs?

Story continues below advertisement

The concept of such "preventive policing" offers great temptations, particularly in the aftermath of tragedy. It recalls the science-fiction film Minority Report, based on Philip K. Dick's short story, in which widespread "future crime" arrests are the norm, based on a prediction system run by mutants floating in a tank. The fact is we aren't really that far away from such a future, relying not on mutant brains but those of Silicon Valley, coupled with the power of big data. But, whatever the attraction, the prospect for very dark results means it is a temptation that should be resisted.

The fantasy of the individual case – stopping a mass shooter such as Nikolas Cruz – needs to be weighed against the broader costs, namely, the prospect of false detentions and targeting of particular groups. There are many people who think or post disturbing ideas and plenty more who might be, for reasons of age, gender, race or other indicia, deemed "suspicious." As mathematician Cathy O'Neil, author of Weapons of Math Destruction argues, our prediction algorithms are never really neutral, but invariably reflect our prejudices and past experiences.

A better idea of what might happen comes from looking at China, the world's leader in big-data preventive policing. In recent years, Chinese security forces have been building "Police Clouds" designed to predict and prevent both crime and political unrest. The most extreme manifestation is found in the Western Chinese region of Xinjiang, from which much can be learned.

Xinjiang (the "new frontier") is a remote desert region, bordering the Himalayas, that is sometimes called East Turkestan. To a visitor, it looks and feels Central Asian and its majority population is not Han Chinese, but the Muslim Uyghur. Like in Quebec during the bad old days, the Uyghur are discouraged from speaking their language, face discrimination in employment and other indignities. Many want their own country; consequently, in recent years, as Human Rights Watch documents, the members of this ethnic majority have come under electronic surveillance – tracking what they buy, where they go, with whom they meet. Actions such as buying large supplies of food, amassing a supply of books or meeting suspicious people are fed into an algorithm that determines each individual's potential dangerousness and level of potential political disobedience. Those who amass enough suspicion points on an algorithmic scale become targets for an involuntary "re-education" program – in other words, arrest and detention based on future dangerousness.

That's China, and surely, you might think, such a thing would be impossible here in the West. But on closer examination, many of the pieces are already in place. Our phones know everywhere we've been. Amazon and credit-card companies know what we buy. Google has our e-mail, while Facebook knows with whom we are friends and why, and roughly what is going on in our personal lives. Governments hold arrest and other records. It is true that in China surveillance is involuntary, while here we "agree" to data collection by clicking on a mass of fine print. But if anything, given how wired the population is, the data available on North Americans is actually more detailed and complete. It just needs someone – law enforcement – to bring it all together.

Meanwhile, the legal system has evolved to punish "inchoate" crimes, that is, crimes that haven't happened yet. The crime of "conspiracy," widely charged, is really an agreement to commit a crime. In many jurisdictions an "attempted" crime is not just, as you might think, the failed bank robbery, but mere preparation for crime. Misdemeanour crimes (petty offences such as littering or traffic violations) can and are used as a means to detain people who might be thought to be dangerous in the future. In cities such as New York, hundreds of thousands are arrested yearly on the premise that doing so might prevent future criminal acts. The New Orleans and Chicago police departments have already been using data-driven predictive policing to find "hot spots" and "targets" based on big-data technologies provided by Silicon Valley data-mining firms.

Put all of this together and it is just a short hop to a fuller system of pre-emptive detention that, if not quite the Chinese system or Minority Report, comes close. The next step would be the FBI, or perhaps Canadian law enforcement, beginning to systematically demand big data from Google and Facebook and using it to arrest or detain those it thought posed a future danger of serious crimes. If promoted in the United States as an alternative to gun control, it is not hard to see that some of the results of an aggressive prevention program would be widely popular. For there very might well be future domestic abusers, burglars, pedophiles and murderers who get stopped before they act. In that sense, some might say, what's not to like?

Story continues below advertisement

It does sound appealing – that is, at least, until you start thinking not about the crime prevented, but the innocent people caught in the web of arrest and detainment, their reputations ruined forever and lives shattered. Or when you consider the prospect of living in an environment where even one's private musings and idle thoughts become prospects for detention. It goes back to ancient principles: In 1765, William Blackstone argued that "it is better that 10 guilty persons escape than that one innocent suffer."

Today, what we are facing is the question of whether we really believe that, or whether it was something we said when the technology wasn't good enough to make the odds a bit better. Let us live by our principles even if the fantasy of stopping the mass killer at the right time is hard to resist. We must ask if it would be worth it if it meant always wondering whether your turn might be next.

Report an error
Comments are closed

We have closed comments on this story for legal reasons or for abuse. For more information on our commenting policies and how our community-based moderation works, please read our Community Guidelines and our Terms and Conditions.

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.