Skip to main content
Complete Olympic Games coverage at your fingertips
Your inside track on the Olympic Games
Enjoy unlimited digital access
$1.99
per week for 24 weeks
Complete Olympic Games coverage at your fingertips
Your inside track onthe Olympics Games
$1.99
per week
for 24 weeks
// //

Police forces across Canada have already begun using technology to predict who may become involved in illegal activity or where crimes might take place, an expert group warned Tuesday as it called on the federal government to protect residents from the potential perils of such tactics.

A report developed by the University of Toronto’s Citizen Lab and International Human Rights Program said Canadian law enforcement agencies have generally been much more cautious than their international counterparts when it comes to deploying algorithmic tools in their crime-fighting efforts.

But the report said algorithmic policing has undoubtedly arrived, opening the door for a host of potential constitutional and human rights violations that the country’s legal system is currently not equipped to handle.

Story continues below advertisement

“Canada is still in the early stages of adoption relative to other jurisdictions, but we have already gone too far from the perspective of making sure our human rights and constitutional laws keep up,” said Cynthia Khoo, report co-author and lawyer specializing in technology and human rights. “... The good news is that it’s not too late. The government has the chance to act now to implement robust legal safeguards that protect our rights to equality, privacy, and liberty.”

Khoo said the term “predictive policing” often conjures images of the film “Minority Report,” in which sophisticated surveillance is used to arrest would-be criminals before they can act on potentially nefarious intentions.

Tales abound in the United States about policing efforts that bring the film to mind, the report said. They included a now-abandoned Chicago Police initiative to mine personal data to determine someone’s risk of carrying out a shooting, or a scuttled Los Angeles Police strategy that saw officers targeting possible crime hotspots based on information gleaned from utility bills, foreclosure records and social service files.

Khoo said Canadian police forces have access to some of the same technology, but have been more circumspect about using them in predictive policing scenarios.

But she said some such initiatives are already underway, including a Vancouver Police Department effort to forecast property crime.

She said the Geodash Algorithmic Policing System draws on historical data from within the force to determine when and where break-and-entry crimes may take place.

Vancouver Police officials said the project, launched at the end of 2016, led to a 20-per-cent decline in burglaries in the first six months alone. But in an article written for the International Chiefs of Police Magazine, the officer leading the effort said the force is aware that the benefits come with risks that law enforcement officials must take care to mitigate.

Story continues below advertisement

Special Const. Ryan Prox said the force is aware that algorithmic policing can disproportionately target marginalized communities and said the force doesn’t use such tools when that risk is elevated.

“Vancouver’s Downtown Eastside includes some of the most marginalized and vulnerable populations in Canada,” Prox wrote. “To better address the needs of this very diverse community, a community policing approach ... is being used instead of predictive policing.”

Elsewhere, Khoo said the Saskatoon Police Service has established a relevant partnership with the University of Saskatchewan and the provincial government.

Khoo said the Saskatchewan Police Predictive Analytics Lab is developing an algorithm to help identify those at risk of going missing, noting the program has not yet gone live.

Saskatoon Police spokeswoman Alyson Edwards said the project, funded in part by Defence Research and Development Canada, has led to “a better understanding of risk factors and patterns associated with the likelihood of being reported missing to police.”

Future projects in the works include analytics to support child death review and narcotics, she added.

Story continues below advertisement

Khoo said numerous Canadian police forces have been less cautious in their deployment of facial recognition technology.

She said such tools, while lacking predictive capabilities, are often launched without public notice and used to track and log residents as they attend protests or otherwise take part in constitutionally protected activities. Social media monitoring is another common practice, she added.

Khoo said any tech-driven law enforcement solution risks not only violating privacy and liberty rights, but further entrenching systems that have disproportionately targeted marginalized groups for years.

“While algorithmic policing technology may seem futuristic, it is inseparable from the past,” she said. “The historical and ongoing patterns of systemic discrimination in Canada’s criminal justice system are embedded in police data and related databases. Where this data is used to train algorithms ... the resulting algorithmic technologies will replicate, amplify, and exacerbate those discriminatory patterns.”

Jonathan Rudin, program Director at Aboriginal Legal Services, agreed, noting algorithms have already been used to monitor Indigenous rights protesters or assess a person’s risk of breaching conditions when released on bail.

“These are the ones that we’ve seen, and they cause us great concern,” he said.

Story continues below advertisement

Khoo said numerous police forces are exploring the possibility of adopting further tools despite the fact that Canadian laws written before such options were available are not equipped to regulate how they can be used.

The Citizen Lab report contains 20 recommendations to ensure fundamental rights are not infringed amid the shift toward predictive policing.

They include a call for the federal government to ban algorithmic tools that rely on historical data until it can hold a judicial inquiry to assess the implications of using such technology. Khoo also called on all police forces to disclose past, current and future algorithmic tools in use.

“We cannot have accountability without transparency,” she said. “The public and lawmakers must know the full factual state of affairs to ... determine whether or how these technologies should be used at all.”

Public Safety Canada said government officials would take time to review the report’s findings and recommendations, but offered no further comment.

Our Morning Update and Evening Update newsletters are written by Globe editors, giving you a concise summary of the day’s most important headlines. Sign up today.

Your Globe

Build your personal news feed

  1. Follow topics and authors relevant to your reading interests.
  2. Check your Following feed daily, and never miss an article. Access your Following feed from your account menu at the top right corner of every page.

Follow topics related to this article:

View more suggestions in Following Read more about following topics and authors
Report an error
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

If you do not see your comment posted immediately, it is being reviewed by the moderation team and may appear shortly, generally within an hour.

We aim to have all comments reviewed in a timely manner.

Comments that violate our community guidelines will not be posted.

UPDATED: Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies