Skip to main content
The Globe and Mail
Support Quality Journalism
The Globe and Mail
First Access to Latest
Investment News
Collection of curated
e-books and guides
Inform your decisions via
Globe Investor Tools
Just$1.99
per week
for first 24 weeks

Enjoy unlimited digital access
Enjoy Unlimited Digital Access
Get full access to globeandmail.com
Just $1.99 per week for the first 24 weeks
Just $1.99 per week for the first 24 weeks
var select={root:".js-sub-pencil",control:".js-sub-pencil-control",open:"o-sub-pencil--open",closed:"o-sub-pencil--closed"},dom={},allowExpand=!0;function pencilInit(o){var e=arguments.length>1&&void 0!==arguments[1]&&arguments[1];select.root=o,dom.root=document.querySelector(select.root),dom.root&&(dom.control=document.querySelector(select.control),dom.control.addEventListener("click",onToggleClicked),setPanelState(e),window.addEventListener("scroll",onWindowScroll),dom.root.removeAttribute("hidden"))}function isPanelOpen(){return dom.root.classList.contains(select.open)}function setPanelState(o){dom.root.classList[o?"add":"remove"](select.open),dom.root.classList[o?"remove":"add"](select.closed),dom.control.setAttribute("aria-expanded",o)}function onToggleClicked(){var l=!isPanelOpen();setPanelState(l)}function onWindowScroll(){window.requestAnimationFrame(function() {var l=isPanelOpen(),n=0===(document.body.scrollTop||document.documentElement.scrollTop);n||l||!allowExpand?n&&l&&(allowExpand=!0,setPanelState(!1)):(allowExpand=!1,setPanelState(!0))});}pencilInit(".js-sub-pencil",!1); // via darwin-bg var slideIndex = 0; carousel(); function carousel() { var i; var x = document.getElementsByClassName("subs_valueprop"); for (i = 0; i < x.length; i++) { x[i].style.display = "none"; } slideIndex++; if (slideIndex> x.length) { slideIndex = 1; } x[slideIndex - 1].style.display = "block"; setTimeout(carousel, 2500); }

Hoan Ton-That, founder of Clearview AI, shows the results of a search for a photo of himself, in New York, Jan. 10, 2019.

AMR ALFIKY/NYTNS

Canada’s largest municipal police service has acknowledged "informally testing” a powerful surveillance tool, but said it has stopped its use pending reviews.

The Toronto Police Service last month initially denied using the Clearview AI software. The technology has sparked concerns from privacy advocates who say the use of emerging surveillance tools by law enforcement agencies is not being adequately scrutinized.

In Canada, such technologies can exist in a legal grey area, allowing detectives to advance their investigations while not necessarily disclosing techniques to the accused, judges or the wider public.

Story continues below advertisement

“Some members of the Toronto Police Service began using Clearview AI in October, 2019, with the intent of informally testing this new and evolving technology,” the police service said in a statement e-mailed to reporters on Thursday. But Chief Mark Saunders last week “directed that its use be halted immediately upon his awareness,” the statement added.

The police service did not say where and why the tool was used, by which police unit or explain why the chief would not have been told about its use at the outset. Nor did the statement clarify whether it was used in any criminal investigations, or if judges had authorized its use.

The TPS statement says that the police force has reached out to Ontario’s Information and Privacy Commissioner and the Ministry of the Attorney-General to review the technology “and its appropriateness as an investigative tool for our purposes.”

The statement says that until reviews are completed, Clearview AI "will not be used by the Toronto Police Service.”

Clearview AI was thrust into the spotlight in January by a New York Times article revealing that a technology company by that same name was selling its facial-recognition product to police across North America.

Forms of facial-recognition software are now routinely used by police services in the U.S. and Canada, although often in relatively limited capacities – for example, to scan police-held mugshot databases in an effort to compare images.

Clearview’s app was built as a police tool by using billions of publicly accessible photos sourced from popular social-media websites. Because these images were taken from the websites without explicit user consent, social-media giants such as Facebook, Google and Twitter reacted to the revelations by serving cease and desist orders on Clearview AI.

Story continues below advertisement

In recent weeks, several Canadian police forces, including the Toronto Police Service, have released statements saying that they were not using the tool.

The Toronto Police Service has previously had to correct other misstatements about the specific surveillance technologies it uses.

In 2015, for example, a TPS spokesman told the Toronto Star that the force’s officers were not using a technology known as IMSI catchers. Such surveillance tools are used by police to indiscriminately grab data from all cellphones in an area, in hopes of tipping off detectives to the phones carried by crime suspects.

But court documents later surfaced showing that police had in fact been using the technology as early as 2014.

Privacy expert Ann Cavoukian said she was very disappointed to learn police in Toronto had been using Clearview AI’s technology.

Story continues below advertisement

“Your facial image is your most sensitive biometric,” said Ms. Cavoukian, formerly the Ontario government’s privacy commissioner. “This kind of unacceptable technology certainly should have gone up the ranks and gotten approval from the chief, not just used incidentally by one arm of the Toronto Police Service,” she said.

“We have jurisdictions in the United States that are outright banning facial recognition because of all its inaccuracies,” she added. “It’s a nightmare if you are incorrectly accused.”

Ms. Cavoukian, however, applauded Chief Saunders for putting the tool’s use on hold. “He took the right action,” she said. “We need a proper investigation.”

Report an error Editorial code of conduct
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed.

Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies