Ottawa says it will take steps to protect Canadians’ data from malicious and unnecessary collection and abuse after years of relying on outdated legislation that left the country lagging behind jurisdictions such as California and the European Union.
Innovation Minister François-Philippe Champagne tabled a bill on Thursday morning to enact the Consumer Privacy Protection Act, to replace similar legislation that was launched in 2020 but not passed. It reintroduces what would be some of the biggest fines in the world for companies that abuse personal data: up to $25-million or 5 per cent of a company’s global revenue.
Bill C-27 also promises to increase the powers of Canada’s privacy commissioner, including the capacity to order a business to halt data collection. Among the new features are heightened protections for minors’ personal information, including an enhanced ability for them and their parents to request the deletion of their data.
“It sends a very clear message to the industry that we are serious in terms of the way that data should be protected, stored and transferred, and giving more rights to the people,” Mr. Champagne said in an interview.
The government also introduced new legislation called the Artificial Intelligence and Data Act, which would expand on the privacy protection act’s rules governing algorithmic decision-making. It would demand that many AI systems used in Canada ensure that possible harm or bias is identified and mitigated, and clearly tell users how their data will be used.
Concerns about data misuse have been rampant since the Cambridge Analytica scandal in 2018, in which users’ Facebook data were used to refine the targeting of political ads without their consent. Jurisdictions worldwide have sought to find ways to prevent companies from (or punish them for) using people’s data for different purposes than they provided it – such as selling it for profit.
The federal Liberals had long promised to update the Personal Information Protection and Electronic Documents Act, introducing the first Consumer Privacy Protection Act in late 2020. It was left to die after the 2021 election was called, even as technology companies’ collection and use of data became more prevalent during the COVID-19 pandemic.
Although the 2020 bill featured significant fines and restrictions for the abuse of data collection, many, including privacy commissioner Daniel Therrien and his successor, Philippe Dufresne, said it lacked teeth. In the year and a half since the bill was announced, other countries have levied significant fines against tech companies over alleged data misuse – including Google in Spain and Meta’s WhatsApp in Ireland.
The proposed legislation reintroduces consumer rights that align with Europe’s General Data Protection Regulation (GDPR) that were introduced in 2020, including the right to know how data will be used; the ability to transfer data between organizations; the right to ask for data to be deleted; and the right to transparency in how algorithms or other automated decision systems use data.
The bill more clearly delineates how businesses can use personal data that have been stripped of identifying details than its 2020 predecessor. Blake, Cassels & Graydon LLP privacy counsel Ronak Shah said this will make doing business in Canada easier, including by being more compatible with Quebec’s recent privacy bill. “There’s some consistency now coming through all the different privacy reforms,” Mr. Shah said.
Combined with requirements for corporate cybersecurity announced earlier this week, Ottawa’s latest data-related bills “put an emphasis on what organizations need to do in terms of data governance,” he continued. Many aspects of the 2020 version of the privacy law were designed to ensure compatibility with GDPR so as to not damage Canada’s trade relationship with the European bloc; Mr. Shah said the new bill appears to meet the same conditions.
“I have a high degree of confidence that what we have presented will be well received in light of GDPR,” Mr. Champagne told The Globe and Mail.
Bill C-27 reintroduces a proposal to launch an administrative tribunal that would enforce privacy laws, including by levying fines the federal privacy commissioner recommends, and hearing appeals of those decisions. (The last time a tribunal was proposed, Mr. Therrien said it would slow decision-making.)
The privacy commissioner would also have the power to work with the Competition Bureau and the Canadian Radio-television and Telecommunications Commission to share information and combine research. This would address data experts’ contention that governments must consider competition, privacy and other implications of data use together.
It would also create a federal commissioner position to help Ottawa monitor compliance with the new Artificial Intelligence and Data Act, with the power to order audits of AI and other algorithmic systems.
Philip Dawson, the AI policy director at the Schwartz Reisman Institute for Technology and Society, who worked with Ottawa on the new legislation, said the new AI rules are meant to be flexible, not prescriptive, to be able to react to the rapid developments of the AI world. Taking this approach could help circumvent the kind of situation unfolding in the European Union, which has already received more than 3,000 proposed amendments to its year-old Artificial Intelligence Act.
Canada’s proposed new AI law would be the second of its kind after the EU’s, Mr. Dawson said. Although many fine details will come through consultations, he said taking a flexible approach is “a means to be more iterative and agile.”
Mr. Dawson also said he hopes the AI and data act could lead Canadian companies to develop services to help businesses that use algorithms stay compliant. “It’s not enough to put the new rules in ink – we have to start thinking pro-actively about how to foster the creation of a new market for compliance products and services,” Mr. Dawson said.
Carole Piovesan, managing partner of INQ Law, which focuses on data law, said Bill C-27 “really signals the degree to which the government is taking violations of AI systems seriously.” But some details about the legislation remain unclear, including a reference to “high-impact” AI systems. “There are still really important definitions from which a whole bunch of obligations flow,” she said.
Asked for an explanation of the term, Mr. Champagne gave an example of AI systems that would determine loan approvals or insurance quotes: “We’ll be able to intervene if we see that some of these algorithms would be detrimental in using people’s information and having some biases.”
Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.