Skip to main content
Complete Olympic Games coverage at your fingertips
Your inside track on the Olympic Games
Enjoy unlimited digital access
$1.99
per week for 24 weeks
Complete Olympic Games coverage at your fingertips
Your inside track onthe Olympics Games
$1.99
per week
for 24 weeks
// //

CEO of Facebook Mark Zuckerberg testifies remotely during a Senate hearing on Capitol Hill in Washington, D.C. on Oct. 28, 2020.

POOL/Reuters

Kean Birch, associate professor, graduate program in science & technology studies, York University

For many people, the digital economy – especially the promise of automated decision-making systems powered by artificial intelligence (AI) – will solve all sorts of social and political problems, from rising health care costs to the spread of misinformation. And yet, regulation of digital technologies, personal data and AI are almost always framed by their commercial applications; they’re rarely considered an issue for collective societal decision-making.

Over the past week, two Canadian initiatives tackled the future of Canada’s digital economy. Canada’s Office of the Privacy Commissioner (OPC) released its proposed Regulatory Framework for AI: Recommendations for PIPEDA [Personal Information Protection and Electronic Documents Act] Reform last week. On Tuesday, Navdeep Bains, the Minister of Innovation, Science and Industry, released the federal government’s proposed Digital Charter Implementation Act. The latter legislation is meant to form the basis for digital regulation. This might seem like a technical matter, but the legislation will shape what technologies get developed in the coming years.

Story continues below advertisement

Ottawa’s legislation is a welcome sight, but regrettably, it’s premised on regulating the impacts of digital technologies rather than regulating their purposes. It is, in this sense, a form of post-hoc governance in which regulators will be responsible for putting the horse back in the stable after it has bolted. In particular, the proposals don’t deal with some of the key issues in Canada’s digital economy.

First, the proposed legislation recommends that the rules governing the use of our personal data can be simplified through the “de-identification” of personal information – basically, removing individually identifiable information from it. So, data controllers – such as Big Tech firms – can use our data without asking for our consent if the data have been de-identified. Unfortunately, there is growing evidence that re-identification is increasingly easy to do, so the emphasis on de-identification seems problematic. It’s good to see the legislation attempt to prohibit such re-identification, but I’m not sure how they can police this when it comes to AI technologies.

A more pertinent issue is that the legislation seems to provide significant leeway for firms to avoid requiring consent to use our data when providing products or services to customers. This means that firms can continue to collect and use our data for things such as targeted advertising, on which the digital economy depends. An opt-out right would be better, enabling anyone who doesn’t want their data to be collected and used to this end to object beforehand.

Second, the legislation takes a rights-based approach to the issue, focusing on privacy and data protection. This is all well and good, although this focus on privacy does look increasingly outdated. As Frank Pasquale, a professor at Brooklyn Law School, points out, holding automated decision-making systems accountable has moved beyond privacy and ethical concerns to more structural issues about identifying what sorts of digital and AI technologies we want to develop in the first place.

At issue here is that rights-based approaches depend on things such as access to courts or tribunals, which is highly unequal, and fail to address the very real digital elephant in the room: Big Tech. The legislation proposes to enable us to withdraw our consent to the use of our data, although this is subject “to the reasonable terms of a contract.” What these “reasonable terms” entail will be the sticking point.

I expect this proposal to have little meaningful impact on current or future data-collection practices of Big Tech companies such Facebook Facebook, Google Google, Apple Apple, Amazon Amazon and Microsoft Microsoft. Why? It leaves the onus on us to pursue this option ourselves – with the added expectation that we understand the complexities involved – but doesn’t enable us to simply avoid all those terms-and-conditions agreements we sign daily (when we download an app, for example). It would be better to remove data collection almost entirely from these terms-and-conditions agreements, limiting such collection to the narrowly defined operational functionality of a digital platform alone. So, despite the continuing public and policy condemnation of their business strategies and investigations of their alleged anti-competitive practices, it looks as if Big Tech is going to continue setting the agenda of our digital economy, whether we like it or not.

Now, there are alternatives to this contractual regulation of data collection and use: People such as Katharina Pistor, a professor at Columbia Law School, argue we could create collective forms of data governance, which would aggregate data in publicly governed databases or data trusts, but this legislation doesn’t do that.

Story continues below advertisement

Third, the legislation aims to ensure greater transparency in automated decision-making, including an individual’s right to an explanation and information on how the personal data feeding into AI systems are collected. I agree with all this but think that it comes up against the problems facing the EU’s General Data Protection Regulation: These rights depend on whether they trump intellectual property rights or not, especially trade secrets. Companies developing AI technologies depend on trade secrets to protect their valuable assets and transparency is of little use if there is no way to ensure companies stop using our data.

Finally, the legislation seeks to strengthen regulatory oversight. A major change is the introduction of new financial penalties, which are desperately needed by the privacy commissioner, and this should be applauded. However, the preference for a post-hoc regulatory system essentially allows firms to engage in wholesale societal experiments until someone complains. A better system would require AI developers, for example, to submit their plans to a regulator before they initiate development, creating something such as an FDA for AI.

That way, societal benefits and outcomes can be defined at the start, and their realization can be supported while problems are avoided from the outset.

Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.

Your Globe

Build your personal news feed

  1. Follow topics and authors relevant to your reading interests.
  2. Check your Following feed daily, and never miss an article. Access your Following feed from your account menu at the top right corner of every page.

Follow topics related to this article:

View more suggestions in Following Read more about following topics and authors
Report an error
Tickers mentioned in this story
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

If you do not see your comment posted immediately, it is being reviewed by the moderation team and may appear shortly, generally within an hour.

We aim to have all comments reviewed in a timely manner.

Comments that violate our community guidelines will not be posted.

UPDATED: Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies