Skip to main content

As countries grapple with the data-driven economy, the need for stronger legislation to balance privacy, personal security and innovation has become apparent. Canada’s response to this is Bill C-11, the Digital Charter Implementation Act of 2020, tabled in Parliament last November. The bill brings Canadian privacy law into the digital era and creates space for envisaging privacy protective governance structures for innovation, such as data trusts.

The bill is many steps in the right direction, pragmatic, principled and remarkably thoughtful in addressing competing considerations. Canadian companies participating in the innovation economy require large amounts of data to produce globally competitive products. At the same time, as recent data breaches and discussion around the “surveillance economy” have shown, Canadians must feel and actually be secure as they go about their data-dominated daily lives. A misstep in C-11, however, may actually jeopardize innovation in Canada.

Under current Canadian privacy law, where information no longer relates to an identifiable individual, it is liberated for use and disclosure and, therefore, for innovation. C-11, instead, restricts the disclosure of such “de-identified” information without knowledge or consent exclusively to a few specified public institutions or an organization mandated by such an institution, for a “socially beneficial purpose.” This essentially cuts out the Canadian private sector from the use of de-identified data for socially beneficial purposes, even with governance structures, such as data trusts, established to address residual privacy risk.

C-11 starts from the premise that, technologically, we can no longer accept the notion that de-identified data is “anonymous” or “anonymized” because no method completely protects it from reidentification. Technologists agree. An Imperial College London study found that “even heavily sampled anonymized datasets are unlikely to satisfy the modern standards for anonymization set forth by the EU’s General Data Protection Regulation and seriously challenge the technical and legal adequacy of the de-identification release-and-forget model.”

But responsible innovation does not reside in simply excluding de-identified data from private sector use. It resides in applying commensurate safeguards, such as data trusts, to govern the responsible sharing of de-identified data. Data trusts are legal entities or contracts that designate a trustee or group of trustees to manage personal information to be used for a specific purpose such as health research or management of the environment.

Consider the following hypothetical scenario currently allowed under Canadian privacy law, but prohibited under the tabled Bill C-11.

A company offers a free app through which users suffering from arthritis can record their symptoms and symptoms alleviation by various medications. The app does not require real names but collects IP addresses, a username, gender, year of birth and information relevant to arthritis pain management with express consent from the user.

Having opened an account, the user records the type of arthritis diagnosis, symptoms, activity levels, medications taken and specific data on symptom alleviation. Essentially, the user is offered a platform to manage arthritis pain.

For a fee, the company provides access to researchers, pharmaceutical companies, physiotherapy clinics, or any other private organization seeking to improve treatment of arthritis symptoms, to the curated, de-identified data, through a strictly monitored protocol that acts as a data trust.

C-11 would prohibit this sharing of de-identified information for innovation even through a data trust. An option would be to seek consent for that data use, but we know from experience that every friction point reduces uptake so the number of volunteers could be too low for valid data analysis. It is also questionable whether that consent could be valid since the purposes could evolve as the research progresses. Another option would be to provide access to the data through a postsecondary education institution, as provided in Section 39 of C-11, but this would subject innovation to the vagaries of academic research and funding, risk misalignment with private sector aims and create a burdensome data-sharing process.

Section 21 of the bill provides some latitude for innovation by allowing an organization to use de-identified information without knowledge or consent for the organization’s internal research and development purposes. This does not allow, however, the sharing of de-identified information with other organizations that could accelerate innovation.

The aim of achieving responsible innovation in Canada requires liberating de-identified data through governance structures to protect privacy – a right enshrined in the Canadian Charter of Rights and Freedoms – and pursue socially beneficial purposes. Public and private organizations, working within the same policy regime, can help achieve this aim.

Chantal Bernier is the national practice leader of privacy and cybersecurity with Dentons Canada and advises leading-edge national and international companies as they expand into Canadian and European markets.

Rohinton P. Medhora is president of the Centre for International Governance Innovation. He sits on The Lancet and the Financial Times Commission on Governing Health Futures 2030, as well as the Commission on Global Economic Transformation.

This piece is based on a longer analysis forthcoming from the Centre for International Governance Innovation.