Jim Balsillie is an entrepreneur, philanthropist and founder of the Centre for Digital Rights
Privacy is more than just the “right to be left alone.” It allows us to be humans, citizens and consumers in ways that are authentic and fulfilling. Privacy is a fundamental human right that serves as a gateway to other rights and freedoms such as freedom of expression, individual and collective autonomy, and freedom from harassment or invasion. Privacy is critical for the healthy development of the human brain, identity, close relationships and social existence. It plays a central role in protecting against tyranny by allowing creativity and individuality to flourish. As privacy scholar Beate Roessler put it, “True realization of freedom, that is a life led autonomously, is only possible in conditions where privacy is protected.”
But more than just a personal predilection, privacy is an important functional requirement for the effective operation of society, from democratic governance to competitive markets. This is especially true in today’s surveillance-based economy. Data about an individual have enormous spillover effects, enabling the tracking and profiling of adjacent individuals and whole communities, creating and exacerbating deep asymmetries of information, which lead to an imbalance of power. Data are uniquely versatile – they can be reprocessed and analyzed in new ways in the future that are unanticipated at the time of collection. This has profound implications not just for the privacy of individuals but for the operation of the economy, national security and democracy.
Behavioural monitoring, analysis and targeting are no longer restricted to unscrupulous social-media companies but have spread across all sectors of the economy, including retail, finance, telecommunications, health care, entertainment, education, transportation and others. In the early 21st century, every business became a technology business – turning every internet-enabled device, product or service into a supply chain interface for the unobstructed flow of data used to power the contemporary surveillance economy. Many valuable products and services became free of charge only to be later found ruinously expensive in other ways. As Harvard professor Shoshana Zuboff has shown in her research, surveillance-based business models have not just resulted in a wholesale destruction of privacy but brought with them intensification of social inequality, mental-health crises among children and youth, poisoning of social discourse with misinformation, demolition of social norms, and weakening of competitive markets and democratic institutions.
Yet Canada’s federal government has repeatedly failed to take privacy seriously and construct a legal and regulatory framework that protects the rights of Canadians in the digital age. The federal government’s most recent attempt at private-sector privacy regulation, the Digital Charter Implementation Act 2022, normalizes and expands surveillance and treats privacy as an obstacle to corporate profits, not as a fundamental human right or even a right to effective consumer protection.
After years of cozying up to Big Tech and meeting with its lobbyists as often as twice a week, the Canadian government is finally coming to terms with the fact that the digital economy needs to be regulated. In November, 2020, the federal government introduced an update to Canada’s private-sector privacy law aiming to replace the 20-year-old Personal Information Protection and Electronic Documents Act with the Digital Charter Implementation Act 2020. The proposed law was so untenable it was quickly dismissed by the federal Privacy Commissioner, data-governance experts, civil-liberties associations, digital-rights activists and consumer-protection organizations. Among its many flaws, the proposed law stripped away previously established privacy protections for children, youth and other vulnerable persons. Facebook, which planned to introduce Instagram for Kids, was undoubtedly thrilled, but such flagrant regulatory capture was impossible to defend to parents and concerned citizens, so this ill-conceived proposed law died on the order paper in September, 2021.
In June, 2022, the federal government introduced an updated proposed law promising “world-class” privacy protections but delivering high-minded moral language that conceals a range of unjustifiable business surveillance practices in generous carve-outs and exceptions.
The Digital Charter Implementation Act 2022, like its predecessor, doubles down on a foundationally flawed approach to privacy and data governance. It uses a “notice and consent” model as its legal framework, which creates a pseudo-compliance system – one that enables large-scale personal data harvesting and intrusive digital profiling by continuously spamming everyone with annoying and misleading “consent” banners.
Facebook notoriously builds “shadow profiles” of individuals from information gathered from their contacts, even those with no Facebook accounts, without asking their consent. These practices have led privacy and data-governance experts to argue that a consent model represents an insufficient legal framework for redressing the lopsided relationship between technology users and data collectors and processors. Some privacy experts go so far as to label current consent-based contracts between individuals and data intermediaries invalid, calling them “immoral and unconscionable.”
But the federal government prefers the consent model because such a framework allows for the creation of exceptions so businesses can continue unrestrained monetization of consumer surveillance and behaviour modification. For example, the proposed new law creates a broad carve-out for surveillance without knowledge or consent based on “legitimate interests.” Facebook can still track you across the web even if you don’t have a Facebook account, Tim Hortons can still know where you work, sleep and travel, and data brokers can still access and sell all your children’s personal information while they use educational tools. To make matters worse, it’s the businesses themselves that determine what constitutes “legitimate interest” for surveillance and they are under no obligation to tell the individual they are tracking and profiling them. Canada’s leading privacy and data-governance expert, Teresa Scassa, says this carve-out “trivializes the human and social value of privacy.”
The desire to leave most surveillance unregulated is why the federal government refuses to designate privacy as a fundamental human right. This is particularly ironic given that Canada is a signatory to numerous conventions that have declared privacy a fundamental human right, including the UN Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights, making Canada a country that respects a higher standard of privacy abroad than at home. What is needed is the inclusion of privacy as a human right in the body of this legislation that gives this human right legal effect.
The EU’s landmark General Data Protection Regulation (GDPR), a law that sets the baseline for modern privacy protections around the world, also designates privacy a fundamental human right, as does Quebec’s privacy law. The Privacy Commissioner of Canada has not only asked for privacy to be declared a fundamental human right but obtained a legal opinion in March, 2022, confirming that such an approach is constitutional. Yet the federal government has yet to provide a legal explanation, justification, or even a constitutional law opinion refuting the one recently obtained by the Privacy Commissioner.
Another area that the federal government is unwilling to take seriously is protecting children and youth from surveillance. In their proposed November, 2020, privacy law, pre-existing privacy protections for children, youth and other vulnerable persons were inexplicably stripped away. This was nothing short of outrageous especially given highly publicized fines for companies such as Google that unlawfully track minors, and the abundance of harmful practices that have emerged from social-media companies manipulating their underage users.
The 2022 proposed law gestures at children’s needs for privacy by calling their data “sensitive” but contains no measures that curtail the prevailing online surveillance and behavioural manipulation practices or even reduce the incentive for companies to track, profile and target children, youth and other vulnerable persons. There is nothing in the Digital Charter Implementation Act 2022 that forces companies such as Google, Facebook or TikTok to treat the privacy of children and youth differently than they do now. This is no way to protect minors or even to prepare them to handle what privacy experts call “the most highly surveilled environment in the history of humanity.”
The federal government’s approach to protecting the privacy of minors runs counter to what progressive jurisdictions around the world are doing, such as developing and passing laws that pay special attention to the commodification and extraction of children’s data. Last month, California unanimously passed the bipartisan California Age-Appropriate Design Code Act to ensure that Big Tech platforms prioritize the safety, well-being and privacy of children and put an end to the pervasive tracking, targeting and manipulation children face online. In Europe, the U.K.’s Children’s Code, developed by privacy experts led by Baroness Beeban Kidron, has served as a model for jurisdictions around the world that want to protect their kids from online harms.
When the Children’s Code passed into law in 2020, Britain’s privacy commissioner at the time said future generations will be “astonished to think that we ever didn’t protect kids online.” Canadians reading the proposed 2022 bill will be equally astonished at how our lawmakers used creative legal language to give our kids absurdly minimal online protections.
This year also marked the unexpected arrival of the Artificial Intelligence and Data Act (AIDA) in which the federal government gallops through content-free pages only to arrive at a deeply flawed outcome: There will be no independent and expert regulator in Canada for automated decision systems. (An automated decision system is a computerized process using data, machines and algorithms to make decisions in a defined context to assist or replace the judgment of humans.) The proposed law fails to provide even the shell of a framework for responsible artificial-intelligence regulation and oversight. All the regulations will be determined at some future date and decisions will come from the Minister of Innovation, Science and Economic Development (ISED) or the minister’s designate. Digital-governance experts have pointed out that the same person drafting laws and regulations as well as providing oversight and enforcement runs completely counter to the OECD’s guide on AI regulations.
But bad governance is the point of this law. It includes the creation of the “Personal Information and Data Tribunal” and “AI and Data Commissioner,” which move enforcement decision-making to the minister or the minister’s designated ISED official. Both of these appointments are political and are designed to overturn the Privacy Commissioner’s decisions and undermine its independence and expertise. There is also no mention of regulating facial-recognition technology – one of the most pernicious surveillance techniques – despite the fact EU regulators are moving toward banning it.
This effort to advance corporate interests over citizen rights is also visible in the federal government’s refusal to give Canadians the ability to contest automated decisions that are used in numerous life-changing situations such as school admissions, credit scoring and insurance. Such rights are provided in European privacy law, Quebec’s new privacy law and are proposed in the upcoming Ontario privacy law update.
Out of the gate, AIDA has swiftly come under heavy criticism from critics ranging from the chair of the government’s own AI Council to advocates for marginalized communities for failing to curb the range of harms caused by automated decisions systems. It is focused narrowly on individual harms. There is no evidence in the proposed law that the federal government understands that much of the current controversies over social media’s harmful application of automated decision systems revolve around issues of collective harms such as social polarization and the erosion of democratic norms.
For all the creative effort to advance corporate interests over citizen rights, the government cannot even claim that it’s advancing the Canadian economy and innovation. The Digital Charter Implementation Act 2022 fails to consider the unique characteristics of data as the most valuable economic and security asset in today’s economy. Data’s network effects, what economists call an “economy of scope, scale and information asymmetry,” means that the more data a company gathers, the more value it gains from it. Every new data set makes all pre-existing data sets in the hands of the same few companies more valuable, disproportionately enhancing the power of established data giants and their vested assets – all of whom are foreign.
This unique structure of the data-driven economy has in less than a decade fostered the greatest market and wealth concentrations in economic history, reduced the rate of entrepreneurship, innovation and business dynamism, and lowered wages – all outcomes under investigation by U.S. federal and state and European antitrust authorities. Properly regulating insidious data collection and trafficking would not only address the concentrations of Big Tech’s economic power but also force all businesses to compete on the level of quality and innovation, not surveillance and manipulation.
The surveillance economy and the unprecedented centralization of information in the hands of a few companies is not an inevitable result of digital technology. Rather, it’s the outcome of a legal, economic and policy architecture that industry lobbyists and lawyers have designed, and which politicians have allowed, permitting entire industries to work against the public interest. The Digital Charter Implementation Act 2022 is not a serious proposal to modernize privacy laws in Canada, it’s regulation theatre.
It’s the cynicism of the proposed law that stings most. The document opens with a lofty preamble that declares commitments to privacy as an inalienable right and to protecting the online privacy of children and youth, only to descend into pages of deceptive and overly broad carve-outs and exceptions that undermine these purported values. It’s genuflecting to vulnerable Canadians while designing what Erica Ifill, an advocate for marginalized communities, calls “a regulatory regime that threatens to allow the systemic discrimination of Black, Indigenous, people of colour, non-binary and transgender people in Canada.” It’s the cynicism of declaring that the proposed law delivers the “highest fines” for privacy violations in the G7, undercut by the narrowest possible definitions of harm to ensure achieving such fines is impossible. It is creating novel definitions for “de-identified” and “anonymized” personal data to allow businesses an ever-widening scope of surveillance under the guise of privacy protection.
Lobbyists for Big Tech will find a lot to inspire them in the Digital Charter Implementation Act 2022, but responsible Canadians will not fall for this mischief just as they didn’t fall for the federal government’s proposed 2020 private-sector privacy law. Canada needs privacy regulation that builds trust in the digital economy where Canadians can take advantage of new technologies without being needlessly tracked, profiled and discriminated against. Meaningful privacy and data regulation must encourage trustworthy information relationships between organizations and individuals and reflect the changed power dynamics that arise in the data-driven economy.
The Digital Charter Implementation Act 2022 sets a dangerous precedent by prioritizing corporate interests, allowing corporations to allocate to individuals, including children, youth and other vulnerable persons, the harmful economic, political and social consequences of the surveillance economy. The federal government must do the hard work of making the digital economy and its business practices compatible with our economy, democracy and the human rights of free people. The latest privacy regulation proposal is evidence they don’t know how to do this, or, worse yet, they have no intention to do so.
Digital privacy: More from The Globe and Mail
The pandemic was a boom time for online learning platforms, but many of them collected data on childrens’ daily activities in ways that violaged their privacy rights, Human Rights Watch warned in a report earlier this year. Lead researcher Hye Jung Han spoke with The Decibel about their findings.