Skip to main content

Google senior vice-president and general counsel Kent Walker appears before the House Intelligence Committee on Capitol Hill in Washington on Nov. 1, 2017.Aaron Bernstein/Reuters

A senior Google executive is urging Canadian lawmakers not to go too far as they prepare to debate a wave of government bills aimed at imposing new rules on web giants.

In a recent interview with The Globe and Mail’s editorial board, Google’s president of global affairs and chief legal officer, Kent Walker, said the company is open to new requirements. But he cautioned that some proposals circulating in Canada and other countries could dramatically worsen people’s online experiences.

Mr. Walker noted that Canada is among many countries that are debating new ways to regulate the internet. He said these proposals include ideas that could be workable and acceptable to Google, while others are causes of concern for the company.

“And the closer you get to that extreme, the more concern,” he said. “Whether that’s on bespoke content regulation, or local content requirements, or government mandates for link taxes and other sorts of things – any flavour of one of those could actually really be bad.”

Large, U.S.-based tech companies have shown a willingness to push back against government measures they don’t like. Google threatened to pull its search services from Australia last year and Facebook temporarily banned news from its platform as the two companies opposed the country’s legislation requiring platforms to support the news sector financially. The companies and Australia ultimately reached a compromise.

Google also vowed last year to take legal action against the German government over that country’s latest hate-speech legislation.

Canada’s federal Liberal government has indicated that three pieces of internet-related legislation will be priorities when the House of Commons resumes sitting on Jan. 31. The bills include a Broadcasting Act update, which was introduced as C-10 in the previous Parliament but not passed. The bill aims to bring platforms like Netflix, Disney + and Google under some of the rules that already apply to traditional broadcasters. That would make the platforms subject to requirements to fund and promote Canadian content.

A second bill will be aimed at addressing harmful content online, such as child exploitation, terrorist content and hate speech. The government introduced C-36 related to hate speech in June and held consultations over the summer on broader measures to address harmful content online. It’s unclear whether this policy area will be covered in one bill or two in the new Parliament.

A third bill, which the government has promised to introduce but so far has not, would require platforms that generate revenue from news content to share a portion of that revenue with Canadian news outlets. The bill is inspired in part by Australia’s law.

Ottawa also plans to pursue a fourth bill with implications for internet policy: the Consumer Privacy Protection Act, previously known as C-11, which the government has said will advance the federal “Digital Charter” and set clear rules to ensure fair competition in the online marketplace.

Discussions around C-10, the broadcasting bill, generated considerable controversy last year in the previous Parliament. Conservative MPs and some policy experts said the proposals would impose excessive government interference online. The Bloc Québécois and the NDP were generally supportive of the bill, siding with arguments from members of Canada’s arts sector, who said rules are urgently needed to ensure streaming platforms support Canadian programs.

Debates in the previous Parliament included a high degree of speculation over how the government’s proposals would have applied in practice. That was due in part to the fact that both C-10 and C-36 proposed granting extensive powers to independent regulators, who would have determined at a later date exactly what rules would have been required to achieve their legislated mandates.

The global and Canadian debates over new internet regulations have included discussions over whether governments should require platforms to be more transparent regarding the algorithms used to determine which results appear in web searches, or which videos or songs are recommended on platforms like YouTube and Spotify.

An independent broadcasting and telecommunications review commissioned by Ottawa highlighted the role of algorithms in its 2020 report, “Communications Future: Time to Act.” The report made suggestions for new internet regulations, including that online service providers should be more forthcoming about the factors that influence content recommendations.

“Algorithms are one of the key tools that enable misinformation and manipulation, causing direct and indirect social harms when used for delivering news and information to the public,” the report says.

Two American Democratic senators introduced a bill last year that would force online platforms to disclose how algorithms use personal information and how they act to promote certain types of content.

“It is time to open up Big Tech’s hood, enact strict prohibitions on harmful algorithms, and prioritize justice for communities who have long been discriminated against as we work toward platform accountability,” Senator Edward Markey said in a statement when the bill was introduced in May.

Asked during his Jan. 14 interview with The Globe about the calls for more transparency related to algorithms, Google’s Mr. Walker said making such changes risks doing more harm than good. He said Google is constantly updating its algorithms to prevent “black hat” individuals from gaming the system.

“It’s very challenging, to be direct about it,” he said.

People think of algorithmic transparency proposals as being useful, he added. “But much of our value is to keep people from giving you irrelevant or even harmful information. So the more we have to disclose about exactly what we’re looking at, what signals are we trying to use to detect these kinds of bad actors who’ve gotten very sophisticated, the easier it is for them to circumvent those systems … So it would actually degrade the quality of the results if we made the algorithm completely transparent.”

Editor’s note: An earlier version of this story said incorrectly that Bill C-36 in the previous parliament dealt with child exploitation, terrorist content and hate speech. In fact, the content of that bill focused on hate speech. The government held consultations last year on proposals for broader legislative and regulatory measures related to online harms, including child exploitation and terrorist content.

For subscribers: Get exclusive political news and analysis by signing up for the Politics Briefing.