Skip to main content

How long would it take for government to regulate an innovative product that demonstrably made people sadder and sicker? And what if the maker was revealed to misuse client information and then fib about it?

Would it take weeks for the state to act? Months?

For the world’s pre-eminent social network, the answer is five years and counting.

Story continues below advertisement

Leaving aside the recent revelations involving Cambridge Analytica, it’s been clear since at least 2013 that there is a credible link between Facebook usage and clinical depression, among other adverse mental-health effects. At least 65 academic papers exist on the subject.

Should we compel Facebook, and internet giants like it, to submit to the same consumer, product safety and liability laws as car companies, Big Pharma or food manufacturers?

It’s not a new idea: Then-New York University law professor James Grimmelmann proposed it in a seminal 2010 paper as a framework to regulate data privacy.

The better question is: Why has no one experimented with his ideas in a serious way?

Some of it is surely due to the way governments, such as Canada’s, approach the tech sector – usually as a matter of telecommunications or commercial policy. Another part of it relates to the tech sector’s broad refusal to countenance oversight, laws and national borders.

But that shouldn’t stop us from thinking about this issue. Around the time Mr. Grimmelmann published his paper, tech researcher Danah Boyd suggested in a New York Times essay that data and privacy should be thought of as a “social utility,” and that companies like Facebook could be made to live up to their obligations the way a gas company must.

It’s a thought-provoking premise, but one that applies better to some aspects of the internet than others.

Story continues below advertisement

Internet service providers, for example, can be likened to electrical grids or municipal water supplies; we regulate those closely in terms of pricing, the way they are planned and built, and who has access to them.

That’s appropriate for the companies who control the pipes through which the internet flows, but what of the companies, like Facebook, that harness those utilities for consumers? Some experts, including Harvard professor Susan Crawford, believe the “utility” label on software firms mostly serves to muddy the definitional waters to the advantage of service providers.

Furthermore, it’s a fool’s errand to try to police internet content – even surveillance states like China can’t do it unless tech companies play ball (many do).

But it is still possible to restrict the ways tech companies compile and sell our personal data.

Some of that could be done via updated privacy laws that, for example, require opt-in consent for information sharing.

The Centre for International Governance Innovation added another ingredient in a recent paper: The servers designed to “collect environmental or human data must be either owned by, or wholly accessible to, government.” In turn, open architecture would force transparency not just on industry but also on government.

Story continues below advertisement

Another intriguing option is to view data the way we do health records or legal advice: digital companies as information fiduciaries, with legally enforceable restraints.

We could also start treating software engineers like actual, proper engineers (again, this is not a new idea). There’s a certification process for engineers who design buildings and bridges, as well as professional standards, ethics codes and disciplinary mechanisms. Why not for software designers?

How would internet sites and apps look today if their designers had worked under codified obligations that made them responsible for their work?

Hindsight bias notwithstanding, it’s hard to imagine the unfettered development of apps that prey on psychological vulnerabilities and steal personal data from unwitting users.

And would software coders, in the context of a professional order, have gotten away with creating systems that could be manipulated by foreign powers to influence elections?

No one is advocating for a state-controlled internet, but the era of the unregulated online world is coming to an end. Those who create the new rules need to find solutions that are well-adapted to the realities of the internet.

Story continues below advertisement

Which means no more old and limited thinking, something there has been too much of in Canada to date.

Report an error Editorial code of conduct
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed.

Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies