Skip to main content
Open this photo in gallery:


Wendy H. Wong is a professor of political science and principal’s research chair at the University of British Columbia, Okanagan. Her latest book, We, the Data: Human Rights in the Digital Age, is a finalist for this year’s Lionel Gelber Prize, presented by the Munk School of Global Affairs & Public Policy.

Like any other invention, AI is an artifact of human ingenuity. At the moment, the models that are attracting the most attention rely on inconceivable amounts of data. Not just any data: data that mirror human behaviour, choices and creativity. It is critical to be cautious about the widespread deployment of AI that is dependent on human-generated data.

Advances in AI technologies feel harried. And in terms of human time, they are. Motor vehicles were in development for nearly 200 years before the first modern version was produced. While computerized AI has been in development since the 1950s, it was not only until the beginning of this century that its capacities became what we know today. That’s partly because vast amounts of data about people became available to power deep-learning models. Before, we simply didn’t have the data necessary to train machines to act as though they’re human.

There are several ways that the hastiness of AI can leave us vulnerable to rash choices if we don’t account for the ways such technologies change us. To start, we haven’t fully acknowledged how the data that enable the seamlessness of our experiences with AI such as ChatGPT is not only gathered from humans, but is also sticky.

Data stick like gum does at the bottom of your shoe: easy to step on, exceedingly difficult to take off. Data are largely about mundane things – everyday choices and behaviours you can’t easily change, such as your gait or your morning routine. Data are sticky because we make data in order to generate important insights and make predictions by linking them to other data. Data stick because it’s effective forever, once created. It’s hard to know something has “really” been deleted. Perhaps most sticky of all, data are co-created. The easiest way to think about this is that you and I are sources of data through our behaviours. Data collectors strategize what behaviours they care about, and then systematically create data about them. Sources and collectors together co-create data.

All of this stickiness affects international human-rights values: autonomy, dignity, equality and community. Currently, data about people are traded as commodities. But data come from rights-bearing individuals. Why do we treat human data as though its sources do not have inherent dignity? We do this in spite of the fact we know data about people can have negative effects on those very people they come from. It’s not just inconvenient. It can be a contravention of human-rights values. Data stickiness can be risky and even dangerous – false arrests, denial of essential services – especially for people who are from marginalized communities or who are trying to build on regrettable pasts. Kids today feel trapped, unhappy and dependent on the algorithms that feed them content.

Finally, one must ask: are the benefits worth the costs? AI’s overall social good has not yet been articulated. In some applications, it seems to hold promise, from counselling to creating vaccines. AI has clearly changed software programming.

But what about other areas? AI leaders tell us they can use it to solve climate change. Yet, we know the most advanced AI systems are energy hogs. Companies unleash AI that writes “original” work. There is a pending New York Times lawsuit against Microsoft and OpenAI that accuses ChatGPT of copyright infringement. Furthermore, researchers have found that AI systems that are fed AI-generated writing over time produce garbage (“model collapse”) without infusions of new human ingenuity (i.e. data).

Digitized human interactions have fundamentally challenged our analog understandings of autonomy, dignity, equality and community. What’s different about AI is the sticky data. As a species, we value tools to help individuals and societies do more or better. At the moment, what we face is a world in which unfettered data creation for data-hungry AI disrupts collective experience, all without a clear articulation of what social good could emerge from those disturbances.

Interact with The Globe