Skip to main content

Canada Toronto human rights lawyer sounds the alarm on Canada’s plans to use AI in immigration

Story continues below advertisement

This is part of Stepping Up, a series introducing Canadians to their country’s new sources of inspiration and leadership.

Petra Molnar

Location: Toronto

Biggest influence: "The refugees I've met over the years – they have shown me what true resilience looks like."

Best piece of advice: “Listen to your gut – she knows best.”

Imagine a not-so-distant future, when automated bots appraise refugees' stories about their own lives, probing whether their marriages are real, their children are their own, or whether they pose a security threat. Then imagine these artificial intelligence arbiters meting out inscrutable rulings that push people out of Canada and back to precarious lives back home, where they may face war, oppressive regimes or persecution.

It’s a dystopian scenario newcomers could one day face here, according to Petra Molnar, a Toronto human rights and refugee lawyer who has been steadily shining a light on the more troubling realities of this country’s immigration system. In September, Ms. Molnar co-authored a pivotal report on the ethical perils of Canada’s plans to use artificial intelligence to help vet immigrant and refugee claims.

Petra Molnar in the gardens of Darwin College, Cambridge, U.K., which is part of the University of Cambridge.

Richard Marsham/The Globe and Mail

Ms. Molnar is sounding an urgent alarm. Though technology is often viewed as impartial, it’s anything but, the lawyer argues. Discrimination, bias and violations of due process and privacy are just the tip of the iceberg with unchecked AI assisting or replacing the judgment of human decision-makers in the immigration sphere.

Story continues below advertisement

“These systems will have life-and-death ramifications for ordinary people, many of whom are fleeing for their lives,” read the 88-page report, a joint project between the International Human Rights Program at the University of Toronto’s Faculty of Law and the Citizen Lab at the Munk School of Global Affairs and Public Policy.

As an immigrant who stared down her own difficult circumstances, Ms. Molnar finds herself feeling personally invested in helping people rebuild their lives in Canada. Her parents immigrated to Winnipeg from the Czech Republic in 2000. Family turmoil, including domestic violence, nearly derailed her education.

“My whole childhood was punctuated by really difficult family relationships,” said Ms. Molnar, whose father left her mother 10 years ago. After forfeiting a University of Toronto scholarship to help her single mother back in Winnipeg, Ms. Molnar eventually became the first lawyer in her family.

Read more in the Stepping Up series: Dalhousie Indigenous student showing Canada the way to reconciliation.

“A lot of these issues are personal,” said Ms. Molnar, who articled at Toronto’s Barbra Schlifer Commemorative Clinic, which aids women who have experienced violence. She worked with refugee women who were struggling with trauma and precarious housing and employment as they escaped spouses threatening them and their children with harm.

The work with refugee women and the groundbreaking AI immigration screening research both ignite her “fire” for protecting human rights, Ms. Molnar said.

Story continues below advertisement

This spring, in an attempt to deal with backlog, the federal government piloted an artificial intelligence program to assist with immigration applications made on humanitarian and compassionate grounds – processes for people who often believe they will face harm back home. The use of AI with such immigrants is “a laboratory for high-risk experiments within an already highly discretionary system,” reads Ms. Molnar’s report, co-authored by Citizen Lab research fellow Lex Gill.

Canadians need to shed longstanding myths about artificial intelligence before turning to it for such dire work, Ms. Molnar argues. We often falsely assume that technology is mechanical and objective, even though its algorithms are designed by human beings who hold various biases. This can include prejudiced views about how people look, which religion they practice and where they travel.

In September, Ms. Molnar co-authored a pivotal report on the ethical perils of Canada’s plans to use artificial intelligence to help vet immigrant and refugee claims.

Richard Marsham/The Globe and Mail

There is also a mistaken belief that technology can read people better than people can, even as it is non-sentient and prone to system error. Given AI technologies are in their infancy, Ms. Molnar warns that they may be too oversimplified to offer nuanced appraisals of people in complex, high-risk situations.

Ms. Molnar has met with government officials to call for transparency and accountability. The lawyer wants to see the creation of an independent task force to ensure the technologies fall within domestic and international human rights laws. She’s urging a freeze on the roll out of such systems until standards, safeguards and robust appeals processes are in place.

In the field of human rights law, advocacy usually arises after people are violated. The AI work is unique because it looks to prevent future harms. “This was uncharted territory. There was no meaningful focus on this before [Ms. Molnar’s] report,” said Samer Muscati, director of the International Human Rights Program.

Prior to the AI research, Ms. Molnar had been working on migrants’ rights for a decade, on the front lines near the Syrian-Turkish border and closer to home, helping resettle refugees in Toronto. Mr. Muscati recalls her sensitivity working with undocumented migrant workers from the Philippines. “She’s able to do very delicate interviews on tough issues," Mr. Muscati said. “One of the challenging parts of this job is to be able to have these types of relationships with people you just met and to win over their trust. You can only do that if you’re a genuine person who has humility.”

Story continues below advertisement

Today, alongside the work on AI-assisted border controls, Ms. Molnar is also investigating the trauma of immigration detention centres, where thousands of migrants and asylum seekers are held each year in this country. For refugees escaping from conflict zones and dislocated from home, Ms. Molnar sees long-term mental health harms in detention. “It stays with people,” she said.

The lawyer’s fears about immigration detention and automated border technology echo graver concerns. Ms. Molnar is uneasy about what she views as a resurgence of xenophobia, especially around ideas of “old stock Canadians” versus “others” in this country.

“Canada needs to look at how we are thinking through these issues and why every couple of years, we are falling back on these tropes of being ‘overrun by migrants,’” Ms. Molnar said. “At the end of the day, if you’re not Indigenous, we’re all newcomers. We just arrived at different times.”

Read more in the Stepping Up series: Dalhousie Indigenous student showing Canada the way to reconciliation.

Report an error Editorial code of conduct
Comments

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • All comments will be reviewed by one or more moderators before being posted to the site. This should only take a few moments.
  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed. Commenters who repeatedly violate community guidelines may be suspended, causing them to temporarily lose their ability to engage with comments.

Read our community guidelines here

Discussion loading ...

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.
Cannabis pro newsletter