Skip to main content
Canada’s most-awarded newsroom for a reason
Enjoy unlimited digital access
$1.99
per week
for 24 weeks
Canada’s most-awarded newsroom for a reason
$1.99
per week
for 24 weeks
// //

For the past six years, Toronto-based NuraLogix has been working on technology that translates blood flow patterns on a person’s face into insights about their health and psychology.

Based on a 30-second video, the company’s app uses artificial intelligence to track changes in the translucency of facial skin, enabling health professionals to remotely monitor high blood pressure and cardiovascular disease.

These measurements can be potentially used to determine an individual’s emotional state – and even detect lies. This leads to another, more sinister potential application for the technology as a public security tool.

Story continues below advertisement

That has garnered interest from Chinese companies selling mass surveillance systems, leading human-rights researchers to question the implications of NuraLogix’s work. The Canadian firm at one point signed an agreement to share information and resources with Joyware Electronics, a Hangzhou-based firm with government ties. No commercial products for public security resulted from the deal, according to NuraLogix’s CEO.

Maya Wang, senior China researcher for Human Rights Watch, said that Chinese public security companies are keenly interested in augmenting mass surveillance systems with emotional recognition technology. “China’s mass surveillance system emphasizes facial recognition, but the intention is to develop a portrait of citizens, comprising DNA, voice biometrics, iris scans, and gait information,” she said. “Some companies are arguing for emotional recognition as a next step along from face recognition.”

While NuraLogix insists its focus is now digital health, the company has its genesis in the pioneering work exploring deception in children by its co-founder, professor Kang Lee of the University of Toronto’s Ontario Institute for Studies in Education.

Dr. Lee has decades of experience in the field, but his work at U of T involved mapping how children’s faces responded to images designed to elicit disgust, joy and sadness, as well as a so-called “neutral” emotion.

“We trained [a] machine using facial blood flow data to tell what emotional state the person is in,” Dr. Lee said in an interview. “The idea was to take this tech for use in lie detection.”

U of T’s Innovations and Partnerships office and Toronto Innovation Acceleration Partners, a non-profit academic incubator, assisted Dr. Lee with launching NuraLogix in 2015. The Natural Science and Engineering Research Council (NSERC), a federal agency, provided grant funding worth $115,000 to enable NuraLogix to purchase new equipment for Dr. Lee’s U of T lab, where the company retains a base.

NuraLogix has since garnered international plaudits for pioneering Transdermal Optical Imaging (TOI). Users of NuraLogix’s Anura app take a 30-second video of their face, which is then analyzed in near real-time by an AI engine, DeepAffex, trained to recognize how the translucency of facial skin changes in relation to heart rate.

Story continues below advertisement

In 2016, Dr. Lee pitched the technology to the U.S. Department of Homeland Security and Transport Security Administration in Boston. At that point, just one year after it was founded, NuraLogix presented TOI as “a new frontier of threat and deception detection [that] uses conventional video cameras to remotely, non-invasively, and covertly reveal hidden emotions associated with threat and deception.”

Dr. Lee said that although they had set out to give a talk about the future of airport security, the U.S. government officials suggested health care would be a more beneficial application for a technology capable of remotely detecting shifts in blood pressure and stress levels. “So, on the way back to Toronto, we decided that we would not do anything with lie detection for the time being, but we would help people with their health,” Dr. Lee said.

NuraLogix has subsequently signed several high-profile partnerships for its health and wellness applications, including with Sanitas, a leading Spanish medical insurance company.

But in 2018, despite the advice of the U.S. security officials, the company agreed to co-operate with Hangzhou-based Joyware Electronics to develop emotional recognition technology for use in public security settings such as railway stations and airports.

Joyware sells facial recognition-enabled public security services, including the “Sharp Eyes” system that has been used to persecute Uyghur Muslims in Xinjiang, where Canada’s Parliament has declared a genocide is ongoing. Sharp Eyes combines advanced surveillance technologies with crowdsourced monitoring hearkening back to the Mao era, that could potentially be used to infringe on individual privacy and to persecute dissent in China, according to Dahlia Peterson, a research analyst at the Center for Security and Emerging Technology at Georgetown University.

NuraLogix (Hangzhou) Artificial Intelligence Co., NuraLogix’s wholly owned affiliate in China, agreed to exchange information and resources to assist Joyware with detecting psychological stress, blood pressure and breathing rate, according to a disclosure to the Shenzhen Stock Exchange.

Story continues below advertisement

The companies aimed to develop two major product lines: AI-enabled public safety analysis systems and polygraph products that can detect emotions in interrogation settings, the disclosure said.

While Joyware was ostensibly a private company when the agreement with NuraLogix was signed, experts suggest that any public security company working in China will, by definition, maintain an integral relationship with the state.

“The Party and the Chinese entity are partners, working towards a common goal – to acquire cutting edge foreign technology for the benefit of China as a whole,” said Steve Dickenson, a lawyer at Harris Bricken who co-authors the popular China Law blog.

In August, 2020, Joyware’s controlling shareholder transferred an 8-per-cent stake to an investment fund backed by a local city government in Henan province, which assumed control of the company after the transaction. The transaction indicates that China’s local government was a significant minority shareholder prior to the equity transfer, suggesting that NuraLogix had been licensing its technology to a government-affiliated company throughout their relationship.

Indeed, Joyware chairman Xu Gangshi remained in place after the share transfer. Mr. Xu is on the board of the China Security and Protection Industry Association and is a vice-president of the Zhejiang Association of Security Technology and Protection.

“That’s huge, especially if it is in Zhejiang province, which is where you have a lot of public security companies based in Hangzhou,” said Shazeda Ahmed, a Berkeley School of Information PhD student focusing on cybersecurity and internet policy in China.

Story continues below advertisement

“A local city government is the state, and if the use cases are related to public security, then you are working with the Ministry of Public Security,” Ms. Ahmed added. “Why would the government buy this technology and not use it for public security?”

NuraLogix told The Globe and Mail that while it did explore public security applications with Joyware, the co-operation did not result in any commercial applications. “Yes, we originally said there is a market for public security here in China, so let’s explore that, but we have done nothing with Joyware,” NuraLogix chief executive Marzio Pozzuoli told The Globe.

NuraLogix maintains that it is not aware of its technology ever being used to test surveillance systems on the Chinese public without their consent, or of any products or services sold by Joyware in China that use its technology.

The company allowed the agreement with Joyware to expire shortly after the share transfer, in November, 2020. Joyware can no longer experiment using NuraLogix’s technology because it no longer has access to the DeepAffex engine.

NuraLogix said that it had sold its products to two universities in China to assist with assessing students’ mental health. Students consented to these trials, which ended in 2020.

The Globe has reported on the introduction of cameras at a Chinese school in Hangzhou that can recognize the emotions of students and track their movements in class.

Story continues below advertisement

NuraLogix is not alone in licensing emotional recognition software to companies affiliated with the Chinese government. An Amnesty International investigation found that three EU-based companies exported similar digital surveillance technology that is now used by Chinese public security bureaus and criminal law enforcement agencies, including those in Xinjiang.

Article 19, a freedom of expression advocacy group, has called for emotional recognition technology to be banned outright after conducting an investigation into the human-rights implications of the technology’s use in China.

Its November, 2020, report warns that claims about the efficacy of emotional recognition are often accepted at face value in China, and that the concept is gaining traction in areas including public security, education and driving.

“We are saying ban emotional recognition outright so we don’t end up in the same weeds we see with facial recognition, where the argument is that if you could just iron out bias it would be OK,” said Ms. Ahmed, who co-authored the report with Article 19′s digital programme officer, Vidushi Marda. “It would not be OK, it would still be terrible.”

Critics say Canadian companies have no place sharing technology that claims to be able to infer people’s emotions, particularly if they assist Chinese public security companies with refining surveillance systems.

“A Canadian company working in China cannot do that for all sorts of reasons,” said Andrew McStay, a professor of digital life at Bangor University in Wales, who questions the effectiveness of the technology itself. “You cannot gauge whether a person is happy or sad by looking at their face – you cannot infer what a person is really feeling, or which emotion they are going through, it just does not work.”

Story continues below advertisement

For Prof. McStay, the use of emotional recognition in China raises deep ethical and privacy concerns.

“For the Canadian company to be [licensing] this in security contexts in China – that’s profoundly wrong,” Prof. McStay said, adding that the issue raises questions over how NuraLogix’s research came to be funded.

“It’s an interesting question for the Canadian government; maybe it needs to be discussed under the same banner as arms sales and dealership.”

Dr. Lee’s research has been funded by multiple Canadian government grants since he emigrated from China in 1989. He graduated with a Bachelor of Science in Psychology from China’s Hangzhou University in 1983, but moved to Canada in the early nineties to take up a teaching post at the University of New Brunswick.

One of the agencies supporting his work, NSERC, has been criticized for failing to safeguard Canada’s economic and national security after collaborating with the Canadian arm of Huawei Technologies Co. Ltd. to fund leading-edge computer and electrical engineering research at Canadian universities.

“Our experience with universities that have this kind of co-operation with China’s MPS or surveillance companies, is that even the world’s top universities have no process to screen from a human rights perspective,” said Human Rights Watch’s Ms. Wang. “That loophole needs to be closed.”

Our Morning Update and Evening Update newsletters are written by Globe editors, giving you a concise summary of the day’s most important headlines. Sign up today.

Your Globe

Build your personal news feed

  1. Follow topics and authors relevant to your reading interests.
  2. Check your Following feed daily, and never miss an article. Access your Following feed from your account menu at the top right corner of every page.

Follow topics related to this article:

View more suggestions in Following Read more about following topics and authors
Report an error Editorial code of conduct
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

If you do not see your comment posted immediately, it is being reviewed by the moderation team and may appear shortly, generally within an hour.

We aim to have all comments reviewed in a timely manner.

Comments that violate our community guidelines will not be posted.

UPDATED: Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies