Skip to main content

World In China, classroom cameras scan student faces for emotion, stoking fears of new form of state monitoring

A Chinese school has equipped several classrooms with cameras that can recognize the emotions of students, introducing a potent new form of artificial intelligence into education, but also raising alarms about a new frontier of state monitoring in the classroom.

handout/Handout

A Chinese school has equipped several classrooms with cameras that can recognize the emotions of students, introducing a potent new form of artificial intelligence into education, while also raising alarms about a novel method of monitoring children for classroom compliance.

The cameras, installed at Hangzhou No. 11 High School, are designed to automatically take attendance and track what students are doing at any moment, including reading, writing or listening.

But they also promise to provide real-time data on students’ outward expressions, tracking whether they look scared, happy, disgusted, sad, surprised, angry or neutral. The system has been touted as a way to ensure students are attentive and happy, learning quickly and, ultimately, scoring well on tests.

Story continues below advertisement

Using the system, installed in March, the school can place students into six behaviour categories: Some might be immersed in learning, while some might be distractions. “We have a minimum score. If a student’s classroom score is lower than that, it means this student is failing to focus during class time,” said vice principal Zhang Guanchao in an interview with The Paper, a Chinese state-run online news outlet. The system will then notify the teacher.

The first month, Mr. Zhang said, has prompted students to “voluntarily change their behaviours and classroom habits,” allowing students to “attend classes more happily now.”

Hangzhou No. 11 High School has positioned itself at the leading-edge of new technology, using facial recognition in its cafeteria and library for the convenience of students, who no longer need to remember meal and library cards.

But details of the emotion-tracking system, initially reported by Caijing media, have sparked a broad debate in China, where some parents raced to fundraise for similar technology in their own children’s classrooms while others raised fears about a new frontier of state monitoring in the classroom.

“Why not install this in university classrooms for political lectures, and score students on the degree of enthusiasm they show?” wrote one person on China’s Twitter-like Weibo social media.

”This technology is so twisted. It’s anti-human,” said Zhang Jing, a 23-year-old photographer who spoke out online about the Hangzhou classroom. He envisioned a future where teachers demand students to smile in class and “then there’s no difference between students and robots, right?”

The advent of emotion-tracking technology comes amid a broader push by Chinese authorities “to use education more and more as a form of social control,” said Jiang Xueqin, a researcher at the Harvard Graduate School of Education, who is a leading expert on the Chinese educational system.

Story continues below advertisement

This technology is so twisted. It’s anti-human.

— Zhang Jing

Work units were once the Communist Party’s key lever of control. Now, in a country where labour is far more mobile, schools are being pressed into that role. Mr. Jiang recently travelled to a school in the Hunan province, where he was told teachers are now turning their attention to educating parents, too, on the importance of ensuring their children understand “correct moral behaviour, the importance of the party and all that.”

“Schools in the future, I think, are going to become these laboratories for mass experimentation on how to control and predict human behaviours,” he added.

Those worried that authorities could take interest in the emotional responses in education can point to history as grounds for their fear.

“Under Maoist penology, the goal was to ‘reform’ the malefactor’s thinking and that reform was judged by manifestations of sincerity,” said Andrew Nathan, a Columbia University political scientist who has studied China for decades.

Story continues below advertisement

George Orwell described and named such a system in his dystopian novel, 1984, calling it “facecrime.” The term has been revived by Chinese internet users discussing the Hangzhou school’s technology.

Chinese firms are hardly alone in pushing the boundaries of computational intelligence.

Microsoft, through its cognitive-services division, says it can parse “anger, contempt, disgust, fear, happiness, neutral, sadness and surprise” in pictures and videos. Affectiva, a Boston-based company, is studying how emotion recognition could be used by retailers to increase impulse buys and by cars to monitor drivers and passengers for drowsiness and discomfort.

Russia’s most prominent facial recognition company, Ntechlab, has also developed emotion-recognition algorithms. But even the best current technology doesn’t work well on an individual basis, warned founder Artem Kukharenko. “It’s very difficult to label the underlying emotion,” he said, since external expression and internal feeling don’t always correlate well.

The best current use is to examine averages of hundreds of people, say, entering and exiting a shopping mall, to see if emotions improve, as a way of evaluating their shopping experience. In a classroom, picking out individual emotions is unlikely to work well, he said. A better use might be to see whether the emotions of an entire class changed over time – a more effective way to assess the teacher than the students.

Some Chinese parents look at the Hangzhou school’s emotion recognition system and see a potential to gain advantage for their children. Some parents took to social media to raise funds to pay for the installation of the technology in their own schools.

Outside China, however, the Hangzhou school’s system has raised alarm.

“My reaction was like, wow, that is scary,” said Jeffrey Ding, a researcher at the Future of Humanity Institute at the University of Oxford, who studies China’s development of artificial intelligence.

Western governments have also employed sophisticated surveillance and facial recognition systems, he said. But such technology often takes on a different complexion when it is used in authoritarian China, whose companies have already begun to export their products.

“It kind of portends a future where this all-seeing eye could be more and more implemented in a lot of our public and private spaces,” he said.

With reporting by Alexandra Li

Report an error Editorial code of conduct
Comments

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • All comments will be reviewed by one or more moderators before being posted to the site. This should only take a few moments.
  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed. Commenters who repeatedly violate community guidelines may be suspended, causing them to temporarily lose their ability to engage with comments.

Read our community guidelines here

Discussion loading ...

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.
Cannabis pro newsletter