Skip to main content
Complete Olympic Games coverage at your fingertips
Your inside track on the Olympic Games
Enjoy unlimited digital access
per week for 24 weeks
Complete Olympic Games coverage at your fingertips
Your inside track onthe Olympics Games
per week
for 24 weeks
// //

This photo taken on June 4, 2019 shows the Chinese flag behind razor wire at a housing compound in Yangisar, south of Kashgar, in China's western Xinjiang region.


As the Chinese government tracked and persecuted members of predominantly Muslim minority groups, technology giant Alibaba taught its corporate customers how they could play a part.

Alibaba’s website for its cloud computing business showed how clients could use its software to detect the faces of Uyghurs and other ethnic minorities within images and videos, according to pages on the site that were discovered by the surveillance industry publication IPVM and shared with The New York Times. The feature was built into Alibaba software that helps web platforms monitor digital content for material related to terrorism, pornography and other red-flag categories, the website said.

The discovery could thrust one of the world’s most valuable internet companies into the storm of international condemnation surrounding China’s treatment of its Muslim minorities.

Story continues below advertisement

The Chinese government has swept hundreds of thousands of Uyghurs and others into indoctrination camps as part of what it calls an anti-terrorism campaign. It has also rolled out a broad surveillance dragnet, using facial recognition and genetic testing, to monitor them. The U.S. government, among others, has denounced the program and penalized Chinese companies that are believed to be involved.

It could not be determined whether or how Alibaba’s clients had used the minority detection tool. But the potential for troubling use is high. A social-media platform, for instance, could automatically flag videos for additional scrutiny or even alert authorities if the videos contain faces that the software predicts are Uyghur.

After the Times asked Alibaba about the tool this week, the company edited its website to remove the references to Uyghur and minority faces.

“The ethnicity mention refers to a feature/function that was used within a testing environment during an exploration of our technical capability,” an Alibaba Cloud representative said in a written statement. “It was never used outside the testing environment.”

The company declined to say more about its testing or explain why information about the feature had been included in the official documentation of its software. It also declined to comment on why it had been testing tools for detecting ethnic minority faces.

Alibaba is a Chinese corporate giant with worldwide reach. It is perhaps Amazon’s sole global peer, a behemoth of digital commerce that has sprawled into logistics, groceries, bricks-and-mortar retail and cloud services. Alibaba’s shares trade on the New York Stock Exchange and are owned by major international investors. Global brands such as Nike, Starbucks and Ralph Lauren use its platforms to sell to Chinese shoppers. Alibaba is the official cloud services partner for the Olympic Games.

But the Trump administration has viewed Chinese technology companies with growing suspicion, particularly those that are seen as participating in human-rights abuses in Xinjiang, the western Chinese region that is home to many Uyghurs.

Story continues below advertisement

The administration last year added 28 Chinese entities, including manufacturers of surveillance gear and artificial-intelligence startups, to a trade blacklist over concerns about their role in the crackdown. Last month, the White House barred Americans from investing in a list of companies with ties to the Chinese military, a step toward severing Chinese businesses’ access to U.S. capital markets.

Chinese officials have defended the campaign in Xinjiang as a non-lethal way of fighting extremism. They have pointed to racial tensions in the United States to deflect U.S. officials’ criticisms.

Surveillance technology has been crucial to China’s efforts. The vast majority of the country’s population is of the Han ethnicity. But members of other ethnic groups can look distinct enough from Han Chinese to make it easier for software to single them out.

The Washington Post reported last week that Huawei, another Chinese tech giant, had tested software that could automatically alert police when its surveillance cameras detected Uyghur faces. The Post’s reporting, which cited a document that had been found on Huawei’s website, led a French soccer star, Antoine Griezmann, to cut ties with the company. He had been a brand ambassador for Huawei’s smartphones.

A Huawei spokesperson told the Post that the tool had been “simply a test.”

Facial recognition technology has posed ethical challenges in many places. In the U.S., the potential for inaccuracy and bias has led some local governments to block the technology’s use for law enforcement purposes. Amazon this year stopped allowing police departments to use its facial recognition service for a year to allow lawmakers to consider stronger regulations.

Story continues below advertisement

The ways in which the technology has been deployed in China have raised starker questions.

Before this week, Alibaba’s website had said that tools for detecting the faces of Uyghurs and other minorities were part of its “content security” service. The service helps Alibaba’s cloud clients flag potentially risky material within the images, videos, texts and documents that are uploaded to their digital platforms.

“As government regulation gets stricter by the day, these are tasks that all websites and platforms must urgently handle and manage seriously,” Alibaba’s website explains. The company is China’s leading provider of cloud services and a partner to international companies that have online operations in China.

The content security service can perform “facial recognition of sensitive people” using still pictures and videos, according to Alibaba’s website. When given an image of a face, the software can look for attributes including whether the person is wearing glasses or smiling, the website’s descriptions say.

Before Alibaba edited those descriptions this week, they had said that the software could evaluate two other attributes as well: whether a person is of Asian descent and whether they are a minority – which, as a description on another page added in parentheses, referred to Uyghurs.

The company’s online documentation in English for the same software contained no mention of detecting ethnic minorities, a possible indication the feature was intended principally for Chinese clients’ use.

Story continues below advertisement

Alibaba had not been alone in China in touting tools for automated racial profiling.

Another Chinese cloud provider, Kingsoft Cloud, had described on its website technology that could use an image of a face to predict “race,” among other attributes. According to a page and a document on Kingsoft Cloud’s website that were discovered by IPVM and shared with the Times, the company’s software could evaluate whether a person’s race was Uyghur or non-Uyghur.

After the Times asked Kingsoft Cloud about the software, the company purged those pages from its website. In a written statement, it said that the tool in question had never been sold to customers and that it had not been able to distinguish Uyghur faces.

The statement said that the software had slipped past the company’s internal review processes and that the company was evaluating those mechanisms to ensure proper oversight.

“The labeling on the basis of any race is inappropriate and inconsistent with Kingsoft Cloud’s policies and values,” the statement said. “Our products will never include any attempt to identify and label specific ethnic groups.”

Story continues below advertisement

Kingsoft Cloud is listed on the Nasdaq stock exchange.

Your Globe

Build your personal news feed

  1. Follow topics and authors relevant to your reading interests.
  2. Check your Following feed daily, and never miss an article. Access your Following feed from your account menu at the top right corner of every page.

Follow topics related to this article:

View more suggestions in Following Read more about following topics and authors
Report an error
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to If you want to write a letter to the editor, please forward to

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

If you do not see your comment posted immediately, it is being reviewed by the moderation team and may appear shortly, generally within an hour.

We aim to have all comments reviewed in a timely manner.

Comments that violate our community guidelines will not be posted.

UPDATED: Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies