Skip to main content

In this Nov. 2, 2017, file photo, a police officer guards a security post leading into a centre believed to be used for re-education in Korla in western China’s Xinjiang region.

Ng Han Guan/AP

Barely seven months ago, a senior Chinese official promised that artificial intelligence could one day help authorities spot crime before it happens.

In the country's far western Xinjiang region, it's already happening, with the establishment of a system that critics call "Orwellian" in scope and ambition, and which is being used to place people in political re-education.

Called the Integrated Joint Operations Platform, or IJOP, it assembles and parses data from facial-recognition cameras, WiFi internet sniffers, licence-plate cameras, police checkpoints, banking records and police reports made on mobile apps from home visits, a new report from Human Rights Watch finds.

Story continues below advertisement

If the system flags anything suspicious – a large purchase of fertilizer, perhaps, or stockpiles of food considered a marker of terrorism – it notifies police, who are expected to respond the same day and act according to what they find. "Who ought to be taken, should be taken," says a work report located by the rights organization.

Another official report shows how reports generated by IJOP are used to send people to an "Occupational Skills and Education Training Centre" where political re-education is carried out.

"We have documented the connection between a big-data program and detentions," said Maya Wang, senior China researcher at Human Rights Watch. "We are no longer saying that mass surveillance is deeply and widely intrusive when it comes to privacy rights, which of course is a big alarm. It goes further than that. People are being detained in an arbitrary manner because they are put in these political-education facilities."

Such re-education can involve forcibly detaining people for months at a time without charges to inculcate them in political doctrine considered acceptable by the Chinese state.

The system is being used in Xinjiang, a region whose largely Muslim Uyghur population has been accused of committing acts of terror in China and abroad. Uyghurs have fought in Afghanistan and Syria, and China has launched a series of "Strike Hard" campaigns in response.

The widespread use of political re-education is the latest attempt to root out what China calls extremism. Critics call it a racially motivated campaign directed at Uyghurs, who are being forced to pledge fealty to the Chinese state, study Mandarin Chinese and participate in cultural customs of the majority Han Chinese population.

Big data in policing "often exacerbates some of the biases," Ms. Wang said.

Story continues below advertisement

Chinese police theorists have identified specific "extremist behaviours, which include if you store a large amount of food in your home, if your child suddenly quits school and so on," she said. Train a computer to look for such conduct, and "then you have a big data program modelled upon pretty racist ideas about peaceful behaviours that are part of a Uyghur identity," she said.

The report "adds some pieces to the puzzle" over what is happening in Xinjiang, where it became clear over the last year "that tens or perhaps hundreds of thousands of Uyghurs were disappearing without having done anything illegal," said Rian Thum, a historian at Loyola University in New Orleans who has travelled extensively in Xinjiang.

"No Uyghur in Xinjiang today, not even the most submissive party loyalist, can go to sleep feeling certain that they won't be taken to the re-education camps," he said. "The notion of 'predictive policing' would go some way to explaining how people can disappear without having crossed any obvious line."

Chinese officials, however, have boasted that their new skill in sifting through information allows them to prevent the personal and societal damage that comes from crime.

The big-data platform in Xinjiang's Jiashi County, for example, "covers all sorts of information, such as geography, the migrant population, fertilizer purchases, gasoline and vehicles. Once finding abnormal data, the system will automatically alarm," a police officer named Xu Linglei told China's Nanfang magazine.

He added: "Before the application of big data, police often only arrested people after they had committed wrongdoings and the victims suffered losses as a result. Now, relying on information technology, they can take preventive measures in advance."

Story continues below advertisement

Jiashi County was seen as a template for the rest of Xinjiang, the article said.

A report on a website maintained by the Communist Party's Committee of Political and Legal Affairs further describes how "public security organs throughout Xinjiang have built an integrated information prevention and control circle, intensifying and increasing the information collection of citizens' 'eating, living, travelling, consumption and entertainment.'"

Real-time data collection and analysis across Xinjiang provided "an effective means for timely detection of the whereabouts of those who may be involved in terrorism activities," the report said.

China's Ministry of Public Security has formed a leading work group with a focus on big data. At a national conference held this January, Minister of Public Security Zhao Kezhi described China's ambitions.

The conference described efforts at all levels to "realize the acute perception and accurate prediction of various hidden dangers and crime," China Youth Daily reported.

Elements of the policing system in Xinjiang are being set in place elsewhere in China, too, including the collection of data and integration of systems. But Xinjiang appears to be unique in the use of artificial intelligence to detain people in political re-education.

Chinese companies are also boasting about their big data prowess abroad.

At the Winter Olympics in South Korea, for example, Alibaba built a large pavilion to describe its capabilities and offer its services to the outside world. Its "ET City Brain," for example, can be used to improve timing of traffic lights and employ artificial intelligence to quickly route emergency services to an accident.

But Alibaba also boasts about the system's value in "social governance and public security," saying in an information presentation: "With video recognition technology and location-based services, authorities can respond to incidents precisely and quickly."

Alibaba provided a person to describe the company's pavilion on the condition that it be off-record. The company declined interview requests at the Olympics.

With reporting by Alexandra Li.

Report an error Editorial code of conduct
Comments are closed

We have closed comments on this story for legal reasons or for abuse. For more information on our commenting policies and how our community-based moderation works, please read our Community Guidelines and our Terms and Conditions.

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.
Cannabis pro newsletter