China’s population of more than 1.4 billion people is monitored closely on a regular basis. They are captured by police cameras installed on street corners and subway ceilings, as well as in hotel lobbies and apartment buildings. Their phones are being traced, their purchases are being scrutinised, and their internet activity is being restricted.
But the story doesn’t end here. Even their future is now being watched.
The most recent generation of technology mines large volumes of data collected on their daily activities for patterns and oddities, offering to detect crimes or riots before they occur. They target potential troublemakers in the perspective of the Chinese government and people with a history of mental illness.
They can alert authorities if a victim of a fraud attempts to go to Beijing to petition the government for payment, or if a drug user makes an excessive number of calls to the same number. They can alert cops if a person with a history of mental illness approaches a school.
The New York Times’ acquisition of procurement and other records reveals new Chinese technology. Still, even though they are unproven, they push the boundaries of social and political limitations and integrate them into everyday life. Moreover, they rationalise oppressive surveillance and breach of privacy as their fundamental principle.
Social stability is of utmost importance to the government, and any threat to it must be removed. According to The Times, the government has even cancelled weddings it considered unsafe.
However, inquiries for comment sent to the China Ministry of Public Security’s Beijing headquarters and six regional offices around the nation received no response.
People are commonly unaware that they are being observed. Outside scrutiny of the usefulness of the technology or the activities it prompts is limited. Personal information cannot be collected in China without a warrant.
But how can we know the future has been accurately anticipated if police intervene before it occurs?
Experts claim that even if the software fails to deduce human behaviour, it might be successful because the surveillance prevents unrest and crime.
“This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch, “the disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”