"Ordinary people here in China aren't happy about this technology but they have no choice. If the police say there have to be cameras in a community, people will just have to live with it. So says Chen Wei at Taigusys, a company specialising in emotion recognition technology, the latest evolution in the broader world of surveillance systems that play a part in nearly every aspect of Chinese society. Emotion-recognition technologies – in which facial expressions of anger, sadness, happiness and boredom, as well as other biometric data are tracked – are supposedly able to infer a person's feelings based on traits such as facial muscle movements, vocal tone, body movements and other biometric signals. It goes beyond facial-recognition technologies, which simply compare faces to determine a match. But similar to facial recognition, it involves the mass collection of sensitive personal data to track, monitor and profile people and uses machine learning to analyse expressions and other clues. The industry is booming in China, where since at least 2012, figures including President Xi Jinping have emphasised the creation of "positive energy" as part of an ideological campaign to encourage certain kinds of expression and limit others. Critics say the technology is based on a pseudo-science of stereotypes, and an increasing number of researchers, lawyers and rights activists believe it has serious implications for human rights, privacy and freedom of expression. With the global industry forecast to be worth nearly $36bn by 2023, growing at nearly 30% a year, rights groups say action needs to be taken now. The main office of Taigusys is tucked behind a few low-rise office buildings in Shenzhen. Visitors are greeted at the doorway by a series of cameras capturing their images on a big screen that displays body temperature, along with age estimates, and other statistics. Chen, a general manager at the company, says the system in the doorway is the company's bestseller at the moment because of high demand during the coronavirus pandemic. Chen hails emotion recognition as a way to predict dangerous behaviour by prisoners, detect potential criminals at police checkpoints, problem pupils in schools and elderly people experiencing dementia in care homes. Taigusys systems are installed in about 300 prisons, detention centres and remand facilities around China, connecting 60,000 cameras. "Violence and suicide are very common in detention centres," says Chen. "Even if police nowadays don't beat prisoners, they often try to wear them down by not allowing them to fall asleep.