China’s growing use of emotion recognition tech raises rights concerns

Posted By : Telegraf
4 Min Read

[ad_1]

Technology that measures emotions based on biometric indicators such as facial movements, tone of voice or body movements is increasingly being marketed in China, researchers say, despite concerns about its accuracy and wider human rights implications.

Drawing upon artificial intelligence, the tools range from cameras to help police monitor a suspect’s face during an interrogation to eye-tracking devices in schools that identify students who are not paying attention.

A report released this week from U.K.-based human rights group Article 19 identified dozens of companies offering such tools in the education, public security and transportation sectors in China.

“We believe that their design, development, deployment, sale and transfers should be banned due to the racist foundations and fundamental incompatibility with human rights,” said Vidushi Marda, a senior program officer at Article 19.

Human emotions cannot be reliably measured and quantified by technology tools, said Shazeda Ahmed, a doctoral candidate studying cybersecurity at the University of California, Berkeley and the report’s co-author.

Such systems can perpetuate bias, especially those sold to police that purport to identify criminality based on biometric indicators, she added.

The systems also raise concerns about an emerging trend of collecting emotional data to monitor students, suspected criminals and even drivers of cars equipped with recognition tech to detect fatigue and unsafe movements.

“A lot of these systems don’t talk about how they would act on the data and use it long term,” Ahmed said in a phone interview.

“We are quite concerned about function creep,” she added, referring to the use of data for purposes other than those for which it was collected.

Read More:  Police checks and patriotic flowers: Beijing leaves nothing to chance ahead of party centenary

Even seemingly benign applications of emotional recognition tech could lead to harm, Marda noted. “There’s a slippery slope with this kind of surveillance,” she said.

“Let’s say a school wants to introduce a camera system to see if students are eating nutritious meals at school — it then may be quickly transformed into an emotion recognition system, and then who knows what next?”

Many of the companies identified in the report are smaller Chinese startups that specialize in a particular kind of emotional recognition tool.

But the report also pointed out some major international firms involved in the market.

Lenovo, the world’s biggest personal computer maker, for example, markets “smart education solutions” that include “speech, gesture and facial emotion recognition,” noted the report.

The firm has sold education technology to more than a dozen Chinese provinces, but Article 19 researchers say it is unclear how many have deployed the systems.

Lenovo did not immediately respond to a request for comment.

Article 19 is worried that the kind of technology being marketed in China could be increasingly hooked into surveillance systems all over the world.

“When you have CCTV all around a city it doesn’t cost that much to add a new emotion recognition service,” explained Marda. “This is not just a China problem.”

At this stage, the global market for emotional recognition technology is relatively small, the report said.

But the researchers cautioned that it is developing quickly, and without much scrutiny.

“We documented around 30 companies selling this technology,” Ahmed said. “That very well could be just the tip of the iceberg.”

PHOTO GALLERY (CLICK TO ENLARGE)

[ad_2]

Source link

Share This Article
Leave a comment