© 2024 Blaze Media LLC. All rights reserved.
'Creepy technology': Zoom reportedly may develop AI tool that detects user emotions on video calls. Human rights groups call it invasion of privacy.
Photo by Robin Utrecht/SOPA Images/LightRocket via Getty Images

'Creepy technology': Zoom reportedly may develop AI tool that detects user emotions on video calls. Human rights groups call it invasion of privacy.

Video-conferencing outfit Zoom reportedly may develop an artificial intelligence tool that detects users' emotions by scanning facial expressions and examining vocal tones — but human rights groups are saying such technology would be invasion of privacy, the Thomson Reuters Foundation reported.

The outlet said tech publication Protocol reported on the subject last month.

What are the details?

But more than 25 groups — including Access Now, the American Civil Liberties Union, and the Muslim Justice League — on Wednesday sent a joint letter to Zoom chief executive Eric Yuan against the idea, the Thomson Reuters Foundation said.

"If Zoom advances with these plans, this feature will discriminate against people of certain ethnicities and people with disabilities, hardcoding stereotypes into millions of devices," Caitlin Seeley George — director of campaign and operations at Fight for the Future, a digital rights group — said, according to TRF. "Beyond mining users for profit and allowing businesses to capitalize on them, this technology could take on far more sinister and punitive uses."

The Thomson Reuters Foundation said Zoom didn't immediately respond to a request for comment.

More from TRF:

Zoom Video Communications Inc emerged as a major video conferencing platform around the world during COVID-19 lockdowns as education and work shifted online, reporting more than 200 million daily users at the height of the pandemic in 2020.

The company has already built tools that purport to analyze the sentiment of meetings based on text transcripts of video calls, and according to Protocol it also plans to explore more advanced emotion reading tools across its products.

Zoom in a blog post said sentiment analysis technology can measure the "emotional tone of the conversations" to help salespeople do a better job, TRF reported.

However, the letter from the human rights groups said detection of users' emotions "is a violation of privacy and human rights" and that "Zoom needs to halt plans to advance this feature," TRF added.

'Creepy technology'

The outlet said emotional recognition tools are increasingly common even though critics compare it to facial recognition tools, which have high error rates on non-white faces and have resulted in wrongful arrests.

Esha Bhandari — deputy director of the ACLU Speech, Privacy, and Technology Project — called emotion detection AI "junk science," TRF said.

"There is no good reason for Zoom to mine its users' facial expressions, vocal tones, and eye movements to develop this creepy technology," Bhandari told the outlet.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?