Fight for the Future, a non-profit organisation that advocates for digital rights, together with other 27 human rights organisations, have issued an open letter to Zoom requesting that they stop experimenting with artificial intelligence (AI) that can evaluate human emotions.
The letter was written in response to a Protocol report stating that Zoom is actively studying ways to apply emotion AI to its future products. It is a subsection of a larger piece that examines how companies have begun employing artificial intelligence to detect the emotional state of potential clients during sales calls.
In the wake of the global epidemic, video conferencing has become more commonplace worldwide. Without the capacity to read prospects’ body language through the screen, salespeople have had difficulty gauging the responsiveness of prospective consumers to their products and services. As a result, companies have begun using technology that can assess people’s moods during calls, and Protocol reported that Zoom intends to offer the same service.
As a human rights organisation, Fight for the Future and other groups are seeking to influence Zoom to abandon its intentions. As they put it, it is predicated on the idea that all people employ the same facial expressions, voice patterns, and body language. “The technology”, the open letter reads is, “discriminatory, manipulative, potentially dangerous and based on assumptions that all people use the same facial expressions, voice patterns, and body language.”
They both asserted that the technology is inherently prejudiced and racist, similar to facial recognition, arguing that Zoom would be discriminating against people of colour and those with impairments if they implemented the function. Additionally, it may be used to discipline pupils or employees who exhibited incorrect emotions.
Alexa Hagerty, a professor at the University of Cambridge, demonstrated in 2021 the limitations of emotion recognition AIs and how easy it is to trick them through an experiment she directed. Previous research has also demonstrated that emotion recognition programs fail the racial bias test and struggle to identify Black faces.
After discussing Zoom’s choice not to implement face-tracking functionality, the group referred to this as another opportunity to do the right thing for its customers. Now, they are requesting that Zoom commits to avoiding incorporating emotion AI into their products by May 20, 2022.