Emotion Sensing Software On Zoom, Intel Calls Soon? - Vibes Of India

Gujarat News, Gujarati News, Latest Gujarati News, Gujarat Breaking News, Gujarat Samachar.

Latest Gujarati News, Breaking News in Gujarati, Gujarat Samachar, ગુજરાતી સમાચાર, Gujarati News Live, Gujarati News Channel, Gujarati News Today, National Gujarati News, International Gujarati News, Sports Gujarati News, Exclusive Gujarati News, Coronavirus Gujarati News, Entertainment Gujarati News, Business Gujarati News, Technology Gujarati News, Automobile Gujarati News, Elections 2022 Gujarati News, Viral Social News in Gujarati, Indian Politics News in Gujarati, Gujarati News Headlines, World News In Gujarati, Cricket News In Gujarati

Emotion Sensing Software On Zoom, Intel Calls Soon?

| Updated: May 25, 2022 18:06

The talking point: Artificial Intelligence software can now detect emotions during sales calls or in classrooms. Alongside, this April a press release from Intel and Classroom Technologies summed up their current joint project: working on a set of artificial intelligence tools that can identify the emotions of students in virtual classrooms. Zoom too is experimenting on ways to detect and analyze human emotions.

Is that a good thing? You decide. You are bored during a sale assessment call. Would you want your boss to know that he is talking gibberish? Would any student want their teacher to know that maybe the style of teaching is amiss?

So, where is this heading to? The debate continues over the use of artificial intelligence in our lives. Is it ethical? Meanwhile, Zoom is in planning stages about the “how to add” an emotion detecting software.

On a technical note: “The premise of these technologies is that our faces and inner feelings are correlated in a very predictable way,” explains Alexa Hagerty, a researcher at the University of Cambridge Leverhulme Centre for the Future of Intelligence and the Centre for the Study of Existential Risk. “If I smile, I’m happy. If I frown, I’m angry. But the APA did this big review of the evidence in 2019, and they found that people’s emotional space cannot be readily inferred from their facial movements.” These emotion-reading systems also fail racial bias tests. In one study, the AI assigned more negative emotions to black men’s faces than white men’s faces.

And that is violation of personal privacy too: Correct. Which is why some want a wrap up of this research. Added, some believe that it can be detrimental to label people with a single word or description. Humans experience a myriad of emotions, so simplifying someone’s state to “happy” or “bored” may be counterproductive. In addition to questions surrounding the accuracy and helpfulness of the technology, “emotion AI” critics question the morality of student surveillance.

In terms of business: The market for emotion detection and recognition is projected to grow from $23.6 billion in 2022 to $43.3 billion by 2027, according to a market forecast.

In their defence: Sinem Aslan, a research scientist at Intel, stated the intention behind the tech was not surveillance. “We did not start this technology as a surveillance system. In fact, we don’t want this technology to be a surveillance system. As with all research projects, Intel and its collaborators abide by strict data privacy policies and adhere to ongoing oversight.”

Also Read: GTU Introduces New Course On Artificial Intelligence and Machine Learning

Your email address will not be published. Required fields are marked *