Are you 80% angry and 2% sad? Why ‘emotional AI’ is fraught with problems

  • 📰 GuardianAus
  • ⏱ Reading Time:
  • 56 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 26%
  • Publisher: 98%

Technology Technology Headlines News

Technology Technology Latest News,Technology Technology Headlines

AI that purports to read our feelings may enhance user experience but concerns over misuse and bias mean the field is fraught with potential dangers

It’s Wednesday evening and I’m at my kitchen table, scowling into my laptop as I pour all the bile I can muster into three little words: “I love you.”

“I appreciate your kind words, I’m here to support you,” Hume’s Empathic Voice Interface replies in a friendly, almost-human voice while my declaration of love appears transcribed and analysed on the screen: 1 for “love”, 0.642 for “adoration”, and 0.601 for “romance”.

Emotional AI’s essential problem is that we can’t definitively say what emotions are. “Put a room of psychologists together and you will have fundamental disagreements,” says McStay. “There is no baseline, agreed definition of what emotion is.” “We only allow developers to deploy their applications if they’re listed as supported use cases,” Cowen says via email. “Of course, the Hume Initiative welcomes feedback and is open to reviewing new use cases as they emerge.

Still, making predictions from statistical abstractions doesn’t mean an AI can’t be right, and certain uses of emotional AI could conceivably sidestep some of these issues.week after putting Hume’s EVI through its paces, I have a decidedly more sincere conversation with Lennart Högman, assistant professor in psychology at Stockholm University.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 1. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines