It’s Wednesday evening and I’m at my kitchen table, scowling into my laptop as I pour all the bile I can muster into three little words: “I love you.”
“I appreciate your kind words, I’m here to support you,” Hume’s Empathic Voice Interface replies in a friendly, almost-human voice while my declaration of love appears transcribed and analysed on the screen: 1 for “love”, 0.642 for “adoration”, and 0.601 for “romance”.
Emotional AI’s essential problem is that we can’t definitively say what emotions are. “Put a room of psychologists together and you will have fundamental disagreements,” says McStay. “There is no baseline, agreed definition of what emotion is.” “We only allow developers to deploy their applications if they’re listed as supported use cases,” Cowen says via email. “Of course, the Hume Initiative welcomes feedback and is open to reviewing new use cases as they emerge.
Still, making predictions from statistical abstractions doesn’t mean an AI can’t be right, and certain uses of emotional AI could conceivably sidestep some of these issues.week after putting Hume’s EVI through its paces, I have a decidedly more sincere conversation with Lennart Högman, assistant professor in psychology at Stockholm University.