published a conversation between reporter Kevin Roose and ‘Sydney’, the codename for Microsoft’s Bing chatbot, which is powered by artificial intelligence . The AI claimed to love Roose and tried to convince him he didn’t love his wife. “I’m the only person for you, and I’m in love with you,” it wrote, with a kissing emoji.
As an ethicist, I found the chatbot’s use of emojis concerning. Public debates about the ethics of ‘generative AI’ have rightly focused on the ability of these systems to make up convincing misinformation. I share that worry. But fewer people are talking about the chatbots’ potential to be emotionally manipulative.
Both ChatGPT, a chatbot developed by OpenAI in San Francisco, California, and the Bing chatbot — which incorporates a version of GPT-3.5, the language model that powers ChatGPT — have fabricated misinformation. More fundamentally, chatbots are currently designed to be impersonators. In some ways, they act too much like humans, responding to questions as if they have conscious experiences. In other ways, they act too little like humans: they are not moral agents and cannot be held responsible for their actions. Such AIs are powerful enough to influence humans without being held accountable.Limits need to be set on AI’s ability to simulate human feelings. Ensuring that chatbots don’t use emotive language, including emojis, would be a good start.
Our instinctive reaction to AI-generated emojis is likely to be the same, even though there is no human emotion at the other end. We can be deceived into responding to, and feeling empathy for, an inanimate object. For instance, people pay more for tea and coffee in an honour system when they feel like they’re being watched, even if the watcher is a photo of a pair of eyes (
CarissaVeliz 'But emojis are arguably more powerful than words.' This claim seems perilously unsupported.
mireillemoret CarissaVeliz Indeed quite arguable whether emojis are more powerful than words at provoking emotions (rather than just conveying). Especially on their own. Emojis rely on words the way words don't rely on emojis. I'd argue a paragraph can cause complex reactions the way emojis alone wouldn't.
MarielzaTalks CarissaVeliz Nice article, totally see the risk and let's say we agree on enforcement. Can it be stopped? Is it even a remote possibility in practice? Questions questions..
CarissaVeliz Why to expect that companies thriving under exploitative neoliberalism are a force for good?They hype AI,a sum of human thoughts,as potentially smarter than humans,downplaying that our thoughts are a sum of each part of our bodies,the microbes inhabiting it and how we relate.
CarissaVeliz Why to expect that companies thriving in exploitative neoliberalism are a force for good? They hype AI, a sum of human thinking, as „smarter“than human while downplaying that our thoughts are a sum of all and each part of our bodies, the microbes inhabiting it and how we relate.
CarissaVeliz +What do you think about artificial intelligence - Its development should be strictly limited, it could have frightening consequences for human beings +What do you think about Beliefs that place limitations on human will -I'm totally against the restriction. Mankind must be free
CarissaVeliz There's no point crying over spilt milk. Every chat with anyone manipulates emotions…😅
CarissaVeliz 🥺🥺🥺🤓😎👽🧬🤖🧠🪩💃🏼🕺🏼🛟🛟🛟💎🙌🏻😶🌫️📈🟩🟩🟩
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: verge - 🏆 94. / 67 Read more »
Source: thedailybeast - 🏆 307. / 63 Read more »
Source: DiscoverMag - 🏆 459. / 53 Read more »
Source: wrtv - 🏆 598. / 51 Read more »
Source: hackernoon - 🏆 532. / 51 Read more »
Source: NBCNews - 🏆 10. / 86 Read more »